Re-imagining the College Degree

What is the definition of a bachelor’s degree? At least in the United States, you receive a bachelor’s degree by completing 120 credit units with a passing grade point average.

So what is the definition of a credit unit? One credit unit equals one hour each week in a room with an instructor, for a total of 15 weeks (corresponding to one semester). Most university classes are three credits, which means you spend three hours each week–“contact hours”–in a room with the instructor. The usual expectation is that for each contact hour, a student will also spend two hours of work outside of class time–reading and studying or working on a project or assignment.

To get 120 units in four years–eight semesters–you need 15 credits each semester. (Or ten credits a quarter, if your school year has three quarters, like the University of Chicago.) With two hours of out-of-class time for each of those contact hours, a full-time student should be spending 45 hours each week, which roughly corresponds to a full-time job, so that seems about right.

But why should this be the definition of a college degree? The 120 credit unit rule is an “input based” definition, meaning it’s a measure of how much input (instructor contact hours) students are getting. Contrast this with an “outcome based” definition, which would define a college degree in terms of what the student had learned and achieved. Let’s say we had a college exit exam–something like the high-school exit exams that some states administer, like the New York State Regents Exam. If you pass the exam, you get a college degree–regardless of where and how you acquired the skill and information necessary to pass the exam.

This alternative is being seriously proposed by some influential people and organizations, according to a story in the Wall Street Journal.* A new SAT-like assessment, called the Collegiate Learning Assessment or the CLA + for short, is “the latest threat to the fraying monopoly that traditional four-year colleges have enjoyed in defining what it means to be well educated” (according to the WSJ). The group behind the test is the Council for Aid to Education, a New-York based nonprofit that was once part of Rand Corporation. Anyone can take the test for $35 whether or not you have ever set foot in a classroom. However, unlike most college degrees–where your major is evidence of content-area mastery–the CLA + assesses general abilities like critical thinking, analytical reasoning, document literacy, writing, and communication.

Outcome based measures like the CLA + are, in part, a response to concerns about grade inflation and about the perception by some employers that colleges aren’t doing a good job preparing graduates for the 21st century workplace. ACT, the nonprofit that administers the college-admission exam, has its own National Career Readiness Certificate, which also measures general abilities like synthesizing and applying graphical information. A recent study found that over 25% of businesses are using the GRE–designed as a graduate school admission test–to evaluate job applicants with bachelors degrees. The MacArthur Foundation has provided funding for a series of “badges” (think of scouting’s merit badges) that each affirm mastery of a specific skill set. Last Thursday, President Obama said he wants the federal government to develop a rating system based on student outcomes.

If this ever comes to pass, it will open up opportunities for all sorts of higher education innovation. MOOCs are in the news today, but it could be another technology or learning design tomorrow. Do you think this will really threaten the traditional campus-based university?

*Douglas Belkin, “Colleges Set To Offer Exit Tests.” WSJ, Monday, August 26, 2013, pp. A1, A2.

The Future of College

I just watched this fascinating 30-minute interview from June 2012, discussing potentially dramatic innovations in higher education. The on-stage interview was part of the Wall Street Journal’s “D: All Things Digital” series, and the host was the Journal’s legendary technology columnist, Walt Mossberg. The two guests were knowledgeable, brilliant, and well-spoken:

  • Salman Khan, creator of the Khan Academy web site (with its instructional videos)
  • John Hennessey, President of Stanford University

There’s a lot of serious change on the horizon. MIT and Harvard have teamed up to offer many of their courses online, for free, through EdX. Stanford has its own consortium of universities, also offering free courses online, called Coursera. These initiatives are called “Massively Open Online Courses” or MOOC for short. My employer, Washington University, just announced a partnership with ten top universities to offer online courses–but not for free, and only for students who meet admissions criteria.

Khan and Hennessey describe several potential futures. For example, maybe some students could get a college degree without ever setting foot on a campus. Maybe others would do a hybrid degree, with some courses on campus and others over the Internet. Khan proposed the most radical change: maybe employers will stop treating elite college degrees as a certification of your ability to do a job. Instead, your abilities would be certified by an entity that is unattached to any college, and anyone can take any test to demonstrate mastery of a specific ability or topic. It doesn’t matter how you learn it–on a campus, at home, in an informal study group with a few friends. If you  pass, you would get a certificate that today’s digerati refer to as a “badge” (by analogy with boy scout merit badges). Khan talks about “separating out the teaching part of college from the certification part.”

Also see my post “Will the Internet Transform College?” from May 31, 2012.

What do you think the future will be?

Online “Badges”: Do They Threaten Colleges?

The Chronicle of Higher Education has just published an article (Jan 8, 2012) wondering whether online “badges” pose a challenge to colleges and universities. Here’s the phenomenon:

The spread of a seemingly playful alternative to traditional diplomas, inspired by Boy Scout achievement patches and video-game power-ups, suggests that the standard certification system no longer works in today’s fast-changing job market. Educational upstarts across the Web are adopting systems of “badges” to certify skills and abilities. At the free online-education provider Khan Academy, for instance, students get a “Great Listener” badge for watching 30 minutes of videos from its collection of thousands of short educational clips. With enough of those badges, paired with badges earned for passing standardized tests administered on the site, users can earn the distinction of “Master of Algebra” or other “Challenge Patches.”

This has the potential to be a serious challenge to the traditional university. The reason is that universities serve two functions in modern society: one function is to help students learn. That’s the one we professors spend most of our time thinking about. The other function is to credential young adults as being prepared for the workplace: what I call the certification function. That’s the one a lot of students (and parents) are mostly thinking about. The certification function is not necessarily linked to the learning function. Yes, in a well-functioning university, the certification attests to master of knowledge learned. But how many of you have heard the cynical phrase “You pretend to teach us, and we pretend to learn”?

Employers need information to help them know who they should hire. They could develop tests and systems in their human resources departments, but they don’t need to, because they are getting this information for free–from universities. If it weren’t for universities and their degrees, employers would have to come up with some other way to acquire information about potential hires. They don’t want to design their own evaluation system and manage it from their human resources department; they want to continue getting it for free.

Voila! Enter the badges. Exactly what employers need: A mechanism that serves the certification function, and that doesn’t cost anything. From the perspective of the employer, it’s the same function that universities serve. Of course one can argue about their relative effectiveness at serving that function. At this time in history, I absolutely trust the university degree a lot more than these badges, but things could change quickly. So will universities lose their monopoly over the certification function?

The Chronicle article quotes David Wiley, a professor at Brigham Young University: “We have to question the tyranny of the degree…As soon as big employers everywhere start accepting these new credentials, either singly or in bundles, the gig is up completely.” The potential is that a system of badges could completely reframe the relationship between employers and universities. Universities benefit tremendously from their monopoly over the certification function.

Is it really that serious? What do you think?