Do you remember First Lieutenant Milo Minderbinder? He’s the guy in Joseph Heller’s Catch-22Â who bought eggs in Malta for 84 cents a dozen and sold them back to the Maltese for 51 cents a dozen. The important thing was that the syndicate made a profit, so everyone who owned a share made out in the end.
Almost four years ago now, I wrote a series of posts that, at the time, seemed to me to be a bit like Milo’s strategies for getting rich from World War II [start here to read them in the original]. Given the current level of interest in “learning badges,” massive online lectures, and other “new” ideas about the future of content delivery and learning in higher education, I thought I would take a minute to pull together what I wrote way back in 2008 into something a bit more like an essay.
What follows is a slightly edited version of what I wrote back then, all in one document. The only real changes are those I’ve made to knit the six posts together, rather than have them read like a series of posts in a blog.
The End of Western Civilization As We Know It
March 21-28, 2008
Over the past couple of years I’ve written a number of posts in which I wrestle with what technological change means for the future of higher education, general education, and history education specifically. Much of my speculating and ranting in these posts has centered on what seems to me to be a clash between the traditional methods by which knowledge is delivered to students (curriculum, teaching) and the world that our students live in (tech-centric, socially networked, etc.).
An article in the March issue of Wired by Editor-in-Chief Chris Anderson (Mr. Long Tail) caught my attention because Anderson’s argument dovetailed so nicely with what I’ve been trying to articulate for a while. In the piece (“Free! Why $0.00 is the Future of Businessâ€), Anderson argues that the “free economy†of the Internet has already transformed the way that business is done globally (but especially in the U.S.) and that to deny the impact of “free†on consumer behavior is to deny the reality in front of your face.
Case in point: Google provides all of its services for free and the last time I looked Google was a profitable company.
Bands give away their music. Yahoo! now provides infinite email storage. And lots of other businesses are in what Anderson terms the “race to the bottom.†For instance, you can fly from London to Barcelona on RyanAir for $20 despite the fact that it costs the airline $70 to get you there. All of these businesses are, at last check, making money.
In its usual way, higher education is attempting to prove the market wrong. According to the National Center for Education Statistics, over the past 30 years the average cost to attend an American college or university has increased by 543% and by 59% in the past ten years (from $9,206 to $14,629). Although private institutions have gotten the lion’s share of the bad press for raising their prices aggressively over the past ten years, price increases (on a percentage basis) at public institutions were actually greater over the past ten years.
How long can colleges and universities ignore what is happening in the larger economy around them? Because they have built such powerful brands and sell an intangible future benefit that study after study shows is worth its price over time, I think you might be right if your first reaction was “quite a long time.â€
But already we are seeing cracks in the model that has sustained prices in higher education for so long. And I think that these examples portend a change in the way higher education does business. Before you object that higher education is not a “businessâ€, I’ll just throw out the fact that according to the 2002 Economic Census, educational services in the United States accounted for more than $30 billion in receipts, employing more than 430,000 people.
Given that the Internet is changing the way that American (and global) companies do business in many sectors, what will that mean for American higher education–an industry that ignores market forces at its peril?
To start with, let me ask this question: when was the last time you heard someone say, “Man, I just love teaching those general education courses!â€
To be sure, there are a few crazies out there (like me) who actually enjoy teaching courses like Western Civ, but by and large, in most academic departments general education courses are seen as necessary chores–and as chores that can often be fobbed off to graduate students or adjunct faculty.
Once upon a time, introductory courses in the general education curriculum had a real and abiding purpose–to teach first and second year college students a certain set of basic things that will enable them to (a) prosper in later courses, (b) be exposed to things they might not otherwise sign up to learn about, and (c) prepare them for citizenship in the nation. Over the years, debates about general education curricula have swirled around the question of breadth versus depth, but rarely around the economic model that general education courses are part of.
And this is one of the many dirty little secrets of American higher education.
In my own university, for instance, every single undergraduate student is required to pass or place out of Western Civilization in order to graduate. For my department, this requirement is both a huge undertaking and an economic windfall. In any given semester we teach Western Civ to something like 1,500 undergraduates (all in sections of 50 or less). But, at the same time, when budget time rolls around, we get credit for filling approximately 3,000 seats in classrooms each year from this one course alone. Because George Mason is a relatively new university with little in the way of endowment, our PhD program is sustained, in large part, by those 3,000 undergraduates.
Given this economic reality, what I’m about to say will sound controversial at best, cracked at worst. I think we ought to take Chris Anderson seriously and start giving the course away. In fact, I think we ought to be giving away the entire general education curriculum at George Mason. Based on our current requirements, that means 40 credits of a college education. For free.
Lest you think that I’m the reincarnation of Milo Minderbinder and his egg buying scheme, hear me out.
Right now the cost of providing a general education course on any campus is not insignificant. Even with graduate students or adjunct faculty teaching the course, there are still many, many costs involved. These include the fixed overhead costs of the buildings and all the support services necessary to make classroom teaching possible (IT support, janitorial services, security, etc.). And then there is the elastic overhead of discounting practices–every student in the classroom is paying a slightly different price for that class based upon a host of factors including scholarships, alumni discounts, and employer support. And every semester these costs recur (and typically go up).
So what would happen if we gave up on the mercantilist vision of higher education as a zero sum game where there are only so many students who can fill so many seats in any given semester and replaced it with what Anderson calls “an ecosystem with many parties, only some of which exchange cash�
Can’t happen, right?
But it is starting to happen already. You can already learn from professors at UC Berkeley on YouTube–for free. You can already learn from professors at MIT via their Open Course Ware project–for free. And many institutions of higher education have signed on to iTunesU and made lots (if not all) of the content there free. Already it seems, higher education may have started Anderson’s “race to the bottom.â€
There are three principle objections I can think of to why giving away the general education curriculum is a bad idea.
The first objection has to do with learning. What kind of real learning would take place in a free, online content delivery system? In tomorrow’s post I’ll go into detail as to what such a free educational content delivery system might look like, but for now, let me deal with the objection that students might not be learning much through such a system.
In the 19th century instruction at American colleges and universities almost universally took place in small classes or in individual tutorials with professors. Toward the end of the century, a new system arose where “lecture†courses, especially introductory courses, got larger and larger. Concerned that students in these larger lecture halls weren’t receiving sufficient direct instruction, many institutions adopted what some called the “Harvard system†which blended the larger lecture with the discussion section run by a graduate student or junior professor. And an economic model was born.
Unfortunately for slightly more than 100 years worth of students attending those lectures, there is virtually no evidence that lecturing is an effective method of teaching–that is if we assume that the goal of teaching is to promote learning. Learning may take place in those discussion sections, but very little takes place in those lecture halls. Quite the contrary, actually.
Cognitive researchers will tell you that the vast majority of what goes in via the students’ ears exits their brains within 30 minutes, and that a substantial fraction of what remained is gone by the end of the day. We retain only tiny amounts of information acquired through listening to a lecture. Thus, it may make us feel better to note that our students listened to a lecture on the Renaissance, or Kafka, constitutional government, or whatever, but the object of general education is not to make us feel better–it is for them to learn something.
Viewed from the business side of the house, however, those large lectures with discussion sections are very cost effective. In fact, on many campuses around the United States they make it possible to have junior and senior level courses with small enrollments–courses where there is lots of evidence that real learning is happening.
Thus, for just over a century we’ve been charging students for courses where they haven’t actually learned very much. For this reason alone we probably ought to stop charging them.
But this leads to a second big objection to launching a free economy in higher education. What would happen to our budgets? At a place likeGeorge Mason, the thought is almost too horrible to contemplate.
As it turns out, lots of companies are making money in the free economy. In his Wired essay, Chris Anderson offers half a dozen different ways to make money in the free economy and in a subsequent post, I’ll offer suggestions for how universities like mine can actually make money (or at least break even) in the free economy.
And the final objection is reputational. What value would potential students assign to a George Mason University education if the first 40 credits of that education were free? For decades, private institutions of higher education have lived off of the idea that the higher your price, the better you are perceived in the marketplace. Public institutions like mine have looked on in a combination of envy and horror as private tuitions have gone stratospheric.
But if we offered 40 credits for free–if we were giving it away–how good could it be?
I submit that this worry is so 20th century.
Before the Internet introduced us to the free economy, I think we would have been right to worry about public perceptions of our quality based on price. But not any more. I think that now the prospect of a college education where one pays for only 80 credits would be more than a little appealing to the average American family. Especially now that graduate education seems more and more to be a necessary further expense.
And George Mason is, after all, a state funded institution, providing a lower cost education to the people of our state (plus a growing number of out of state students). If we can advance our mission at two-thirds of the cost to the students, then aren’t we doing just the sort of public service we were chartered to do?
Here at George Mason that would mean 40 credits, or one-third of the credits required to graduate.
How would we do that?
The first objection that might be raised to such an idea concerns teaching and learning. Right now, today, it wouldn’t be possible to offer the entire set of required courses on our campus for free. But that’s assuming that courses completed is a reasonable measurement of learning.
What I’m suggesting here is that we have to throw out our assumptions about what “teaching and learning†mean in the context of the general education curriculum. Right now, we assume that what happens is that students enroll in something we call courses where faculty members impart knowledge to them in various ways. And we further assume that if a student successfully completes the 40 hours we require that he or she will know the things (or be able to do the things) our general education curriculum is set up to impart.
At George Mason, here are the goals of the general education curriculum:
1. To ensure that all undergraduates develop skills in information gathering, written and oral communication, and analytical and quantitative reasoning;
2. To expose students to the development of knowledge by emphasizing major domains of thought and methods of inquiry;
3. To enable students to attain a breadth of knowledge that supports their specializations and contributes to their education in both personal and professional ways;
4. To encourage students to make important connections across boundaries (for example: among disciplines; between the university and the external world; between the United States and other countries).
I think these are all worthy goals, don’t get me wrong. Where I beg to differ with the current system is that I don’t think that these goals have to be accomplished with courses.
If instead of thinking about the university as a place where faculty members teach and students take courses, what if, instead, we thought of the university as a place that fosters learning. If we let learning be our standard, rather than courses completed, then I think we can liberate ourselves from the feeling that if we don’t teach our students X, they won’t be able to do Y when they leave our campus for the “real†world after graduation.
And, I would further suggest that this sort of approach might just be one cure for something colleagues complain about a lot–the instrumental approach that so many students take when it comes to their education. But really, who can blame them? When so much emphasis is placed on completing courses with a certain grade point average, the goal becomes completing courses, not learning things worth knowing.
So, I think our approach needs to change and change radically. Rather than teaching courses in the general education curriculum, we need to go beyond those lofty statements I listed above and come up with some real, honest benchmarks–benchmarks that allow us to say to our students, okay, you now know enough and have enough of the necessary skills to progress on to upper division courses here at our university.
What, then, would it look like in reality if we were to move to a competency-based approach to general education?
If we start thinking about the university (or college) as an aggregator and re-distributor of knowledge and skills (as well as a place where new knowledge is created), then I think we’ll be on the right path. Right now, though, we view each campus as a place where unique teaching and learning take place. But really, it’s not all that unique.
Consider the introductory Western Civilization course as one example. A simple search of online syllabi using our [now defunct thanks to Google]  Syllabus Finder tool returns 23,800 syllabi. Assuming errors in the search and a large amount of duplication, let’s say that the actual number of unique syllabi is around 5,000. Having taught Western Civ at four different universities over the past 12 years, and having spent countless hours in a national study of the course for the College Board, I can say with confidence that it just isn’t taught that differently from one campus to the next.
Already, instructors around the country and around the world are making video lectures, podcasts, specific assignments, and other aspects of their Western Civ courses available online for free. And, on most of our campuses, we let students test out of Western Civ if they’ve taken an AP or an IB course in European history. So why not just expand on what we are already doing and let any student test out, regardless of whether or not they took one of these high school courses? As I have discussed previously in this space, the legal profession already does this in a number of states.
Of course, in such a model we would still need to provide some basic services to our students to help them navigate their way through preparing for these exams. But these services don’t have to be free. What would a typical student’s initial experience at our institution be like if we were to move to such a model? Here’s one possible vision.
Student X enrolls at our institution. After a week of getting oriented to campus, he or she sits down with an academic adviser and charts a path through the 40 credits needed to move on beyond the general education curriculum. This adviser would show him/her how to access the many learning resources the university has aggregated over time for each course–resources vetted by faculty members for their quality and organized, perhaps, on a wiki page for each exam that the student needs to pass.
Then the student and his or her adviser will establish a schedule of regular meetings, say once every two weeks or so, for that first semester. They would also establish a schedule for preparing for and then taking the various exams. Given recent research on student persistence in higher education that identifies time management as the number one issue confronted by new college students, this sort of regular check up will be pretty necessary for most students.
Instead of classrooms (in short supply on lots of campuses these days), our fictional university would reallocate space as “learning commons†where students could work individually or collaboratively in preparation for specific exams. Students preparing for an exam could establish Facebook groups or use other social networking tools to find peers they need. The physical spaces would include robust wireless, dedicated work stations, librarians, and some academic specialists. Faculty members from the various departments might take turns holding “learning hours†rather than office hours in the learning commons spaces.
I think you can see that if we abandon the course as the delivery system for the general education curriculum, then we are free to find lots of interesting ways for teaching and learning to occur.
But it will take some serious letting go of a model we’ve been wedded to for more than 100 years now.
And it will take some creative approaches to the financial side of things, because just because we provide the credits for free, that doesn’t mean it is free for us to do so.
Thus far I’ve written a lot about the academic aspects of what free means for those of us in post-secondary education, so today I want to turn to the economic aspects of the argument.
How could it possibly work for an institution like George Mason University–a mass market university with almost no endowment–to give away as much as one-third of its undergraduate degree for free?
In his article (that is the precursor to a book), Chris Anderson of Wired offers a “taxonomy of free†that higher education needs to take very seriously. Among the examples he cites are: “freemium†where users of the basic version of a website or service get it free, but for a fee they get access to premium services; the advertising model where websites carry advertising, whether banners or Google search links; cross-subsidies, where the free stuff entices you to buy more expensive stuff; the zero marginal cost model, where inexpensively stored and delivered items (think music files) are given away as a vehicle for marketing other goods and services (like concerts in the case of music); labor exchange, where the web user does something online in order to help a company build something else entirely (directory assistance queries helping to build databases of consumer information); and the gift economy where web users share things with one another for free without any expectation of compensation (thinkWikipedia).
Of these, the “freemiumâ€, cross-subsidy, and zero marginal cost models seem the most relevant to higher education. In the scenario that I’ve laid out in previous postings on this topic, here is how I see these models working:
Freemium: A student enrolls, paying a one time enrollment fee to cover the cost of admitting, setting up an account in the registrar’s office, obtaining a campus email account and web access, etc. This fee would be pretty low relative to current educational costs–say $500. Then our student has the right to test out of as much of the general education curriculum (up to the maximum 40 credits) as he wishes. The university provides access to lots of free educational content (lectures, learning modules, podcasts, etc.) to help the student prepare for these exams. If, however, the student needs “live help†that’s not free. So, for instance, an appointment with the writing center costs $20, or an hour with a math tutor costs $50, and so on. Anderson cites Flickr.com as the best example of how the freemium model works, and since more than 5 billion photographs have been uploaded to Flickr since the site went live, I’d say their model seems to work fairly well.
One could argue that the way I’ve just laid out the freemium model will advantage students with more money–they’ll be able to pay for the premium services, while less prosperous students won’t. This assumes that financial aid is not available in such a model–and I think it would be–and it assumes that there are only a limited number of opportunities to test out of portions of the general education curriculum. With financial aid and with multiple opportunities to pass a qualifying exam, less prosperous students would have plenty of access to the upper levels of the university curriculum (which wouldn’t be free).
Cross-Subsidy: The cross-subsidy model is really essential to what I’m thinking about here. Being able to obtain one-third of your college degree for free seems like a real enticement to enroll at a mass market university like GMU. Students who take advantage of our free general education curriculum are, I submit, highly likely to stay with us for the last two-thirds of their degree. Thus, recruitment and retention costs, both of which are significant portions of our administrative overhead, go down.
Zero Marginal Cost: Delivery of learning content online (especially when a lot of that content has been aggregated from elsewhere) has avery low (but not zero) marginal cost for universities. Our bandwidth costs are low relative to the market and we’ve already built out pretty robust networks. Giving students access to this learning content for free just doesn’t cost us all that much. And where we do incur costs, that’s where the premium service model and cross-subsidy models kick in.
Why no advertising? I’m not opposed to the idea that my university will advertise on its websites–we already do, especially for events on campus that cost money to attend. For me it’s a purely aesthetic objection–I hate coming to websites with advertising. If we could do something much less intrusive (think the ads on Facebook), I’d be okay with that. But banner advertising (think Yahoo!) would just bother me too much. But that’s just me.
I agree with you that this is quite possibly a viable model for the future. Universities have had the business model of “selling access” to information and knowledge. Children compete academically for 12 years … so that one day they will allowed to “access” to the mystified “knowledge”. If this is made freely available, the ethical issue of “selling ” knowledge is overcome and universities can focus on research, mentoring and challenging the status quo .. ie showing leadership through developing critical thinkers.