Wither general education in America? Tom Ehrlich, senior scholar at the Carnegie Foundation for the Advancement of Teaching, is blue about the Harvard plan for general education. In an essay for Carnegie Perspectives, Ehrlich (formerly president of Indiana University) laments the proposal of the Harvard Faculty of Arts and Sciences to substantially water down Harvard’s general education requirements.
Ehrlich’s main criticism is that “the sad reality is that the new plan looks like it was crafted to serve the faculty and not the students. It will ensure that faculty need teach only what they want to teach, leaving it up to students to make whatever connections they can among their courses.” In particular, Ehrlich worries that students will lose the sort of common academic experiences that the core curriculum provides and will end up taking an idiosyncratic cluster of freshman and sophomore courses that are too focused on narrow, career-focused goals, rather than acquiring the sort of broad-based knowledge that the core curriculum was designed to provide.
On the one hand, Ehrlich is right to be concerned. For more than a century, Harvard and other institutions at the pinnacle of the American educational system have set the standard for undergraduate education that many instutitions further down the food chain then seek to emulate. For instance, it was at Harvard in the 19th century that the current large lecture/TA recitation section system began. In its first decades this approach to the large course was even known as the “Harvard system.” So, it is reasonable to expect that if Harvard waters down its approach to general education, others will follow suit.
But is this a bad thing?
It is if one assumes that the course will continue to be the unit of content delivery over time. It is also a bad thing if one assumes that students should not be allowed to decide what knowledge they ought to acquire during their undergraduate years. While I agree with Ehrlich that students ought to expose themselves to a wide fund of knowledge, especially when they are freshmen and sophomores–before declaring a major–I’m less convinced that they must do so.
For more than a century we have delivered an undergraduate curriculum that is largely unchanged. Students enroll at our universities, they take 40 or so courses, 10-15 of which are narrowly focused in a major, and they graduate and head off to the world beyond the campus boundaries. And then they find out what they don’t know. Employers demand specialized or general knowledge and skills of them that they didn’t get in college and so they attend corporate training courses, take continuing education from us, go back to graduate school, or simply change careers to avoid the first items in this list.
Why not let the students and the market decide what is best? Give them the chance to accumulate what they think they ought to learn?
The downside to this, of course, is that many students will over concentrate. But won’t those students find themselves at a disadvantage in the job market? Won’t employers want employees who can think as well as do?
And, if I’m right that the course is being undermined by technological change, then I suspect we’ll see the proliferation of content modules that students can take either as part of a course or on its own. And if this happens, students will be able to over concentrate even more. But if employers care about this, then I suspect we’ll see less and less over concentration as students align their educational experiences with employer expecatations.
So, is Ehrlich right to be worried? At one level he is, but at another level he’s worrying about a problem from the previous century instead of the new century.