Tag Archives: Curriculum

The History Curriculum in 2023

In my last post I argued that if we don’t start making some substantial changes to the history curriculum, we’ll be in a world of trouble before too much longer. I’m not a fan of those who simply predict doom without offering possible solutions. Now that the semester is over and I have more than ten minutes to think about something other than the most pressing item on my todo list, I want to propose my own possible solution to getting us out of the corner we have largely painted ourselves into.

Just to be clear from the outset, I am not going to propose what the content knowledge of that curriculum ought to be. I think that faculty in high school and postsecondary history departments around the world will continue to make very interesting decisions about the content of their courses and their curricula. My thinking, that I’m going to lay out in a series of posts over the next few days, is about the procedural knowledge we need to be teaching our students so that they can prosper in the information and service economy they will live in once they graduate.

Also, I feel the need to stipulate that I am specifically not proposing that we discontinue teaching our students analytical writing about the past or traditional research skills (e.g., how to locate and analyze primary sources). These are essential components of the history curriculum. But, as I have argued previously in this blog, these cannot be the only skills we teach and it is not necessary that every course we offer be based, at least in part, on teaching these skills. There is more to success in the economy our students will live in than being able to write a really good five-page paper based on primary sources.

My proposal for additions to the history curriculum of the future can be summed up in just four words: Making, Mining, Marking, and Mashing.

In the posts that follow in this series, I will elaborate on each of these for core concepts that I think will form essential foundations of the curriculum we ought to be developing in the coming years.  Yes, students will still be required to find and analyze primary sources, to form arguments, and to place those arguments (and the sources they find) into a larger conversation among scholars. But those skills alone will position our students ideally for the economy of 1993, not the economy of 2013, much less 2023. If we want to be true to ourselves as educators and true to our students’ needs and expectations, we need to admit that the skills we have been teaching them since the late 1890s are no longer sufficient preparation for the world those students will live in once they graduate.

You may not agree with me on the Four Ms of the future history curriculum, and if you don’t, I hope you’ll express that disagreement in very specific terms here or elsewhere (and then link back here). But I do think that you should at least consider that the very fact that we have been teaching history much the same way for more than 100 years is, in and of itself, a fact worth reflecting on. The world has changed an awful lot in the last 100 years and the fact that our teaching has changed so little in that century should give all of us pause.

So, read on as this series of posts unrolls, think about what I’m proposing, and let me know what you think. If you are at the American Historical Association annual meeting or at THATCampAHA in New Orleans, by all means track me down and tell me what you think in person. I also strongly suggest reading the Top Ed Tech Trends of 2012 by Audrey Watters of HackEducation. Much of my thinking on the history curriculum ten years hence has been influenced by Audrey’s writing about educational technology.

[Next post in this series]

History on Thin Ice?

In his Opinionator blog at the New York Times yesterday, Timothy Egan argues that “history, the formal teaching and telling of it, has never been more troubled.” According to Egan, the twin forces of educators caving in to corporate demands to phase out the liberal arts and what he calls the “circular firing squad of academics who loathe popular histories,” have teamed up to push history to the edge of irrelevance.

My own view is that, while Egan’s essay is heavy on hyperbole, he’s more than a little correct–just not for the reasons he cites.

I share Egan’s view that the teaching and learning of history is in trouble, but not because, as he writes, “Too many history books are boring, badly written and jargon-weighted with politically correct nonsense.” To be sure, much of academic history writing these days is all of these things and many of my colleagues share a strong prejudice against anything written for a broader market. As a for instance, a number of my colleagues here at George Mason recently criticized my forthcoming book Teaching History in the Digital Age (Michigan, March 2013) as being “under theorized.” I certainly could have written a more heavily “theorized” book, but to do so would have limited its market appeal to the small number of academic historians who see theory as the marker of excellence. For good or ill, I chose instead to write to a much larger audience. This is not a new debate. See, for instance, my coverage of Barbara Weinstein’s commentary on this same topic more than five years ago.

But, as impenetrable as it can sometimes be, I don’t think over specialized academic writing is the real problem. In fact, I think it is an overly convenient straw man. Instead, I think history is in trouble for two reasons: bad teaching and flawed curricular design.

First the teaching. It’s not news that the vast majority of history classes in high school and at the post-secondary level are taught primarily through lecture with a smattering of discussion thrown in just to keep it lively (or sort of lively). It’s also not news, or at least it shouldn’t be, that research in cognitive science demonstrates quite conclusively that lecturing is the worst form of teaching, that is if learning is the goal of teaching. And, for what it’s worth, historians have been writing about how ineffective lecturing is as a mode of instruction in the history classroom since 1897. Yes, 1897.

While students in other disciplines are engaging in more and more active learning in their courses, solving problems, moving around, making things in the analog and online worlds, and negotiating their way through group projects, the vast majority of history students sit still, listen, and take notes. If history teachers, at whatever level, continue to cling to the lecture as the primary mode of instruction, our field will become more irrelevant with each passing year.

And then there is the curriculum. Around the United States history curricula are depressingly similar. Almost anywhere a student might choose to enroll, he or she will almost certainly find requirements that include the following: a few introductory surveys, upper level distribution requirements almost always dividing the past into some version of American, European, and non-Western history, a methods course, and a capstone research seminar. To give some credence to my contention here, I selected four history departments at random (plus George Mason) and here are links to their requirements: Boston College, University of Missouri, Denison University, UC Irvine. There is almost no variation in the requirements from department to department and I am quite certain that any random sample you would generate would have the same results.

In a recent paper (Trends Toward Global Excellence in Undergraduate Education), Marijk van der Wende of Amsterdam University College argues that “leaders of the future will have to work together across the boundaries of nationalities, cultures, and disciplines, in order to be successful in the globally engaged and culturally diverse society of the 21st century.” Take a look through the degree requirements I linked to above and you’ll find not one hint of interdisciplinarity, or of providing history majors with the knowledge and skills they will need to succeed in the globalized and increasingly digitalized knowledge economy they will enter after graduation. Given the very parochial, very siloed approach to education that typifies the university history degree, it’s no wonder that students are bored.

And they aren’t just bored. They’re voting with their feet. According to the recently published Digest of Education Statistics, enrollments in bachelor’s programs in history have grown by 5.6% since 2001, that is compared to growth of almost 10% in all other social science bachelor’s programs during the same period. A growth rate half of that in other social science disciplines should be cause for significant concern.

The way out of the box we’ve put ourselves in is actually pretty simple. First, dump the lecture as the primary mode of instruction. So many other disciplines have managed this trick that for historians to say that we just can’t is disingenuous at best, ridiculous at worst. It’s just not that hard to teach without lecturing. Second, take seriously the notion that our curricula are ideally positioned for 1973, not 2013. Rewriting curricula is much more difficult than dumping the lecture model of teaching because there is a lot of administrative overhead (curriculum committees, catalog copy, etc.) that have to be dealt with, not to mention good old fashioned inertia. But rewrite the curriculum we must if we are going to do right by our students.

If we don’t make these changes, then Timothy Egan will be right about our field being in a world of trouble.

Shaving Years Off the PhD in History

For years historians have wrung their hands about how long it is taking our doctoral students to complete their PhD degree. Six years? Seven? Eight? More? In fact, a 2008 report by the American Historical Association indicates that eight years is the average, with the range being 4-11 years to complete a PhD in history.

The longer it takes our students, the more expensive it is for them (and for us), in particular because every year they are in school is a year of lost income after someone graduates. Most of the solutions I’ve heard revolve around offering students more funding so they can spend more time on their studies/dissertations. It is interesting to note, however, that size of program seems to be more important to time-to-degree than funding, as students in small programs seem to complete their degrees in much less time.

I spent a fair amount of time last week in Switzerland chatting with PhD students there. If you are familiar with the typical European PhD program, you’ll know that PhD students on the Swiss side of the pond take no, or almost no classes. They enroll in their doctoral students and, as one student told me last week, begin “making a PhD.” In other words, they start on their dissertations right away, which means that they are generally done in three or four years.

My view is that both versions are problematic. Our students spend too much time on their degrees and European students don’t have the opportunities our have to deepen their knowledge or a topic, develop a knowledge of more than one subject area via minor fields, and because they aren’t spending time in class with fellow students, often lack a community of practice with other students–or so several have told me over the past year.

Given these issues, I have a modest proposal for changing the PhD degree–a proposal that puts the onus on us rather than on our students or the administration. Assuming they come to us with an MA in history, doctoral students could follow a curriculum that includes:

Year 1-2
12 credits of course work
6 credits of advanced reading
Qualifying exams

Years 3-5
Dissertation research and writing

Students who followed such a curriculum would thus have the benefit of the study of two specific areas of history–say a 12 credit major field and a 6 credit minor field, as compared to European students who launch right into the dissertation. These same students would then have had the opportunity to begin building a community of other students that could lead to such things as writing groups, etc., as their career progresses.

If we are honest with ourselves and our students, three years is certainly enough time to research and write a dissertation. Too often we either load them up with expectations that can only be satisfied by spending four, five, or even more years on the dissertation, or we allow them to work on topics more suited for monographs than for dissertations (or simply allow them to dawdle).

It’s possible to imagine fully funding students for four or possibly five years in such a degree program, especially if they spend (no more than) one year working as a teaching assistant go gather some useful classroom experience.

I realize that it’s unlikely that any PhD program out there is going to willingly shave credits off of their program, if only because of the revenue losses that would result. In the case of our program here at George Mason, such a proposal would mean the loss of between at least 6 and probably 12 credits (and possibly more) sold to each doctoral student.

There are many difficulties with such a proposal, not least of which is the willingness of external accrediting agencies to accept a doctoral degree that includes fewer credits. Nevertheless, I think a discussion of such a modified degree path is well worth having.

NB: There was an article in Perspectives back in October that dealt with some of these same issues.

Productivity in Higher Education

Lately I’ve been thinking a lot about the issue of productivity in higher education. There are many ways to measure what we accomplish such as numbers of graduates, what kinds of jobs our graduates get, research dollars, patents received, research productivity (publishing), just to name some of the most obvious. But any discussion of such factors leads immediately to the budget, what the marginal cost of any of these things is, and whether that marginal cost is increasing or decreasing.

This question of marginal cost is one I had to think about a lot last year because part of my brief as an Associate Dean was to monitor the enrollment in all the many hundreds of course sections in our College to make sure that all of those sections had sufficient numbers of students given how much it was costing us to offer that course. Was the class being taught by a tenure-line faculty member (Cost = x), by a term, i.e., non-tenure track but full time faculty member (Cost = x – 1.5) or by an adjunct faculty member (Cost = x – 4)? These formulas are made up and arbitrary, but they approximate the kind of decision making we had to engage in. Considerations of marginal cost per course had to be balanced against student needs, i.e., a particular capstone course had to be taught so some students could graduate. But as the start of a new semester approached, we had to think carefully about which sections to cancel and the marginal cost per section was an important consideration–one of many, to be sure, but an important one of those many.

As education budgets and endowments imploded across the United States during what some are now calling the Great Recession, plenty of people have asked whether or not efficiencies could be found that would allow us to lower the marginal cost per course. The obvious solution is to shrink the number of sections and expand the size of every section. There are well known educational downsides to fewer and larger course sections, but such has been the state of may university budgets that bigger classes are a necessity.

Here at George Mason I sometimes hear the argument that the solution to all our budget woes lies in ever greater adoption of technology as a course content delivery vehicle. After all, the University of Phoenix is hugely profitable and most of what they offer is offered online. Why can’t we just follow their model and reduce costs per student taught and, possibly, even teach many more students with our existing faculty? After all, we have run out of classroom space, but need to teach more students to keep bringing in more money. I think this argument is flawed for two reasons. First, the University of Phoenix, Walden University, and others have already beat us to that punch and anything we tried to do on a large scale would not measure up to what they do. Second, a shift to substantially online teaching would require either significant retraining of faculty and investment in technology infrastructure that we don’t currently have. Faculty would resist such retraining and right now there is almost no money for investment in infrastructure.

So where does that leave us? Should we, as the Chancellor and Vice Chancellor of UC-Berkeley have argued, make a massive federal investment in a small number of top flight universities and thereby beggar smaller or less well known institutions? Or should we demand an ever greater share of the federal budget for education, but for all sectors of education? As a former Provost of the University of Southern California has pointed out, there simply isn’t enough money in the budget for higher education to grab a bigger slice. And if it’s true that higher education is in the midst of a bubble economy, we are in serious trouble, because the only way industries recover from the popping of a bubble is through massive restructuring.

If there were clear and obvious answers to the linked questions of funding and productivity, we would have found them already. Whether we would have embraced them is another story, but at least we would know what we were turning down. Instead, colleges and universities keep muddling along hoping that the changes we need to make won’t really be necessary after all. Some institutions are willing to try out some new ideas such as the outsourcing of grading. Our students are already changing the old higher education business model by renting textbooks. Others are saving thousands of dollars by taking courses from companies such as Straighterline.com.

If we accept the idea that the business model of higher education has to change, then I think we need to take some very, very hard looks at how we measure productivity in our industry. In my next post in this thread, I’ll outline some of the ways higher education may be able to make much greater use of technology without (a) having to engage in massive investment in infrastructure, (b) retraining faculty, or (c) diving into the online degree ocean while having failed to take swimming lessons.