Tag Archives: Curriculum

The Future of History

In the December 2013 issue of Perspectives, AHA Executive Director Jim Grossman wrote a very interesting essay on the future of history education in America: “Disrupting the Disruptors.” I couldn’t agree more with Grossman’s premise that higher education is a public good and deserves to be treated that way. Alas, as a recent report by the GAO makes clear, all across the country legislatures are inexorably defunding public higher education. And there is no reason to believe this reality is going to change.

In his essay Grossman also makes a strong pitch for the value of a traditional liberal arts education in the face of the disruptions in the higher education business model brought to us courtesy of those who would “unbundle” the degree. I too am a passionate defender of the value of a liberal arts education. I think that as a nation we are making a big mistake if we turn our backs on the value of the liberal arts to our economy, our political and social system, and to our citizens.

Where I have to part company with Grossman, however, is where his argument that an unbundled degree is “a narrow and often isolated experience compared to the liberal education that is available in the hundreds of institutions across the nation that offer curricula, rather than courses.” Alas, that ship has already sailed.

For one thing, history departments all across the country essentially unbundled their degrees decades ago. Last year I did a quick and dirty study of history major requirements at a random sample of institutions — large, small, public, private — and what I found is that history majors look much the same everywhere. They are, by and large, baskets of courses that students select from with the only thing approaching a “curriculum” are requirements that include a methods seminar/capstone seminar experience. Otherwise, it’s pick your courses, add up your credits, and get your degree.

For another, the view of liberal education as “bundled”, meaning students take all their courses at the same institution, is hopelessly nostalgic. Only a tiny number of students in the United States follow this path, and even those who do increasingly arrive on our campuses having skipped substantial numbers of our courses courtesy of the AP/IB courses they took in high school.

And finally, even if the disruptors attempting to eat our lunch with their new and more flexible approaches to course delivery fail, the rising cost of tuition at BA granting institutions, coupled with the truly excellent teaching happening at our country’s community colleges, is driving more and more students every year to complete some or all of their first two years of college at one of those community colleges.

Using my own, putatively low-cost, institution as an example, tuition alone for a full time student in the spring 2015 semester is just over $5,000 for an in-state student and a whisker under $15,000 for an out of state student. That means that before housing, books, meals, parking, and all the various fees we charge them, a full time history major will pay George Mason $40,000 if she is an in-state student and $120,000 if she is an out of state student. Just tuition. Our office of admissions estimates that four years here for an in-state student will cost around $90,000, while out of state students will pay around $170,000.

Our local community college, Northern Virginia Community College (NOVA), charges in-state students less than half what we charge, and out of state students around 25 percent of what we charge. Given the excellent teaching that happens at NOVA and these cost differentials, it’s no surprise that almost half of our undergraduate students come to us as transfers. And it will be no surprise a decade from now when something like two-thirds of our students follow this same path to our campus.

What does all this mean for History? It means that our departments are going to get smaller and our graduate programs, largely financed through the large enrollments in our general education courses, are in danger of running out of funding. Fewer faculty, graduate programs downsized or dropped altogether — that sounds like a calamity to us.

But to our students? Probably not.

What they want is a quality education that prepares them for life and for work after college. And if we are asking them to spend somewhere between $90,000 and $170,000 for a degree, it seems to me they have every right to this expectation. How they get that quality education that prepares them for a successful life and a successful career matters much less to them than the results do.

Fortunately, we don’t have to sit back and accept that market forces are destiny. But to change our fate, we have to change. For example, why not guarantee every history major an internship? Some institutions, such as our Virginia colleagues at Longwood University, do just that. Why not create some history courses that are more directly employment focused — such as training in digital archiving (a growth industry)? Why not develop a version of the major that is built around service learning, or environmental sustainability, or global engagement, or public policy?

Or, we can just keep doing what we’re doing now — offering lots of interesting courses that students can pick from, cafeteria style, with a smattering of required seminars — and hope for the best. Maybe that will work.

 

Getting History in Tune

Over the past year or so the American Historical Association has been working on what they call the “Tuning Project“. For those who are not members of the Association, the April 2013 edition of Perspectives included an entire forum on the project. Now the AHA has issued a new (pdf) document detailing the current state of the Tuning Project’s work on what they are calling the Discipline Core: “a statement of the central habits of mind, skills, and understanding that students achieve when they major in history.”

There is much to like in this document, which as Julia Brookins of the AHA writes, is intended to foster dialogue among history educators, students, the general public, and others interested in how history is (and isn’t) taught. If I were starting a history major from scratch, this document would be one of the source documents I would use with my colleagues as a basis for our conversation about what we ought to be teaching (competencies and skills, not content) to and with our students. And because I’m teaching historical methods this coming semester, I plan to revisit my syllabus to see what sort of alignment my assignments have with the core competencies laid out in the Tuning document.

In the spirit of Brookins’ call for conversation, I would also say that I found the document surprisingly disappointing in a couple of important ways. The first of those is that the document seems to be focused primarily on undergraduate education. As someone who teaches both graduate and undergraduate students, and who also spends a lot of time working with K-12 history educators, I was hoping to see a bit more conversation on the trajectory of history education from the earliest grades through the terminal degree. Because the authors of the document speak to a desire for a broader conversation, I think that more of that conversation would be likely if all phases of history education a part of the report.

A second critique I would offer is that the document just doesn’t seem very forward-looking. While the authors have done a very nice job of capturing what is common to historical study as it is right now at most colleges and universities, there is no sense of future possibility here. A reader coming to this document for the first time will have to be excused for concluding that what history students do is read, research, and write. What about the making of historical things — websites, digital archives, digital stories, re-created artifacts, museum exhibits (virtual/analog), and all the other ways that history students are beginning to use new media and other tools to make history in new and different ways?

The report does mention the creation of a website/blog/e-portfolio toward the end, but that is really the only mention of the digital world history students live in, other than saying that students should be able to locate appropriate materials online as well as in libraries. Those two statements about the digital world our students inhabit just strikes me as not nearly enough. For instance, shouldn’t history majors learn to apply their critical thinking skills to databases — not as tools for locating sources, but as resources that have historical arguments all their own? Shouldn’t history majors learn how to source digital sources (digital forensics)? Shouldn’t they learn to think critically about how the maker movement might have something to say to historical scholarship? Or what does it mean to have historical information be open source? At what point in the trajectory of historical study should students begin learning to work with big data? These are altogether different and yet very pressing issues in history education and they are largely missing from the Tuning document.

I would also like to see a much richer conversation about the ethics of historical research and production. Too often our conversations with our students about ethics come down to a series of admonitions about plagiarism in the first week of the semester and that’s that. Before the Library of Congress went offline last week, I did a search for books on the ethics of the historical profession and found exactly three. Three. I think we need to find new ways to spread the conversation about ethics across our curricula and so if I were editing this document, I’d include more on ethics.

 

Teaching Digital History: Beyond Tech Support

I taught my first “digital humanities” course in the spring of 1998 when I was a visiting assistant professor at Grinnell College. My students created a “virtual archive” of primary sources, building a website that made it easy (in 1998 terms) to access the sources they placed in the archive. They wrestled with such things as metadata, whether or not to post the sources in both English and the original language, user interface, and website design issues. While they liked the class, that group of pioneering students found their lack of technical knowledge – when it came to such things as website design and information architecture – to be very frustrating and inhibiting.

Fifteen years later, not much has changed.

Sure, the technology has changed a lot, and there are many tools that have lowered the bar of entry for students to start building digital humanities projects. But the challenges I faced in 1998 are, in many ways, the same challenges I face today. Every course I teach that has a digital humanities component requires me to spend a significant amount of time getting the class up to speed with the technologies they need to use so they can create whatever it is that either I’ve assigned or they’ve determined they ought to create.

I find that I am doing just as much tech support in 2013 as I did in 1998, and all that time devoted to tech support detracts substantially from the final results my students achieve. We just don’t get to spend enough time on the important and interesting historical and humanities issues that are central to the course. And my students are often just as frustrated, if not more frustrated, as I am by this problem.

There are plenty of reasons why many undergraduate students come into our digital humanities classes ill prepared to do the work we expect. Despite their facility with the technology when it comes to making connections with others, locating that video/song/story/picture/meme they are interested in, they are often very inexperienced with digital work beyond the creation of a slideware presentation.

One solution would be to urge our colleagues to add a digital “making” course to the general education curriculum. But doing that means either adding one more course to often overly burdensome general education requirements, or deleting some other course, with all the controversy such a change to the general education requirements can cause on our campuses.

Another possible solution, and the one I plan to start advocating, is to try to break free from the 14 week semester or 10 week quarter when we teach the digital humanities. The semester/quarter, it turns out, is just not enough time to do sophisticated work in this emerging field. My proposed solution is a new digital history “course” that will extend over multiple semesters, giving students the opportunity to enroll for one, two, three, or even four semesters, as they work together to realize a much larger and more sophisticated group project than is possible in just 14 weeks.

The idea I have in mind lives somewhere between a standard course and an internship and so for lack of a better term, I’m calling it a workshop. We have no such name or classification in our catalog, so I’ll end up having to call it a course, unless I can get away with calling it a lab, which is actually much closer to the reality of what I have in mind. Because I’m also very interested in learning spaces, I’m planning to use this “course” or “lab” or whatever as a way of experimenting with the intersection between public digital history and making space on a college campus.

Right now I have a draft proposal just starting to float around campus. My hope is that by the end of the summer I’ll have something acceptable enough that I can start it through the necessary approvals that will then lead to a roll out of the course in the fall of 2014. Once I get some feedback on version 0.1, I’ll post it here for further public comment. In the meantime, I’d love to hear from people who have been teaching digital humanities to undergraduates – what has worked, what hasn’t?

The History Curriculum in 2023 (Mining)

When I was a freshman in college one of the first history classes I took included a tour of the university’s main library and an introduction to its vast card catalog, the like of which none of us had ever seen. Our professor patiently explained the arcana of the Library of Congress subject heading system, showed us how a work might turn up in the catalog either by title, author, or subject heading, and then sent us off on a scavenger hunt through the thousands of little file drawers. By the end of our class period, each of us had the beginnings of a bibliography on the subject of our course.

That first foray into the world of real historical research was fun, overwhelming, and educational all at the same time. But it was also limited to secondary sources and was entirely limited to those works available in the university library.

How the history student’s world has changed.

Today our students are face to face with access to primary and secondary sources beyond count–quite literally tens of millions of primary sources and an equally large and growing corpus of scanned secondary works. My professors taught me in a pedagogical world based on scarcity. Today we teach in a world dominated by abundance.

Big data” is one of the “big ideas” of the current decade across many sectors of the information economy and historians and other humanists have already begun working on exciting projects [see also and also] that are helping us find ways to mine emerging super massive datasets of historical information. One maturing example is the Criminal Intent project funded by the NEH’s Digging into Data program (my colleagues Fred Gibbs and Dan Cohen are central players in this project).

As exciting as the Criminal Intent project and other similar data mining efforts are, they are currently operating at a level a bit to complex for the average undergraduate. Simpler data mining tools like Google’s NGram viewer offer a more frictionless introduction to data mining concepts. For instance, I’ve written about how undergraduates might use the NGram viewer to mine millions of words from the Google book database and begin to think about what sorts of historian’s questions might then come out of such a mining exercise.

Right now, today, getting much beyond these basic sorts of exercises with undergraduates will be difficult. But it is useful to remember that ten years ago it was not so easy to make a web page. Before too much longer the user interfaces for mining massive data sets of historical information — especially texts and images — will be appropriate for the undergraduate curriculum. That means it is already past time for historians to be thinking about how we can incorporate data mining into the undergraduate curriculum. Some interesting graduate syllabi have begun to appear, but data mining, whether text or image mining, seems to be largely absent from the undergraduate history curriculum.

Imagine, for instance, a course that begins with the simplest tools, such as Many Eyes or the NGram viewer, helping history students to see both the strengths and weaknesses of these tools. From there the course could move on to increasingly complex forays into data mining, letting the students range further and further afield as their skills grow. Our colleagues in computer science have already developed such courses, but such courses would need to be adapted heavily for them to work with history students who (mostly) lack the background in programming.

In my previous post I pointed out that incorporating “making” into the history curriculum gives us opportunities to build connections to other academic disciplines (art, engineering, graphic design). Data mining offers us similar opportunities (computer science, library science, computational sciences). The more creative we can be about building such linkages, the richer our curriculum can be and the better prepared our students will be for the world they’ll face when they graduate.

But just as important, we’ll be training a new generation of historians to work with the unimaginable wealth of historical information that a decade’s worth of scanning and marking up of texts, images, video, and sound files, has made available to us all.

[Next post in this series]