Tag Archives: teaching

More Than a Few Tough Things (2)

In my previous post in this series, a response to a column my colleague Steve Pearlstein wrote in the Washington Post over the weekend, I discussed some difficult choices that public universities will need to make in the future as enrollments change, legislative investment declines, and options for students proliferate. And just to be clear, I’m very specifically talking about public colleges and universities, not other higher ed institutions, while Pearlstein generalizes across the higher education spectrum.

Less research, more teaching: It’s simply not the case, as Pearlstein erroneously claims, that the vast majority of work published in the humanities and social sciences is not cited by other scholars and so has no value. As Yoni Applebaum pointed out yesterday, Pearlstein  is guilty of citing bad data when he repeats this claim. We don’t accept such carelessness from our students, so we shouldn’t accept it from our professors.

But, being wrong about one thing doesn’t make him wrong about everything.

I happen to think he is correct when argues that we should, “offer comparable pay and status to professors who spend most of their time teaching, reserving reduced teaching loads for professors whose research continues to have significance and impact.”

One of the questions the Ernst & Young report on Australian higher education asks is: “Can your institution maintain a strong competitive position across a range of disciplines?” [19] I would say that the answer is “no” for the vast majority of public colleges and universities in the U.S. There just isn’t enough money to go around in public higher education, and, really, how many doctoral programs in X, or MA programs in Y, or BA programs in Z, does a state higher education sector need?

But we all seem to want to offer everything to our students, leading to a lack of differentiation. The result is market confusion and, as the Bain report on U.S. higher education points out, “Who will pay $40,000 per year to go to a school that is completely undistinguished [from similar schools]?”

What’s the solution? First, as I argued in my previous post, we need to eliminate some programs, and downsize others. In addition to the examples I offered earlier (including my own department, which I argue should be downsized over time), I would offer up the examples of Geology and Philosophy. According the State Council of Higher Education in Virginia, in the 2013-14 academic year, the top 10 public colleges and universities in the state awarded 108 bachelors degrees in Philosophy and 126 in Geology. Students graduated with Philosophy degrees from seven different schools, and those receiving Geology degrees graduated from five.

It seems (to me any way), quite reasonable to ask why in a state system, if only slightly more than 100 students per year are receiving degrees in a given discipline, it is necessary to staff up sufficiently (and allocate the physical space) to offer those degrees at five or seven different institutions? Wouldn’t it make much more sense to consolidate those degree programs and offer them at only three or perhaps four institutions? Courses in Geology and Philosophy could (and should) still be offered anywhere in the system as part of a general education curriculum, but given the general lack of differentiation from one university to another, it seems to make sense to focus our resources a bit so we can build stronger programs at fewer institutions.

In such a scenario we would then have to say to students who wanted a degree in Geology or Philosophy: “Here are your three choices in Virginia.” Would that be so wrong?

The Bain report calls this “differentiation” and the Ernst & Young report calls it becoming “niche dominators,” but the result is the same. Students who want a degree in a less popular discipline would have fewer choices, but those choices would be stronger, more diverse, and have more resources.

The second part of the answer, as Pearlstein correctly argues, is that we need a clear path to professional success–pay and status–for excellent teachers who are not productive researchers at our public colleges and universities. This is already the case at the majority of public institutions, but with each passing year, colleges and universities chase elusive rankings that revolve around research productivity by emphasizing research over teaching. Larry Cuban explained how this happened in history departments in a book published way back in 1999, and the story he told then just continues to repeat itself in a variety of disciplines across the country.

If the pathway to success at our top ranked public colleges and universities had two lanes — the research lane and the teaching lane — that led to the same salary, benefits, and other rewards, it’s quite easy to imagine that some significant number of our colleagues would opt for the teaching lane, even if it meant teaching more classes and more students. But the reward and status structure would need to be the same, or almost no one would make this choice when they could have more reward and status in the research lane.

If, however, we got the incentives right, and reduced, eliminated, or consolidated academic programs across state systems, cost structures at our public colleges and universities would look a heck of a lot better than they do today.

Back to the Future

It’s all but impossible for me to believe it, but 10 years ago this week I wrote my first post in this blog. And, oddly enough, this post is #500. If I were a numerologist I’m sure I could make something of that symmetry.

Way back in 2005, that first post was about my attempts to teach students to be more critical consumers of historical content they found online–and in 2015, I’m still at it. While I’ve tried many different approaches to teaching this skill and habit of mind to my students (some controversial, some not), the biggest change between 2005 and 2015 is that in 2005 I asked them to review historical websites using a rubric of my own devising, in 2015 I ask them to build websites using a rubric of their own devising.

Between then and now, I think the biggest lesson I’ve learned as a history teacher is that students learn best by doing digital history rather than by learning about digital history. I should have known this, of course, because my first true digital history courses were “doing” courses — the first was a seminar at Grinnell College in which my students built a database of historical sources and the second, at Texas Tech, was a seminar in which my students took one of my colleagues online (creating his website for him). And along the way, I’ve taught lots of other digital history courses that involve really doing digital history.

What’s different now–really since 2007–is that I’ve found ways to combine the creation of digital history (which involves a lot of teaching of technical skills) with careful consideration of the underlying principles of digital information and the underlying principles of historical thinking. Once I found that sweet spot, my students’ results improved substantially.

The tools available to do this work are so much more accessible and user friendly than they were in 2005, and I suspect that by 2025 they will be even more conducive to the kinds of deeper learning about the past that I’m after.

When I started writing this blog, Google and YouTube were still very new, and Twitter, data phones, and 3D printing didn’t exist, at least in the commercial space. No one, or almost no one, was talking about “big data” or data visualizations in the humanities. Zotero and Omeka, which I use all the time in my teaching, weren’t available, and the big thing everyone seemed to want to talk about was how to use Facebook to teach about the past (not so much a topic these days).

I’ll also be very interested to see what new challenges the tech innovators of the world can throw at us. No matter what they throw, however, I strongly suspect that in 2025 we’ll still be talking about how to teach students to be critical consumers of online historical content.

Does Playfulness Crowd Out Rigor?

If you’ve been a reader of what I’ve been writing about teaching and learning the past several years you’ll know that I’ve been arguing that historians should make room for a more playful approach to the past in the undergraduate history curriculum.

I’ve never argued that playful teaching and learning should be the only way we pursue our goals in history education. But I do think we need to lighten up a bit and make room for courses that are not so dependent on the classic style of history teaching: the lecture or seminar that has as its primary goal the writing of one or several analytical essays and, perhaps, a final presentation to the class, with a mid-term and a final examination.

The fact is that the vast majority of undergraduate history courses taught in the United States are taught in pretty similar ways.  Students have every right to be bored with the sameness of it all and, I suspect, this sameness is one of many causes of the continuing slide in history majors around the country.

Way back in 2008, I started to experiment with more playful approaches to teaching and learning. My forays into teaching students to create online historical hoaxes generated more than their fair share of commentary and controversy around the world. That course, and my more playful version of the historical methods seminar [syllabus], also generated some blow back within my own department.

When I came up for promotion to full professor in 2012-13, my departmental tenure and promotion committee (which ultimately voted against my promotion), wrote the following in their letter to the dean:

Nevertheless, members of the department are concerned that the playfulness of Kelly’s courses can crowd out rigor. Some faculty are concerned that Dead in Virginia [my methods course] was offered as a section of the required historical research methods course yet did not require students to do as much analytical writing as do other sections of that course, which is designated as writing intensive.

Because no one on the committee ever spoke to me about the course, I don’t know, but I suspect, that the concern about the supposed paucity of analytical writing in my version of the methods course arose from the fact that instead of a 10-15 page essay, I required my students to write a series of database entries and the first two pages of a long essay (which I then iterated with them). My students wrote a lot — just not in the format historians are more used to — the 3, 5, 10, or longer essay.

The bigger and more interesting issue here is whether, by having my students write in chunks rather than in long form, I was adequately preparing them for the rigors of our capstone seminar, in which they must write a 20-plus page essay built on primary sources they acquire through their own research.

The promotion and tenure committee also criticized me, quite correctly, for not testing the claims I made in my most recent book [relevant chapter] about the success of the more playful methods course “by comparing the outcomes of Kelly’s section with those of other sections.”

Ever since reading their critique I’ve been a little worried that, in fact, I had not adequately prepared my students for the rigors of the capstone seminar. So, I decided to do what I should have done all along — compare my students’ outcomes with those of other sections of our methods seminar.

To get at that information, I asked the registrar’s office to pull student data from all the sections  of our methods course offered in the semesters when I used my more playful syllabus (spring 2011, spring 2013). I compared student grades in the methods seminar (HIST 300 here at George Mason) to the grades those same students received in their capstone seminar (HIST 499). I did not teach the capstone seminar to any of these students.

Here’s what I learned.

  1. My students outperformed the students who enrolled in other sections. Of the students who took my section of the methods course in 2011, their average GPA in the capstone seminar was an 88.47. Four of my colleagues taught methods that same semester. Their students’ average grade in the capstone seminar was 88.28. In the spring 2013 semester, 64 students took methods (22 of them in my section). The average capstone seminar GPA of the students who took the course from someone else was 88.06, while the average GPA of the students who took the course from me was 90.15.
  2. More of my students have completed the capstone seminar. Only 80% of the students in the other 2011 sections ever went on to take the capstone seminar, while 100% of mine have done so. Given the slow pace of some of our students, it’s likely that more students from the 2013 sections will take the capstone in the coming year. As of now, though, 69% of the students who took the methods course from someone else in spring 2013 have taken the capstone seminar, while 80% of my students have done so.

I will admit to being much relieved that the students who took the methods course from me did not suffer from having taken a more playful version of historical methods in which they wrote database entries rather than a long essay. In fact, quite the opposite happened. They did just fine.

While I’m relieved, I’m also a little peeved with myself for letting the criticism I got during my promotion year convince me to go back to teaching methods the more traditional way. I’m teaching the course again this fall and can’t ditch that more traditional syllabus entirely for the more playful one. I will certainly ditch the 10-15 page paper in favor of more shorter and iterative writing assignments.

And, like a zombie, Dead in Virginia will rise again…

Improving the Past

This semester I’m offering a new course, Improving the Past [syllabus], that is another attempt on my part to capitalize on what we’ve learned from recent research about how young people use digital media. Last year I wrote a series of posts I called The History Curriculum in 2023 in which I argued that within a decade we should be focusing our teaching around four key areas of skill: making, mining, marking, and mashing. Improving the Past takes on the first and last of these criteria.

Last year my department decided that I couldn’t teach my admittedly controversial course, Lying About the Past, in its full form and I chose not to teach it in the version our undergraduate committee proposed, one that would limit my students’ creative endeavor to the confines of our classroom. Because that course had generated so much student enthusiasm, I started thinking about ways to capture that enthusiasm that would also be acceptable to my colleagues. A close friend and former George Mason colleague helped me clarify my thinking on this and had several fantastic suggestions, one of which morphed into the current course.

The basic premise underlying the course is that there is a long history of attempts to “improve” the past, whether it was the sudden disappearance of Trotsky from the history of the Soviet Union, or a more recent claim by a Virginia textbook writer that thousands of slaves took up arms in the Civil War to defend the institution that held them in bondage. And then there are those faked Civil War photographs like the one provided here. Of course, this history of improvement extends all the way to the origins of our profession cw00172when, for instance, Thucydides put words into the mouths of his subjects in his history of the Peloponnesian Wars. At least Thucydides was up front about his improving of the past.

Given this long history of improvement of the past — whether with good intent or bad — it seemed to me important that students, whether history majors or not, need to learn to think critically not only about why the past is being improved, by how. How is information altered and woven into compelling new narratives? What role does technology play in both the alteration and the dissemination of such knowledge? How can technological tools help us ferret out distortions of the historical record?

One of the most important takeaways for me as an educator from my experiences with Lying About the Past is that my students learned best when they were making a hoax out of the available (mostly true) historical facts. As a result, Improving the Past is built around making and mashing. In addition to studying the many ways the past has been improved, my students will do some of their own improving. They will select historical texts, images, and maps that they will then alter, preferably subtly, to create a new and improved narrative about the past. Then they will write about why they made the choices they made, how the new narrative might change our understanding of the past, how an improved past might be easier to teach, and what they learned from their experiences.

A glance at the syllabus will show that I’m placing a big premium on collaborative work in the course. There are two reasons for that emphasis. The first is that the work I’m asking them to do is difficult and each student will come to class with a different level of experience with history and with technology. The more they can pool their intellectual resources, the more they’ll get out of the class. The second is that I’m emphasizing the lesson that historical work is heavily collaborative, especially in these days of digital scholarship, and so I want to drive home the idea that by working together they are mirroring what, increasingly, we do in our own work. And lest anyone be concerned, my students’ “improvements” of the past will not be released to the Internet.

I am fortunate that the university has just opened two new active learning classrooms and I was able to grab one for this course (see below). I have not had the good fortune to teach in such a space before and so I’m looking forward to monitoring the ways the classroom design does (or doesn’t) facilitate the kind of work I’m expecting from my students. Given what I’ve written recently about spaces for history teaching and learning, I’m excited to be in such a new and different room. Notice, for instance, the wrap around white boards and the lack of an obvious “front” to the room.

RobB106

Needless to say, I’m looking forward to the class. I’ll report in later in the semester on whether it’s working or not.