Tag Archives: data

History’s Future

The March 2014 issue of Perspectives includes a very clear analysis of the most recent IPEDS data on history BAs by Allen Mikaelian. Everyone currently teaching college history or planning to do so should read this article.

Why? A quick glance at this graph should at least given one pause.

Mikaelian-Fig1What is shows is a five year decline in history’s share of all bachelor’s degrees awarded in the United States. In the data analysis business, we call this a trend. In an era of stagnant or declining funding for colleges and universities, this is a particularly bad moment for history departments to be smaller players on the enrollment stage. While the overall number of bachelor’s degrees awarded in history is actually up slightly, deans, provosts, and campus accounting types all take note of a discipline’s relative share of resources provided and consumed and so graph results like this one are a real (not imagined) problem.

As I have written previously, one reason for history’s relative decline as a share of overall degrees awarded is the inescapable fact that, at the undergraduate level, our discipline has a gender problem. The 2011-12 IPEDS data (the most recent available) show that 57 percent of all bachelor’s degrees in the United States were awarded to women, but only 40 percent of degrees in our field went to women. That’s a problem. And it’s not getting better. The IPEDS data show that history is also getting whiter by the year, even as higher education as a whole is becoming more diverse by the year.

What’s new to me in Mikaelian’s article is that the share of bachelor’s degrees in history awarded by our most research intensive universities (the “very high” category in the Carnegie classification) has fallen substantially over the past 25 years. In 1989, 38 percent of all bachelor’s degrees in history were awarded at these universities, almost all of which have very large history departments with substantial doctoral enrollments. But in 2012, only 31 percent of bachelor’s degrees in our field came from these departments. And, as Mikaelian points out, those same institutions experienced only an overall drop in bachelor’s degrees of three percent, so there has been a real drop in history degrees at our most research intensive departments.

I’ve spent a lot of time in university administration over the past five years and one thing I know for sure is that a measurable decline in degrees awarded is something that gets noticed, even if that decline took 25 years. There just aren’t enough resources to go around any more and so those fields that are generating more tuition revenue are blessed with more resources, while those generating less revenue see their budgets declining. That’s the inescapable reality of higher education in 2014.

What does this mean for the future of our discipline? It means that in the near term we shouldn’t be surprised to see tenure  lines at the most research intensive universities being shifted away from history. Unless those faculty who remain agree to teach more undergraduates (unlikely in most cases), those large departments will either become smaller still, or will begin relying on ever more contingent labor for their undergraduate teaching.

More worrisome than any possible decline of the biggest and most research intensive history departments is the on-going gender problem we have at the undergraduate level. If we don’t start coming up with new ways of thinking about that long standing problem, we’re all in the same boat — a boat that has sprouted more than a few leaks.

Future of Higher Education Conference (5)

After being forced to miss the final session yesterday (sorry Dan), I’m back at it at the #masonfuture conference. The day began with some tentative summary of Day 1 by President Cabrera, who charged the room with continuing to think about “preconceived notions we came with that we are now questioning.”

One of the notions that I came with that I continue to question is the degree to which universities like ours can find our way into the future with our current staffing patterns. As a recent report by Ernst and Young argues, universities in Australia cannot survive to 2025 with their current business models. A big problem, the authors argue, is that all the universities in Australia (and I suspect in the U.S. as well) have become too staff heavy and will need to rebalance their staffing patterns if they are going to become more nimble in coming decades. The authors argue that faculty represent income centers, while staff represent cost centers, and that unless these two are brought into balance, universities are in big trouble.

Given the clear interest in pushing online education coming out of the rhetoric of this conference, universities like Mason are going to have to take a long, hard look at how we might implement these sorts of sweeping changes without a significant addition of staff to make it possible. Online is not even remotely frictionless, and staffing the effort will be very, very expensive.

Will we add staff? Or will we repurpose existing staff? We don’t have the money for the former, and if we do the latter, what things will those staff stop doing. This is a conversation we are not having at this conference and I’ll be interested to see if we do.

Our morning plenary speaker was Suzanne Walsh of the Gates Foundation who began with a question: How might we use data to make better and different decisions? A key part of her argument is that universities are not doing a good job mining data about their students to maximize institutional success. In this she promoted the work of Civitas Learning and their approach to using data to help universities to make better decisions about enrollment and retention.

For example, one of the things she talked about was using data to identify courses that promote or hinder student retention. I’m sure that some of what she described seemed “new” to some of the people in the room. Alas, what she was describing really isn’t new at all. In my former life I was an enrollment management consultant and we were doing this sort of thing in the late 1980s with our clients–using good old fashioned main frame computers and good old fashioned multiple regression analysis.

The only thing that seemed “new” here was the very nice visual displays of those data. It’s certainly much easier to see connections in the data if the connected data are all purple or red, or are scaled to point to those that seem more interesting than others. But, as Edward Tufte has been grumping for decades, these sorts of applications often violate the optimal “data ink ratio” and obscure more data than they display.

I’m not disputing her larger point that we need to do a much better job of using data to make decisions on our campuses. And, don’t get me wrong, I love a good data visualization. But we already have the data she’s arguing for and plenty of very qualified statisticians, economists, policy analysts, and others who can analyse our data in some very useful ways.

The point she made that is more useful, is that we have to be open to the use of data, as opposed to instinct, conventional wisdom, or urban legends about “what our students are like” to make decisions. So, to take my earlier example about staff/faculty balance, I wonder what our data would tell us about the 10 or 15 year trend at Mason relative to that balance and how whatever decisions we have made about that balance have helped us or hindered us from achieving the goals we’ve set for ourselves.

This sort of question (and the willingness to be open to whatever the data say) is especially important at our institutions like ours, not only because of the questions we face going forward from 2012, but also because we are in the midst of writing a new strategic plan for that “going forward.” I agree completely with Walsh’s point that we need to let those data tell us what they say, not what we want them to say, and then use those data to help us with the decisions we need to make. But we also need to remember that data are not all. Intuition and institutional memory matter too.

 

New Data from D.C.

The U.S. government is beginning to post its vast collection of data sets online. At the moment, there are only 47 data sets posted at data.gov and most of these are geological or weather related. However, it won’t be long (I’m told) before data of greater interest to historians begin to appear. I, for one, can’t wait for census data to begin showing up on this site rather than having to rely on other, more cumbersome points of access to the census. My hope is that data.gov will eventually include not only the 2000 census, but all of the census data collected by the federal government. Talk about a treasure trove for historians!

Around the world census authorities are posting more and more raw data in its entirety and in various summary forms. At present much of this data is the most recent information, but soon we can expect to see historical data sets. One thing I like about the data.gov data sets is that many are published in a variety of formats, including, for instance, Google Earth overlays. So, for example, if you want to know how many earthquakes there have been in any part of the world in the past seven days, you can download the file and take a peek. Here’s a look at Alaska.

datagov

But what if your interest was in changing patterns of infant mortality in Europe compared to levels of industrialization (say, steel production) over time? Once these data are available, enterprising historians and geographers and sociologists and economists will start to play with the data and instead of earthquakes, we’ll be able to see graphical representations of the relationship between things like mortality and industrialization. Of course, this will require some rather unprecedented cooperation between social scientists who aren’t so used to talking to one another, but I suspect that a passion for data is something many of us share and will become a way to bridge our disciplinary divides.

Quantifying the Humanities

The rising importance of metrics for evaluation in higher education has more than a few of my friends and colleagues on edge. What will it mean, for instance, when colleges and universities see the same sorts of assessment data generated for the humanities that already exist in K-12 education? Will we see graduation exams in History or English? How does one quantify the many years spent researching and writing a book of history? How will these data be used?

While I think college faculty are right to ask probing questions about the quantification of their efforts in the classroom and in their research, I think it’s wrong-headed to assume that any and all attempts to quantify educational or scholarly endeavors are somehow an evil conspiracy to undermine our academic freedom and integrity.

For instance, read Jennifer Howard’s very interesting article in the Chronicle of Higher Education from October 10, 2008 (“New Ratings of Humanities Journals Do More Than Rank — They Rankle“). For those of you without online access to the Chronicle, the story begins:

A large-scale, multinational attempt in Europe to rank humanities journals has set off a revolt. In a protest letter, some journal editors have called it “a dangerous and misguided exercise.” The project has also started a drumbeat of alarm in this country, as U.S.-based scholars begin to grasp the implications for their own work and the journals they edit.

I would submit that one implication is that academic c.v.s will be much easier to make sense of. This past year I was on a committee in our Center for Teaching Excellence charged with helping nominees for a state award navigate the process. My two charges were in the Psychology Department and although I know nothing about the relative merits of various Psychology journals, I could quickly see which of their articles was in the more difficult-to-publish-in journals. Why? Because academic journals in Psychology publish data on the acceptance rates of articles. It was therefore obvious to me at a glance that an article published in a journal with an 11% acceptance rate was probably more notable than one in a journal with a 78% acceptance rate.

Only in the humanities have we been so resistant to any sort of quantification of results. Almost every other major disciplinary category — sciences, engineering, health sciences, social sciences — rates and ranks almost everything they do. And in many of these disciplines college graduates are already subject to de facto graduation examinations administered by various licensing boards. So what makes the humanities so special?

Because I don’t think we are special enough to get a pass on quantification of effort, I was pleased to receive the announcement today that the Humanities Resource Center Online has gone live. A project of the American Academy of Arts and Sciences with some collaboration from organizations such as the National Endowment for the Humanities and the American Council of Learned Societies (among others), the HRC offers one-stop shopping for data on the humanities in the United States, much of it set in a global framework.

Want to know how much money was being invested in the humanities in a given year? Want to know about the academic preparation of high school history teachers? Want to know more about the participation of underrepresented groups in graduate programs in English? It’s all there. I applaud the work that has gone into this website and hope that as the years go by more and more data will be deposited there.

Why? Because I’m a historian and I believe in the value of evidence in arguments. The data on this site will make it possible to have much more informed conversations about what is happening (and just as importantly, what is not happening) in the humanities. So, for instance, when we complain that scholars in the humanities are underpaid relative to our peers in other disciplines, now we have the data to prove it. Or when we wonder why our majors seem to be less ethnically diverse than the rest of our student body, we can see how our local findings compare to national data sets.

All in all, I think the current iteration of this project is a great start and I look forward to its further elaboration in the years to come.