Yesterday I started a new thread on the undergraduate history curriculum. In the first post of this thread I posed a question about requirements in the major and what those requirements ought to look like. I also suggested that history majors themselves ought to have some input into the design of the major, given that it has a much greater influence on their lives than on ours.
Today I want to raise another important issue about the undergraduate history curriculum. One thing we know for sure about our undergraduate students is that they already live in a world pervaded by digital media. We also know from our own experiences that more and more historical content, historical analysis, and popular uses of history are appearing online. At some point, in the not too distant future (once Google has been scanning books for a few more years), there will be more historical information online than in even the Library of Congress.
So I asked myself, what are history departments around the country doing to teach their majors digital skills?
Alas, the answer is apparently “nothing.”
To be sure, many faculty members around the country incorporate digital skills (all of my courses require students to blog and use wikis, for instance) into their courses, and so when I say “nothing”, what I really mean is that I could not find any evidence that history departments around the English-speaking world are requiring these skills. I searched the Internet using both Google and our own Syllabus Finder (a database of just under 915,000 syllabi in all disciplines that have been posted online), and here’s what I found.
Not even George Mason, where we mandate the learning of these skills in our PhD program, has an undergraduate digital media course in its catalog. My colleague Paula Petrik has offered a couple of sections of such a course under one of our course numbers that allows faculty wide latitude in the scope and content of the course, but we do not include a digital history course as either a requirement or a numbered course in the catalog. With all the digital historians here at the Center for History and New Media, if we aren’t doing it, I suppose it is no surprise that no one else is either.
To be fair, I did find two courses that would qualify as the kind of digital media history course that I am talking about. But I would submit that two courses (Representing the Real: Documenting U.S. History and History and Digital Media ) is a pretty pitiful return from an hour’s worth of searching. I’m sure there are others out there. They just seem to be fairly well-hidden.
Where do we go from here? Given how important digital skills will be to our majors–whether they become historians, attorneys, school teachers, or anything else–and how central the digital world is becoming to the discipline of history–it is time for us to take a step back and examine our undergraduate curricula with these issues in mind. How do the learning objectives in our courses lead to our majors developing some reasonable set of digital skills? Who in our departments can teach a course in digital history, and if the answer is no one, who can we support in a quest to develop that expertise? How should digital experience play a role in our hiring of new faculty? All of these are important questions that I hope will become part of the discussion of undergraduate history education at the place where the rubber meets the road–in our decisions about which courses make it into the catalog and which don’t.
You ask the best questions! Digital skills are essential, but I would hazard a guess that most faculty who have been out of grad school for a decade are largely self-taught (if they have the digital skills at all). A shift such as that you suggest is going to produce two sources of cognitive dissonance: first, if they acknowledge that it is important, why have they not learned it ‘properly’ (and being academics, that means some kind of professional training and recognition by the Powers That Be that such training is valued) and second, such recognition cannot help but radically challenge their own teaching and research paradigms.
There are many, many historians in academia that regard teaching as an unwelcome distraction. Accountability, represented by the increasing emphasis on assessment, is resented and resisted. If historians and the profession itself continue to regard their primary function as researchers and regard teaching as a necessary evil, such innovations as assessable learning objectives, redefining the programs to suit student needs and integrating digital skills training are going to met with hostility and dismissed as the ramblings of “bean-counters” and “education types.”
Interesting question, I’m glad you asked it. A lot lies behind the digital documents (scanned or born digital) that you find on a website such as that maintained by the National Archives and Records Administration (NARA). or a private sector entity
If you goal is to help students who may work in fields other than history, you might want to talk about the technical, legal and ethical challenges they may face in dealing with electronic records. What is kept, what is disclosed, how metadata is preserved, what happens to historical information that is born digital, are issues that many of your students will face. Some work environments will be friendlier to history than others.
I follow some of these issues by looking in on a records management listserv. Not every business entity treats records the same way. The continued existence of a business may depend on what happens to its records. In some companies, the existence of historical records might be viewed more as a business liability than as a business asset. How records are handled after creation often depends on how risks are assessed within a regulatory environment. This isn’t always easy and may confront corporate employees, some of whom majored in history, with difficult choices. (Lawyers are not the only ones who will face such issues.)
Looking back on my own undergraduate and graduate studies in history (which started in the late 1960s and lasted through the 1970s), I wish there had been a cross-cutting course available that provided some non-academic context on how people create, use or view knowledge and data. Given your goal, I believe that history majors these days would benefit from a discussion of data reliability, information management, law and business ethics.
I ended up working as an employee of the National Archives and Records Administration (NARA) with the White House records of Richard Nixon. I started my job there while I still was in grad school. Nothing in college prepared me for how something we historians love – a paper trail – may be regarded by a power player. And how a former President may fight archivists’ efforts to apply legislative intent. (Congress had called for release of “the full truth†about Watergate as well as information of “general historical significance.â€) I’ll never regret going to work at NARA. But on-the-job training in some of this — through tangling with a former President’s lawyers, in the ways my colleagues and I did — can be brutal
If your students go to work in a governmental setting, they may have to apply the Federal Records Act, the Presidential Records Act, or state laws and municipal regulations. I’ve never worked in academe but I imagine it is easier to address these public sector issues at a state university. For compliance reasons, a state university probably has a good records management and archiving system in place for its electronic and paper-based records.
I’ve always worked in the public sector. From what I’ve read, things can get complicated in the private sector. When the Wall Street Journal published an article on the value of corporate history in 1987, it quoted a Harvard business history professor who warned, “Lawyers are the enemies of history.†(“In Wake of Cost Cuts, Many Firms Sweep Their History Out the Door,†Wall Street Journal, December 21, 1987)
A corporate, governmental or academic organization that values knowledge — and views records retention as having a low risk — may provide a comfortable fit for a history major. It may be harder for such a person to work in an environment where the existence of records (beyond those required by Sarbanes-Oxley and other laws or regulations) may be seen as a corporate or political liability. (Think of any news story you’ve read about documents being shredded improperly, sometimes despite regulatory prohibitions.) I can’t stress strongly enough that students need to consider what environment best suits not just their financial goals but also their temperament and system of values and to apply for jobs accordingly.
In a knowledge-based institution that faces little risk of consumer or customer lawsuits, the pressures are fewer. Records actually may be prized for how they preserve institutional knowledge. However, your students still may have to deal with metadata capture, access and version control. (Not so easy, especially where photos are concerned. Without the negatives that exist for 35 mm film, in the age of Photoshop, it may be much harder, years after its creation, to state with any certainty that a digital photo captured a person or scene as it really appeared.)
What about the information that shows who handled what and when? The transmittal slips and handwritten annotations found in old, paper archival records now may be captured in the document history of an Electronic Records Management System and through electronic comment functions in Word or PDF files. As judges have noted in court, e-mail systems may include valuable information beyond the content of a message, such as actual creation date, who opened them, etc. Electronic records present many interesting forensic challenges.
How easy it is to preserve digital data depends in part on whether it is static or dynamic and how often it is subject to being overwritten. (When President Clinton left office, the National Archives preserved some portions of the external White House website as it appeared during his administration.) Historians also will have to sort out how records that may be created in a hybrid environment (some born digital, some existing in hard copy, only) fit together in terms of timelines, circulation for comment, etc. Of course, permanently valuable digital data has to be migrated or emulated to ensure that it continues to be machine-readable.
Ideally, history majors should get a glimpse of the mix of legal, technical and human factors – the latter actually being the most important — which surround the records we use to write history. This can be done in part by describing what happened at Enron and Arthur Andersen or by tracing the difficulties in opening Nixon’s records. Or pointing to the impact on email retention of actions taken by Oliver North during Iran-Contra. Or considering the effect on corporate officials of the type of evidence introduced during the Clinton-era Microsoft litigation.
In summary, whether they plan to work in the private, public or not-for-profit sectors, in academe, in the business, or in the governmental world, students need to consider, at least in passing, what can happen – as a consequence of ethical or unethical actions — during the life cycle of potentially historical records. You need to remind them also that scholars see only the records that are preserved and eventually made available to the public, either because the law permits it or in the case of personal property, because someone donates it to a repository. (In the latter case, a deed of gift will control access to it.)
Discussing this, however briefly, might help students better appreciate what happens with records, whether they were created recently, in the age of the delete key, or back when secretaries still typed ribbon and carbon copies of documents.
Given the widespread prejudice in the academy about faculty blogging, I am wondering how one could integrate this into a history curriculum.
Actually, I wonder if this is a problem to be solved at the departmental level.
I’m not a faculty member, but I do deal with a lot of academic advising and curriculum questions. The University does require all students (regardless of major) complete a computer ethics and proficiency course (usually IT103). There are dozens of sections of this course- what if someone proposed that one section (at least) be designed only for history majors? Other departments offer courses within their own discipline to meet this requirement (Administration of Justice, and Government, for example), and history even has its own special section in ENGL 302 (intensive writing) for history majors only, so why not do the same with IT? IT&E could work with one of the department or CHNM historians in creating the content for the course while still meeting the university requirement.
Hi Laura:
This is a very interesting idea and one that deserves close scrutiny as we begin to think about changes in the undergraduate curriculum later this year. I like this one.
Mills