History, American Democracy and the AP Test Controversy

History, American Democracy and the AP Test Controversy

History, American Democracy, and the AP Test Controversy

Wilfred M. McClay is the G.T. and Libby Blankenship Professor in the History of Liberty at the University of Oklahoma. He has also taught at the University of Tennessee at Chattanooga, Tulane University, Georgetown University, and Pepperdine University, and he served for eleven years as a member of the National Council on the Humanities. His books include The Masterless: Self and Society in Modern America, The Student’s Guide to U.S. History, and Figures in the Carpet: Finding the Human Person in the American Past. He received his Ph.D. in history from Johns Hopkins University.

Historical study and history education in the United States today are in a bad way, and the causes are linked. In both cases, we have lost our way by forgetting that the study of the past makes the most sense when it is connected to a larger, public purpose, and is thereby woven into the warp and woof of our common life. The chief purpose of a high school education in American history is not the development of critical thinking and analytic skills, although the acquisition of such skills is vitally important; nor is it the mastery of facts, although a solid grasp of the factual basis of American history is surely essential; nor is it the acquisition of a genuine historical consciousness, although that certainly would be nice to have too, particularly under the present circumstances, in which historical memory seems to run at about 15 minutes, especially with the young.

No, the chief purpose of a high school education in American history is as a rite of civic membership, an act of inculcation and formation, a way in which the young are introduced to the fullness of their political and cultural inheritance as Americans, enabling them to become literate and conversant in its many features, and to appropriate fully all that it has to offer them, both its privileges and its burdens. To make its stories theirs, and thereby let them come into possession of the common treasure of its cultural life. In that sense, the study of history is different from any other academic subject. It is not merely a body of knowledge. It also ushers the individual person into membership in a common world, and situates them in space and time.

This is especially true in a democracy. The American Founders, and perhaps most notably Thomas Jefferson, well understood that no popular government could flourish for long without an educated citizenry—one that understood the special virtues of republican self-government, and the civic and moral duty of citizens to uphold and guard it. As the historian Donald Kagan has put it, “Democracy requires a patriotic education.” It does so for two reasons: first, because its success depends upon the active participation of its citizens in their own governance; and second, because without such an education, there would be no way to persuade free individuals of the need to make sacrifices for the sake of the greater good. We now seem to think we can dispense with such an education, and in fact are likely to disparage it reflexively, labelling it a form of propaganda or jingoism. But Kagan begs to differ with that assessment. “The encouragement of patriotism,” he laments, “is no longer a part of our public educational system, and the cost of that omission has made itself felt” in a way that “would have alarmed and dismayed the founders of our country.”

Why has this happened? Some part of the responsibility lies within the field of history itself. A century ago, professional historians still imagined that their discipline could be a science, able to explain the doings of nations and peoples with the dispassionate precision of a natural science. But that confidence is long gone. Like so many of the disciplines making up the humanities, history has for some time now been experiencing a slow dissolution, a decline that now may be approaching a critical juncture. Students of academic life express this decline quantitatively, citing shrinking enrollments in history courses, the disappearance of required history courses in university curricula, and the loss of full-time faculty positions in history-related areas. But it goes much deeper than that. One senses a loss of self-confidence, a fear that the study of the past may no longer be something valuable or important, a suspicion that history lacks the capacity to be a coherent and truth-seeking enterprise. Instead, it is likely to be seen as a relativistic funhouse, in which all narratives are arbitrary and all interpretations are equally valid. Or perhaps history is useless because the road we have traveled to date offers us only a parade of negative examples of oppression, error, and obsolescence—an endless tableau of Confederate flags, so to speak—proof positive that the past has no heroes worthy of our admiration, and no lessons applicable to our unprecedented age.

This loss of faith in the central importance of history pervades all of American society. Gone are the days when widely shared understandings of the past provided a sense of civilizational unity and forward propulsion. Instead, argues historian Daniel T. Rodgers, we live in a querulous “age of fracture,” in which all narratives are contested, in which the various disciplines no longer take a broad view of the human condition, rarely speak to one another, and have abandoned the search for common ground in favor of focusing on the concerns and perspectives of ever more minute sub-disciplines, ever smaller groups, ever more finely tuned and exclusive categories of experience. This is not just a feature of academic life, but seems to be an emerging feature of American life more broadly. The broad and embracing commonalities of old are no more, undermined and fragmented into a thousand subcultural pieces.

This condition has profound implications for the academy and for our society. The loss of history, not only as a body of knowledge but as a distinctive way of thinking about the world, will have—is already having—dire effects on the quality of our civic life. It would be ironic if the great advances in professional historical writing over the past century or so—advances that have, through the exploitation of fresh data and new techniques of analysis, opened to us a more expansive but also more minute understanding of countless formerly hidden aspects of the past—were to come at the expense of a more general audience for history, and for its valuable effects upon our public life. It would be ironic, but it appears to be true.

As historian Thomas Bender laments in a recent article, gloomily entitled “How Historians Lost Their Public,” the growth of knowledge in ever more numerous and tightly focused subspecialties of history has resulted in the displacement of the old-fashioned survey course in colleges and universities, with its expansive scale, synthesizing panache, and virtuoso pedagogues. Bender is loath to give up any of the advances made by the profession’s ever more intensive form of historical cultivation, but he concedes that something has gone wrong: historians have lost the ability to speak to, and to command the attention of, a larger audience, even a well-educated one, that is seeking more general meanings in the study of the past. They have indeed lost their public. They have had to cede much of their field to journalists, who know how to write much more accessibly and are willing to explore themes—journalist Tom Brokaw’s celebration of “the greatest generation,” for example—that strike a chord with the public, but which professional historians have been trained to disdain as ethnocentric, triumphalist, or uncritically celebratory. Professional historians complain that such material lacks nuance, rigor, and is prone to re-package the past in terms that readers will find pleasing to their preconceptions. They may be right. But such works are at least being read by a public that is still hungry for history. The loss of a public for history may be due to the loss of a history for the public.

Instead, it seems that professional historiography is produced mainly for the consumption of other professional historians. Indeed, the very proposition that professional historiography should concern itself in fundamental ways with civic needs is one that most of the profession would find suspect, and a great many would find downright unacceptable—a transgression against free and untrammeled scholarly inquiry. Such resistance is understandable, since conscientious historians need to be constantly wary of the threat to their scholarly integrity posed by intrusive officials and unfriendly political agendas.

There can be no doubt that the professionalization of the field has brought a remarkable degree of protection for disciplinary rigor and intellectual freedom in the framing and pursuit of historical questions. But must abandonment of a sense of civic responsibility come in tandem with the professionalization of the field? This presents a problem, not only for the public, but for the study of history itself, if it can no longer generate a plausible organizing principle from its own resources.

Consider in this regard our startling incapacity to design and construct public monuments and memorials. Such edifices are the classic places where history and public life intersect, and they are by their very nature meant to be rallying points for the public consciousness, for affirmation of the body politic, past, present, and future, in the act of recollection and commemoration, and recommitment to the future. There is a profundity, approaching the sacramental, in the atmosphere created by such places, as they draw together generations of the living, the dead, and those yet unborn in a bond of mutuality and solidarity. The great structures and statuary that populate the National Mall in Washington, D.C.—such as the Lincoln Memorial and the Washington Monument—or the solemnity of Arlington National Cemetery, do this superbly well. There is a sense, too, that cemeteries honoring fallen soldiers of the Confederacy somehow deserve our general respect, even if the cause for which they fell does not. But these structures were a product of an earlier time, when the national consensus was stronger. Today, as illustrated by the endless deadlock over the design and erection of a memorial to Dwight D. Eisenhower in Washington, a drama that has become a fiasco, we seem to find the construction of monuments almost impossibly difficult. And in a different but not unrelated way, the sudden passion to cleanse the American landscape of any and all allusions to the Confederacy or slaveholding—a paroxysm more reminiscent of Robespierre than of Lincoln—also suggests the emergence of a public that is losing meaningful contact with its own history.

Why has this happened? In the case of the Eisenhower memorial, it happened because the work of designing the memorial was turned over to a fashionable celebrity architect who proved incapable of subordinating his monumental ego to the task of memorializing a great American hero. But more generally, it has happened because the whole proposition of revering and memorializing past events and persons has been called into question by our prevailing intellectual ethos, which cares little for the authority of the past and frowns on anything that smacks of hero worship or piety toward our forebears. The past is always required to plead its case before the bar of the present, where it generally loses. That ethos is epitomized in the burgeoning academic study of “memory,” a term that refers in this context to something vaguely suspect.

This is Part One of a multi-part series. Keep an eye out for the next installment!

Reprinted by permission from Imprimis, a publication of Hillsdale College.