Eric Miller
In Quest of Intellectual Community
"As opposed to stupid history?"
That's what a student had asked when, this past November, I told my 19th-c. U.S. survey class I was heading to New York City for an "intellectual history" conference. It was a reasonable question, reflecting understandable confusion. if intellectual history—the study of ideas, any ideas, in the flow of history—hasn't exactly disappeared amidst the profession's proliferating subfields, it's certainly been forced to go undercover. Try searching for a post in intellectual history and you'll search yourself right out of the academy.
So this conference was a spearhead of a larger effort, what one of its organizers calls a "quixotic attempt to invigorate" a once mighty field. There's nothing particularly quixotic, to academics at least, about summoning scholars to give and listen to papers on subjects of little interest to more than fifteen human beings outside of the room. But there is something very quixotic about such a conference being organized by a group of junior scholars from (mainly) second-tier institutions that nonetheless manages to attract a spectacularly high concentration of influential historians from élite institutions. To add to the novelty, the organization sponsoring the conference didn't exist even a year ago. What existed was a blog with the unadorned name "U.S. Intellectual History," itself not even five years old. Somehow the blog's architects had staged three annual conferences prior to this one.
It was the luminaries shining out from the program that attracted me—Jackson Lears (Rutgers), Pauline Meier (MIT), Eric Foner (Columbia), Dorothy Ross (Johns Hopkins), Michael Kazin (Georgetown), Daniel Rodgers (Princeton), and—it's no exaggeration to say—many others. Why were they there? Could it be that something new was emerging in history?
That's not just a cute question. In his widely debated 1988 book That Noble Dream: The "Objectivity Question" and the American Historical Profession, Peter Novick had surveyed the landscape and declared that not only was "objectivity in crisis" but that (as he titled his last chapter) "there was no king in Israel." The (truly) quixotic Victorian attempt to turn history into science, Novick was burdened to contend, had, after one hundred years of effort, led to sophisticated scholarship but little evidence of a past rendered with the clarity and (collective) confidence science should have made possible, according to the profession's overwhelming epistemological consensus—"the idea and ideal of objectivity" being, as Novick put it, "the rock on which the venture was constituted, its continuing raison d'être."
Still, despite the serious, compelling, and fundamental critique of Novick and others, the canons and traditions of scientistic history have retained their force, as Novick, in fact, predicted they would, postmodern contentions about the limits of reason proving no match for the institutional and ideological force of the profession itself. History in the modern American vein has its own history, and it tends to rough up those who mess with it.
Not surprisingly, then, the conference featured a stimulating mixture of focused intelligence and resourceful argumentation serving the usual ends with the usual means. Scholars incisively probed links between consciousness-raising groups and revolutionary politics, between the psychological category of self-hatred and liberal cosmopolitanism, between women's higher education and secularization. Gradually a sharpened sense of our historical moment emerged, as story piled upon story—that, for instance, of the Great Books guru Mortimer Adler, who both detested the New Left and feared for a world globalized under the aegis of capitalism. Or the story of The Lonely Crowd author and "qualitative liberal" David Riesman, whose respect for the "Protestant ethic" grew as liberals went left in the 1960s. Or the illuminating revelation that the American media in nearly all of its varieties not only denounced but in effect silenced any who questioned the Allies' adoption of "obliteration bombing" as a policy during World War II.
What did it all add up to? It was most basically a tradition at play, serious play. Even liberal modernity requires earthy sites of institutionalized ritual (think panel+respondent+Q&A, or wine and cheese receptions, or "business meetings"), although the underground suite of grey classrooms we huddled within at the Graduate Center of the City University of New York could hardly be described as "earthy"—dull windowless rooms lit in florescent white and cluttered with wires, screens, and plastic chairs.
This setting did, though, set the university nicely off from The Church of Our Savior just around the corner, which a friend and I stepped into over Friday's lunch. We were stunned into silence upon discovering a massive Christ Pantocrator staring at us from the darkened apse—a reminder that traditions of any kind don't survive without judgment peering down. At this conference, the senior scholars present—some genuinely iconic—were prepared to offer it. Leo Ribuffo, a looming, laughing presence who teaches at the George Washington University, issued, in a style that somehow blended Andy Rooney and Archie Bunker, a bemused rejoinder to the panelists' assumption that the modern self deserves to be taken as seriously as modern selves themselves tend to believe. The grey room roared throughout his remarks, lit for a moment with another kind of light.
But the University of Pennsylvania's Bruce Kuklick was anything but jocular in his judgments. In response to a panelist who on his view failed to probe with sufficient acuity the apocalyptic beliefs of early 20th-century Pentecostals, Kuklick declared that "we need more antipathy to the religious." In response to the historian who sympathetically told the story of the American critics of obliteration bombing, Kuklick charged historians to register "less antipathy" to American leaders. The Allied effort requires, apparently, the same reverence from historians now as it did from journalists then.
It was Kuklick's rejoinder to a third scholar, though, that revealed most sharply the epistemic fissures Novick's That Noble Dream reveals. After leaving unmistakable his evaluation of the moral deficiencies of any historian who is either soft on religion or hard on America, Kuklick lit into a historian who, in his view, had smuggled "normative" assumptions about civil religion into a paper tracing Richard John Neuhaus' evolving beliefs about war and the state.
Clearly it wasn't normativity itself that was the problem—Kuklick was nothing if not brimming with moral judgment, about the war, about history, about America itself. For that matter, the whole conference was swelling with normativity. One scholar severely and angrily denounced the absence of attention to the AIDS crisis in Daniel Rodgers' recent Age of Fracture, winner of the Bancroft Prize. Another thrilled an audience with his mention of having gotten "the shit beat out of him" by a cop during an Occupy Wall Street rally the night before. The closing plenary session assembled an impressive collection of eminent figures to discuss American exceptionalism, including the Pulitzer Prize-winner Eric Foner, who described it as "fundamentally hubris," a sign of being "closed-minded." Such a session and such a judgment make no sense apart from a professional self-identity premised on the notion of the scholar-as-citizen, dedicated to using history to build the republic: a normative commitment if there ever was one.
Kuklick's hang-up, it seems, wasn't with normativity. It was with method. Once we turn to investigation of the evidence itself, the old, still current, story goes, we turn off moral evaluation and don our scientific caps. We meet as disinterested equals in a true meritocracy, intelligence of argumentation the only arbiter.
If only language were so cooperative. A scholarly, neutral notion of civil religion—or equality, or democracy, or morality—exists, think such scholars. But it does not.
Mary Kupiec Cayton, of Miami University, was one who dared, in a remarkable paper, to charge that something may be fundamentally compromised about this modern conception of not just historical method but rationality itself. Turning to the theoretical work of José Casanova, Talal Asad, and Charles Taylor, she suggested that the secular conception of history was actually rooted in a "faulty epistemology," one that has blinded historians to the centrality of religion in the formation of the American nation. She quickly clarified that she was offering this argument in the name not of religious conviction but of what she called a "civil ethic," concerned that the failure to understand secularity as itself something of a "colonial imposition" was leading American historians—of all people—to be "politically incorrect" in their treatment of not just religion but religious believers, past and present.
It happened that her respondent was the profession's keenest critic, Christopher Shannon of Christendom College, who applauded her searching attention to the "deep structures" of the profession's dominant historical narratives. In view of the incoherency, not to say illiberality, of the profession's epistemological tradition, Shannon, in an award-winning 2011 essay published in Historically Speaking, urges it to openly embrace diverging traditions at their deepest points of difference rather than forcing all to play by the (alleged) neutral rules of liberal modernity. His solution to the crisis of knowledge: a proposal for a new "professional pluralism" that "would distinguish itself from its liberal predecessor by an explicit commitment to the pursuit of Truth—that is a truth beyond that which is empirically verifiable," thus making possible not a "monologue" about modernity but a "dialogue"—one premised on the notion that much is, truly, at stake.
Elisabeth Lasch-Quinn, a historian at Syracuse, if anything upped the stakes in her published response to Shannon's proposal. "Let us not forget," she warned, that "all traditions are not equal. Welcoming all comers at the start of a conversation is different from ending with the same status for all. Given our current unraveling," she concluded, "we need a cultivation of judgment." At this conference she sounded this same theme. Before a packed room she expressed her own hopes for this fledgling organization, premised on her sense that its founding is one manifestation of what she takes to be, much to her surprised delight, a broader renewal of interest in ideas. If so, she urged, it may be that "intellectual communion" will be possible here—the enticing possibility that comes with shucking the "neutral stance" and believing that irony is not the end, that "discovery"—shared discovery—is indeed possible.
It's a striking judgment. It may just be the truth.
Eric Miller, professor of history at Geneva College, is the author of Hope in a Scattering Time: A Life of Christopher Lasch (Eerdmans).
Copyright © 2012 Books & Culture. Click for reprint information.
Displaying 00 of 0 comments.
Displaying 00 of 0 comments.
*