The Uses and Abuses of History Page 4
As so often is the case, the ways the public reacts to the work of historians have much to do with the issues of the time. In the late 1950s, Britain was going through a painful period of re-examination as it adjusted to its diminished importance in the world and its manifest social and economic problems at home. The Suez adventure of 1956 had been a costly disaster and, although the new Conservative prime minister, Harold Macmillan, made much of his nation’s special relationship with the United States, it was quite clear which country was the dominant partner. The empire was melting away; indeed, Macmillan had just made his famous speech about the wind of change blowing through Africa when he had to decide whether or not to let Frankland’s volume be published. World War II assumed ever greater importance as the glorious and gallant moment when all British pulled together and Britain was one of the Big Three powers. The mix of nostalgia and pride was neatly and unkindly caught by the satirical revue Beyond the Fringe in its sketch “The Aftermyth of War.” Frankland’s careful and clear examination of the bombing campaign and his revelations about the debates and disputes which had gone on at the time came as a dash of cold water.
Historians, the great philosopher of history R.G. Collingwood wrote in his autobiography, examine the past with a careful eye, even if it means exploding cherished myths: “So long as the past and the present are outside one another, knowledge of the past is not of much use in the present. But suppose the past lives on in the present; suppose, though encapsulated in it, and at first sight hidden beneath the present’s contradictory and more prominent features, it is still alive and active; the historian may very well be related to the non-historian as the trained woodsman is to the ignorant traveller.” That can often be intensely irritating when the historians raise qualifications and point to ambiguities. Do we really want to know that our great heroes, such as Winston Churchill, made silly mistakes? That there was and is a controversy over the effectiveness and morality of the World War II Allied bombing campaign against Germany? That John F. Kennedy suffered from a variety of illnesses and was dangerously dependent on painkillers? I think we do, not for prurient reasons but because a complex picture is more satisfying for adults than a simplistic one. We can still have heroes, still have views on the rights and wrongs of the past, and still be glad that it turned out in one way rather than another; but we have to accept that in history, as in our own lives, very little is absolutely black or absolutely white.
Historians, of course, do not own the past. We all do. But because historians spend their time studying history, they are in a better position than most amateurs to make reasoned judgments. Historians, after all, are trained to ask questions, make connections, and collect and examine the evidence. Ideally, they already possess a considerable body of knowledge and an understanding of the context of particular times or events. Yet, when they produce work that challenges deeply held beliefs and myths about the past, they are often accused of being elitist, nihilistic, or simply out of touch with that imaginary place, “the real world.” In the case of recent history, they are also told, as Noble Frankland was, that they cannot have an opinion if they were not there.
The idea that those who actually took part in great events or lived through particular times have a superior understanding to those who come later is a deeply held yet wrong-headed one. The recent dispute at Canada’s War Museum over the Allied bombing campaign has predictably brought charges that the historians who mounted the exhibit and those who supported it must defer to the views of the veteran airmen. Of course, said the National Post, “there is the issue of free expression and not caving into the sensitivities of every special interest group. Veterans, though, are not just any special interest group.…” I was one of the outside historians called in to evaluate the exhibit when the fuss started. (I supported the plaque and strongly advised the War Museum not to back down.) When my views became known, I started to get mail saying that I had no authority to comment on World War II because I was not part of it. And, as a woman, it was hinted, what could I know of things military anyway? True, I did not receive the email that one of my colleagues did: “The veterans have done more for our country and way of life, and shown more courage and dedication to duty, than you ever will. Since they were there, and you were not, it stands to reason that they should have the final say as to whether or not the plaque is fair.”
Being there does not necessarily give greater insight into events; indeed, sometimes the opposite is true. I lived through the Cuban Missile Crisis, for example, but at the time I knew only what was reported in the media. Like millions of others, I knew nothing of the intense debates in Washington and Moscow about how to handle the crisis. I had no idea that Kennedy had secret channels of communication with the Soviets or that the Soviets already had nuclear warheads in Cuba. I did not know that Fidel Castro was prepared to see his country destroyed if it brought Soviet victory in the Cold War closer. It was only much later, as the classified documents started to appear on both sides, that we got a much more detailed and comprehensive view of what was really happening. The same gap exists between the experiences of the veterans and the history of the bombing campaign. They knew what it was like to risk their lives flying over Germany, but they could not know about the debates in Whitehall or the impact of the bombs they dropped. That could only come with hindsight and much research and analysis.
Memory, as psychologists tell us, is a tricky business. It is true that we all remember bits of the past, often in vivid detail. We can recall what we wore and said on particular occasions, or sights, smells, tastes, and sounds. But we do not always remember accurately. Dean Acheson, the distinguished American statesman, once told the historian Arthur Schlesinger that he needed a strong martini after spending a morning on his memoirs. Acheson had been sketching out the run-up to Pearl Harbor and remembered vividly being in President Roosevelt’s office with the president and Cordell Hull, then secretary of state, on that fateful day in 1941 when the United States took a step closer to war with Japan by freezing Japanese assets: “The President was sitting at his desk; Cordell Hull was sitting opposite him; I was in a chair at the Secretary’s side,” he had written. The only trouble was that Acheson’s secretary had checked the records and found that Hull had not even been in Washington that day.
We mistakenly think that memories are like carvings in stone; once done, they do not change. Nothing could be further from the truth. Memory is not only selective, it is malleable. In the 1990s, there was much public concern and excitement about recovered memories. Authoritative figures published books and appeared in the media claiming that it was possible to repress completely memories of painful and traumatic events. Working with therapists, a number of patients discovered memories of such ghastly things as sexual abuse by their parents, cannibalism, satanic cults, and murder. Many families were destroyed and lives, both of the accusers and accused, ruined. Now that the panic has died down, we are ruefully admitting that there is no evidence at all that human beings repress painful memories. If anything, the memories remain particularly vivid. The “repressed memories” were fiction.
Researchers at the Biological Psychiatry Lab at McLean Hospital, affiliated to the Harvard Medical School, have recently conducted a research project into the repressed memory syndrome. Their interest was piqued by its sudden appearance in the late twentieth century. If the syndrome were hard-wired into the human brain, then surely there would be evidence of its occurrence down through history. They found examples in nineteenth-century literature but, although they offered rewards, they turned up no examples either in fiction or non-fiction before 1800. They concluded that “the phenomenon is not a natural neurological function, but rather a ‘culture- bound’ syndrome rooted in the nineteenth century.” The preoccupation of the Romantics with the supernatural and the imagination, as well as later work, most notably that of Sigmund Freud, on the subconscious predisposed us to believe that the mind can play extraordinary tricks on us.
We edit our memories
over the years partly out of a natural human instinct to make our own roles more attractive or important. But we also change them because times and attitudes change over the years. In the early years after World War I, the dead were commemorated in France and Britain as fallen heroes who had fought to defend their civilization. It was only later as disillusionment about the war grew that the British and French publics came to remember them as the victims of a futile struggle. We also edit out of our memories what no longer seems appropriate or right. When I interviewed British women who had lived in India as part of the Raj, I always asked them what the relations between the British rulers and their Indian subjects were like. They all invariably told me that there was never any tension between the races and that the British never expressed racist views. Yet, we know from contemporary sources—letters, for example, or diaries—that many, perhaps most, of the British in India saw Indians as their inferiors.
We also polish our memories in the recounting. Primo Levi, who did so much to keep the memory of the Nazi concentration camps alive, warned, “A memory evoked too often, and expressed in the form of a story, tends to become fixed in stereotype … crystallized, perfected, adorned, installing itself in the place of the raw memory and growing at its expense.” As we learn more about the past, that knowledge can become part of our memory, too. The director of the Yad Vashem memorial to the Holocaust in Israel once said sadly that most of the oral histories that had been collected were unreliable. Holocaust survivors thought, for example, that they remembered witnessing well-known atrocities when in fact they were nowhere near the place where the events happened.
In the 1920s, the French sociologist Maurice Halbwachs coined the term collective memory for the things we think we know for certain about the past of our own societies. “Typically,” he wrote, “a collective memory, at least a significant collective memory, is understood to express some eternal or essential truth about the group—usually tragic.” So the Poles remember the partitions of their country—“the Christ among nations”—in the eighteenth century as part of their martyrdom as a nation. The Serbs remember the battle of Kosovo in 1389 as their defeat on earth but their moral victory in their unending struggle against Muslims. Often present-day concerns affect what we remember as a group. Kosovo acquired its particularly deep significance in the memory of the Serbs as they were struggling to become an independent nation in the nineteenth century. In earlier centuries, the battle was remembered as one incident in a much larger story. Collective memory is more about the present than the past because it is integral to how the group sees itself. And what that memory is can be and often is the subject of debate and argument where, in Halbwachs’s words, “competing narratives about central symbols in the collective past, and the collectivity’s relationship to that past, are disputed and negotiated in the interest of redefining the collective present.”
Peter Novick has argued forcefully in his book The Holocaust in American Life that for American Jews, the Holocaust became a central identifying feature of who they were only in the 1960s. In the years after World War II, few American Jews wanted to remember that their co-religionists had been victims. Jewish organizations urged their community to look to the future and not the past. It was only in the 1960s that attitudes began to change, partly, Novick argues, because victimhood began to acquire a more positive status and partly because the 1967 and then the 1973 war showed both Israel’s strength and its continuing vulnerability.
As the nineteenth-century Zionists began their bold project of recreating a Jewish state, they looked into Jewish history for symbols and lessons. They found, among much else, the story of Masada. In 73 A.D. as the Romans stamped out the last remnants of Jewish resistance to their rule, a band of some thousand men, women, and children held out on the hilltop fortress of Masada. When it became clear that the garrison was doomed, its leader, Elazar Ben-Yair, convinced the men that it was better to die than submit to Rome. The men first killed their women and children and then themselves. The story was recorded but did not assume importance for Jews until the modern age. Masada has been taken up as a symbol, not of submission to an inevitable fate but, rather, of the determination of the Jewish people to die if necessary in their struggle for freedom. In independent Israel, it became an inspiration and a site of pilgrimage for the Israeli military as well as for civilians. As a popular poem has it, “Never again shall Masada fall!” In recent years, as pessimism has grown in Israel over the prospects for peace with its neighbours, another collective memory about Masada has been taking shape: that it is a warning that Jews always face persecution at the hands of their enemies.
While collective memory is usually grounded in fact, it need not be. If you go to China, you will more than likely be told the story of the park in the foreign concession area of Shanghai that had on its gate a sign that read “Dogs and Chinese Not Admitted.” While it is true that the park was reserved for foreigners, insulting enough in itself, the real insult for most Chinese was their pairing with dogs. The only trouble is that there is no evidence the sign ever existed. When young Chinese historians expressed some doubts about the story in 1994, the official press reacted with anger. “Some people,” a well-known journalist wrote, “do not understand the humiliations of old China’s history or else they harbour sceptical attitudes and even go so far as to write off serious historical humiliation lightly; this is very dangerous.”
It can be dangerous to question the stories people tell about themselves because so much of our identity is both shaped by and bound up with our past history. That is why dealing with the past, in deciding on which version we want, or on what we want to remember and to forget, can become so politically charged.
We argue over history in part because it can have real significance in the present. We use it in a variety of ways: to mobilize ourselves to achieve goals in the future, to make claims, for land for example, and, sadly, to attack and belittle others. Examining the past can be a sort of therapy as we uncover knowledge about our own societies that has been overlooked or repressed. For those who do not have power or who feel that they do not have enough, history can be a way of protesting against their marginalization, or against trends or ideas they do not like, such as globalization. Histories that show past injustices or crimes can be used to argue for redress in the present. For all of us, the powerful and weak alike, history helps to define and validate us.
Who am I? is one question we ask ourselves, but equally important is, who are we? We obtain much of our identity from the communities into which we are born or to which we choose to belong. Gender, ethnicity, sexual preference, age, class, nationality, religion, family, clan, geography, occupation, and, of course, history can go into the ways that we define our identity. As new ways of defining ourselves appear, so do new communities. The idea of the teenager, for example, scarcely existed before 1900. People were either adults or children. In the twentieth century, in developed countries, children were staying in school much longer and hence were more dependent on their parents. The adolescent years became a long bridge between childhood and full adulthood. The market spotted an opportunity, and so we got special teenage clothes, music, magazines, books, and television and radio shows.
We see ourselves as individuals but equally as part of groups. Sometimes our group is small, an extended family perhaps, sometimes vast. Benedict Anderson has coined the memorable phrase “imagined community” for the groups, like nations or religions, that are so big that we can never know all the other members yet that still draw our loyalties. Feeling part of something, in our fluid and uncertain times, is comforting. If we are Christians, Muslims, Canadians, Scots, or gays, it implies that we belong to something larger, more stable, and more enduring than ourselves. Our group predated us and will presumably survive our deaths. When many of us no longer believe in an afterlife, that promises us a sort of immortality.
Nationalists, to take one example of the imagined communities, like to claim that their nation has always existed b
ack into that conveniently vague area, “the mists of time.” The Anglican Church claims that, in spite of the break with Rome during the Reformation, it is part of an unbroken progression from the early church. In reality, an examination of any group shows that its identity is a process, not a fixed thing. Groups define and redefine themselves over time and in response to internal developments, a religious awakening perhaps, or outside pressures. If you are oppressed and victimized, as gays have been and still are in many societies, that becomes part of how you see yourself. Sometimes that leads to an unseemly competition for victimhood. American blacks have watched resentfully as the commemoration of the Holocaust has taken an ever greater place in American consciousness. Was not slavery just as great a crime, some have asked?