close

European Modern History

“To be open, in spite of the past”: Revisiting Putin’s words in the light of Russia’s war in Ukraine

meinecke rev

On the night of 23 February 2022—formerly Soviet Army Day—Russia violated the territory of another sovereign post-Soviet state, independent Ukraine. The actions demonstrate Russia’s continued denial of the right to sovereign nationhood of former USSR member states, previously manifested in the Chechen wars (1994-96, 1999-2009), the occupation of South Ossetia and Abkhazia in 2008, and Crimea in 2014. The war being waged by Russia has turned into a kind of looped Blitzkrieg on multiple, seemingly disconnected fronts, leading to the displacement of around 10 million people, around a quarter of Ukraine’s total population. Millions have limited or no access to basic goods such as water or electricity, over 3.5 million had to flee the country under humanitarian escape corridors continuously violated by Russian forces. Given Ukrainian resilience, a ‘Chechen’ scenario, with Grozny pulverised by Russia and reconstructed under a pro-Russian protectorate in 2006, is not a likely prospect for Kyiv, yet what ‘deal’ precisely can lead to a way out of the war is not clear either. The International Criminal Court has been receiving tangible evidence of war crimes being committed by Russia, including Putin-loyal Chechen forces, on Ukrainian territory. In some cities, like Mariupol, over 70% of its residential buildings have been destroyed. Attacks, including the use of weapons such as vacuum bombs, previously used by Russia in Syria, have targeted civilian infrastructure from health facilities and food storage to theatres and historical archives.

I am the kind of historian who thinks that words and ideas matter, even—perhaps especially—in times of the greatest violence. They matter, too, because they resonate long after shots have been fired, bodies burned and buried, new borders etched onto the map of Europe. But to disclose the true power of words, their capacity to affect audiences and divide them, to reveal truths as well as to obfuscate, they need to be put in relation to social actions and legal interventions.

Last year, on 22 June 2021, eighty years after Germany’s attack on the Soviet Union, Putin published an op-ed in the German liberal weekly Die Zeit. It appeared in German translation under the title “To be open, in spite of the past”. The piece concluded that Europe ‘simply cannot afford to lug around the weight of bygone misunderstandings, offences, conflicts and mistakes.’ It was time for the Europeans to leave old-fashioned Cold War institutions such as NATO behind and instead to work on ‘constructive interdependence’ for the sake of ‘security and prosperity’. This new aim should be secured by such initiatives as ‘Nord Stream’, and visions such as Charles De Gaulle’s greater Europe ‘from Lisbon to Vladivostok’, and Putin’s own idea of ‘creative cooperation’.

Sometime between last summer and now, the link to Putin’s Die Zeit article on kremlin.ru disappeared. (It is still accessible at Die Zeit). Has Putin changed his mind? It will remain a mystery, but we can still try to make sense of the meaning of the piece in the light of his subsequent writings.

Putin has been partial to long op-eds and increasingly long speeches on matters of history, and yet many people refuse to engage with his ideas in any detail. I understand why, as I too feel a kind of squeamish disgust whenever I find myself in a room where someone is blurting something on Russian state TV. The bare essence of this material can be reduced to a single claim, made repeatedly to Russia’s home audiences, a claim with which the Russian executive sent its conscripts into this war that is never called that. This claim centres around the supposed ‘genocide’ of Russians in the areas of Ukraine no longer under Ukrainian control. It should be noted that even according to the UN High Commissioner (OHCR) data, which is likely to be a wild underestimation, the number of civilians who died in Ukraine alone in one month since 24 February 2022 (1,035) is more than four times higher than the total number of civilian deaths recorded by OHCHR in the conflict zone of eastern Ukraine in four years, from 1 January 2017 to 31 December 2021 (253). Those who endorse the actions of Russia’s leadership, such as, most recently, these Rectors of most major Russian universities, leaders of orchestras of major Russian conservatories, or, most horrifyingly, those looking after children in hospices for palliative care, provide their own servile justifications for Russia’s criminal actions in Ukraine. Those who despise them, such as the close to two million Russians who have signed various petitions since the war started, let their names speak against his words. On 4 March 2022, the Russian government introduced the latest of a series of legislative packages aimed at curtailing dissent. It is now a criminal offense to call the actions of the Russian army in Ukraine a war rather than a ‘special operation’, punishable with up to fifteen years of prison colony. The remit of interpretation is so wide that there is now an entire legal advice industry elaborating on how to use social media ‘safely’ under the new legal regime. There is no doubt that punishments are no longer symbolic, by the way: just a week before the war, a Russian teenager was sentenced to five years of prison colony for trying to blow up the FSB building in a video game. So, many of the millions of names in petitions against the war have been taken down or hidden by the creators of the petitions themselves, to protect the signatories.  

But let’s go back to the op-ed from June 2021. What did Putin and his circle mean by being open ‘in spite of the past’? The internal dimension of this phrase only becomes clear in the context of recent legislative changes regarding historical interpretation. The Russian government not only seeks to disburse itself of the responsibilities for the Soviet state’s crimes against its own citizens, such as illegal imprisonment, execution or deportation, and mass famines; it actually seeks to silence the very memory of these offences.

Since 2017 the Russian Duma has passed no less than four laws which make it a criminal offense to utter certain historical judgments about the Russian past. These include proposing analogies between the Soviet Union and Nazi Germany, particularly with reference to the Molotov-Ribbentrop pact of 1939. Moreover, by November 2021 more than 100 organisations and over 60 individuals registered in Russia and abroad have been declared ‘foreign agents’ under Russian criminal law. As a result, the more than three decade-long work in documenting and describing crimes committed by the Soviet state against its own citizens and those of other states, undertaken by the network of NGOs called Memorial, has been stopped, the organisation ‘liquidated’, their offices raided.

Against all odds, like a refugee, the living memory of these injustices has been crossing the borders of post-Soviet Russia, defying regulation by means of national legislation or military intervention. While Russia’s FSB archives have been closing and attempts to investigate the persecution of one’s ancestors are now met with intransigence, Ukrainian KGB archives have opened, like those of the Stasi previously, making it easier to piece together the story of both countries’ Soviet past. Some of these Ukrainian archives, however, have now also been destroyed by Russian bombs.

In a television address of 21 February 2022, just a few days before the invasion, Putin had made clear his assertion that Ukraine’s sovereignty was illegitimate because as a nation-state it was essentially a product of Bolshevik imagination. The current secessionist movements were part of the unintended boomerang effect of this statement, Putin argued. Secondly, Putin asserted Ukraine’s government had no legitimacy because it was oligarchical and corrupt. ‘You want Decommunisation?’ he asked. ‘We will show you Decommunisation’.

It is worth delving a bit more into historical detail here, because the arguments made in this last prewar speech appear to have been put through some kind of counterfactual speech generator. Putin is basically saying that contemporary Russia is the rightful heir to the medieval Kievan Rus’, to the Russian empire under the Romanovs, and to the USSR. By contrast, Ukraine has no basis in premodern history and in its modern history was created by the Bolsheviks. You could, however, just as well use the same references to make the opposite argument, that Russia is a product of the Bolsheviks and Ukraine is the true heir both to Kievan Rus’ and the empire. In any case, neither outcome of such scholastic disputes should be used to extinguish the lives of thousands of innocent people.  

At the level of twentieth-century history alone Putin’s statement about the Bolshevik origins of Ukraine is plainly wrong. Lenin’s essay, ‘The Right of Nations to Self-Determination’ appeared in 1914, three years before the Bolshevik faction had come into being. It was a document of internal controversy within Russian Social Democracy, which also included Rosa Luxemburg, who wanted to see the rights of minority nations supported against larger nations through robust federations. Lenin dismissed this view by asserting that effectively only national sovereignties would be able to promote capitalism to a higher stage needed for revolution to succeed.

The Ukrainian People’s Republic emerged in the February revolution of 1917 and was thereby the first aspirant to post-imperial Russian sovereignty. A Ukrainian delegation negotiated with the Central Powers at Brest-Litovsk before the Bolsheviks hijacked the proceedings. Meanwhile, the Bolsheviks were at this time not only internally divided over tactics and strategy, but also did their utmost to undermine the emergence of Ukrainian sovereignty throughout the Civil War. Thus, chronologically, Ukrainian national sovereignty emerged before the Bolsheviks took over the Russian state from the failing Provisional Government.

Putin’s emphasis on the early twentieth century, moreover, deflected attention from the more relevant recent sources of Ukrainian sovereignty in the dissolution of the USSR. Its emergence was rooted in the same kind of constituent power as that of modern Russia itself, wielded by the Ukrainian nation as represented by the Ukrainian SSR, including Crimea, and declared on 16 July 1990, just over a month later than Russia, then reaffirmed by a referendum in 1991.

While the erasure of this uncomfortable past informs one dimension of Putin’s speeches, a second dimension is more emotive. Putin’s recent speeches convey the feeling of having been offended by the collective ‘West’. Putin’s critique of NATO has caught on particularly among the ‘progressive’ readership in the West. For broader audiences, he refers to the current military campaign both as an act of ‘Decommunisation’ (used ironically, as a nod to the recent depedestalisations of Lenin in Ukraine – after all, in his home country the Communist party has consistently been his main real political adversary, attracting around a third of voters’ sympathies) and Denazification (with angry righteousness, linking the history of collaborationism with the Nazi regime in Ukraine to the existence of far-right groups there today). For positive orientation, Putin points to international treaties such as the OSCE agreement at Astana of 2010, which tampered the ‘equal right to security’ of each state with an element of relativity by allowing military capability that is ‘commensurate with our legitimate individual or collective security needs’.

A third major line of Putin’s argumentation is the dual assertion that, on the one hand, not all nations are ultimately eligible for sovereignty, and on the other hand, some sub-national communities can be entrusted with self-determination. It may be surprising to note that this argument about unequal entitlements used to be rather wide-spread in liberal circles of late imperial Germany, for instance, notably by Friedrich Meinecke – who, after the Second World War, ended up writing a bitter indictment of Germany’s path, titled The German Catastrophe.

At the last raid on Moscow Memorial, the police left the infamous letter ‘Z’ – the new sign of Russia’s interventionist actions against enemies at home and abroad – on its noticeboard. It is the same letter, apparently derived from the World War II-related phrase Za pobedu [for victory], which marks the so-called ‘special operation’ in Ukraine. Is it worth trying to find a name for the Russian political movement responsible for the catastrophe in Ukraine, such as Russian fascism, or Putinism? Fascism is often overused, or used too lightly, and for a long time Putin and his immediate circle lacked a common ideological core, while Putin’s own popularity had been steadily waning. For the time being, it may be more precise to call its supporters by the name they use themselves: Zetism, perhaps. It means that when a time of reckoning comes, not having Russian citizenship or performing a last-minute disavowal of Putin will not be an obstacle for the need to accept individual responsibility for these atrocities.

Dina Gusejnova is an Assistant Professor at the Department of International History at the London School of Economics. She has previously been Senior Lecturer in Modern History at Sheffield University and has taught at Queen Mary University of London, UCL, and at the University of Chicago. She is the author of European Elites and Ideas of Empire, 1917-57 (Cambridge University Press, 2016), and the editor of Cosmopolitanism in Conflict: Imperial Encounters from the Seven Years’ War to the Cold War (Palgrave, 2018). Her current research concentrates on the longer-term impact of the internment of scholars from continental Europe in Britain during the Second World War. In the current crisis, she is involved in managing mentoring opportunities for students, scholars and cultural workers from Ukraine or those fleeing political repression in Russia and Belarus, at https://neweurope.university/. At LSE, she is also a member of Ukraine Hub UK Academic Taskforce.

Image: Projection of a future publication on Russia’s predicament (Loosely based on Friedrich Meinecke, Die deutsche Katastrophe, Wiesbaden, 1946)

read more

Euro 2020 ends as it began: as a political football

EURO_2020_FINAL_Wembley_Stadium_London_11_July_2021_(j) (1)

In the end football didn’t ‘come home’ – nor was it meant to. Euro 2020 was not designed as a celebration of England’s long association with the codified form of the game in the way the 1996 tournament had been. Instead, it was the ‘Eurovision’ of the now-disgraced former UEFA president Michel Platini Set out in 2012 to celebrate the 60th anniversary of the competition, it spread games across a number of countries in a celebration of European football (not political) unity.

Indeed, sporting events have a long history of being used for expressions of political agendas dating back to ancient times. Most famously in the modern era was the Nazification of the 1936 Berlin Olympics. Even the European Football Championship, in its former guise as the European Nations Cup, saw its first edition affected by political grandstanding;  General Franco withdrew the Spanish team in 1960 when they were drawn to play his ideological enemies, the Soviet Union, at the height of the Cold War.

Meanwhile, war-time enmities have also played a part in previous competitions. Memories of the Second World War were a strong motivation for Dutch fans to celebrate so strongly when defeating West Germany in the 1988 semi-final in Hamburg, and even to this day the War is referenced by English fans each time they play Germany. More recently, the vicious Balkan Wars of the 1990s have made their presence felt, with Yugoslavia excluded from the 1992 competition due to UN sanctions. Meanwhile, Albania qualified for the 2016 tournament after being awarded three points from a qualifier against Serbia in Belgrade that was abandoned after violence erupted arising from tensions over Kosovo. The controversy in this tournament over Marko Arnautović’s goal celebration that allegedly targeted North Macedonia players on account of their Albanian origins shows that football remains an arena for highlighting the divisions from that period.

Platini’s vision never entirely came to pass, as Euro 2020 highlighted both the best and worst in sport’s capacity to bring people together and to reinforce divisions. No other country represented these complexities as clearly as the UK. Before the tournament there were large elements within the Conservative establishment who opposed England’s players taking the knee before matches to highlight discrimination not just in football but wider society, where players continue to receive unacceptable racial abuse from those who hide behind the anonymity of social media and even in sections of the established media. Home Secretary, Priti Patel’s criticism of the move as ‘gesture politics’ and saying that people were ‘free to boo’ it, together with Tory MP Lee Anderson’s assertion that he would not support the team unless players desisted from the stand, were hardly consistent with leadership of a progressive society.

Meanwhile, there was rabid nationalism in a section the Scottish press ahead of the final between Italy and England. The front-page mock-up of Italian coach, Roberto Mancini dressed as William Wallace under the headline ‘Roberto You Are Our Only Hope’ was resonant of the worst jingoistic excesses of the English press over the past 40 years, especially in the light of the mutual respect engendered by the England v Scotland match in the group phase.

However, beyond taking the knee, the diverse nature of England’s squad which acted with a complete unity of purpose has seemingly generated a similar reaction across England in creating what the journalist and academic Sunny Hundal has described as ‘a new English identity: confident in its diversity and tolerance, illustrative patriotism’. Arguably, this has been most powerfully seen in the backlash of ordinary Englishmen and women to the vile abuse received by Marcus Rashford, Jadon Sancho and Bukayo Saka on account of their skin colour following their penalty shoot-out misses in the final. Large sections of the public have also seen through the crocodile tears of the Home Secretary regarding the team’s treatment, and have taken her to task for helping to create a climate to ‘stoke the fire’ of racism as Tyrone Mings commented in the light of her comments. Hopefully, it will be a catalyst for real change.

Unfortunately, expressions of intolerance were not confined to England. The Hungarian FA was fined for the racist abuse of players by sections of the home support in Budapest, whilst UEFA begrudgingly allowed German and English captains Manuel Neuer and Harry Kane to wear rainbow armbands in their second-round clash, but refused permission for the Munich authorities to light up the Allianz Stadium in those same rainbow colours. Confusingly, as an institution UEFA had already pledged its support for the LGBT community and Pride month and had the players wear the ‘respect’ logo on their jerseys, but then took no action as rainbow flags were taken away from spectators in Baku.

In some respects, the European coming together envisaged by Platini did happen. The sight of Finnish fans chanting Christian Eriksen’s name in unison with their Danish hosts after the player suffered an on-field heart attack, was one of the most heart-warming moments of the past month. Moreover, in the face of COVID-19 travel restrictions, the presence of thousands of expatriate supporters behaving impeccably to support their teams in cities like London, Rome and Amsterdam was vindication of the benefits of more than half a century of freedom of movement following the Treaty of Rome, rather than the narrow anti-immigrant discourse of Brexit.

Indeed, Italy’s final victory over England has been viewed with a degree of relief across Europe, especially in the light of the graceless and violent behaviour of too many of the English fans before, during and after the match. If Britain and Ireland’s bid to host the 2030 World Cup is to be successful, it will need to demonstrate that the British – the English in particular – can interact with its neighbours in a collaborative way that is founded on mutual respect and that is able to overcome international divisions in the post-Brexit era. However, sport will likely continue to remain a political football.

Mark Orton is an independent researcher specialising in national identity and sport, and recently passed his PhD at De Montfort University defending his thesis on Football and National Identity in Argentina 1913–1978. You can find him on Twitter @MarkAOrton

Cover image: EURO 2020 – Final Crowds at Wembley Stadium before kick-off at 8PM London time, 11 July 2021, courtesy of Kwh1050 https://commons.wikimedia.org/wiki/File:EURO_2020_FINAL_Wembley_Stadium_London_11_July_2021_(j).jpg (Accessed 17 July 2021)

read more

‘Violent affections of the mind’: The Emotional Contours of Rabies

Rabies pic small

Living through the Covid-19 pandemic has more than drummed home the emotional dimensions of diseases. Grief, anger, sorrow, fear, and – sometimes – hope have been felt and expressed repeatedly over the last year, with discussions emerging on Covid-19’s impact on emotions and the affect of lockdown on mental health.

But emotions have long since stuck to diseases. Rabies – sometimes called hydrophobia – is a prime example.[i] In nineteenth-century Britain, France, and the United States, rabies stoked anxieties. Before the gradual and contested acceptance of germ theory at the end of the nineteenth century, some doctors believed that rabies had emotional causes.

For much of the nineteenth century, the theory that rabies generated spontaneously jostled with the one that held that it was spread through a poison or virus. The spontaneous generation theory stressed the communality of human and canine emotions. Rather than contagion through biting, emotional sensitivity made both species susceptible to the disease.

A sensitive person prone to emotional disturbances was considered particularly at risk from external influences that might cause rabies to appear. “Violent affections of the mind, operating suddenly and powerfully on the nervous system” could in rare cases lead to rabies or, at the very least, exacerbate the symptoms in nervous patients, according to Manchester physician Samuel Argent Bardsley (who was more commonly known for promoting quarantine as a way of containing the disease).

For one Lancashire man, John Lindsay, the difficulty of feeding his family drove him to anxiety and despair, exacerbated by a bout of overwork and a lack of food. Fatigued, suffering from headaches, and fearing liquids, Lindsay remembered being bitten by a supposed mad dog some twelve years previously. Amidst violent spasms, visions of the black dog “haunted his imagination with perpetual terrors” and made recovery seem “hopeless.” With reluctance, Bardsley concluded that this was a case of spontaneous rabies. Emotional distress and an overactive imagination had caused and aggravated the disease.

During the mid-nineteenth century prominent London doctors argued that rabies was closely linked to hysteria and had emotional and imaginative origins, much to the chagrin of veterinarian William Youatt, the leading opponent of theories of spontaneous generation.[ii] In the 1870s alienists (otherwise known as psychiatrists) then lent greater intellectual credibility to theories of rabies’ emotional aetiology. They stressed the powerful sway that emotions and the mind held over individuals, especially in the enervating conditions of modern life.

Physician and prominent British authority on mental disorders Daniel Hack Tuke argued that disturbing emotions and images could create hydrophobic symptoms in susceptible individuals. Referencing Bardsley, and drawing on French examples, he argued that “such cases illustrate the remarkable influence exerted upon the body by what is popularly understood as the Imagination.” The very act of being bitten by a dog and the “fearful anticipation of the disease” was enough to spark rabies , even if the dog was not rabid. Even rational and emotionally-hardy doctors had reported suffering from hydrophobic symptoms when recalling the appalling scenes of distress during the examination and treatment of hydrophobic patients.[iii] 

Tuke suggested that in some cases excitement or other forms of mental, emotional, and sensory overstimulation could activate the virus years after a bite from a rabid dog. He drew on a striking case from the United States, as reported by the Daily Telegraph in 1872. A farmer’s daughter had been bitten by a farm dog when choosing chickens for slaughter. The wound healed and no signs of rabies appeared until her wedding day two months later. The “mental excitement” of this life-changing event brought on a dread of water. After the ceremony she experienced spasms and “died in her husband’s arms.”

Tuke reproduced the newspaper’s view, and more generalized gendered assumptions about female emotional delicacy, that such “nervous excitement” had a profound influence on the “gentler” sex. In this case, her nerves were considered to have been exacerbated by the anticipation of the impending wedding night, which was often framed as an emotionally fraught sexual encounter.[iv]

Dr William Lauder Lindsay of the Murray Royal Asylum in Perth, Scotland, was another prominent proponent of the view that rabies was a predominately emotional disease. The disease, he argued, “is frequently, if not generally, the result of terror, ignorance, prejudice, or superstition, acting on a morbid imagination and a susceptible nervous temperament.” Under the sway of their overactive imagination, an individual could take on “canine proclivities,” such as barking and biting. In classist language, Lindsay argued that rabies showed the influence of mind over the body, especially in the “lower orders of the community.”[v]

The British alienists’ depiction of rabies as a predominately emotional disorder made its way across the Atlantic. In the mid-1870s Dr William A. Hammond, President of the New York Neurological Society and leading American authority on mental disorders, stated that the evidence from Europe suggested that heightened emotions might cause rabies in humans. More generally, New York physicians and neurologists debated whether or not individuals had died from actual rabies or fears of the disease, and discussed how fear might turn a bite from a healthy animal into death.[vi]

The alienists lent greater credibility to earlier theories that rabies anxieties could lead to imaginary or spurious rabies. Tuke asserted that fears of rabies could create an imaginary manifestation of the disease. “Hydrophobia-phobia” demonstrated clearly the “action of mind upon mind,” and was distinct from the “action of the mind upon the body” in those cases when emotional distress led to actual rabies.

Echoing Tuke, Lindsay identified women as a particular vector in triggering spurious rabies. He asserted that they spread rabies fears, as supposedly shown by an Irishwomen in Perth who had frightened her husband into believing he had rabies. For Lindsay, this was a classic case of spurious (or false) rabies, which required the rational and firm intervention of medical men, such as himself, to stamp out. But he felt himself fighting an unstoppable tide. For in America, as well as Britain, the press ignited fears and created spurious rabies in susceptible individuals.[vii]

Lindsay and Tuke believed that rabies could, in some cases, be transmitted by dogs to humans through biting and “morbid saliva.” But some doctors controversially argued that it was a purely emotional disease. Eminent Parisian doctor Édouard-François-Marie Bosquillon set the tone in 1802 when he confidently declared that rabies in humans was caused solely by terror. His observation that individuals were struck with hydrophobic symptoms, including “loss of reason” and convulsive movements,” at the sight of a mad dog provided sufficient proof.

Horror-inducing tales of rabies, fed to children from a young age, created fertile conditions for the development of the disease, particularly in “credulous, timid and melancholic” people. Gaspard Girard, Robert White, William Dick, and J.-G.-A. Faugére-Dubourg developed this line of argument as the century progressed. And the theory had traction. In the 1890s, Philadelphian neurologist Charles K. Mills insisted that rabies was purely a disease of the nerves. Such theories were, however, contentious, and Tuke cautioned against those who asserted that rabies was solely an imaginary disease.[viii]

Nonetheless, these theories cemented rabies as an emotionally-fraught disease and reinforced the dangers of dogs bites: even a bite from a healthy dog could trigger a lethal neurological reaction in the swelling ranks of anxious individuals. 

Dr Chris Pearson is Senior Lecturer in Twentieth Century History at the University of Liverpool. His next book Dogopolis: How Dogs and Humans made Modern London, New York, and Paris is forthcoming (2021) with University of Chicago Press. He runs the Sniffing the Past blog and you can download a free Android and Apple smart phone app on the history of dogs in London, New York, and Paris. You can find Chris on Twitter @SniffThePastDog.


Cover image: ‘Twenty four maladies and their remedies’. Coloured line block by F. Laguillermie and Rainaud, ca. 1880. Courtesy of the Wellcome Collection, https://wellcomecollection.org/works/pysjar4f/images?id=mpqquvrh [accessed 25 March 2021].

[i] Contemporaries sometimes used “rabies” and “hydrophobia” interchangeably to refer to the disease in animals and dogs, but sometimes used “rabies” to refer to the disease in dogs and “hydrophobia” for humans. With the rise of germ theory at the end of the nineteenth century, “rabies” gradually replaced “hydrophobia.” For simplicity’s sake, I will use “rabies” to refer to the disease in humans and animals unless I quote directly from a historical source.

[ii] Samuel Argent Bardsley, Medical Reports of Cases and Experiments with Observations Chiefly Derived from Hospital Practice: To which are Added an Enquiry into the Origin of Canine Madness and Thoughts on a Plan for its Extirpation from the British Isles (London: R Bickerstaff, 1807), 238-50, 284, 290; “Hydrophobia”, The Sixpenny Magazine, February 1866; Neil Pemberton and Michael Worboys, Rabies in Britain: Dogs, Disease and Culture, 1830-2000 (Basingstoke: Palgrave Macmillan, 2013 [2007]), 61-3.

[iii] Daniel Hack Tuke, Illustrations of the Influence of the Mind Upon the Body in Health and Disease Designed to Elucidate the Action of the Imagination (Philadelphia: Henry C. Lea, 1873), 198-99, 207.

[iv] Tuke, Illustrations,200-1; Daily Telegraph, 11 April 1872; Peter Cryle, “‘A Terrible Ordeal from Every Point of View’: (Not) Managing Female Sexuality on the Wedding Night,” Journal of the History of Sexuality 18, no. 1 (2009): 44-64.

[v] William Lauder Lindsay, Mind in the Lower Animals in Health and Disease, vol. 2 (London: Kegan Paul, 1879), 17; William Lauder Lindsay, “Madness in Animals,” Journal of Mental Science 17:78 (1871), 185; William Lauder Lindsay, “Spurious Hydrophobia in Man,” Journal of Mental Science 23: 104 (January 1878), 551-3; Pemberton and Worboys, Rabies, 96-7; Liz Gray, “Body, Mind and Madness: Pain in Animals in the Nineteenth-Century Comparative Psychology,” in Pain and Emotion in Modern History, ed. Rob Boddice (Basingstoke: Palgrave, 2014), 148-63.

[vi] “Hydrophobia: The Subject Discussed by Medical Men,” New York Times, 7 July 1874; Jessica Wang, Mad Dogs and Other New Yorkers: Rabies, Medicine, and Society in an American Metropolis, 1840-1920. (Baltimore: Johns Hopkins University Press, 2019), 150-1.

[vii] Tuke, Illustrations, 198-99; Lindsay, “Spurious Hydrophobia in Man,” 555-6, 558.

[viii] Lindsay, Mind in the Lower Animals, 176; Édouard-François-Marie Bosquillon, Mémoire sur les causes de l’hydrophobie, vulgairement connue sous le nom de rage, et sur les moyens d’anéantir cette maladie (Paris: Gabon, 1802), 2, 22, 26; Vincent di Marco, The Bearer of Crazed and Venomous Fangs: Popular Myths and Delusions Regarding the Bite of the Mad Dog (Bloomington: iUniverse, 2014), 141-47; Pemberton and Worboys, Rabies, 64; Tuke, Illustrations, 198-99; Wang, Mad Dogs, 151-2.

read more

What’s in a Special Relationship?

NATO_3c_1952_issue_U.S._stamp

The recent decision by US President Donald Trump to remove some American troops from Germany has brought much consternation to the international community. One interesting twist that has found its way into the conversation occurred when Anthony Blinker, policy advisor to presidential candidate and former Vice President Joe Biden commented that the move weakened NATO and harmed Germany, ‘our [America’s] most important ally in Europe.’ Many on both sides of the Atlantic gasped at this comment, but none more so than those in the United Kingdom. The truth of the matter is – and this may come as a shock to some – that the United States has never seen the Anglo-American relationship as special. Yes, there are cultural and linguistic commonalities, but when it comes to foreign policy, the United States’ view on Britain and Europe does not match that of an Anglo-American ‘special relationship’.

It would be fair to say that Winston Churchill’s consistent message of a Special Relationship between Great Britain and the United States has ingrained the phrase in the minds of most citizens of both countries. Nevertheless, from a governmental and policy position, it has traditionally been a one-sided relationship. American leaders have rarely used the phrase and even more rarely acted on it to the point that former German Chancellor Helmut Schmidt is reported to have said the ‘British clam to have a special relationship with the US, but if you mention this in Washington, no one knows what you are talking about.’ This idea was reinforced during the Brexit debates when US President Barack Obama stated that the UK would find itself at the back of the queue in US trade negotiations. The last fifty years provides a clearer understanding of how the US views the ‘Special Relationship.’

It would also be fair to say that since the end of the Second World War, US Foreign Policy has focused on a strong Europe. The ‘Special Relationship,’ as a purely Anglo-American relationship, is very much a British view. This does not mean that the US has not or does not value Britain. What is often forgotten, intentionally or not, is the importance of Europe to US foreign and trade policy since 1945. During the Second World War, the US and Britain, along with the Soviet Union, stood side-by-side to defeat the Axis. Once the war was over, and the Cold War began, the relationship between the US and Britain changed. What began as a strategic and military partnership during the Second World War quickly morphed into a relationship between two unequal partners. Despite Britain’s continually diminishing status, US presidents from Truman to Clinton understood the value of working with the British to meet US foreign policy goals.[1]

Nevertheless, US presidents have also focused on a strong Europe. Successive US presidents supported British involvement in different European projects. Dwight D. Eisenhower as Supreme Allied Commander Europe and later as President was firm in his belief that any plan to defend Europe required a British commitment to the continent. As such, he continually pushed Churchill, and later Eden and Macmillan, to take a more active role in NATO and the European Economic Community, which they eventually did.

The collapse and break-up of the Soviet Union in 1991 left US leaders believing they did not need multilateral alliances. The US was and is, after all, the lone superpower. Since this time, presidents from both parties have chosen to ‘go it alone.’ In the meantime, Britain failed to stop its slide away from world power status. True, London remains one of the great financial centers in history but as a nation, they no longer have the military power to be more than a limited partner on the world stage. A no more shocking example of how far Britain’s defense capabilities have fallen can be found in the fact that the Royal Navy is now smaller than Pakistan’s navy and only slightly larger than Qatar’s, and the Royal Air Force is about the size of the Brazilian air force.[2]

Under George W. Bush and Barack Obama, it appeared that the US was moving closer to Germany as its leading partner in European issues. This was not a new position, per se, and it was not a result of Germany’s military prowess (it is also struggling to maintain a large and functioning force) but due to its economic power. The US position since 1945 has been to forge a durable transatlantic link between the US and Europe.[3] At the beginning of the twenty-first century, Germany had the fourth-largest economy in the world with a GDP that was more than $1 trillion larger than that of Britain. What is often overlooked in all of the discussion about America pulling closer to Germany and further away from Britain, or about the withdrawal of US troops from Germany is Europe’s importance to the US.

A look at the Bank of England’s Quarterly Bulletin provides an idea of how important Europe is to the US relative to the UK. America’s most trusted trade partners are still the United Kingdom and Europe. As the year 2020 rolls towards the last quarter, Germany is feeling angst about its special relationship with the US. While the US president drives that anxiety, a reversal of roles may be in the offing. With US politics becoming less reliable in recent years, Europe might decide to no longer rely on the US and ‘go it alone,’ just as the US did in the 1990s. However, with reports that Johnson’s government is secretly ‘desperate’ for a Biden victory in hopes of a revived comprehensive trade plan the chances of a Europe without the US seem small.  In light of Brexit, the UK might think about how the US has historically viewed the special relationship. For the US, the relationship that is and has always been special has been with Europe – a Europe that includes Britain.

Justin Quinn Olmstead is currently Associate Professor of History and Director of History Education at the University of Central Oklahoma with a Concurrent Appointment in the College of Arts and Humanities at Swansea University, Wales as Affiliate Faculty with responsibility for doctoral research supervision. He has edited two books, Reconsidering Peace and Patriotism during the First World War (Palgrave Macmillan, 2017), and Britain in the Islamic World: Imperial and Post-Imperial Connections (Palgrave Macmillan, 2017). Dr. Olmstead has also published, The United States’ Entry into the First World War: The Role of British and German Diplomacy (Boydell & Brewer, 2018). He has contributed a chapter on the impact of military drones on foreign affairs in The Political Economy of Robots, (Palgrave Macmillan, 2018). Currently, he is the Assistant Editor for The Middle Ground Journal, Treasurer and Director of Membership for Britain and the World, and president elect of the Western Conference on British Studies. Just undertook his PhD at the University of Sheffield — you can find him on Twitter @OlmsteadJustin

Cover image: NATO 3-cent 1952 U.S. stamp, issued at the White House on April 4, 1952, honored the North Atlantic Treaty Organization (NATO). https://commons.wikimedia.org/wiki/File:NATO_3c_1952_issue_U.S._stamp.jpg [Accessed 11 August 2020].

[1] Melvyn P. Leffler, A Preponderance of Power: National Security, the Truman Administration, and the Cold War (Stanford: Stanford University Press, 1992), p. 61.

[2] https://britainandtheworld.org/news/2020/6/4/batw-announces-a-virtual-roundtable

[3] Timothy Andrews Sayle, Enduring Alliance: A History of NATO and the Postwar Global Order (Ithaca: Cornell University Press, 2019), p. 3.

read more

Delight, Dismay and Disbelief: Reactions to the Death of Hitler, 75 Years Ago

Hitler_salute_in_front_of_lamppost

It is 75 years since Adolf Hitler committed suicide in his Berlin bunker. His death continues to generate considerable public interest thanks to both continuing forensic discoveries about his biological remains, and the persistence of outlandish tales of his postwar survival. While no serious historian believes in the latter, it is worth considering how confused reporting of Hitler’s fate in spring 1945 created a climate ripe for the flourishing of such legends.

The first formal declaration of Hitler’s death came late on the evening of 1 May 1945 via a radio broadcast by Grand Admiral Karl Dönitz. Sombre music and drum rolls gave way to the momentous announcement: ‘our Führer, Adolf Hitler, has fallen. In the deepest sorrow and respect, the German people bow’. It was, proclaimed Dönitz, a ‘hero’s death’, Hitler falling in battle while fighting valiantly against the ‘Bolshevik storm’.

‘Hitler Dead’ screamed countless international headlines the next day. The bold, dramatic and matter-of-fact statement left little room for ambiguity. Hitler had met his end, National Socialism was vanquished and the Second World War was effectively over. The Daily Herald printed a caricature of a burning Nazi emblem under the slogan ‘WAStika’. The cover of Time magazine simply struck Hitler’s face out with a large red cross.

The media’s response to Hitler’s passing was predominantly one of intense relief. ‘The whole building cheered’, recalled Karl Lehmann, a member of the BBC Monitoring unit. Numerous editorials depicted it as a moment of universal liberation – ‘a terrible scourge and force of evil has been removed’, declared the Lancashire Daily Post.[1] The sense of catharsis continued into the VE Day celebrations a few days later when the burning of Hitler’s effigy typically formed the high point of the UK’s festivities.

In the midst of this jubilation, however, there was widespread uncertainty about the precise cause of death. Dönitz’s talk of Hitler ‘falling’ in battle filled the first wave of international news reports, but many of the accompanying editorials urged caution about accepting this at face value. There was suspicion that either the Nazis were exaggerating the circumstances of his demise to foster a ‘Hitler legend’, or that they were peddling an entirely false narrative to distract from his retreat from the scene. Questioned on the matter during a White House press conference, President Harry S. Truman insisted that he had it ‘on the best authority possible’ that Hitler was, indeed, dead – but conceded there were no details yet as to how he died.

The press were right to question the death-in-battle scenario invented in the Dönitz broadcast. Stationed in Flensburg, over 270 miles away from the death scene, the Admiral was reliant upon information fed to him by colleagues in Führerbunker, namely Propaganda Minister Joseph Goebbels and Head of the Party Chancellery Martin Bormann. The pair had already delayed sending definitive news of Hitler’s passing, prompting Dönitz to misdate the fatal moment to the afternoon of 1 May, rather than the 30 April. They also neglected to supply details of what, exactly, had occurred, leaving Dönitz to fill in the gaps for himself. As it transpired, he was not the only person speculating on Hitler’s fate.

United States made propaganda forgery of Nazi German stamp. Portrait of Hitler made into skull; instead of “German Reich” the stamp reads “Lost Reich”. Produced by Operation Cornflakes, U.S. Office of Strategic Services, circa 1942, https://commons.wikimedia.org/wiki/File:Futsches-Reich-Briefmarke-UK.jpg [accessed 29 April 2020]

The Western Allies, anxious to puncture martyrdom myths before they could take hold, swiftly countered Dönitz’s heroic imagery by reviving rumours of Hitler’s previously failing health. The Soviets, meanwhile, denounced reports of Hitler’s death as a ‘fascist trick’ to conceal his escape from Berlin. Even when reports of a Hitler suicide emerged from 3 May, debate continued as to whether the Nazi leader had shot himself or taken cyanide – poison being perceived by the Soviets as a particularly cowardly (and thus eminently appropriate) way out for Hitler.

What, though, did the general public make of all this? Within hours of the Dönitz broadcast, the New York Times and the social research organisation Mass Observation were gauging reactions across Manhattan and London respectively. At first, the news appeared anticlimactic; people who had longed for this moment felt disoriented, numb or empty now it was finally upon them. As the implications sunk in, Hitler’s death raised optimism that the war might finally be over, but dashed hopes that the public would see him brought to justice. ‘Too bad he’s dead’, mused one young New Yorker, ‘he should have been tortured’.[2]

The overwhelming reaction to news of Hitler’s demise, though, was one of disbelief. Some sceptics perceived the whole affair as a Nazi ruse, with Hitler just waiting to ‘pop out again when we aren’t looking’. Others foreshadowed modern-day accusations of ‘fake news’, directing their cynicism towards the contradictory explanations printed in the Allied press for Hitler’s demise. Mistrust of Nazi propaganda was also, understandably, common with one Londoner reflecting, ‘I don’t believe he died fighting. They just said that to make it seem more – you know – the way he’d have wanted people to think he died… I think personally he’s been out of the way for a long time now.’[3]

Ultimately, the competing versions of Hitler’s death ensured that the timing and cause of his demise became quite fluid within the public imagination. This, together with initial Soviet refusals to disclose the recovery of an identifiable corpse outside the bunker, created a vacuum in which all manner of rumours could take root. By contrast, the death of Benito Mussolini was commonly regarded with satisfaction because the deliberate display of his body rendered it an indisputable fact. It was only in 2000 that images of Hitler’s jaw (alongside a fragment of skull erroneously attributed to him) were publicly exhibited in Moscow, demonstrating how documenting the truth about his fate has proved a protracted process, and explaining why the Nazi leader has managed to remain so ‘alive’ in public discussion for all these years.

Caroline Sharples is Senior Lecturer in Modern European History at the University of Roehampton.  Her research focuses on memories of National Socialism, representations of the Holocaust and perpetrator commemoration. She is currently writing a cultural history of the death of Adolf Hitler. You can find her on Twitter @carol1ne_louise.

Cover image: Adolf Hitler, prior to 1945.

[1] Lancashire Daily Post, ‘Hitler’s Exit’ (2 May 1945), p.2.

[2] New York Times, ‘City Takes Report of Death in Stride’ (2 May 1945), p.9.

[3] Mass Observation Archive, University of Sussex, Topic Collection 49/1/1: ‘Hitler Indirects’, Hampstead, 2 May 1945.

read more

Human Rights and the COVID-19 Lockdown

The_universal_declaration_of_human_rights_10_December_1948 edit

The speed with which we have given up some of our most basic rights and freedoms in the face of an incurable epidemic would be noteworthy, if it were not also such a cliché. Everyone has seen films in which the rights-bearing body of an individual becomes a disease-vector, and ultimately little more than toxic waste to be placed under rigorous cordon sanitaire, if not summarily obliterated. The mediocre Tom Clancy techno-thriller Executive Orders (1996) had the USA fight off a weaponised Ebola attack, with only conniving political opportunists moaning about rights, as the pragmatic authorities intoned the legal pabulum “the Constitution is not a suicide-pact!”[i]

Less entertainingly, it is also very nearly a truism of real-life commentary that the inequality with which “rights” are distributed in good times is multiplied in bad ones. While the virus itself may not discriminate, as we have been repeatedly advised, it seems to be having a disproportionate impact in the ethnic-minority communities of major Western nations, while the economic effects of lockdown are, of course, more violently traumatic the closer one is to the margins of society.

Human rights are supposedly universal and unconditional. But the protections they claim to offer have always proven flimsy and threadbare in practice. One reason for this is that the evolution of rights-language in the last three centuries is in fact frequently about two other things: firstly, an idea of grounded, foundational rectitude which has only partially shifted from theological to “scientific” underpinnings, and secondly, the doctrine of state sovereignty, historically entangled with the assertion of national identity. In the way they are used in practice in the world, “human rights” are frequently a cover for assertions and practices that entirely contradict their supposed premise of individual autonomy and security.

Human rights began their modern life as “natural rights”, an offshoot of centuries of European intellectual debate about the existence and contours of “natural law”. Understood, implicitly and explicitly, as a function of the fact of an ordered and purposive divine creation, and of the sovereign state as a component of such an order, rights retained their theological tinge very clearly into the Age of Enlightenment. The US Declaration of Independence invoked the “laws of nature and of nature’s God” as its foundation, spoke of the trinity of life, liberty and the pursuit of happiness as rights “endowed by their Creator” upon men, and appealed to “the Supreme Judge of the world” for validation. Thirteen years later, the French declared the “natural and imprescriptible rights of man” at the heart of a document they decreed to be proclaimed “in the presence and under the auspices of the Supreme Being”.

The French declaration of 1789 also placed the imagined rights-bearing individual in a complex and ultimately subordinated relationship to the other rising force of the era, in stating that “The principle of all sovereignty resides essentially in the nation”, and that “Law is the expression of the general will.” Across the declaration’s seventeen articles, although “every citizen” has the “right” to participate in lawmaking, the law itself – the encoded power of the nation-state – stands above anyone’s “liberty, property, security, and resistance to oppression” (the four enumerated natural rights).[ii]

The modern sovereign nation-state that increasingly took shape in the 1800s was built on claims of inherent superiority that displaced divinity with reason, but were no less, and sometimes more, discriminatory as a result. In France, even before the Revolution had transitioned into Napoleon’s dictatorship, the savants of the new National Institute had taken up the reins of scientific leadership dropped by the abolished royal academies of the old order. Alongside scholars of the sciences and literature, equal prominence was given to practitioners of the “moral and political sciences”.

One of the supposedly great truths that these scholars enunciated, for a country now explicitly referring to itself as “the Great Nation”, was that such a nation, while naturally superior to others, also contained many – multitudes indeed – who did not measure up, individually, to that greatness. France’s leading intellectuals quite deliberately defined the egalitarian republicanism to which they were sworn as something that required, in practice, a rigorous hierarchical division between the fully-enlightened and able elite, and the majority, still seeking to pull themselves out of the mire of the past, who could only expect to be led, gently but firmly, for the foreseeable future.

The legacy of the early nineteenth-century approach to the superiority of rational knowledge has been the creation of waves of ideological thinking, predicated on the foundational entitlement of those who know better to dominate and manipulate the common herd. Over the past two centuries, ideologies from to fascism to Marxism-Leninism, via the imperial liberalism that dominated Anglo-American and French public life, have used claims about their superior understanding of past, present, and future to claim the right to forcibly remake humanity for the collective good, using the overwhelming power of the state.

When the founders of the United Nations produced a Universal Declaration of Human Rights in 1948, they proposed to endow all people with a remarkably wide-ranging set of entitlements. The first clause of Article 25 states:

Everyone has the right to a standard of living adequate for the health and well-being of himself and of his family, including food, clothing, housing and medical care and necessary social services, and the right to security in the event of unemployment, sickness, disability, widowhood, old age or other lack of livelihood in circumstances beyond his control.

A noble aim, perhaps, but also a staggering act of hypocrisy on the part of France and the UK, ruling over swathes of subjugated and impoverished empire, the USSR, winding up to launch the antisemitic persecution of the so-called “Doctors’ Plot”, and the USA, mired in racial segregation and discrimination. The ultimate paradox of the notion of individual “rights” is that, if they are violated by a higher power, only a yet-higher and more righteous power can set matters straight. It is easy to believe such a power can exist, much harder to identify it in practice.

The past six decades have seen repeated and ever-more elaborate forms of international covenants binding states to increasing portfolios of rights that purport to demand respect. Yet, where are we? Half of the world’s ten largest countries – more than 3 billion people in those five states alone – are ruled by demagogues and autocrats.[iii] The UN’s “Human Rights Council”, founded in 2006, is a rotating talking-shop of forty-seven states which to date has never failed to include some of the world’s most notorious human-rights abusers in its membership.

Sitting in our homes, in a world which has, with the best intentions, summarily crushed many of our most fundamental everyday freedoms, we might legitimately wonder whether all discussion of “human rights” remains in the shadow of its pre-modern origins. We have, mostly, displaced the notion of divinely-ordained absolute sovereignty with more modern ideas, but we may well have given the sovereign nation and the state that embodies it almost as much power, while gaining in return little real regard for the individuals whose rights it supposedly protects.

David Andress is Professor of Modern History at the University of Portsmouth, and author of many works on the era of the French Revolution. He edited the Oxford Handbook of the French Revolution (2015), and has recently published Cultural Dementia (2018) and The French Revolution: a peasants’ revolt (2019).

Cover image: The universal declaration of human rights, 10 December 1948.

[i] The short-lived 2004 BBC show Crisis Command grimly demonstrated what might happen if a plague outbreak in the UK was not mercilessly stamped out, and to hell with rights.

[ii] According to the canonical text, the law may constrain liberty, in a whole number of ways, if behaviour troubles “the public order established by law”; it may overrule people’s own understanding of both security and resistance to oppression, for “any citizen summoned or arrested in virtue of the law shall submit without delay, as resistance constitutes an offence.” It may even, in the text’s final article, take away property, despite this being reiterated as “an inviolable and sacred right”, as long as due forms are followed and compensation paid. And what those are, of course, will be determined by the law.

[iii] In 2005 the UN invented the doctrine of a collective “Responsibility to Protect” human rights in other states. In 2015 the Saudi government invoked its “responsibility” to “protect the people of Yemen and its legitimate government” in launching the savage and near-genocidal campaign that continues to this day.

read more
1 2 3
Page 1 of 3