close

Modern History

The Monarchy and the Next Great Depression

1280px-Royal_broadcast,_Christmas_1934_(Our_Generation,_1938)

It has been an oft-quoted refrain since the coronavirus pandemic arrived in Europe: along with much of the rest of the world, Britain and the continent face a looming recession on a scale that hasn’t been witnessed since the 1930s. The first half of this inauspicious decade saw a collapse in overseas investment and profits, a rapid rise in unemployment, and yawning financial uncertainty for ordinary people.

Across the globe, the Great Depression also threw up challenges to democracies and some didn’t survive. The spectre of far-right nationalism, feeding on the misery of the masses, rose once again to undermine the spirit of international cooperation and optimism that had come to define the 1920s.

Britain’s political system, though, while certainly tested by the economic downturn, remained remarkably resilient to the kinds of forces that swept away Taisho Japan, Weimar Germany, and the Second Spanish Republic. British democracy – if it can be labelled as such – had been longer in the making and its political institutions were more robust than those in the aforementioned countries. But one organization often ignored by historians and political scientists which played a key role in helping to maintain at least the appearance of order and stability in these difficult years was the House of Windsor.

What exactly did the crown do and what might the current monarchy learn from the lessons of the 1930s in adjusting to a period that may one day be referred to as the Second Great Depression?

Beginning in the years immediately before the first world war, King George V and his courtiers carefully enlarged the sphere of royal altruism so that it touched more working-class people’s lives than ever before. This formed part of a conscious effort to promote social cohesion in a period marked by a surge in class conflict.

Royal philanthropy grew in importance on the home and western fronts between 1914 and 1918 and, in the wake of the economic slump that followed the Wall Street Crash of 1929, the Windsors increased their efforts to help those subjects who they deemed most in need of attention. For example, the royals set up relief funds for unemployed men and their families who had, often overnight, lost breadwinner wage packets.

Historian Frank Prochaska sees the 1930s as key to the emergence of what he terms a ‘welfare monarchy’. Since 1917, courtiers had worried about the allure of communism among what they perceived as a politically restless and unreliable proletariat. Driven by renewed fears of revolution in the early 1930s, the Windsors used philanthropy to cultivate closer ties with working-class communities in the hope that it would reduce feelings of disaffection and thus help to ensure the maintenance of the status quo.

Unfortunately for the current royal family, it will take a more concerted effort from the UK’s central government to deal with the crisis that lies ahead that will likely leave little room for philanthropic endeavour. It is also imperative that the royals avoid entangling themselves with contentious policies that might otherwise undermine the monarchy’s claims to political impartiality.

One such area no longer deemed to be taboo or politically contentious (which the monarchy has thus leapt on) is Britain’s mental health. The 2008 economic crisis led to a programme of austerity which saw government reduce real-terms spending on mental health services, and into this gap popped Princes William and Harry.[1]

We can be sure that palace courtiers are already searching for similar gaps in the current government’s Covid-19 recovery programme that younger members of the royal family can look to fill through new kinds of public service, thus ensuring their own meaningful survival – as was the case in the 1930s.

It was the Great Depression that also led the king to deliver the first ever Christmas broadcast to his peoples in Britain and across the empire in 1932. Again, the aim was to offer words of reassurance and comfort at a time of great difficulty. And it seems that he largely succeeded: the concern George V communicated for his people in his radio messages strengthened the emotional bonds that connected many of them to him and this, in turn, ensured their loyalty to the throne upon which he sat, and to the royal democracy over which he presided.

Since the coronavirus arrived on these shores, we’ve already had two such messages from Elizabeth II, where she has sought to offer encouragement to her people and bring the British nation – however fleetingly – together as one.

As we move into what seems to be an increasingly uncertain future, we will hear much more from the Windsors as they attempt to invoke a spirit of national unity and togetherness. But at the same time the royals must ensure that such sentiments do not err on the banal through repetition and that messages imploring solidarity do not ignore the inequalities that separate the lives of the privileged from the lives of the ordinary people who will be the ones to suffer most because of joblessness, cuts in public spending, and tax increases.

Finally, the downturn of the 1930s saw George V and his kin take on more direct roles in trying to stabilize Britain’s economic and political systems. Younger royals carved out roles as trade emissaries promoting new economic relationships with South American countries while also acting as advocates for an older system of imperial preference.

There have recently been calls for the return of a royal yacht that could transport the Windsor family across the world so that they can help ‘Global Britain’ forge new trading relationships. Given the cost to the taxpayer, these suggestions will likely fall on deaf ears, but that is not to say that the royals cannot work to try to improve the nation’s economic prospects by greasing the wheels of international diplomacy. We can expect many more visits of foreign dignitaries to Buckingham Palace and trips by a royal contingent led by Prince Charles to regions of the world deemed strategically important to the UK’s trading future.

Perhaps the most significant step taken by George V during the Great Depression was when he controversially oversaw the creation of a National Government in 1931 in order to restore confidence in Britain’s shaky finances. He succeeded, but this event split the Labour Party and destroyed its electoral chances.

It seems unlikely in the twenty-first century that a monarch would risk involving themselves in an episode as politically explosive as this, or whether they would even be able to given the reduction in the royal prerogative powers. But the last couple of years have taught us that we should never say never when it comes to British politics.

The UK’s uncodified constitution enables flexibility when it comes to the precise role played by the crown in affairs of state. If the monarch and their advisors were to arrive at the view that the government in power was no longer representing the interests of the public it was elected to serve, then it is possible to imagine that the palace could apply pressure on the leader of such an administration to step aside so that someone else might do a better job.

For now, we wait apprehensively to see how painful the coming recession will be, along with how many people’s livelihoods are destroyed as businesses close and the inevitable job losses follow. The monarchy has always had to search out new roles in order to justify its position in British society. While the next Great Depression will bring with it many challenges, it will also create opportunities for the House of Windsor to reinvent itself again as we move into a post-pandemic world.

Dr Ed Owens is a historian, royal commentator and public speaker. His recent publication, The Family Firm: Monarchy, Mass Media and the British Public, 1932-53, is the first book in the New Historical Perspectives series, a new publishing initiative for early career researchers in collaboration with the Royal Historical Society, the Institute for Historical Research and the University of London Press. For queries please contact edowens@live.com or tweet to @DrEdOwens.

Cover image: The royal Christmas broadcast became an annual tradition because King George V wanted to reach out to his people in new ways during the difficult years of the Great Depression. The King delivering his Christmas broadcast, 1934. https://en.wikipedia.org/wiki/George_V#/media/File:Royal_broadcast,_Christmas_1934_(Our_Generation,_1938).jpg [Accessed 12 June 2020].

[1] Not only did the princes speak more openly about their mental wellbeing, they also set up new initiatives and promoted the work of existing charities to help people in need. The strategy was twofold: keep the monarchy relevant to people’s current concerns; and plug a hole left by government. Britain’s mental health will worsen as the nation finds itself beset by another financial crisis. It remains to be seen whether the current government takes a more urgent interest in this area thus potentially rendering royal patronage obsolete, or whether it continues in the tradition of the post-2010 administrations that left the UK’s mental health crisis to be dealt with by a patchwork of underfunded charities and royal-led organizations.

 

read more

The State and the Pandemic: Spain and the 1889-1890 Flu

Plaza Mayor Madrid 1890

COVID-19 has brought the so-called Spanish Flu of 1918 sharply into the collective consciousness, but it was not the first worldwide pandemic to be faced by the modern state.

In the winter of 1889, a new type of flu came to Europe. Although it had originated in China, they called it ‘Russian Flu’ because, in November, newspapers – including those in Spain – reported that large numbers of people had fallen ill in St. Petersburg. It would take less than a month to reach Madrid.

With greatly improved transport links, unsurprisingly, it was suspected that the number of people travelling was responsible for its rapid spread. However, recent research has emphasised ‘that the important predictor of the speed of the pandemic is not the absolute numbers of passengers travelling between cities but the “connectedness” of the network of cities.[1] In other words, it only took of a small number of people to spread the flu so quickly across an increasingly interconnected continent.

There had been flu outbreaks in 1836/7 and 1848 but these were little remembered and, in 1889, the Spanish authorities were disastrously slow to react. Despite the press tracking its seemingly inevitable arrival, no preparations had been made. In fact, the flu had probably been circulating undetected for weeks before the government acknowledged it on 17 December. The consequences of this inaction are difficult to establish but, in a recent study, Sara García Ferrero suggests that 65% of all 6,180 deaths in Madrid in the nine weeks that followed can be attributed to the flu.[2] In Barcelona, as many as 52,000 caught the disease.[3]

Understanding of virology was in its infancy and early reports focussed on whether it was in fact flu or, perhaps, dengue fever. Even making allowance for this, official messaging was confused and, initially, the threat was played down. The Provincial Health Board of Madrid met the same afternoon as the government’s acknowledgment to discuss their response; it was remarkably sanguine. La Iberia reported that they had confirmed the presence of ‘a disease, with epidemic characteristics, of the flu or a severe cold, in a very benign form.’ This is particularly surprising considering that, for weeks, the newspapers had been carrying reports of the large numbers taken seriously ill elsewhere. Even more worrying, though, was their contradictory assertion that the ‘disease is not spread by contagion.’[4]

This may have been a deliberate attempt by state functionaries to manage the public reaction to the outbreak and there is further evidence of this phenomenon elsewhere. In Reus, for example, the authorities ordered that church bells no longer be rung for the dead to avoid spreading fear among the population.[5] It was, however, a difficult balance to strike. The endorsement of ‘cures’, such as ¡Pum! (Bang!) – a punch of rum and bitter orange – may have done more harm than good.

Some of the more concrete measures taken were also strikingly modern. Primary schools were closed and the Christmas holiday was extended for older students. A 250-bed field hospital was constructed at the present-day School of Engineering, off the Paseo de la Castellana in Madrid. What is particularly notable about these actions is that they were the same as those that had been taken elsewhere. Then, as now, there appeared to be an international consensus about the contours of state intervention. Nevertheless, although such intervention may have slowed the spread, it failed to stop it completely.

The authorities did nothing the limit public gatherings, perhaps for fear of economic damage, but it still came at a cost. On 22 December, La Correspondencía de España reported that as many as 600 soldiers of the Madrid garrison had fallen ill. Despite this, there were signs that a type of social distancing was happening intuitively. People decided to avoid public spaces; streets, shops and cafés were largely deserted, and theatres closed (though only because of high levels of sickness among the performers.)[6]

The longer-term, chronic impoverishment of the Spanish state meant that its capacity for a more exhaustive response was limited. Even the field hospital had to rely at least in part on private donations.[7]

The effects of the pandemic itself also significantly disrupted the provision of public services. Predictably, doctors were particularly vulnerable to catching the flu, but there were also high sickness rates among state officials. Paradoxically, though, some of this disruption served to limit the spread of the virus. Sickness rates among transport workers, for example, disrupted tram and railway services, involuntarily restricting the movement of people.

While these restrictions and relative wealth helped shield the middle class, the poor were disproportionally affected. Plainly because of overcrowding and poor sanitation, but also because the state’s penetration was weakest in the most deprived areas. The measures the authorities introduced had little effect on the lives of the residents there. In a quandary with sad parallels today, many had little choice but risk their health and continue to go out to work.

The flu of 1889-90 was nothing like as deadly as COVID-19, but there are remarkable similarities in the Spanish state’s response. Despite advances in understanding, most countries made similar early mistakes during the current pandemic to those Spain made then. In both cases, this can partly be explained by a lack of scientific knowledge about the threat, but most decisions are also political ones, with intended and unintended consequences.

Eventually the measures were lifted. But only late in January and only when the death rate had returned to normal. In 1890 the lessons had been learned; it remains to be seen whether they will be in 2020. And if they will be remembered more enduringly this time.

Dan Royle is an historian of nineteenth-century Spain. His PhD at the University of Sheffield is on 1848.

Cover Image: Plaza Mayor (ca. 1890), Memoria de Madrid

[1] Alain-Jacques Valleron, ‘Transmissibility and geographic spread of the 1889 influenza pandemic’, in Proceedings of the National Academy of Science of the U.S.A. 107/19 (2010) pp.8778–8781.

[2] Sara García Ferrero, ‘La gripe de 1889-1890 en Madrid’, Ph.D. thesis (Universidad complutense de Madrid, 2017), p.452.

[3] Bogumiła Kempińska-Mirosławska and Agnieszka Woźniak-Kosek, ‘The influenza epidemic of 1889–90 in selected European cities – a picture based on the reports of two Poznań daily newspapers from the second half of the nineteenth century’, in Medical Science Monitor 19 (2013), pp.1131–1141.

[4] ‘Noticias’, in La Iberia (18 December 1889), p.2.

[5] Quoted in Ferrero, ‘La gripe de 1889-1890’, p.38.

[6] La Correspondencia de España (22 December 1889), p.3; Ferrero, ‘La gripe de 1889-1890’, p.43.

[7] ‘Boletín sanitario’, in El Día (28 December 1889), p.1.

read more

COVID-19, ‘Big Government’, and the Prohibition of Alcohol: Crisis as a Transnational Moment for Social Change

Liquor_bottles_array (1)

Throughout history, crises have often led to enormous social and economic reform as policymakers are forced to come up with new ways to meet unexpected demands. As Walter Scheidel argues in his book, The Great Leveller (2017), mass violence has been the primary impetus for the decline of inequality throughout world history, most recently with the Second World War serving as a watershed in relation to increased government spending on social programmes in many of its participating states. Although a crisis of a very different nature, the current coronavirus pandemic has also brought about similar shifts, with governments running huge budget deficits to protect jobs and counteract the threat of a looming recession caused by travel restrictions and lockdowns.

We also witness cases where governments experiment with creative solutions to crises that stretch across borders, as is the case with the current global pandemic. For a variety of reasons, a small handful of countries have resorted to banning the sale of intoxicants. One of the most debated aspects of South Africa’s lockdown has been their prohibition on the sale of alcohol and cigarettes, intended to reduce hospital admissions and secure beds for COVID-19 patients. Admissions have dropped by two-thirds due to reductions in alcohol-related violence and accidents, but such draconian measures also meant the rise of black-market trade and the near-collapse of the country’s proud wine industry.

The sale of alcohol was also banned in the Caribbean island of Sint Maarten, a constituent country of the Netherlands, and in Nuuk, the capital of Greenland, over its role in exacerbating incidents of domestic violence that came with the lockdown. In Thailand, the prohibition on alcohol was put in place to prevent the spread of the virus in social gatherings. In each setting, such policies were deemed drastic but necessary, carefully implemented for their advantages in tackling a variety of health concerns whilst also considering their clear downsides.

Although instituted under entirely different circumstances, the First World War was also a moment when similarly harsh controls were imposed on the sale of alcohol across the world. France and Russia were the first to institute bans on absinthe and vodka, respectively, due to concerns over their impact on wartime efficiency. Countries in which anti-alcohol temperance movements were already influential also implemented tough restrictions of varying degrees. Although the production and sale of alcohol had already been banned in different local jurisdictions in Canada and the United States, a national prohibition came into fruition in both countries due to the war. Alcohol was not banned in Britain, but the country nevertheless instituted far-reaching controls on the distribution of drink under the Central Control Board (CCB), established in 1915 to enforce higher beverage duties and shorter closing hours in pubs.

In almost every instance, it was the context of the war that spurred the move towards instituting these tough restrictions. Temperance activists in North America had been pushing for a national prohibition for decades, but the conditions of the war, such as the rise of anti-German sentiment directed towards German-American breweries such as Anheuser-Busch, brought the federal implementation of prohibition to the forefront of national politics. In Britain, part of the CCB’s responsibility was the nationalisation of pubs and off-licenses situated in parts of the country that were of strategic importance to the war effort.

These contexts directly parallel what we’re seeing in South Africa and Thailand, where extraordinary circumstances necessitated extraordinary countermeasures. However, there is also an important difference that must be stressed: while current lockdown prohibitions are merely temporary, most advocates of prohibitions and controls a century ago believed that such measures were to be permanent, based on their view that there were no advantages to permitting the existence of ‘demon drink’ in society. The ban on the distillation of vodka instituted under Imperial Russia in 1914 was maintained after the October Revolution and was not scrapped until after Lenin, himself an ardent prohibitionist, died in 1924. Yet, within the British context, the First World War effectively reshaped alcohol licensing for several generations, as high beverage duties and shorter opening hours were mostly preserved into the interwar and postwar eras.

These cases highlight the broader implications of social and economic reforms that are being implemented today. Right-wing governments in both Britain and Japan have approved record levels of government spending in the form of economic aid and stimulus. As Bernie Sanders ended his bid for the Democratic nomination in April 2020, politicians of both the left and the right debated the federal implementation of universal healthcare and paid sick leave in light of the public health crisis. Most recently, the Spanish government announced a €3-billion-euro universal basic income scheme to stimulate the pandemic-hit economy through increased consumer spending. A columnist for The Washington Post was clearly onto something when he declared that ‘there are no libertarians in foxholes’.

It is, however, decidedly too early to predict the long-term impacts of COVID-19 and whether these will lead to what many hope to be a reversal of neoliberal reforms that have dominated economics since the 1970s. One cannot forget that the ‘Keynesian Resurgence’ in stimulus spending during the Financial Crisis of 2007-08 was immediately followed by the tragedy of the Eurozone Crisis and the traumas of austerity measures that devastated the public sectors of Greece, Spain, Italy, Britain, and so on. Despite that, the impact of abrupt changes in undermining the status quo cannot be underestimated, as we saw with the global ‘wave’ of alcohol prohibitions a century before. History, therefore, is an apt reminder of how crises are moments when ‘radical’ reforms that were previously only imagined can eventually become reality.

Ryosuke Yokoe is a historian of medicine, science, and public health, presently affiliated with the University of Sheffield as an honorary research fellow. He recently completed a PhD on the medical understandings of alcohol and liver disease in twentieth-century Britain. You can find him on Twitter @RyoYokoe1.

Cover image: Array of liquor bottles, courtesy of Angie Garrett, https://www.flickr.com/photos/smoorenburg/3312808594/ [accessed 28 May 2020].

read more

Delight, Dismay and Disbelief: Reactions to the Death of Hitler, 75 Years Ago

Hitler_salute_in_front_of_lamppost

It is 75 years since Adolf Hitler committed suicide in his Berlin bunker. His death continues to generate considerable public interest thanks to both continuing forensic discoveries about his biological remains, and the persistence of outlandish tales of his postwar survival. While no serious historian believes in the latter, it is worth considering how confused reporting of Hitler’s fate in spring 1945 created a climate ripe for the flourishing of such legends.

The first formal declaration of Hitler’s death came late on the evening of 1 May 1945 via a radio broadcast by Grand Admiral Karl Dönitz. Sombre music and drum rolls gave way to the momentous announcement: ‘our Führer, Adolf Hitler, has fallen. In the deepest sorrow and respect, the German people bow’. It was, proclaimed Dönitz, a ‘hero’s death’, Hitler falling in battle while fighting valiantly against the ‘Bolshevik storm’.

‘Hitler Dead’ screamed countless international headlines the next day. The bold, dramatic and matter-of-fact statement left little room for ambiguity. Hitler had met his end, National Socialism was vanquished and the Second World War was effectively over. The Daily Herald printed a caricature of a burning Nazi emblem under the slogan ‘WAStika’. The cover of Time magazine simply struck Hitler’s face out with a large red cross.

The media’s response to Hitler’s passing was predominantly one of intense relief. ‘The whole building cheered’, recalled Karl Lehmann, a member of the BBC Monitoring unit. Numerous editorials depicted it as a moment of universal liberation – ‘a terrible scourge and force of evil has been removed’, declared the Lancashire Daily Post.[1] The sense of catharsis continued into the VE Day celebrations a few days later when the burning of Hitler’s effigy typically formed the high point of the UK’s festivities.

In the midst of this jubilation, however, there was widespread uncertainty about the precise cause of death. Dönitz’s talk of Hitler ‘falling’ in battle filled the first wave of international news reports, but many of the accompanying editorials urged caution about accepting this at face value. There was suspicion that either the Nazis were exaggerating the circumstances of his demise to foster a ‘Hitler legend’, or that they were peddling an entirely false narrative to distract from his retreat from the scene. Questioned on the matter during a White House press conference, President Harry S. Truman insisted that he had it ‘on the best authority possible’ that Hitler was, indeed, dead – but conceded there were no details yet as to how he died.

The press were right to question the death-in-battle scenario invented in the Dönitz broadcast. Stationed in Flensburg, over 270 miles away from the death scene, the Admiral was reliant upon information fed to him by colleagues in Führerbunker, namely Propaganda Minister Joseph Goebbels and Head of the Party Chancellery Martin Bormann. The pair had already delayed sending definitive news of Hitler’s passing, prompting Dönitz to misdate the fatal moment to the afternoon of 1 May, rather than the 30 April. They also neglected to supply details of what, exactly, had occurred, leaving Dönitz to fill in the gaps for himself. As it transpired, he was not the only person speculating on Hitler’s fate.

United States made propaganda forgery of Nazi German stamp. Portrait of Hitler made into skull; instead of “German Reich” the stamp reads “Lost Reich”. Produced by Operation Cornflakes, U.S. Office of Strategic Services, circa 1942, https://commons.wikimedia.org/wiki/File:Futsches-Reich-Briefmarke-UK.jpg [accessed 29 April 2020]

The Western Allies, anxious to puncture martyrdom myths before they could take hold, swiftly countered Dönitz’s heroic imagery by reviving rumours of Hitler’s previously failing health. The Soviets, meanwhile, denounced reports of Hitler’s death as a ‘fascist trick’ to conceal his escape from Berlin. Even when reports of a Hitler suicide emerged from 3 May, debate continued as to whether the Nazi leader had shot himself or taken cyanide – poison being perceived by the Soviets as a particularly cowardly (and thus eminently appropriate) way out for Hitler.

What, though, did the general public make of all this? Within hours of the Dönitz broadcast, the New York Times and the social research organisation Mass Observation were gauging reactions across Manhattan and London respectively. At first, the news appeared anticlimactic; people who had longed for this moment felt disoriented, numb or empty now it was finally upon them. As the implications sunk in, Hitler’s death raised optimism that the war might finally be over, but dashed hopes that the public would see him brought to justice. ‘Too bad he’s dead’, mused one young New Yorker, ‘he should have been tortured’.[2]

The overwhelming reaction to news of Hitler’s demise, though, was one of disbelief. Some sceptics perceived the whole affair as a Nazi ruse, with Hitler just waiting to ‘pop out again when we aren’t looking’. Others foreshadowed modern-day accusations of ‘fake news’, directing their cynicism towards the contradictory explanations printed in the Allied press for Hitler’s demise. Mistrust of Nazi propaganda was also, understandably, common with one Londoner reflecting, ‘I don’t believe he died fighting. They just said that to make it seem more – you know – the way he’d have wanted people to think he died… I think personally he’s been out of the way for a long time now.’[3]

Ultimately, the competing versions of Hitler’s death ensured that the timing and cause of his demise became quite fluid within the public imagination. This, together with initial Soviet refusals to disclose the recovery of an identifiable corpse outside the bunker, created a vacuum in which all manner of rumours could take root. By contrast, the death of Benito Mussolini was commonly regarded with satisfaction because the deliberate display of his body rendered it an indisputable fact. It was only in 2000 that images of Hitler’s jaw (alongside a fragment of skull erroneously attributed to him) were publicly exhibited in Moscow, demonstrating how documenting the truth about his fate has proved a protracted process, and explaining why the Nazi leader has managed to remain so ‘alive’ in public discussion for all these years.

Caroline Sharples is Senior Lecturer in Modern European History at the University of Roehampton.  Her research focuses on memories of National Socialism, representations of the Holocaust and perpetrator commemoration. She is currently writing a cultural history of the death of Adolf Hitler. You can find her on Twitter @carol1ne_louise.

Cover image: Adolf Hitler, prior to 1945.

[1] Lancashire Daily Post, ‘Hitler’s Exit’ (2 May 1945), p.2.

[2] New York Times, ‘City Takes Report of Death in Stride’ (2 May 1945), p.9.

[3] Mass Observation Archive, University of Sussex, Topic Collection 49/1/1: ‘Hitler Indirects’, Hampstead, 2 May 1945.

read more

Human Rights and the COVID-19 Lockdown

The_universal_declaration_of_human_rights_10_December_1948 edit

The speed with which we have given up some of our most basic rights and freedoms in the face of an incurable epidemic would be noteworthy, if it were not also such a cliché. Everyone has seen films in which the rights-bearing body of an individual becomes a disease-vector, and ultimately little more than toxic waste to be placed under rigorous cordon sanitaire, if not summarily obliterated. The mediocre Tom Clancy techno-thriller Executive Orders (1996) had the USA fight off a weaponised Ebola attack, with only conniving political opportunists moaning about rights, as the pragmatic authorities intoned the legal pabulum “the Constitution is not a suicide-pact!”[i]

Less entertainingly, it is also very nearly a truism of real-life commentary that the inequality with which “rights” are distributed in good times is multiplied in bad ones. While the virus itself may not discriminate, as we have been repeatedly advised, it seems to be having a disproportionate impact in the ethnic-minority communities of major Western nations, while the economic effects of lockdown are, of course, more violently traumatic the closer one is to the margins of society.

Human rights are supposedly universal and unconditional. But the protections they claim to offer have always proven flimsy and threadbare in practice. One reason for this is that the evolution of rights-language in the last three centuries is in fact frequently about two other things: firstly, an idea of grounded, foundational rectitude which has only partially shifted from theological to “scientific” underpinnings, and secondly, the doctrine of state sovereignty, historically entangled with the assertion of national identity. In the way they are used in practice in the world, “human rights” are frequently a cover for assertions and practices that entirely contradict their supposed premise of individual autonomy and security.

Human rights began their modern life as “natural rights”, an offshoot of centuries of European intellectual debate about the existence and contours of “natural law”. Understood, implicitly and explicitly, as a function of the fact of an ordered and purposive divine creation, and of the sovereign state as a component of such an order, rights retained their theological tinge very clearly into the Age of Enlightenment. The US Declaration of Independence invoked the “laws of nature and of nature’s God” as its foundation, spoke of the trinity of life, liberty and the pursuit of happiness as rights “endowed by their Creator” upon men, and appealed to “the Supreme Judge of the world” for validation. Thirteen years later, the French declared the “natural and imprescriptible rights of man” at the heart of a document they decreed to be proclaimed “in the presence and under the auspices of the Supreme Being”.

The French declaration of 1789 also placed the imagined rights-bearing individual in a complex and ultimately subordinated relationship to the other rising force of the era, in stating that “The principle of all sovereignty resides essentially in the nation”, and that “Law is the expression of the general will.” Across the declaration’s seventeen articles, although “every citizen” has the “right” to participate in lawmaking, the law itself – the encoded power of the nation-state – stands above anyone’s “liberty, property, security, and resistance to oppression” (the four enumerated natural rights).[ii]

The modern sovereign nation-state that increasingly took shape in the 1800s was built on claims of inherent superiority that displaced divinity with reason, but were no less, and sometimes more, discriminatory as a result. In France, even before the Revolution had transitioned into Napoleon’s dictatorship, the savants of the new National Institute had taken up the reins of scientific leadership dropped by the abolished royal academies of the old order. Alongside scholars of the sciences and literature, equal prominence was given to practitioners of the “moral and political sciences”.

One of the supposedly great truths that these scholars enunciated, for a country now explicitly referring to itself as “the Great Nation”, was that such a nation, while naturally superior to others, also contained many – multitudes indeed – who did not measure up, individually, to that greatness. France’s leading intellectuals quite deliberately defined the egalitarian republicanism to which they were sworn as something that required, in practice, a rigorous hierarchical division between the fully-enlightened and able elite, and the majority, still seeking to pull themselves out of the mire of the past, who could only expect to be led, gently but firmly, for the foreseeable future.

The legacy of the early nineteenth-century approach to the superiority of rational knowledge has been the creation of waves of ideological thinking, predicated on the foundational entitlement of those who know better to dominate and manipulate the common herd. Over the past two centuries, ideologies from to fascism to Marxism-Leninism, via the imperial liberalism that dominated Anglo-American and French public life, have used claims about their superior understanding of past, present, and future to claim the right to forcibly remake humanity for the collective good, using the overwhelming power of the state.

When the founders of the United Nations produced a Universal Declaration of Human Rights in 1948, they proposed to endow all people with a remarkably wide-ranging set of entitlements. The first clause of Article 25 states:

Everyone has the right to a standard of living adequate for the health and well-being of himself and of his family, including food, clothing, housing and medical care and necessary social services, and the right to security in the event of unemployment, sickness, disability, widowhood, old age or other lack of livelihood in circumstances beyond his control.

A noble aim, perhaps, but also a staggering act of hypocrisy on the part of France and the UK, ruling over swathes of subjugated and impoverished empire, the USSR, winding up to launch the antisemitic persecution of the so-called “Doctors’ Plot”, and the USA, mired in racial segregation and discrimination. The ultimate paradox of the notion of individual “rights” is that, if they are violated by a higher power, only a yet-higher and more righteous power can set matters straight. It is easy to believe such a power can exist, much harder to identify it in practice.

The past six decades have seen repeated and ever-more elaborate forms of international covenants binding states to increasing portfolios of rights that purport to demand respect. Yet, where are we? Half of the world’s ten largest countries – more than 3 billion people in those five states alone – are ruled by demagogues and autocrats.[iii] The UN’s “Human Rights Council”, founded in 2006, is a rotating talking-shop of forty-seven states which to date has never failed to include some of the world’s most notorious human-rights abusers in its membership.

Sitting in our homes, in a world which has, with the best intentions, summarily crushed many of our most fundamental everyday freedoms, we might legitimately wonder whether all discussion of “human rights” remains in the shadow of its pre-modern origins. We have, mostly, displaced the notion of divinely-ordained absolute sovereignty with more modern ideas, but we may well have given the sovereign nation and the state that embodies it almost as much power, while gaining in return little real regard for the individuals whose rights it supposedly protects.

David Andress is Professor of Modern History at the University of Portsmouth, and author of many works on the era of the French Revolution. He edited the Oxford Handbook of the French Revolution (2015), and has recently published Cultural Dementia (2018) and The French Revolution: a peasants’ revolt (2019).

Cover image: The universal declaration of human rights, 10 December 1948.

[i] The short-lived 2004 BBC show Crisis Command grimly demonstrated what might happen if a plague outbreak in the UK was not mercilessly stamped out, and to hell with rights.

[ii] According to the canonical text, the law may constrain liberty, in a whole number of ways, if behaviour troubles “the public order established by law”; it may overrule people’s own understanding of both security and resistance to oppression, for “any citizen summoned or arrested in virtue of the law shall submit without delay, as resistance constitutes an offence.” It may even, in the text’s final article, take away property, despite this being reiterated as “an inviolable and sacred right”, as long as due forms are followed and compensation paid. And what those are, of course, will be determined by the law.

[iii] In 2005 the UN invented the doctrine of a collective “Responsibility to Protect” human rights in other states. In 2015 the Saudi government invoked its “responsibility” to “protect the people of Yemen and its legitimate government” in launching the savage and near-genocidal campaign that continues to this day.

read more

Reading Between the Lines: What Can Testimonies of Former Slaves Tell Us about their Relationships with their Former Mistresses?

alex-boyd-HA0Rgl-ISko-unsplash

The testimonies of formerly enslaved women reveal a great deal about their experiences and relationships formed with their former white mistresses (a term used for female slaveholders in antebellum America). My SURE project, supervised by Rosie Knight, sought to compare the testimonies of formerly enslaved women in Virginia and South Carolina recorded in the WPA Slave Narratives Collection. Comparing these states reveal the varying factors that influenced slave-mistress relations, and the weight they held in doing so. These two regions contrasted greatly in a number of ways, including economic circumstances, slaveholding sizes and geographical disposition, which in turn influenced the relationships formed between enslaved women and their mistresses.

The WPA interviews have been a hotly debated source of testimony, providing valuable insight into the experiences of formerly enslaved people from their own perspectives, but also heavily influenced by the context of the 1930s. Many participants were suffering in poverty during the Great Depression, which may have influenced more nostalgic recollections of their childhood characterised by greater economic security.

Moreover, the ruling of Jim Crow may have meant participants were intimidated by their white interviewers, and indeed expressed reluctance to say too much or ‘the worse’, as one interviewee put it. In cases such as these, their silences may be the most revealing aspect of their testimonies. From analysing these interviews, three key themes come to the fore: violence, material well-being and religion. However, the nature and extent of the influence of such factors were subject to regional variations.

The violence experienced by enslaved women was heavily dictated by regional circumstances, and greatly influenced both the relationships formed and perceptions constructed of the mistress. Slaveholdings were generally smaller in Virginia than those in South Carolina, meaning mistresses themselves would often beat and whip slaves themselves, whereas in larger slaveholdings in South Carolina, overseers usually inflicted violence upon slaves.

The personal dimension of such violence played a key role in shaping how mistresses were remembered by slaves later in life. For example, Henrietta King (VA) recalled the brutal violence she experienced at the hands of her mistress for stealing a peppermint candy when she was a child, explaining: “See dis face? See dis mouf all twist over here so’s I can’t shet it? See dat eye? All raid, aint it? … Well, ole Missus made dis face dis way.” She went on to describe her former mistress as “a common dog.”[1]

In contrast, recollections of former slaves in South Carolina tend to recall their former mistresses as justified in their violence toward them, and appear less resentful, perhaps influenced by the relatively good material conditions and religious teachings they were provided. Victoria Adams, for example, recalled: “De massa and missus was good to me but sometime I was so bad they had to whip me.”[2]

The booming slave economy of South Carolina meant enslaved people often experienced better material conditions, and the larger size of slaveholdings meant enslaved people had greater opportunities to form stable family units and networks of kinship than in Virginia, where familial separation was common due to interstate slave-trading and the tendency for smaller slaveholdings. The better conditions in South Carolina may have led to less direct resistance, and thus less violence from their mistresses. Economic decline in Virginia meant slaves often lived in abhorrent living conditions and were provided little, if anything, to eat, which led to attempts to escape or steal food.

Such conditions shaped perceptions of former mistresses, as expressed by Henrietta King:  “In de house ole Missus was so stingymean dat she didn’t put enough on de table to feed a swaller.”[3] Such a testimony illustrates the ways in which the material conditions of slaves influenced their perceptions of their mistresses, both during their enslavement and retrospectively. Moreover, located further north, Virginia slaves were more likely to reach the free states, and so may have more readily engaged in direct resistance and efforts to escape.

In South Carolina, where conditions were better, interviewees tended to remember their former mistresses as domestic and motherly women. For example, Granny Cain described her mistress as “the best white woman I know of — just like a mother to me, wish I was with her now.”[4]

Viewing nostalgic recollections of slaves within the context of the Great Depression allows us to understand how interviewees may have recalled their experiences in slavery in survival terms, as a time in which they may have had greater economic security. Fear of bad-mouthing former slaveholders, again, may have also played a role in such recollections. Moreover, many interviewees were children during slavery, and so may have had greater experiences and less responsibilities than their mothers or older siblings would have experienced.

Religion also proved to be a significant survival strategy in the experiences of enslaved women, both providing comfort and, in some cases, strengthening connections with their slaveholders. In Virginia, enslaved people appear to have received religious instruction mainly via the church and with little input from their mistress, while in South Carolina, religion and its instruction played a key role in slave-mistress relations. This led to enslaved people associating their mistress with what she taught — as pious, good and even a saviour in some cases. Josephine Stewart, for example, described one of her former mistresses as “a perfect angel, if dere ever was one on dis red earth.”[5]

The relationships formed between enslaved women and their mistresses can therefore be seen as greatly influenced by regional and economic variations across slaveholdings. The most important influences included: the violence enslaved people were subjected to, especially if this was at the hands of the mistress; the material well-being of slaves; and religious instruction. The variation of testimonies across the South points to the value of a comparative framework, signifying how experiences of enslaved women were not the same across the region and cannot be generalised. Understanding the influence regional variations had upon the experiences of enslaved people and the relationships they formed with their mistresses not only enables us to place these testimonies and their experiences in historical context, but also helps us avoid making generalisations about a topic so sensitive and complex.

Lydia Thomas is a final-year History undergraduate at the University of Sheffield. She completed the Sheffield Undergraduate Research Experience (SURE) researching the relationships formed between enslaved women and their white female slaveholders. She focused on antebellum Virginia and South Carolina to explore how variations in regional circumstances, such as economy and slaveholding size, influenced the relationships formed and testimonies of formerly enslaved women.

Cover image: A close up of an old map of the USA, featuring Virginia and South Carolina. https://unsplash.com/photos/HA0Rgl-ISko [Accessed 24 March 2020].

[1] Henrietta King cited in Charles L. Perdue, Jr., Thomas E. Bardon and Robert K. Phillips (eds), Weevils in the Wheat: Interviews with Virginia Ex-Slaves (Charlottesville, 1976), p. 190

[2] Victoria Adams, Federal Writers’ Project: Slave Narrative Project, South Carolina, 14.1, pp. 10-11

[3] Henrietta King cited in Charles L. Perdue, et al., Weevils in the Wheat, p. 190

[4] Granny Cain, Federal Writers’ Project: Slave Narrative Project, South Carolina, 14.1, p. 166

[5] Josephine Stewart, Federal Writers’ Project: Slave Narrative Project, South Carolina, 14.4, p. 152. It is important to reiterate the influence of the context on such testimonies — positive recollection may have been utilised as a means of avoiding conflict with interviewers; Mistresses also often utilised religious instruction as a form of manipulation and control, especially within the large slave-holdings of the low country, presenting themselves in a position of authority and as an agent in the salvation of the slaves

read more
1 2 3
Page 1 of 3