close

Modern History

The State and the Pandemic: Spain and the 1889-1890 Flu

Plaza Mayor Madrid 1890

COVID-19 has brought the so-called Spanish Flu of 1918 sharply into the collective consciousness, but it was not the first worldwide pandemic to be faced by the modern state.

In the winter of 1889, a new type of flu came to Europe. Although it had originated in China, they called it ‘Russian Flu’ because, in November, newspapers – including those in Spain – reported that large numbers of people had fallen ill in St. Petersburg. It would take less than a month to reach Madrid.

With greatly improved transport links, unsurprisingly, it was suspected that the number of people travelling was responsible for its rapid spread. However, recent research has emphasised ‘that the important predictor of the speed of the pandemic is not the absolute numbers of passengers travelling between cities but the “connectedness” of the network of cities.[1] In other words, it only took of a small number of people to spread the flu so quickly across an increasingly interconnected continent.

There had been flu outbreaks in 1836/7 and 1848 but these were little remembered and, in 1889, the Spanish authorities were disastrously slow to react. Despite the press tracking its seemingly inevitable arrival, no preparations had been made. In fact, the flu had probably been circulating undetected for weeks before the government acknowledged it on 17 December. The consequences of this inaction are difficult to establish but, in a recent study, Sara García Ferrero suggests that 65% of all 6,180 deaths in Madrid in the nine weeks that followed can be attributed to the flu.[2] In Barcelona, as many as 52,000 caught the disease.[3]

Understanding of virology was in its infancy and early reports focussed on whether it was in fact flu or, perhaps, dengue fever. Even making allowance for this, official messaging was confused and, initially, the threat was played down. The Provincial Health Board of Madrid met the same afternoon as the government’s acknowledgment to discuss their response; it was remarkably sanguine. La Iberia reported that they had confirmed the presence of ‘a disease, with epidemic characteristics, of the flu or a severe cold, in a very benign form.’ This is particularly surprising considering that, for weeks, the newspapers had been carrying reports of the large numbers taken seriously ill elsewhere. Even more worrying, though, was their contradictory assertion that the ‘disease is not spread by contagion.’[4]

This may have been a deliberate attempt by state functionaries to manage the public reaction to the outbreak and there is further evidence of this phenomenon elsewhere. In Reus, for example, the authorities ordered that church bells no longer be rung for the dead to avoid spreading fear among the population.[5] It was, however, a difficult balance to strike. The endorsement of ‘cures’, such as ¡Pum! (Bang!) – a punch of rum and bitter orange – may have done more harm than good.

Some of the more concrete measures taken were also strikingly modern. Primary schools were closed and the Christmas holiday was extended for older students. A 250-bed field hospital was constructed at the present-day School of Engineering, off the Paseo de la Castellana in Madrid. What is particularly notable about these actions is that they were the same as those that had been taken elsewhere. Then, as now, there appeared to be an international consensus about the contours of state intervention. Nevertheless, although such intervention may have slowed the spread, it failed to stop it completely.

The authorities did nothing the limit public gatherings, perhaps for fear of economic damage, but it still came at a cost. On 22 December, La Correspondencía de España reported that as many as 600 soldiers of the Madrid garrison had fallen ill. Despite this, there were signs that a type of social distancing was happening intuitively. People decided to avoid public spaces; streets, shops and cafés were largely deserted, and theatres closed (though only because of high levels of sickness among the performers.)[6]

The longer-term, chronic impoverishment of the Spanish state meant that its capacity for a more exhaustive response was limited. Even the field hospital had to rely at least in part on private donations.[7]

The effects of the pandemic itself also significantly disrupted the provision of public services. Predictably, doctors were particularly vulnerable to catching the flu, but there were also high sickness rates among state officials. Paradoxically, though, some of this disruption served to limit the spread of the virus. Sickness rates among transport workers, for example, disrupted tram and railway services, involuntarily restricting the movement of people.

While these restrictions and relative wealth helped shield the middle class, the poor were disproportionally affected. Plainly because of overcrowding and poor sanitation, but also because the state’s penetration was weakest in the most deprived areas. The measures the authorities introduced had little effect on the lives of the residents there. In a quandary with sad parallels today, many had little choice but risk their health and continue to go out to work.

The flu of 1889-90 was nothing like as deadly as COVID-19, but there are remarkable similarities in the Spanish state’s response. Despite advances in understanding, most countries made similar early mistakes during the current pandemic to those Spain made then. In both cases, this can partly be explained by a lack of scientific knowledge about the threat, but most decisions are also political ones, with intended and unintended consequences.

Eventually the measures were lifted. But only late in January and only when the death rate had returned to normal. In 1890 the lessons had been learned; it remains to be seen whether they will be in 2020. And if they will be remembered more enduringly this time.

Dan Royle is an historian of nineteenth-century Spain. His PhD at the University of Sheffield is on 1848.

Cover Image: Plaza Mayor (ca. 1890), Memoria de Madrid

[1] Alain-Jacques Valleron, ‘Transmissibility and geographic spread of the 1889 influenza pandemic’, in Proceedings of the National Academy of Science of the U.S.A. 107/19 (2010) pp.8778–8781.

[2] Sara García Ferrero, ‘La gripe de 1889-1890 en Madrid’, Ph.D. thesis (Universidad complutense de Madrid, 2017), p.452.

[3] Bogumiła Kempińska-Mirosławska and Agnieszka Woźniak-Kosek, ‘The influenza epidemic of 1889–90 in selected European cities – a picture based on the reports of two Poznań daily newspapers from the second half of the nineteenth century’, in Medical Science Monitor 19 (2013), pp.1131–1141.

[4] ‘Noticias’, in La Iberia (18 December 1889), p.2.

[5] Quoted in Ferrero, ‘La gripe de 1889-1890’, p.38.

[6] La Correspondencia de España (22 December 1889), p.3; Ferrero, ‘La gripe de 1889-1890’, p.43.

[7] ‘Boletín sanitario’, in El Día (28 December 1889), p.1.

read more

COVID-19, ‘Big Government’, and the Prohibition of Alcohol: Crisis as a Transnational Moment for Social Change

Liquor_bottles_array (1)

Throughout history, crises have often led to enormous social and economic reform as policymakers are forced to come up with new ways to meet unexpected demands. As Walter Scheidel argues in his book, The Great Leveller (2017), mass violence has been the primary impetus for the decline of inequality throughout world history, most recently with the Second World War serving as a watershed in relation to increased government spending on social programmes in many of its participating states. Although a crisis of a very different nature, the current coronavirus pandemic has also brought about similar shifts, with governments running huge budget deficits to protect jobs and counteract the threat of a looming recession caused by travel restrictions and lockdowns.

We also witness cases where governments experiment with creative solutions to crises that stretch across borders, as is the case with the current global pandemic. For a variety of reasons, a small handful of countries have resorted to banning the sale of intoxicants. One of the most debated aspects of South Africa’s lockdown has been their prohibition on the sale of alcohol and cigarettes, intended to reduce hospital admissions and secure beds for COVID-19 patients. Admissions have dropped by two-thirds due to reductions in alcohol-related violence and accidents, but such draconian measures also meant the rise of black-market trade and the near-collapse of the country’s proud wine industry.

The sale of alcohol was also banned in the Caribbean island of Sint Maarten, a constituent country of the Netherlands, and in Nuuk, the capital of Greenland, over its role in exacerbating incidents of domestic violence that came with the lockdown. In Thailand, the prohibition on alcohol was put in place to prevent the spread of the virus in social gatherings. In each setting, such policies were deemed drastic but necessary, carefully implemented for their advantages in tackling a variety of health concerns whilst also considering their clear downsides.

Although instituted under entirely different circumstances, the First World War was also a moment when similarly harsh controls were imposed on the sale of alcohol across the world. France and Russia were the first to institute bans on absinthe and vodka, respectively, due to concerns over their impact on wartime efficiency. Countries in which anti-alcohol temperance movements were already influential also implemented tough restrictions of varying degrees. Although the production and sale of alcohol had already been banned in different local jurisdictions in Canada and the United States, a national prohibition came into fruition in both countries due to the war. Alcohol was not banned in Britain, but the country nevertheless instituted far-reaching controls on the distribution of drink under the Central Control Board (CCB), established in 1915 to enforce higher beverage duties and shorter closing hours in pubs.

In almost every instance, it was the context of the war that spurred the move towards instituting these tough restrictions. Temperance activists in North America had been pushing for a national prohibition for decades, but the conditions of the war, such as the rise of anti-German sentiment directed towards German-American breweries such as Anheuser-Busch, brought the federal implementation of prohibition to the forefront of national politics. In Britain, part of the CCB’s responsibility was the nationalisation of pubs and off-licenses situated in parts of the country that were of strategic importance to the war effort.

These contexts directly parallel what we’re seeing in South Africa and Thailand, where extraordinary circumstances necessitated extraordinary countermeasures. However, there is also an important difference that must be stressed: while current lockdown prohibitions are merely temporary, most advocates of prohibitions and controls a century ago believed that such measures were to be permanent, based on their view that there were no advantages to permitting the existence of ‘demon drink’ in society. The ban on the distillation of vodka instituted under Imperial Russia in 1914 was maintained after the October Revolution and was not scrapped until after Lenin, himself an ardent prohibitionist, died in 1924. Yet, within the British context, the First World War effectively reshaped alcohol licensing for several generations, as high beverage duties and shorter opening hours were mostly preserved into the interwar and postwar eras.

These cases highlight the broader implications of social and economic reforms that are being implemented today. Right-wing governments in both Britain and Japan have approved record levels of government spending in the form of economic aid and stimulus. As Bernie Sanders ended his bid for the Democratic nomination in April 2020, politicians of both the left and the right debated the federal implementation of universal healthcare and paid sick leave in light of the public health crisis. Most recently, the Spanish government announced a €3-billion-euro universal basic income scheme to stimulate the pandemic-hit economy through increased consumer spending. A columnist for The Washington Post was clearly onto something when he declared that ‘there are no libertarians in foxholes’.

It is, however, decidedly too early to predict the long-term impacts of COVID-19 and whether these will lead to what many hope to be a reversal of neoliberal reforms that have dominated economics since the 1970s. One cannot forget that the ‘Keynesian Resurgence’ in stimulus spending during the Financial Crisis of 2007-08 was immediately followed by the tragedy of the Eurozone Crisis and the traumas of austerity measures that devastated the public sectors of Greece, Spain, Italy, Britain, and so on. Despite that, the impact of abrupt changes in undermining the status quo cannot be underestimated, as we saw with the global ‘wave’ of alcohol prohibitions a century before. History, therefore, is an apt reminder of how crises are moments when ‘radical’ reforms that were previously only imagined can eventually become reality.

Ryosuke Yokoe is a historian of medicine, science, and public health, presently affiliated with the University of Sheffield as an honorary research fellow. He recently completed a PhD on the medical understandings of alcohol and liver disease in twentieth-century Britain. You can find him on Twitter @RyoYokoe1.

Cover image: Array of liquor bottles, courtesy of Angie Garrett, https://www.flickr.com/photos/smoorenburg/3312808594/ [accessed 28 May 2020].

read more

Delight, Dismay and Disbelief: Reactions to the Death of Hitler, 75 Years Ago

Hitler_salute_in_front_of_lamppost

It is 75 years since Adolf Hitler committed suicide in his Berlin bunker. His death continues to generate considerable public interest thanks to both continuing forensic discoveries about his biological remains, and the persistence of outlandish tales of his postwar survival. While no serious historian believes in the latter, it is worth considering how confused reporting of Hitler’s fate in spring 1945 created a climate ripe for the flourishing of such legends.

The first formal declaration of Hitler’s death came late on the evening of 1 May 1945 via a radio broadcast by Grand Admiral Karl Dönitz. Sombre music and drum rolls gave way to the momentous announcement: ‘our Führer, Adolf Hitler, has fallen. In the deepest sorrow and respect, the German people bow’. It was, proclaimed Dönitz, a ‘hero’s death’, Hitler falling in battle while fighting valiantly against the ‘Bolshevik storm’.

‘Hitler Dead’ screamed countless international headlines the next day. The bold, dramatic and matter-of-fact statement left little room for ambiguity. Hitler had met his end, National Socialism was vanquished and the Second World War was effectively over. The Daily Herald printed a caricature of a burning Nazi emblem under the slogan ‘WAStika’. The cover of Time magazine simply struck Hitler’s face out with a large red cross.

The media’s response to Hitler’s passing was predominantly one of intense relief. ‘The whole building cheered’, recalled Karl Lehmann, a member of the BBC Monitoring unit. Numerous editorials depicted it as a moment of universal liberation – ‘a terrible scourge and force of evil has been removed’, declared the Lancashire Daily Post.[1] The sense of catharsis continued into the VE Day celebrations a few days later when the burning of Hitler’s effigy typically formed the high point of the UK’s festivities.

In the midst of this jubilation, however, there was widespread uncertainty about the precise cause of death. Dönitz’s talk of Hitler ‘falling’ in battle filled the first wave of international news reports, but many of the accompanying editorials urged caution about accepting this at face value. There was suspicion that either the Nazis were exaggerating the circumstances of his demise to foster a ‘Hitler legend’, or that they were peddling an entirely false narrative to distract from his retreat from the scene. Questioned on the matter during a White House press conference, President Harry S. Truman insisted that he had it ‘on the best authority possible’ that Hitler was, indeed, dead – but conceded there were no details yet as to how he died.

The press were right to question the death-in-battle scenario invented in the Dönitz broadcast. Stationed in Flensburg, over 270 miles away from the death scene, the Admiral was reliant upon information fed to him by colleagues in Führerbunker, namely Propaganda Minister Joseph Goebbels and Head of the Party Chancellery Martin Bormann. The pair had already delayed sending definitive news of Hitler’s passing, prompting Dönitz to misdate the fatal moment to the afternoon of 1 May, rather than the 30 April. They also neglected to supply details of what, exactly, had occurred, leaving Dönitz to fill in the gaps for himself. As it transpired, he was not the only person speculating on Hitler’s fate.

United States made propaganda forgery of Nazi German stamp. Portrait of Hitler made into skull; instead of “German Reich” the stamp reads “Lost Reich”. Produced by Operation Cornflakes, U.S. Office of Strategic Services, circa 1942, https://commons.wikimedia.org/wiki/File:Futsches-Reich-Briefmarke-UK.jpg [accessed 29 April 2020]

The Western Allies, anxious to puncture martyrdom myths before they could take hold, swiftly countered Dönitz’s heroic imagery by reviving rumours of Hitler’s previously failing health. The Soviets, meanwhile, denounced reports of Hitler’s death as a ‘fascist trick’ to conceal his escape from Berlin. Even when reports of a Hitler suicide emerged from 3 May, debate continued as to whether the Nazi leader had shot himself or taken cyanide – poison being perceived by the Soviets as a particularly cowardly (and thus eminently appropriate) way out for Hitler.

What, though, did the general public make of all this? Within hours of the Dönitz broadcast, the New York Times and the social research organisation Mass Observation were gauging reactions across Manhattan and London respectively. At first, the news appeared anticlimactic; people who had longed for this moment felt disoriented, numb or empty now it was finally upon them. As the implications sunk in, Hitler’s death raised optimism that the war might finally be over, but dashed hopes that the public would see him brought to justice. ‘Too bad he’s dead’, mused one young New Yorker, ‘he should have been tortured’.[2]

The overwhelming reaction to news of Hitler’s demise, though, was one of disbelief. Some sceptics perceived the whole affair as a Nazi ruse, with Hitler just waiting to ‘pop out again when we aren’t looking’. Others foreshadowed modern-day accusations of ‘fake news’, directing their cynicism towards the contradictory explanations printed in the Allied press for Hitler’s demise. Mistrust of Nazi propaganda was also, understandably, common with one Londoner reflecting, ‘I don’t believe he died fighting. They just said that to make it seem more – you know – the way he’d have wanted people to think he died… I think personally he’s been out of the way for a long time now.’[3]

Ultimately, the competing versions of Hitler’s death ensured that the timing and cause of his demise became quite fluid within the public imagination. This, together with initial Soviet refusals to disclose the recovery of an identifiable corpse outside the bunker, created a vacuum in which all manner of rumours could take root. By contrast, the death of Benito Mussolini was commonly regarded with satisfaction because the deliberate display of his body rendered it an indisputable fact. It was only in 2000 that images of Hitler’s jaw (alongside a fragment of skull erroneously attributed to him) were publicly exhibited in Moscow, demonstrating how documenting the truth about his fate has proved a protracted process, and explaining why the Nazi leader has managed to remain so ‘alive’ in public discussion for all these years.

Caroline Sharples is Senior Lecturer in Modern European History at the University of Roehampton.  Her research focuses on memories of National Socialism, representations of the Holocaust and perpetrator commemoration. She is currently writing a cultural history of the death of Adolf Hitler. You can find her on Twitter @carol1ne_louise.

Cover image: Adolf Hitler, prior to 1945.

[1] Lancashire Daily Post, ‘Hitler’s Exit’ (2 May 1945), p.2.

[2] New York Times, ‘City Takes Report of Death in Stride’ (2 May 1945), p.9.

[3] Mass Observation Archive, University of Sussex, Topic Collection 49/1/1: ‘Hitler Indirects’, Hampstead, 2 May 1945.

read more

Human Rights and the COVID-19 Lockdown

The_universal_declaration_of_human_rights_10_December_1948 edit

The speed with which we have given up some of our most basic rights and freedoms in the face of an incurable epidemic would be noteworthy, if it were not also such a cliché. Everyone has seen films in which the rights-bearing body of an individual becomes a disease-vector, and ultimately little more than toxic waste to be placed under rigorous cordon sanitaire, if not summarily obliterated. The mediocre Tom Clancy techno-thriller Executive Orders (1996) had the USA fight off a weaponised Ebola attack, with only conniving political opportunists moaning about rights, as the pragmatic authorities intoned the legal pabulum “the Constitution is not a suicide-pact!”[i]

Less entertainingly, it is also very nearly a truism of real-life commentary that the inequality with which “rights” are distributed in good times is multiplied in bad ones. While the virus itself may not discriminate, as we have been repeatedly advised, it seems to be having a disproportionate impact in the ethnic-minority communities of major Western nations, while the economic effects of lockdown are, of course, more violently traumatic the closer one is to the margins of society.

Human rights are supposedly universal and unconditional. But the protections they claim to offer have always proven flimsy and threadbare in practice. One reason for this is that the evolution of rights-language in the last three centuries is in fact frequently about two other things: firstly, an idea of grounded, foundational rectitude which has only partially shifted from theological to “scientific” underpinnings, and secondly, the doctrine of state sovereignty, historically entangled with the assertion of national identity. In the way they are used in practice in the world, “human rights” are frequently a cover for assertions and practices that entirely contradict their supposed premise of individual autonomy and security.

Human rights began their modern life as “natural rights”, an offshoot of centuries of European intellectual debate about the existence and contours of “natural law”. Understood, implicitly and explicitly, as a function of the fact of an ordered and purposive divine creation, and of the sovereign state as a component of such an order, rights retained their theological tinge very clearly into the Age of Enlightenment. The US Declaration of Independence invoked the “laws of nature and of nature’s God” as its foundation, spoke of the trinity of life, liberty and the pursuit of happiness as rights “endowed by their Creator” upon men, and appealed to “the Supreme Judge of the world” for validation. Thirteen years later, the French declared the “natural and imprescriptible rights of man” at the heart of a document they decreed to be proclaimed “in the presence and under the auspices of the Supreme Being”.

The French declaration of 1789 also placed the imagined rights-bearing individual in a complex and ultimately subordinated relationship to the other rising force of the era, in stating that “The principle of all sovereignty resides essentially in the nation”, and that “Law is the expression of the general will.” Across the declaration’s seventeen articles, although “every citizen” has the “right” to participate in lawmaking, the law itself – the encoded power of the nation-state – stands above anyone’s “liberty, property, security, and resistance to oppression” (the four enumerated natural rights).[ii]

The modern sovereign nation-state that increasingly took shape in the 1800s was built on claims of inherent superiority that displaced divinity with reason, but were no less, and sometimes more, discriminatory as a result. In France, even before the Revolution had transitioned into Napoleon’s dictatorship, the savants of the new National Institute had taken up the reins of scientific leadership dropped by the abolished royal academies of the old order. Alongside scholars of the sciences and literature, equal prominence was given to practitioners of the “moral and political sciences”.

One of the supposedly great truths that these scholars enunciated, for a country now explicitly referring to itself as “the Great Nation”, was that such a nation, while naturally superior to others, also contained many – multitudes indeed – who did not measure up, individually, to that greatness. France’s leading intellectuals quite deliberately defined the egalitarian republicanism to which they were sworn as something that required, in practice, a rigorous hierarchical division between the fully-enlightened and able elite, and the majority, still seeking to pull themselves out of the mire of the past, who could only expect to be led, gently but firmly, for the foreseeable future.

The legacy of the early nineteenth-century approach to the superiority of rational knowledge has been the creation of waves of ideological thinking, predicated on the foundational entitlement of those who know better to dominate and manipulate the common herd. Over the past two centuries, ideologies from to fascism to Marxism-Leninism, via the imperial liberalism that dominated Anglo-American and French public life, have used claims about their superior understanding of past, present, and future to claim the right to forcibly remake humanity for the collective good, using the overwhelming power of the state.

When the founders of the United Nations produced a Universal Declaration of Human Rights in 1948, they proposed to endow all people with a remarkably wide-ranging set of entitlements. The first clause of Article 25 states:

Everyone has the right to a standard of living adequate for the health and well-being of himself and of his family, including food, clothing, housing and medical care and necessary social services, and the right to security in the event of unemployment, sickness, disability, widowhood, old age or other lack of livelihood in circumstances beyond his control.

A noble aim, perhaps, but also a staggering act of hypocrisy on the part of France and the UK, ruling over swathes of subjugated and impoverished empire, the USSR, winding up to launch the antisemitic persecution of the so-called “Doctors’ Plot”, and the USA, mired in racial segregation and discrimination. The ultimate paradox of the notion of individual “rights” is that, if they are violated by a higher power, only a yet-higher and more righteous power can set matters straight. It is easy to believe such a power can exist, much harder to identify it in practice.

The past six decades have seen repeated and ever-more elaborate forms of international covenants binding states to increasing portfolios of rights that purport to demand respect. Yet, where are we? Half of the world’s ten largest countries – more than 3 billion people in those five states alone – are ruled by demagogues and autocrats.[iii] The UN’s “Human Rights Council”, founded in 2006, is a rotating talking-shop of forty-seven states which to date has never failed to include some of the world’s most notorious human-rights abusers in its membership.

Sitting in our homes, in a world which has, with the best intentions, summarily crushed many of our most fundamental everyday freedoms, we might legitimately wonder whether all discussion of “human rights” remains in the shadow of its pre-modern origins. We have, mostly, displaced the notion of divinely-ordained absolute sovereignty with more modern ideas, but we may well have given the sovereign nation and the state that embodies it almost as much power, while gaining in return little real regard for the individuals whose rights it supposedly protects.

David Andress is Professor of Modern History at the University of Portsmouth, and author of many works on the era of the French Revolution. He edited the Oxford Handbook of the French Revolution (2015), and has recently published Cultural Dementia (2018) and The French Revolution: a peasants’ revolt (2019).

Cover image: The universal declaration of human rights, 10 December 1948.

[i] The short-lived 2004 BBC show Crisis Command grimly demonstrated what might happen if a plague outbreak in the UK was not mercilessly stamped out, and to hell with rights.

[ii] According to the canonical text, the law may constrain liberty, in a whole number of ways, if behaviour troubles “the public order established by law”; it may overrule people’s own understanding of both security and resistance to oppression, for “any citizen summoned or arrested in virtue of the law shall submit without delay, as resistance constitutes an offence.” It may even, in the text’s final article, take away property, despite this being reiterated as “an inviolable and sacred right”, as long as due forms are followed and compensation paid. And what those are, of course, will be determined by the law.

[iii] In 2005 the UN invented the doctrine of a collective “Responsibility to Protect” human rights in other states. In 2015 the Saudi government invoked its “responsibility” to “protect the people of Yemen and its legitimate government” in launching the savage and near-genocidal campaign that continues to this day.

read more

Reading Between the Lines: What Can Testimonies of Former Slaves Tell Us about their Relationships with their Former Mistresses?

alex-boyd-HA0Rgl-ISko-unsplash

The testimonies of formerly enslaved women reveal a great deal about their experiences and relationships formed with their former white mistresses (a term used for female slaveholders in antebellum America). My SURE project, supervised by Rosie Knight, sought to compare the testimonies of formerly enslaved women in Virginia and South Carolina recorded in the WPA Slave Narratives Collection. Comparing these states reveal the varying factors that influenced slave-mistress relations, and the weight they held in doing so. These two regions contrasted greatly in a number of ways, including economic circumstances, slaveholding sizes and geographical disposition, which in turn influenced the relationships formed between enslaved women and their mistresses.

The WPA interviews have been a hotly debated source of testimony, providing valuable insight into the experiences of formerly enslaved people from their own perspectives, but also heavily influenced by the context of the 1930s. Many participants were suffering in poverty during the Great Depression, which may have influenced more nostalgic recollections of their childhood characterised by greater economic security.

Moreover, the ruling of Jim Crow may have meant participants were intimidated by their white interviewers, and indeed expressed reluctance to say too much or ‘the worse’, as one interviewee put it. In cases such as these, their silences may be the most revealing aspect of their testimonies. From analysing these interviews, three key themes come to the fore: violence, material well-being and religion. However, the nature and extent of the influence of such factors were subject to regional variations.

The violence experienced by enslaved women was heavily dictated by regional circumstances, and greatly influenced both the relationships formed and perceptions constructed of the mistress. Slaveholdings were generally smaller in Virginia than those in South Carolina, meaning mistresses themselves would often beat and whip slaves themselves, whereas in larger slaveholdings in South Carolina, overseers usually inflicted violence upon slaves.

The personal dimension of such violence played a key role in shaping how mistresses were remembered by slaves later in life. For example, Henrietta King (VA) recalled the brutal violence she experienced at the hands of her mistress for stealing a peppermint candy when she was a child, explaining: “See dis face? See dis mouf all twist over here so’s I can’t shet it? See dat eye? All raid, aint it? … Well, ole Missus made dis face dis way.” She went on to describe her former mistress as “a common dog.”[1]

In contrast, recollections of former slaves in South Carolina tend to recall their former mistresses as justified in their violence toward them, and appear less resentful, perhaps influenced by the relatively good material conditions and religious teachings they were provided. Victoria Adams, for example, recalled: “De massa and missus was good to me but sometime I was so bad they had to whip me.”[2]

The booming slave economy of South Carolina meant enslaved people often experienced better material conditions, and the larger size of slaveholdings meant enslaved people had greater opportunities to form stable family units and networks of kinship than in Virginia, where familial separation was common due to interstate slave-trading and the tendency for smaller slaveholdings. The better conditions in South Carolina may have led to less direct resistance, and thus less violence from their mistresses. Economic decline in Virginia meant slaves often lived in abhorrent living conditions and were provided little, if anything, to eat, which led to attempts to escape or steal food.

Such conditions shaped perceptions of former mistresses, as expressed by Henrietta King:  “In de house ole Missus was so stingymean dat she didn’t put enough on de table to feed a swaller.”[3] Such a testimony illustrates the ways in which the material conditions of slaves influenced their perceptions of their mistresses, both during their enslavement and retrospectively. Moreover, located further north, Virginia slaves were more likely to reach the free states, and so may have more readily engaged in direct resistance and efforts to escape.

In South Carolina, where conditions were better, interviewees tended to remember their former mistresses as domestic and motherly women. For example, Granny Cain described her mistress as “the best white woman I know of — just like a mother to me, wish I was with her now.”[4]

Viewing nostalgic recollections of slaves within the context of the Great Depression allows us to understand how interviewees may have recalled their experiences in slavery in survival terms, as a time in which they may have had greater economic security. Fear of bad-mouthing former slaveholders, again, may have also played a role in such recollections. Moreover, many interviewees were children during slavery, and so may have had greater experiences and less responsibilities than their mothers or older siblings would have experienced.

Religion also proved to be a significant survival strategy in the experiences of enslaved women, both providing comfort and, in some cases, strengthening connections with their slaveholders. In Virginia, enslaved people appear to have received religious instruction mainly via the church and with little input from their mistress, while in South Carolina, religion and its instruction played a key role in slave-mistress relations. This led to enslaved people associating their mistress with what she taught — as pious, good and even a saviour in some cases. Josephine Stewart, for example, described one of her former mistresses as “a perfect angel, if dere ever was one on dis red earth.”[5]

The relationships formed between enslaved women and their mistresses can therefore be seen as greatly influenced by regional and economic variations across slaveholdings. The most important influences included: the violence enslaved people were subjected to, especially if this was at the hands of the mistress; the material well-being of slaves; and religious instruction. The variation of testimonies across the South points to the value of a comparative framework, signifying how experiences of enslaved women were not the same across the region and cannot be generalised. Understanding the influence regional variations had upon the experiences of enslaved people and the relationships they formed with their mistresses not only enables us to place these testimonies and their experiences in historical context, but also helps us avoid making generalisations about a topic so sensitive and complex.

Lydia Thomas is a final-year History undergraduate at the University of Sheffield. She completed the Sheffield Undergraduate Research Experience (SURE) researching the relationships formed between enslaved women and their white female slaveholders. She focused on antebellum Virginia and South Carolina to explore how variations in regional circumstances, such as economy and slaveholding size, influenced the relationships formed and testimonies of formerly enslaved women.

Cover image: A close up of an old map of the USA, featuring Virginia and South Carolina. https://unsplash.com/photos/HA0Rgl-ISko [Accessed 24 March 2020].

[1] Henrietta King cited in Charles L. Perdue, Jr., Thomas E. Bardon and Robert K. Phillips (eds), Weevils in the Wheat: Interviews with Virginia Ex-Slaves (Charlottesville, 1976), p. 190

[2] Victoria Adams, Federal Writers’ Project: Slave Narrative Project, South Carolina, 14.1, pp. 10-11

[3] Henrietta King cited in Charles L. Perdue, et al., Weevils in the Wheat, p. 190

[4] Granny Cain, Federal Writers’ Project: Slave Narrative Project, South Carolina, 14.1, p. 166

[5] Josephine Stewart, Federal Writers’ Project: Slave Narrative Project, South Carolina, 14.4, p. 152. It is important to reiterate the influence of the context on such testimonies — positive recollection may have been utilised as a means of avoiding conflict with interviewers; Mistresses also often utilised religious instruction as a form of manipulation and control, especially within the large slave-holdings of the low country, presenting themselves in a position of authority and as an agent in the salvation of the slaves

read more

‘The Great Australian Silence’: Sexual Violence in Australian History

1200px-Millstream_National_Park,_Pilbara,_Western_Australia

Like many settler colonies with evolving frontiers, there has been a continuous undercurrent of sexual violence in Australian history. From the first establishment of European settlements in Australia, forced sexual relations perpetrated by white settlers have remained relatively unspoken about in recollections of the Australian frontier experience, regardless of the victim’s race.

The term ‘the Great Australian Silence’ was first coined in a 1968 lecture delivered by anthropologist W.E.H. Stanner. Stanner utilised the term to address the manner in which certain critical areas of Indigenous and non-Indigenous history, including invasion, dispossession and massacres, had generally been ignored by Australian historians as part of a long-term structural trend, otherwise known as the ‘cult of forgetfulness’.[1]

Although scholarship has evolved over the past two decades to address certain aspects of ‘the Great Australian Silence’, a silence which undeniably excludes or minimises the prevalence of sexual violence perpetrated by white settlers predominantly against Aboriginal women, the scholarship has centred around massacres, genocide and child removal, with no substantial historiography on sexual violence.

Subsequently, it has been historically-set works of fiction that have been most effective in drawing public and academic attention to the relationship between the frontier, frontier violence and sexual violence. This includes the efforts of John Hillcoat and Kim Scott, whose works The Proposition and Benang: From the Heart will be briefly examined in this post, as well as the works of other contemporaries such as Kate Grenville (The Secret River, 2005) and Phillip Noyce (Rabbit-Proof Fence, 2002).

Although Scott and Hillcoat investigate these ideas in slightly different contexts, namely sexual violence towards white women in nineteenth-century frontier Queensland in Hillcoat’s The Proposition, and sexual violence towards Aboriginal women in Western Australia from European arrival through to the twentieth century in Scott’s Benang, they both attempt to highlight sexual violence as intrinsic to the frontier experience.

These two texts, when compared, emphasise differing aspects of colonial sexual violence. Hillcoat, in depicting the raped white colonial woman, presents sexual violence as a threat to the ideal of white nationhood; whereas Scott, in showing interracial sexual violence between settlers and Indigenous women, presents sexual violence as necessary for the survival of the white Australian nation.

In The Proposition, sexual violence is a vital and indivisible aspect of the film; indeed, “women’s bodies, or the violation of white women’s bodies to be exact, are called upon as both the motivation and means of resolving the proposition propelling the film”.[2]

The crime that motivates the proposition that drives the film is especially horrific as it involved the rape and murder of pregnant Eliza Hopkins, who embodied the future of the white nation. Furthermore, the place of sexual violence in relation to the frontier is emphasised in the penultimate scene in the Stanley homestead whereupon Martha, wife of Constable Stanley, is the victim of an attempted rape.

In this regard, Hillcoat draws substantial attention of the place of sexual violence against white women on the Australian frontier. In comparing The Proposition and Benang, the role of race is important to note, and here both creators serve to offer a nuanced insight into how sexual violence is presented in the context of colonial Australia based on the race of the violated woman. Rape is deemed a crime in The Proposition, arguably the worst crime that can be committed in such a society, whereas in Benang it is either an unacknowledged, un-criminalised consequence of the wider, also unacknowledged, crime of mass murder, or merely taken as an accepted aspect of colonisation

The sexual violence against Indigenous women in Benang does not serve to drive the plot of the novel; instead, it supplements and further highlights the violence faced by the Nyoongar people under white settlement. Furthermore, Scott highlights how sexual violence is intrinsic to other brutal and silenced aspects of colonialization, namely the eugenicist ideals held by those such as A. O. Neville, which subsequently motivate the mass removal of Indigenous children.

The most predominant occasions of rape are committed by Ern Scat, a Scotsman who legitimises his constant rape of his two Nyoongar wives as part of his eugenicist attempts to “breed out the colour”. For Scott, sexual violence and the expression of colonial hegemonic masculinity are depicted as a necessary part of colonisation, via the likening of the bodies of Aboriginal women to the land they are dispossessed from.

Indeed, Ern’s first experiences with the Aboriginal camps is a memory overwhelmed by sexual violence; as he remembers “the first night. The dirt on his bare knees, and how she turned her head away as her body took his thrusts”. Shifting between Sandy One’s mother being the product of rape, to the intrinsic place of rape after the massacre of Indigenous groups, through to Ern’s exploits, Benang details how sexual violence towards Aboriginal women is a continual and substantial feature of Australian history.

In comparing Hillcoat’s The Proposition and Scott’s Benang, one can see how historically-set texts have been vital in attempting to address the national silence regarding the place of sexual violence in Australian history.

It is worth noting that these examinations of sexual violence are done from the perspective of male creatives, and although they are successful in opening dialogue about ‘the Great Australian Silence’ regarding sexual violence in the history of the Australian frontier, texts by women, particularly Indigenous women, could offer further insights and perspectives into the relationship between sexual violence and Australian history.

Yet undeniably, Hillcoat and Scott both succeed in starting to challenge the silence and unspeakability regarding historical sexual violence in Australia, and thus offer a foundation for further discussion and research from a myriad of different perspectives. Ultimately, both texts work to render sexual violence in Australian history speakable, as it should be.[3]

Zoe Smith is a history and literature student at the Australian National University, with a specialisation in gender history and feminist theory. Having just completed a semester of study with the University of Sheffield History Department, she will be completing her third year of study this year, with full intentions of doing further research into sexual violence on the Australian frontier via an honours thesis and a PhD. You can find Zoe on Twitter @ZoeASmith4

Cover image: View of Millstream-Chichester National Park, Australia. The barren landscape is suggestive of the cultural silence discussed in the blog. Courtesy of Gypsy Denise. https://commons.wikimedia.org/wiki/File:Millstream_National_Park,_Pilbara,_Western_Australia.jpg [Accessed 4 February 2020].

[1] For more information on Stanner and the ‘Great Australian Silence’, see Andrew Gunstone, ‘Reconciliation and “The Great Australian Silence”’ in R. Eccleston, N. Sageman, and F. Gray (eds), The Refereed Proceedings of the 2012 Australian Political Studies Association Conference, (Melbourne, 2012).

[2] Tanya Dalziell, ‘Gunpowder and Gardens: Reading Women in The Proposition’, Studies in Australasian Cinema, 3.1 (2009), 122.

[3] The ideas and research presented in this blog post are featured in and further extended upon in an upcoming article due to be published in March by the Australian National University Undergraduate Research Journal. Interested readers will be able to access the article here: http://studentjournals.anu.edu.au/index.php/aurj

read more
1 2 3
Page 1 of 3