close

British History

Churchill and the Prof: Putting ‘declinism’ at the heart of the system

unnamed

In October 1951, after a six-year absence, Winston Churchill was returned to office as British Prime Minister at the head of a Conservative government. His previous stint for which he is most famous was as head of a wartime coalition, appointed rather than elected. His 1951-1955 term – which Anthony Seldon once described as his ‘Indian Summer’ – is less familiar to a contemporary audience.

Yet Churchill’s second term has still played a key role in shaping the contemporary political landscape, not least in terms of the politics of ‘declinism’ – the school of commentary and historical writing that constructs Britain’s story in terms of its ‘decline’ as a world power, often for highly-politicised reasons. It is almost a banality to note that Britain’s status in international affairs was a preeminent concern for Churchill; in the course of 1951-1955 – four years otherwise seen by many historians as unremarkable in political terms – the question of global power and science and technology policy would, for Churchill, become inextricably linked.

On his return to office, Churchill invited his old friend and ally Frederick Lindemann – by now Lord Cherwell – to assume his old ministerial office of Paymaster-General which he had held during the Second World War. Cherwell, as much as Churchill, is a controversial figure; acting as Churchill’s scientific adviser during the Second World War, it was Cherwell’s calculations, as Madhusree Mukerjee has shown, which played a key role in the Bengal Famine of 1943.

In the 1950s, Cherwell – an eminent physicist and Oxford professor – used his government and academic roles to lobby strongly for a British equivalent to the Massachusetts Institute of Technology (MIT). This was anchored in his conviction that Britain’s great power status was contingent on the radical reshaping of science and technology policy – and its education system. Himself a graduate of the Berlin Technical University where he had studied until Walther Nernst, he was dismissive of much of British scientific and technological education.

He consistently used his relationship with Churchill to push for reform – often against the tide of broader intellectual opinion. Whilst academic thinking on curricula in Britain was increasingly focusing on interdisciplinarity in the post-war period, partly in reaction to what was seen as ‘excessive specialisation’ (and thus, so the argument ran, moral bankruptcy) in the German institutions to which Britain’s civic universities in particular were indebted, Cherwell was unrepentant in his defence of such institutions. In a 1952 memorandum to the Prime Minister, Cherwell argued that it was ‘essential to train a new race of technologists to effect a [‘minor’ penciled out] revolution in our industrial outlook, because it is vital for our survival that productivity should rapidly increase’. To this end, Cherwell wanted three new technological universities.

He was an advocate of what C. P. Snow would come to call the ‘two cultures’ thesis of a rift between the Arts and the Sciences. In a debate in the House of Lords in 1957, shortly before he died and some years out of office, Cherwell decried the lack of basic scientific knowledge he found amongst ‘Arts men’, and argued pointedly that a lack of ‘technologists’ placed Britain at a disadvantage to Russia.

Cherwell had made the same arguments for years; in a typescript of his notes for a speech at a conference in 1950, the same themes can be found. In 1954, after he had left office but whilst Churchill was still Prime Minister, he had – in Churchill’s words – ‘warned that…the Russians were getting ahead not only of us, but of the Americans’. Churchill recounted this in a letter to Harold Macmillan during the latter’s tenure as Prime Minister, pleading for more spending on technological education.

This warnings had an impact on Churchill, both in office and out of it. In 1954, Churchill appointed Sir David Eccles as Minister of Education, the first minister ‘to assume educational expenditure was economic investment’, in the words of former civil servant Maurice Kogan. In his first meeting with his Parliamentary Education Committee, Eccles argued that ‘education was the basis of the defence of freedom.’

For Eccles, taking office towards the end of Churchill’s term, the link between education and economic growth was axiomatic, and thus in turn global power. Under Churchill’s successor, Sir Anthony Eden, Eccles would promote a ‘public sector’ of higher education – the Colleges of Advanced Technology – which would be advertised in emphatically geopolitical terms.

Churchill, in his retirement, would spend time lobbying for the Churchill Technological Trust, who sought to build a College at Cambridge in his name, supposedly focusing on scientific and technological education. Cherwell’s influence on him – the duo had been dubbed ‘Churchill and the Prof’ – had been profound. But their partnership in the 1950s, building on their tenure during the Second World War, helped construct educational expansion in Britain in ‘techno-nationalist’ terms, to borrow David Edgerton’s phrase. They were not alone in this, but the peculiar emphases in STEM discourses in contemporary British politics owe at least a little to the crystallisation of declinist ideas about scientific and technological education which took place during, and after, Churchill’s peacetime government.

These ideas – equating science and technology with British power and the possibilities of influence – live on, and offer a distinctively British framing to broader debates over higher education policy, marketisation, neoliberalism and techno-nationalism which span party lines and, as Edgerton has shown, illuminate broader questions as to the nature of the British ‘nation’ in the post-war era.

Dr Mike Finn is Senior Lecturer in History and Director of Liberal Arts at the University of Exeter. A historian of modern and contemporary British history, his doctoral research focused on the political economy of higher education in post-war England. His most recent book is British Universities in the Brexit Moment: Political, economic and cultural implications (2018).

 

read more

Jonas Salk turns 105: Some thoughts on lessons from history

salk1

Jonas Salk would have been 105 today, 28 October. He is remembered as the inventor of the polio vaccine who, when asked how much money he stood to make, declared: ‘There is no patent. Could you patent the sun?’

Of course, “it’s more complicated than that”. Salk was part of a multi-national, multi-agency project to develop prophylactics. Without the use of “his” injectable vaccine and the oral vaccine developed by rivals on the other side of the iron curtain, humanity would not be on the verge of eliminating polio. (For more on that story, see the excellent book by Dora Vargha.)[1] And one of the reasons Salk didn’t patent the vaccine was that it was unpatentable.

But let’s not be uncharitably pedantic. It is, after all, his birthday.

In the wake of recent reports of resurgent infectious diseasesincluding polio – vaccination is back in the news. (If, indeed, it ever went away). Matt Hancock, the UK’s Secretary of State for Health and Social Care, has suggested the government might consider mandatory vaccination. Public health experts have cautioned against this, using (in part) historical evidence. In the nineteenth century, compulsory vaccination generated a well-organised, vocal and occasionally violent anti-vaccination movement,[2] the effects of which still haunt Britain’s public health authorities.

Public health has taken its lessons from high-profile examples of crisis – smallpox, pertussis or measles to name but three.[3] But not all problems come from rejection of vaccines. With polio in the 1950s, the problem was the government’s inability to meet demand.

Salk’s vaccine (yes, we’ll give him credit here – after all, contemporaries referred to the inactivated poliomyelitis vaccine simply as “Salk”) became commercially available in 1955. The British government announced with great fanfare that it would provide the vaccine for free to all children and young adults. There was clear demand for it. This invention – in the same vein as space exploration and penicillin – was a marker of modernity, the power of science to solve once-intractable problems.

Unfortunately, there was not enough to go around. In 1955, a manufacturing defect by Cutter Pharmaceuticals resulted in the accidental infection of hundreds of American children. As a result, the British banned American imports and chose to use domestic factories to produce a “safer” form of the vaccine.[4] But Britain didn’t have the capacity to produce enough doses in time. Shortages created complaints from the British press and parents, and – despite the demand – few registered for the vaccine because of the long waiting lists and inconvenience.

As proof of the demand for the vaccine – despite the Cutter incident – local authorities were swamped with requests when high-profile cases made the news. The death of professional footballer Jeff Hall showed even fit, young people could be affected and created a surge in numbers of younger adults presenting themselves and their children for the jab. In the ensuing shortages, the health minister blamed people for their apathy – if they’d just done as they were told when they were told, the government could have better distributed the vaccine over the course of the year. This did not go down well as a public relations exercise.

This crisis was eventually overcome through the introduction of the oral polio vaccine in the early 1960s. Taken on a sugar cube, parents were much more willing to present their children. It was a quick process that could be done anywhere; it didn’t hurt (though its taste was somewhat to be desired); and it could be manufactured so easily, and in such volume, that there was no need to wait around for months for the next batch to become available.

Of course, all historical circumstances are different. Anti-vaccination literature is certainly more visible than it was in the 1950s. Populations are more mobile. The immediate memory – even fear – of certain infectious diseases has faded.

At the same time, the intriguing part of this history – at least to this historian – is not why people don’t vaccinate their kids. It’s why so many do.[5] The vast majority of children receive some form of vaccination – upwards of 95 per cent – even if they do not always complete every course in the recommended time frame.

The great improvements in vaccination rates over the past 70 years have come from better administration. Easier-to-administer vaccines. More-robust procedures for following up on missed appointments. Advertising. Having local health professionals answer the specific questions and concerns individual parents might have. Following up with patients who might otherwise slip through the surveillance of public health authorities (such as those who do not speak English, regularly change addresses, have other acute social care needs). All these things required resources which have been squeezed significantly since public health was reintegrated into already-struggling local authorities.

It would be unwise for a historian to say that this is the cause of the problems, or that extra funding will provide a magic-bullet solution.

It is, however, worth reminding ourselves that crises in vaccination policy are not new. We have experienced them before. And not all of them have been due to a lack of demand or fear of a particular vaccine. The 1950s polio example shows us that more practical issues can be at play, and that the public and its collective behaviour are not necessarily at the root of them.

Gareth Millward is a Wellcome Research Fellow at the Centre for the History of Medicine at the University of Warwick. He has worked on the history of disability, public health, vaccination and most recently sick notes. His book Vaccinating Britain was published in January 2019 by Manchester University Press.

[1] Dora Vargha, Polio across the Iron Curtain: Hungary’s Cold War with an Epidemic (Cambridge: Cambridge University Press, 2018).

[2] Nadja Durbach, Bodily Matters: The Anti-Vaccination Movement in England, 1853–1907 (Durham: Duke University Press, 2005).

[3] Stuart Blume, Immunization: How Vaccines Became Controversial (London: Reaktion, 2017).

[4] Hannah Elizabeth, Gareth Millward and Alex Mold, ‘’Injections-While-You-Dance’: Press advertisements and poster promotion of the polio vaccine to British publics, 1956-1962’, Cultural and Social History 16:3 (2019): 315-36.

[5] Gareth Millward, Vaccinating Britain: Mass Vaccination and the Public Since the Second World War (Manchester: Manchester University Press, 2019), p. 1.

read more

Rex Britanniae?- The national identity of King Edward I in four maps

British_-_Edward_I_-_Google_Art_Project

‘Is it the end of the world?’ asked one thirteenth-century Welsh poet, when English forces stormed into Wales in 1277. The key instigator was King Edward I, whose campaigns of 1277-1307 were fundamental for how Scottish, Welsh and English people identified themselves. By mapping Edward’s movements, we can investigate how he promoted a singular British national identity, a Rex Britanniae, provoking consideration over how debates regarding sovereignty and self-belonging in the thirteenth-fourteenth centuries are remarkably similar to those of the twenty-first century – calls for Scottish and Welsh independence are certainly no novelty.

Wales

A key element of Edward’s approach to kingship was Arthurianism, particularly for Wales. Taking advantage of a popular thirteenth-century obsession, Edward saw himself as the mythical king’s successor. [1]  That he and Queen Eleanor attended the reinternment of the bones of ‘Arthur’ and ‘Guinevere’ just months after defeating the Welsh is non-coincidental. Whatever its plausibility, this was symbolic, a truly political performance; Edward sought to embody Arthur as legitimate overlord of Wales.

Yet ideological mastery was insufficient. Magnificent castles such as Beaumaris and Harlech were catalysts for Anglicisation, visually documenting the physical and metaphorical permanence of English conquest. [2] As Map 1 shows, these castles were also strategically significant – Rhuddlan as an army base, Caernarfon as a supply centre. The administrative role of castles – the castle boroughs – was the machinery behind Rex Britanniae. Realised through the Statute of Wales (Rhuddlan Castle, 1284) explicitly English administrative cadre were imprinted onto boroughs.

Often called ‘Anglicisation from above’, ‘the first colonial institution’, the Statute never aimed to create unity between English and Welsh law. [3] It represented the tenacity of Edward’s Arthurian ‘United Kingdom’ ideology and the battle over sovereignty, climaxing in large-scale, coordinated attacks on Edwardian castles, namely in 1294. [4]

Map 1 – Edward I’s Welsh campaigns. I plotted the London-St. David’s route, encompassing various major castles. Copyright @Charlotte Tomkins

Scotland

While Wales had no single ruler and yet was relatively ethnically homogenous, Scotland was politically united under the Scottish kings, despite being a melting pot of Brittonic, Gaelic and Viking elements, further complicated by the English-speaking population inhabiting South-Eastern Scotland. [5] Yet the division between England and Scotland was primarily not cultural but political, and fluid, made remarkably clear in Map 2. Here, a tiny Scotland is presented as a “land beyond the sea”, connected to an English mainland only by Stirling bridge.

Map 2 – Matthew Paris’ Map of Britain, (c. 1250).

Yet no bridge would stop Edward. Arthurianism was most powerful regarding Wales, but it was certainly not irrelevant for Scotland – a multiplicity of Arthurian myths existed. The hereditary right of Arthur’s successors to rule Britain in its entirety was central to Arthurian territorial ideologies. Viewing the Anglo-Scottish border as a purely internal division, Edward used this ideology to try to absorb Scotland, like Wales, into the inalienable royal fisc – the Crown’s taxation and revenue source.

Castles were again essential in realising this and the castles of Scotland and Northern England became key battlegrounds between these two realms. Tensions escalated from 1290, partly over Edward’s determination that all royal fortresses come under his custody.  What was for Edward a logical step ensuring Scotland’s security was a denial of sovereignty for the Scots. These castles were the ‘instruments of raw power’, whose loss was catastrophic. [6]

The border-lands saw some of the most vicious attempts to subdue Scotland: for example, the 1296 bloodbath, the Battle of Berwick. Edward had captured the castle, massacred its townspeople, and garrisoned the fortress – so, what did this mean for Rex Britanniae? Tensions with Scotland’s heir John Balliol were already fraught, but Edward’s actions, combined with the withdrawal of his support for Balliol’s claim to the Scottish throne, resulted in Balliol’s formal renunciation of his oath to Edward.

Map 3 – Edward I’s Scottish campaigns. Note the importance of the border-lands as key entry, exit and fortification points. Copyright @Charlotte Tomkins

Berwick was a watershed moment, where Balliol confronted the hard truth – his enthronement was only ever temporary; where Rex Britanniae was performed and destroyed as Edward gambled with Balliol’s loyalty for the motive of reducing Scotland to a vassal-state of England. [7]  Revolts from 1296  – which Map 3 shows Edward’s efforts to put down – simply demonstrate that a shared Scottish identity was being strengthened. [8]

The King in Motion

Edward’s attempts to realise a Rex Britanniae depended on impregnable, looming castles – but it also depended on almost ceaseless movement. These campaigns can be drawn on conventional static maps. But by plotting his known locations and dynamically projecting them onto a map we can see – for the first time – the rhythm of Edward’s movements and their consequences, watching as he gathers forces and builds a consensus in England, before striking north and west.

Map 4: click to access moving images and all data

Viewing Edward’s reign in this innovative way allows us to not only visualise his efforts to become the rex Britannie, but also to begin to quantify his movements, highlighting the importance of warfare and conquest throughout Edward’s reign. The statistics below elaborate on the movements in Map 4 :

Figure 1 – During peacetime years alone Edward spent 116 days in Berwick-upon-Tweed, surpassed only by Windsor (157) and Westminster (956), delineating its strategic, ideological and political importance.  (J. E. Crockford, ‘The Itinerary of Edward I of England: Pleasure, Piety and Governance’ (Turnhout, 2016), p. 245.)

Edward’s death in 1307 made hopes of achieving Arthur’s united Britain impossible. Edward’s success was not linear; while the Anglicisation of Wales was long-lasting, the Scottish conquest ultimately failed.  What we can say, however, is that Edward’s ideology of a United Kingdom still remained influential even after his death; even now people are still debating the very concept.

Charlotte Tomkins is in her final year of an undergraduate history degree at the University of Sheffield, with ambitions to continue the subject at MA and PhD level. She recently completed the Sheffield University Research Experience (SURE) where she examined the links between the itinerary of Edward I and his pursuit of a single kingdom, a Great Britain and a United Kingdom, under his kingship, using databases and cartography. She focused on how castles were at the heart of Edward’s vision, and how debates over British national identity are not contemporary; they have been heated since the medieval period and earlier.

[1] D. Jones, The Plantagenets: The Kings Who Made England (London, 2012), p 98.

[2] R. R. Davies, ‘Edward I and Wales’ in T. Herbert and G. E. Jones (eds), Edward I and Wales (Cardiff, 1988), p. 1.

[3] Ibid., p. 2.; Jones, The Plantagenets, p. 314.; M. Prestwich, Edward I (London, 1988), pp. 205-206.

[4] M. Morris, Castle: A History of the Buildings that shaped Medieval Britain (London, 2012), p. 134.

[5] M. Morris, Edward I: A Great and Terrible King (London, 2009), p. 241.

[6] Ibid., pp. 236-237.

[7] P. Parker, History of Britain in Maps (Glasgow, 2017), p. 26.

[8] P. Traquair, Freedom’s Sword: Scotland’s Wars of Independence (London, 1998), p. 13.

read more

A Great British Welcome? Unlearned lessons from the Kindertransport

Owen

British collective memory largely recognises the Kindertransport (Children’s Transport) policy as a point of national pride, believed by many to be ‘the zenith of…interwar international humanitarianism.’[1] This policy was instituted, – albeit reluctantly – by Chamberlain’s Conservative government as a reaction to the Kristallnacht (The Night of Broken Glass) of November 1938, where Jewish homes, businesses, and buildings were ransacked across Germany in an act of extreme racial hatred. The Kindertransport allowed 10,000 Jewish children to take refuge in Britain to escape the persecution of the Third Reich.

Although the policy appears noble and humanitarian on the surface, there were several caveats. Firstly, the parents of these young Jewish refugees were not permitted to accompany them. This was due to the British government’s fears of developing a ‘Jewish problem in the United Kingdom’: this supposed ‘problem’ being that too many Jewish people taking refuge in the country would lead to overpopulation and employment issues.[2] Therefore, only allowing unaccompanied children to take refuge was believed to be ‘more palatable to British public sensibilities’.[3]

Furthermore, Jewish refugees were only allowed to enter the country on the condition that their escape from racial discrimination would not be a ‘financial burden on the public’, as it would cost £50 per child to safely cross the borders and house on their arrival. In an attempt to exonerate the former government from their questionable treatment of Jewish refugees, the current government – via The National Archives’ educational entry on the Kindertransport –provide the rather tenuous excuse that ‘few households could pay the sum…required’. Instead, this comes off as little more than apologia for inaction towards racial discrimination and is akin to our current government’s own treatment of refugees in the present day.

Ultimately, it was up to Jewish organisations and benefactors to foot the bill themselves to ensure the safety of these children.[4] Support was quite limited at the start of the program, with the intake of refugees only gaining further traction from Gentile (non-Jewish) groups once they learned that the refugees ‘were not all Jewish.’[5] Those who did manage to escape via the Kindertransport were not all guaranteed hospitality and care on arrival, with many being placed in refugee camps and youth hostels. Max Dickson, a German-Jewish child of the Kindertransport, recorded his experiences in refugee camps in his memoirs: ‘[There was] No one to tuck you in and give you a hug or say “I love you”. I think many of us cried ourselves to sleep those first three months.’[6] Another Kindertransport refugee, Bob Kirk, was separated from his parents in Hanover in May 1939. Kirk, along with 200 other children on his train, were led to believe that their parents would be joining them in England once their papers had been approved: ‘My parents were so intent on not making it seem like a parting that they didn’t include anything which might suggest we wouldn’t see each other again.’ Kirk’s parents were deported to Riga in 1941 and never returned. Regular discrimination, fear, and loneliness were all part and parcel of the life of a Jewish refugee, and one may already begin to start drawing significant parallels with the poor treatment experienced by contemporary refugees.

Many continue to perpetuate an idea of the Second World War as an almost-biblical battle between good and evil, with Britain acting as the righteous ‘saviours’ of Jews under threat from Nazism.  This approach makes for a compelling narrative but is a gross misrepresentation of reality. Contrary to popular nostalgia, the notion of British war-time humanitarianism in relation to refugees is questionable at best and offensively sanitised at worst. Whilst on the proverbial ‘right side of history’ in opposing the horrors of National Socialism, the British government was indifferent, if not outright hostile towards Jewish refugees, which is – depressingly – quite relevant to the current government’s own treatment of refugees.

In popular British culture, many prefer to select specific examples of British humanitarianism and ascribe them to the nation at large. This is clear in the case of Sir Nicholas Winton, dubbed ‘the British Oskar Schindler’ for his instrumental role in evacuating 669 children from Prague. Winton, however, was arguably the exception rather than the rule, and not representative of the British population.[7] Despite this, Theresa May used his story in her resignation speech in May 2019; May recalled that Winton, a long-time constituent of hers in Maidenhead, had given her some advice prior to his death, supposedly telling her that ‘Compromise is not a dirty word. Life depends on compromise.’ This quote is rather unusual, as it is completely incongruous with Winton’s actions during the Second World War. Lord Alf Dubs, himself a refugee of the Kindertransport and one of the 669 saved by Winton, believed May’s words to be ‘an insult’ to Winton’s character:

What [Winton] demonstrated was not compromise. What he demonstrated was tenacity of purpose, a determination to battle with the British government, to battle with the Nazis, to do what he had to do…She’s using a man who is absolutely iconic for the wonderful things he did and the lives he saved…to justify compromise. That seems to me quite wrong, and a bit of an abuse.’

Despite May’s questionable anecdote, actions speak louder than words. A year after Winton’s death, May (alongside 293 other MPs), voted to turn away 3,000 unaccompanied child refugees from Syria.

Indifference and hostility towards refugees continues to be an issue. Of course, our politicians, pundits, and popular figures will happily deploy the Second World War, selecting instances of humanitarianism where convenient, while failing (or choosing not) to see the other parallels between past and present.

Today, politicians are increasingly taking harder lines against refugees to win votes. Prime Minister Boris Johnson recently vowed to ‘crack down’ on those who ‘abused [the UK’s] hospitality’, hoping it would ‘restore public faith’ in the British immigration system. Johnson has also pledged to ‘make all immigrants speak English’, stating that ‘too often there are parts of our country…where English is not spoken by some people as their first language and that needs to be changed’.

Johnson’s and May’s attitudes clearly demonstrate that the current British establishment have learned very little, if anything, from the Kindertransport. The passing of the Kindertransport policy in 1938 was, of course, partially a positive action for the government to take; this does not mean, however, that the negative aspects should be ignored. The government should be ashamed of their role in the Kindertransport, but through the power of historical revisionism and compelling narrative, they have been sanitised and falsely idealised as being the driving force behind this humanitarian effort, instead of a roadblock against it. This has effectively given contemporary politicians a free pass to continue treating refugees with contempt, whilst still claiming the likes of Winton where convenient.

This self-congratulatory revisionism of so-called ‘British humanitarianism’ must be challenged, and those who continue to peddle such history for political gain must be held to account. Government actions, no matter how positive they may seem on the surface, should not be blindly praised without digging a little deeper first.

Owen A. Jones is a final-year History undergraduate at the University of Sheffield. He recently completed the Sheffield Undergraduate Research Experience (SURE), conducting research on anti-Semitism and Jewish refugees of war during the early twentieth century. His research also examines relevant parallels to the present-day refugee crisis and Britain’s continued treatment of refugees. You can find him on Twitter @OwenAdamJones.

[1] L. E. Brade and R. Holmes, ‘Troublesome Sainthood: Nicholas Winton and the Contested History of Child Rescue in Prague, 1938–1940’, History and Memory 29.1 (2017), p. 5.

[2] B. Wasserstein, Britain and the Jews of Europe, 1939-1945 (Oxford, 1979), pp. 10-11.

[3] Brade and Holmes, ‘Troublesome Sainthood’, p. 5.

[4] C. Holmes, John Bull’s Island: Immigration and British Society, 1871–1971 (London, 1988), p. 142.

[5] ibid., p. 143.

[6] M. Dickson, The Memories of Max Dickson formerly Max Dobriner (Sheffield, 2010), pp. 6-7.

[7] Brade and Holmes, ‘Troublesome Sainthood’, p. 5.

Image Credit: ‘“The Children of the Kindertransport”, Hope Square, Liverpool Street Station, London.’ (Licence: CC BY-SA 2.0, https://creativecommons.org/licenses/by-sa/2.0/), available at: https://www.flickr.com/photos/locosteve/15535288254/in/photostream/.

read more

‘Maybe it’s medieval?’ – Comparing Modern TV and Film against the Medieval Morality Play

Danse_macabre_by_Michael_Wolgemut

How many times have you watched a TV show or film and thought the narrative seemed vaguely familiar? From the classic ‘boy meets girl’ rom-com story arc, to the theory that The Lion King is just a rip-off of Shakespeare’s Hamlet, I think it’s fair to say that stories often repeat themselves.

I recently completed a SURE (Sheffield Undergraduate Research Experience) Project; these provide undergraduate students an opportunity to research an area of special interest. I chose to look at morality plays and what they tell us about the impact of the Black Death on Medieval life and culture.

I found that the approaches taken by these medieval playwrights were not too dissimilar from the techniques used by modern screenwriters. By exploring two literary techniques: tragicomedy and the ‘Shoulder Angel’ I compared these medieval morality plays to modern day film and television to further understand how they differ in consequence to the cultural climate of the period.

Although these plays are an established part of academic study, the narratives of a late-15th-century morality play is not generally well-known. This popular type of play followed the story of ‘mankind’, a single character who represented a typical individual living in medieval society, following his birth, life and final salvation on his day of judgement. Most of the plays I studied originated from East Anglia, but there were other plays from cities such as Chester and York that also dealt with similar themes of religion and death.

The plays were intentionally metaphorical, with their purpose being to give reassurance to those living in the aftermath of the Black Death, as its cultural impact lasted for centuries after its slow decline in the 1350s. Death was witnessed by each individual, as they lost many family members and friends. Thus, these plays aimed to educate their audience, showing that people would reach heaven if they lead a Christian life.

Comedy and Death: A match made in heaven?

‘Mankind’ (suspected to have been written 1465-1470), was considered one of the most popular morality plays of the medieval period. This was thought to be due to its focus on entertainment and comedy taking centre stage over an educational directive.

Similarly, ‘Bruce Almighty’ (2003) – a film about a man who believes he can do a better job than God and in response is gifted omnipotent power by God himself – is an example of a film that pushes a moral message, whilst being comedic.

The morality play argues that a repentance of sins would lead to a control over life, as they could control their afterlife. During a time were life and death were extremely unpredictable Christianity would have offered a reassurance to a medieval audience, showing that they were in control of their future, even if they couldn’t be in control of their death.

Where ‘Mankind’ teaches an audience to repent of their sins and live a moral life, ‘Bruce Almighty’ teaches an audience they should take control of their own lives, and not expect others (such as God) to fix their problems. During a time where life and death were extremely unpredictable, Christianity provided a comforting solution by suggesting that through repenting of their sins they could control their life even in death by ensuring their path to heaven.

Specifically, in its use of comedy, ‘Mankind’ makes a mockery of the main character during his fall into sin. This juxtaposition of comedy and darker themes is seen in another Jim Carrey film, ‘The Truman Show’ (1998). If you took the comedy (or Jim Carrey) out of the film, it would just be harrowing. An hour and forty-three minutes of watching a man have his entire reality taken away from him, to find out it was just a moneymaking scheme.

The use of comedy in both modern films and morality play helps to keep an audience engaged, as death for medieval people was a heavy theme; by using comedy the playwrights could more successfully communicate their moral message.

The Perseverance of the ‘Shoulder Angel’.

The plot device of the ‘shoulder angel’ is most commonly seen today where a protagonist has a good character and an evil character both attempting to persuade the protagonist down a certain path.

This technique was also used in another morality play, ‘The Castle of Perseverance’ (1440), with its use of 15 good and bad characters. More recently, this technique has been seen in the Amazon original, ‘Good Omens’ (2019), with the two protagonists being an angel and a devil.

Although there are some clear differences in the narratives, both stories follow the concept that both good and evil are present in the world. Therefore, it is our own choices that will lead us down a good or evil path. During the 15th century, this may have provided reassurance, as the plays appear to be demonstrating to the audiences that they are in control of their lives, despite the mysterious and unstoppable figure of death being ever-present in their lives, caused by the Black Death.

The idea of being guided by angels is another technique seen in some of the most recognisable films, often ones that are cemented into our Christmas Traditions, like ‘It’s a Wonderful Life’ (1946). In many ways, it appears like a modern-day morality play, as it teaches the audience to be aware of the impact they have on the world as well as the people within it.

Firstly, both the play and film follow one man’s entire life. Secondly, the character of an angel who shows the protagonist how his good deeds have affected the world, allowing him to see the importance of his own life.

The most important part of morality plays was a happy ending. This also demonstrates the legacy of the Black Death, as even when a character reached death it is shown as a rite-of-passage where they were ultimately forgiven for their sins.

Similarly, the end of ‘It’s a Wonderful Life’ shows the protagonist gaining a greater understanding and appreciation of life, ending on the note that they will live their life differently – is that not the same as what the authors of the morality play want their audience to do after the curtain closes?

In summary, I think it’s important to note how all storytelling has a message. The message of morality plays to live a Christian life may not be entirely relevant to the majority of people today, but the general sentiment of living your life with an awareness of mortality suggesting that we should live with purpose and accountability of our actions is a concept audiences can still relate to.

Thus, the reoccurrence of these similar tropes suggests that the stories we choose to tell today may not be so dissimilar from those written 500 years ago. Despite huge differences in values and material conditions, the similarities deserve serious study too.

Natalya Edwards is a History undergraduate student at the University of Sheffield. She recently completed the Sheffield Undergraduate Research Project which provides undergraduate students an opportunity to research an area of special interest, in order to provide insight and experience for postgraduate research. In her project she chose to look at morality plays and what they can tell us about the impact of the Black Death on medieval life and culture.

read more

Why Naomi Wolf misinterpreted evidence from the Old Bailey Online

38551767860_cb41dfe746_b

Readers may be aware of the recent furore over Naomi Wolf’s misinterpretation of Old Bailey trial evidence in her book, Outrages: Sex, Censorship, and the Criminalization of Love, in support of her argument that executions for sodomy increased at the Old Bailey in the second half of the nineteenth century.  Wolf cited in particular the case of Thomas Silver, tried for ‘an unnatural offence’ in 1857, where the Old Bailey Online gives the punishment sentence as ‘Death Recorded’.  As Richard Ward, quoted in the Guardian, notes, this term meant the opposite of what Wolf thought.  First used in 1823, the term ‘death recorded’ was used in cases where the judge wished to record a sentence of death, as he was legally required to do, while at the same time indicating his intention to pardon the convict.  In fact, if Wolf had clicked on the ‘related sources’ link from this trial on the Old Bailey website to the records in the associated website the Digital Panopticon (a massive collection of criminal justice data which traces the lives of Old Bailey convicts following their convictions), she would have seen that following his conditional pardon Silver was sentenced to penal servitude for three years, and, two and a half years later, was released on a prison licence (an early form of parole).

There are two lessons to be learned from this debacle, first about the use of online historical sources, and, second about how the English judicial system worked in the nineteenth century, in particular the differences between courtroom sentences and the actual punishments convicts received.

The first point is obvious, but needs to be made. When searching for evidence online, it is all too easy to pay insufficient attention to context and to fail to follow up links to related information. Nowhere in this very short trial report (censored, like all Old Bailey trial reports of sodomy after 1785), does the word ‘executed’ appear.  In fact, this webpage has included since March 2018 (which admittedly may have been late in Wolf’s research), a link at the top of the page (as noted above) to the Digital Panopticon, which provides evidence of Silver’s actual imprisonment and early discharge. And the ‘Historical Background’ pages on the Old Bailey website include an explanation of the differences between courtroom sentences and the actual punishments convicts received.

This is the second, historically significant, point: convicts at the Old Bailey frequently did not receive the punishments to which they were sentenced.  While this general point has been known in its broad outlines for some time, it is a central research theme of Simon Devereaux’s recent project, Capital Convictions at the Old Bailey, 1730-1837, which tracked down the actual penal outcomes for all capital convicts up to 1837, and a key research finding of the Digital Panopticon project, which researched the outcomes of all defendants convicted at the Old Bailey between 1780 and 1870. This research demonstrates that convicts sentenced to death and transportation often did not experience these punishments, and even those sentenced to imprisonment remained in prison for less time than prescribed by their sentence. The English judicial system was permeated by the exercise of judicial discretion, in which judges, the Home Office, and even penal officers shaped actual punishments to meet the perceived significance and circumstances of each case.

In the case of the death penalty, there was a long term decline in the proportion and number of convicts who were actually executed.  The proportion of Old Bailey capital convicts executed fell from 43.5% in the 1780s to 10.4% between 1810 and 1837, by which point reforms to the penal code had led to a sharp reduction in capital offences: after 1837, the only offences punishable by death were murder, infanticide, wounding, rape, treason, robbery, burglary, arson, and sodomy.  In practice, however, the only Old Bailey convicts actually executed after 1837 were murderers (and even 40% of these were pardoned).

The penal environment of those convicted of sodomy in the second half of the nineteenth century was thus one in which the death penalty was not a realistic possibility. Other forms of persecuting homosexuality certainly remained, and may have even worsened, but the cruellest punishments were confined to an earlier period.  The criminal persecution of sodomy has a long history.  As the Homosexuality page on the Old Bailey Online indicates, there were waves of prosecutions throughout the eighteenth century, leading to executions and near-death maulings by crowds on the pillory.  This continued into the next century: a search of the Digital Panopticon database shows that fourteen men convicted of sodomy were executed between 1803 and 1835.  Fortunately, however, punishment by execution and the pillory stopped there: James Pratt and John Smith, both executed on 27 November 1835, were the last men convicted of sodomy at the Old Bailey to meet this horrible fate.

I’d like to thank Tim Hitchcock, Sharon Howard and Richard Ward for their contributions.

Robert Shoemaker is Professor of Eighteenth-Century British history. His main interests lie in social and cultural history, particularly urban history, gender history, and the history of crime, justice, and punishment, and in the use of digital technologies in historical research. He is co-director, with Professor Tim Hitchcock at the University of Hertfordshire and Professor Clive Emsley of the Open University, of the Old Bailey Proceedings Online, which created a fully searchable edition of the entire run of published accounts of trials which took place at the Old Bailey from 1674 to 1913, and, with Hitchcock, London Lives, 1690-1800: Crime, Poverty and Social Policy in the Metropolis, a fully searchable edition of 240,000 manuscript records and fifteen datasets which makes it possible to compile biographies of eighteenth-century Londoners.

read more
1 2 3
Page 1 of 3