close

Medical Humanities

COVID-19, ‘Big Government’, and the Prohibition of Alcohol: Crisis as a Transnational Moment for Social Change

Liquor_bottles_array (1)

Throughout history, crises have often led to enormous social and economic reform as policymakers are forced to come up with new ways to meet unexpected demands. As Walter Scheidel argues in his book, The Great Leveller (2017), mass violence has been the primary impetus for the decline of inequality throughout world history, most recently with the Second World War serving as a watershed in relation to increased government spending on social programmes in many of its participating states. Although a crisis of a very different nature, the current coronavirus pandemic has also brought about similar shifts, with governments running huge budget deficits to protect jobs and counteract the threat of a looming recession caused by travel restrictions and lockdowns.

We also witness cases where governments experiment with creative solutions to crises that stretch across borders, as is the case with the current global pandemic. For a variety of reasons, a small handful of countries have resorted to banning the sale of intoxicants. One of the most debated aspects of South Africa’s lockdown has been their prohibition on the sale of alcohol and cigarettes, intended to reduce hospital admissions and secure beds for COVID-19 patients. Admissions have dropped by two-thirds due to reductions in alcohol-related violence and accidents, but such draconian measures also meant the rise of black-market trade and the near-collapse of the country’s proud wine industry.

The sale of alcohol was also banned in the Caribbean island of Sint Maarten, a constituent country of the Netherlands, and in Nuuk, the capital of Greenland, over its role in exacerbating incidents of domestic violence that came with the lockdown. In Thailand, the prohibition on alcohol was put in place to prevent the spread of the virus in social gatherings. In each setting, such policies were deemed drastic but necessary, carefully implemented for their advantages in tackling a variety of health concerns whilst also considering their clear downsides.

Although instituted under entirely different circumstances, the First World War was also a moment when similarly harsh controls were imposed on the sale of alcohol across the world. France and Russia were the first to institute bans on absinthe and vodka, respectively, due to concerns over their impact on wartime efficiency. Countries in which anti-alcohol temperance movements were already influential also implemented tough restrictions of varying degrees. Although the production and sale of alcohol had already been banned in different local jurisdictions in Canada and the United States, a national prohibition came into fruition in both countries due to the war. Alcohol was not banned in Britain, but the country nevertheless instituted far-reaching controls on the distribution of drink under the Central Control Board (CCB), established in 1915 to enforce higher beverage duties and shorter closing hours in pubs.

In almost every instance, it was the context of the war that spurred the move towards instituting these tough restrictions. Temperance activists in North America had been pushing for a national prohibition for decades, but the conditions of the war, such as the rise of anti-German sentiment directed towards German-American breweries such as Anheuser-Busch, brought the federal implementation of prohibition to the forefront of national politics. In Britain, part of the CCB’s responsibility was the nationalisation of pubs and off-licenses situated in parts of the country that were of strategic importance to the war effort.

These contexts directly parallel what we’re seeing in South Africa and Thailand, where extraordinary circumstances necessitated extraordinary countermeasures. However, there is also an important difference that must be stressed: while current lockdown prohibitions are merely temporary, most advocates of prohibitions and controls a century ago believed that such measures were to be permanent, based on their view that there were no advantages to permitting the existence of ‘demon drink’ in society. The ban on the distillation of vodka instituted under Imperial Russia in 1914 was maintained after the October Revolution and was not scrapped until after Lenin, himself an ardent prohibitionist, died in 1924. Yet, within the British context, the First World War effectively reshaped alcohol licensing for several generations, as high beverage duties and shorter opening hours were mostly preserved into the interwar and postwar eras.

These cases highlight the broader implications of social and economic reforms that are being implemented today. Right-wing governments in both Britain and Japan have approved record levels of government spending in the form of economic aid and stimulus. As Bernie Sanders ended his bid for the Democratic nomination in April 2020, politicians of both the left and the right debated the federal implementation of universal healthcare and paid sick leave in light of the public health crisis. Most recently, the Spanish government announced a €3-billion-euro universal basic income scheme to stimulate the pandemic-hit economy through increased consumer spending. A columnist for The Washington Post was clearly onto something when he declared that ‘there are no libertarians in foxholes’.

It is, however, decidedly too early to predict the long-term impacts of COVID-19 and whether these will lead to what many hope to be a reversal of neoliberal reforms that have dominated economics since the 1970s. One cannot forget that the ‘Keynesian Resurgence’ in stimulus spending during the Financial Crisis of 2007-08 was immediately followed by the tragedy of the Eurozone Crisis and the traumas of austerity measures that devastated the public sectors of Greece, Spain, Italy, Britain, and so on. Despite that, the impact of abrupt changes in undermining the status quo cannot be underestimated, as we saw with the global ‘wave’ of alcohol prohibitions a century before. History, therefore, is an apt reminder of how crises are moments when ‘radical’ reforms that were previously only imagined can eventually become reality.

Ryosuke Yokoe is a historian of medicine, science, and public health, presently affiliated with the University of Sheffield as an honorary research fellow. He recently completed a PhD on the medical understandings of alcohol and liver disease in twentieth-century Britain. You can find him on Twitter @RyoYokoe1.

Cover image: Array of liquor bottles, courtesy of Angie Garrett, https://www.flickr.com/photos/smoorenburg/3312808594/ [accessed 28 May 2020].

read more

Dawson’s ‘Big Idea’: The Enduring Appeal of the Primary Healthcare Centre in Britain

Retford

May 2020 marks the centenary of the publication of the Interim Report of the Consultative Council on the Future of Medical and Allied Services, popularly known as the Dawson report after its principal author, Lord Dawson of Penn.[i] The report, commissioned in 1919 by the newly established Ministry of Health, outlined a plan to bring together existing services funded by national health insurance, local authorities, and voluntary bodies in a coherent and comprehensive healthcare system. The final report was never published, being consigned to oblivion by a worsening economy and changed political climate. Though cautiously welcomed by professional leaders, Dawson’s plan was condemned by a hostile press as grandiose and unaffordable.[ii] However, recent NHS policy directives regarding Integrated Care Systems show that the principal task which Dawson’s group had set itself, that of successfully integrating primary, secondary and ‘allied’ health services, is one with which NHS leaders are still grappling today.[iii]

Lord Dawson of Penn, courtesy of the British Medical Association archive

Central to Dawson’s plan, and its most revolutionary idea, was the creation of a network of ‘primary health centres’ (PHCs) in each district in which general practitioners (GPs) could access diagnostic, surgical, and laboratory facilities for their patients and which would also house infant welfare and maternity services, facilities to promote physical health, and space for administration, records, and postgraduate education. GPs and other professionals would see and treat patients at PHCs, referring only complex cases to specialists at secondary care centres (essentially district hospitals) located in large towns, while patients needing the most specialized treatment would be referred to regional teaching hospitals with attached medical schools. This ‘hub and spoke’ model is one to which recent generations of NHS health planners have returned time and again, seemingly unaware of its antecedents.

A firm believer in teamwork, Dawson hoped that collaborative use of PHCs by GPs would encourage group practice and multi-disciplinary working. But the individualistic nature of general practice at that time meant GPs remained wary of his ideas, despite the fact that examples of PHCs already existed in Gloucestershire and in Scotland and many of the facilities they were meant to comprise could be found in GP-run cottage hospitals and Poor Law infirmaries.[iv] Experiments with architect-designed health centres in the 1920s and 1930s failed to elicit a major change in professional or governmental attitudes.[v] In 1948 the NHS brought public, voluntary and local authority hospitals under state control but in its early years the promise of new PHCs remained largely unrealised.[vi] Proprietorial traditions and fear of local government control led to a mushrooming of purpose- built, GP-owned practice premises between the late 1960s and 1990s independently of local authority-owned health centres, for which there was a major building programme in the 1970s.[vii]

Illustration of a Primary Health Centre, from the Dawson Report, courtesy of the BMA archive

Although by the late twentieth century the Dawson report had largely been forgotten, interest in PHCs resurfaced in the early 2000s with a major investment in primary healthcare facilities through the establishment of Local Improvement Finance Trusts (LIFT). These were a form of private finance initiative designed to provide state of the art community health and social care hubs housing GP practices and other services. Unfortunately, LIFT buildings proved more expensive than anticipated and their facilities, intended to promote the transfer of work from secondary to primary care, were often underutilised.[viii] While these were being constructed, the Labour health minister, Lord Ara Darzi, announced the establishment of a number of ‘polyclinics’, bearing a close resemblance to Dawson’s PHC idea. However, the Darzi Centres that were established were either mothballed or repurposed, being condemned as an expensive ‘white elephant’ by professional leaders.[ix]

In the last few years a ‘quiet revolution’ has been taking place in the NHS in England involving attempts to dismantle the financial and institutional barriers between primary, secondary and community care created by the internal market. Its byword, ‘Integration’, echoes Dawson’s overriding goal and the ‘hub and spoke model’ he advocated is now well established. Meanwhile, the pressures of unending demand have forced GPs to collaborate as healthcare providers in locality groups called Primary Care Networks (PCNs). Though guidance on these is not prescriptive, some PCNs have adopted the idea of a community ‘hub’ housing shared diagnostic and treatment facilities much as Dawson had envisaged.[x]

While the full impact of COVID-19 on our struggling health services is still unknown, the abiding necessity for all parts of the NHS to collaborate, communicate and mutually support each other during this crisis underlines the value and relevance of Dawson’s vision of integrated services. It remains to be seen if, in its aftermath, his ‘big idea’ of ubiquitous multi-purpose PHCs will come any closer to being realised.

Chris Locke is a fourth year PhD student in the History Department at the University of Sheffield. His research is focused on the political consciousness of British GPs and their struggle for professional self-determination in the early Twentieth Century.

Cover image: LIFT -built Primary Care Centre, Retford, Nottinghamshire, photographed by the author.

[i] Interim Report of the Consultative Council on the Future of Medical and Allied Services, Cmd 693 HMSO  1920. For an account of the origins and significance of the report see Frank Honigsbaum, The Division in British Medicine (London, 1979) chapters 6-12.

[ii] The British Medical Association’s blueprint for health services reform, A General Medical Service for the Nation (1930) and the report by Political and Economic Planning, The British Health Services (1937) both referenced the Dawson report, and it clearly influenced the Beveridge report, Social Insurance and Allied Services (1942).

[iii] https://www.kingsfund.org.uk/publications/making-sense-integrated-care-systems (last accessed 3 April 2020)

[iv] The report referenced the hub and spoke model of healthcare facilities overseen by Gloucestershire County Council’s Medical Officer of Health, Dr J Middleton Martin. Commentators also noted similarities with Sir James McKenzie’s Primary Care Clinic in St Andrews and Trade Union-run Medical Aid Institutes in South Wales.

[v] Jane Lewis and Barbara Brookes, ‘A Reassessment of the Work of the Peckham Health Centre 1926-1951’, Health and Society vol 61, 2, 1983 pp.307-350; For Finsbury Health Centre see A B Stewart, ‘Health Centres of Today’, The Lancet, 16 March 1946 pp. 392-393.

[vi] For one exception see R H Parry et al, ‘The William Budd Health Centre: the First Year’, British Medical Journal, 15 March 1954 pp.388-392.

[vii] BMA General Practitioners Committee guidance: The Future of GP Practice Premises (Revised 2010)

[viii] Nottinghamshire Local Medical Committee, NHS LIFT in Nottinghamshire (Nottingham,1997)

[ix] Peter Davies, ‘Darzi Centres: an expensive luxury the UK can no longer afford?’, British Medical Journal, 13 November 2010, 341; c6237.

[x] https://www.england.nhs.uk/primary-care/primary-care-networks/ (last accessed 3 April 2020)

 

read more

Jonas Salk turns 105: Some thoughts on lessons from history

salk1

Jonas Salk would have been 105 today, 28 October. He is remembered as the inventor of the polio vaccine who, when asked how much money he stood to make, declared: ‘There is no patent. Could you patent the sun?’

Of course, “it’s more complicated than that”. Salk was part of a multi-national, multi-agency project to develop prophylactics. Without the use of “his” injectable vaccine and the oral vaccine developed by rivals on the other side of the iron curtain, humanity would not be on the verge of eliminating polio. (For more on that story, see the excellent book by Dora Vargha.)[1] And one of the reasons Salk didn’t patent the vaccine was that it was unpatentable.

But let’s not be uncharitably pedantic. It is, after all, his birthday.

In the wake of recent reports of resurgent infectious diseasesincluding polio – vaccination is back in the news. (If, indeed, it ever went away). Matt Hancock, the UK’s Secretary of State for Health and Social Care, has suggested the government might consider mandatory vaccination. Public health experts have cautioned against this, using (in part) historical evidence. In the nineteenth century, compulsory vaccination generated a well-organised, vocal and occasionally violent anti-vaccination movement,[2] the effects of which still haunt Britain’s public health authorities.

Public health has taken its lessons from high-profile examples of crisis – smallpox, pertussis or measles to name but three.[3] But not all problems come from rejection of vaccines. With polio in the 1950s, the problem was the government’s inability to meet demand.

Salk’s vaccine (yes, we’ll give him credit here – after all, contemporaries referred to the inactivated poliomyelitis vaccine simply as “Salk”) became commercially available in 1955. The British government announced with great fanfare that it would provide the vaccine for free to all children and young adults. There was clear demand for it. This invention – in the same vein as space exploration and penicillin – was a marker of modernity, the power of science to solve once-intractable problems.

Unfortunately, there was not enough to go around. In 1955, a manufacturing defect by Cutter Pharmaceuticals resulted in the accidental infection of hundreds of American children. As a result, the British banned American imports and chose to use domestic factories to produce a “safer” form of the vaccine.[4] But Britain didn’t have the capacity to produce enough doses in time. Shortages created complaints from the British press and parents, and – despite the demand – few registered for the vaccine because of the long waiting lists and inconvenience.

As proof of the demand for the vaccine – despite the Cutter incident – local authorities were swamped with requests when high-profile cases made the news. The death of professional footballer Jeff Hall showed even fit, young people could be affected and created a surge in numbers of younger adults presenting themselves and their children for the jab. In the ensuing shortages, the health minister blamed people for their apathy – if they’d just done as they were told when they were told, the government could have better distributed the vaccine over the course of the year. This did not go down well as a public relations exercise.

This crisis was eventually overcome through the introduction of the oral polio vaccine in the early 1960s. Taken on a sugar cube, parents were much more willing to present their children. It was a quick process that could be done anywhere; it didn’t hurt (though its taste was somewhat to be desired); and it could be manufactured so easily, and in such volume, that there was no need to wait around for months for the next batch to become available.

Of course, all historical circumstances are different. Anti-vaccination literature is certainly more visible than it was in the 1950s. Populations are more mobile. The immediate memory – even fear – of certain infectious diseases has faded.

At the same time, the intriguing part of this history – at least to this historian – is not why people don’t vaccinate their kids. It’s why so many do.[5] The vast majority of children receive some form of vaccination – upwards of 95 per cent – even if they do not always complete every course in the recommended time frame.

The great improvements in vaccination rates over the past 70 years have come from better administration. Easier-to-administer vaccines. More-robust procedures for following up on missed appointments. Advertising. Having local health professionals answer the specific questions and concerns individual parents might have. Following up with patients who might otherwise slip through the surveillance of public health authorities (such as those who do not speak English, regularly change addresses, have other acute social care needs). All these things required resources which have been squeezed significantly since public health was reintegrated into already-struggling local authorities.

It would be unwise for a historian to say that this is the cause of the problems, or that extra funding will provide a magic-bullet solution.

It is, however, worth reminding ourselves that crises in vaccination policy are not new. We have experienced them before. And not all of them have been due to a lack of demand or fear of a particular vaccine. The 1950s polio example shows us that more practical issues can be at play, and that the public and its collective behaviour are not necessarily at the root of them.

Gareth Millward is a Wellcome Research Fellow at the Centre for the History of Medicine at the University of Warwick. He has worked on the history of disability, public health, vaccination and most recently sick notes. His book Vaccinating Britain was published in January 2019 by Manchester University Press.

[1] Dora Vargha, Polio across the Iron Curtain: Hungary’s Cold War with an Epidemic (Cambridge: Cambridge University Press, 2018).

[2] Nadja Durbach, Bodily Matters: The Anti-Vaccination Movement in England, 1853–1907 (Durham: Duke University Press, 2005).

[3] Stuart Blume, Immunization: How Vaccines Became Controversial (London: Reaktion, 2017).

[4] Hannah Elizabeth, Gareth Millward and Alex Mold, ‘’Injections-While-You-Dance’: Press advertisements and poster promotion of the polio vaccine to British publics, 1956-1962’, Cultural and Social History 16:3 (2019): 315-36.

[5] Gareth Millward, Vaccinating Britain: Mass Vaccination and the Public Since the Second World War (Manchester: Manchester University Press, 2019), p. 1.

read more

PIP, Parity, and the Past: why history matters

L0006105 String galvanometer and human electrocardiogram

Few would deny that living with a mental health condition today often means living with stigma, limited support, or access to services. It has also become recognized that these issues do not affect people living with a physical health condition in the same way, thus leading to calls for ‘parity of esteem’ from charities such as Mind. [1]

Parity of esteem is best understood as valuing mental health equally with physical health, and in 2015 a government taskforce was created to achieve this. [2]

Nonetheless, disparity was recently brought into sharp focus by researchers at the University of York, who revealed significant differences in the allocation of Personal Independence Payment (PIP) to people who have a mental health condition, in comparison to people living with physical health conditions such as diabetes.[3]

PIP was introduced as part of the 2012 Welfare Reform Act, and supports people aged 16 to 64 who are living with long term health conditions or disability. [4]

York researchers cited the “informal observation” of appearance and body language in order to make decisions regarding eligibility as a potential cause of this disparity.

Nonetheless, history provides useful insight when attempting to understand how, rather than just why, such disparity between the mental and the physical may emerge in welfare contexts.

This is exemplified by the work of Rhodri Hayward, who traced the emergence and uses of the concept of the ‘unconscious’ in early twentieth century British primary healthcare. [5]

In this comprehensive book, Hayward’s focus on how the unconscious facilitated the interrogation of insurance or compensation claims in the wake of early twentieth-century welfare legislation, is particularly compelling.

Hayward defined the unconscious as the belief that there is “some sort of inner agent which records our experience and organizes its embodiment” which is beyond our control. [6]

In the early twentieth century, the passage of the Workmen’s Compensation Act (1897, 1900, and 1906), and the National Health Insurance Act (1911) offered a new scheme of sick pay and remuneration for the working population of Britain. These welfare policies set in motion significant changes in primary care. In the doctor/patient relationship the interests of the latter changed, as they became a claimant seeking financial compensation or insurance, not just medical treatment.

This legislation thus also stimulated a wave of insurance and compensation claims from the working population. Contemporaries lamented the economic and social implications of this increase, highlighting that a situation had been created where “any experience of sickness was bound up with the possibility of unearned reward.” [7]

The unconscious was vital to navigating and disciplining these complexities, and identifying malingerers. Crucially, the use of this concept allowed claims to be assessed without creating an oppositional relationship between the doctor and the claimant.

Moreover, contemporary medical professionals identified the group of “unconscious malingerers” whose “symptoms may be founded on fact, but are mostly imaginary.” [8] Such claimants continued to seek compensation long since their “real physical disabilities” had disappeared. [9]

Technological developments in electrophysiology facilitated the interrogation of a claim, as by detecting electrical currents produced by the heart, the ‘galvanometer’ was believed to reveal the “unspoken attractions and intentions of an investigative subject.” [10] In becoming quantifiable and measurable, the acceptance and use of the unconscious were solidified.

Hayward also sheds light on the place of the unconscious today, as he suggested that we may now have entered an “age of cosmetic psychiatry”, where psychological health is understood as within our control. [11]

In this new age, we are encouraged to shape our identities through an eclectic package of pharmaceutical and therapeutic treatments such as anti-depressants or mindfulness courses.

If we accept this shift, it is important to question what psychological concepts have or will replace those such as the ‘unconscious’ as a means of understanding the health, characters, and lives of others and ourselves. It is moreover useful to consider how these concepts may operate, discipline, or discriminate in a welfare context, such as a PIP assessment.

In his work, Hayward demonstrated how the unconscious shaped insurance and compensation administration. Married with new language and developing electrophysiological technology, this concept supported the interrogation, investigation, and assessment of claimants, and most importantly, the detection of malingerers. The acceptance and meaning of the unconscious was in turn shaped and reinforced by the language and practice which grew up around it.

By analyzing the use of this concept, Hayward demonstrated that it is possible to grasp why some people were granted insurance or compensation, and why others were not. His contentions and approach are therefore useful when trying to understand the current enduring and damaging disparity between mental and physical health, which has been highlighted by researchers and evidenced in PIP assessments.

Hayward’s work provides us with a useful template to analyse how, and therefore to understand why, people with mental health conditions are currently losing their welfare entitlement to PIP. His contentions should force us to question how current psychological concepts continue to facilitate and shape the decision-making process and outcome for PIP claimants, and whether these concepts have a role to play in disparity.

There are no simple answers to why parity of esteem continues to be so elusive in practice. This blog hopes, however, to have presented some useful tools to begin to ask the right questions.

Kate McAllister is a first year PhD student at the University of Sheffield’s Department of History. Her research is funded by the Wellcome Trust, and aims to contextualise the current parity of esteem agenda, demonstrating that although this concept has shaped policy for over a century, implementing it in practice has recurrently failed. To navigate the complexities of this issue, her thesis focuses on the outbreak of Epidemic Encephalitis in Sheffield during the 1920s and 1930s.

 

[1]https://www.mind.org.uk/information-support/your-stories/valuing-mental-and-physical-health-equally/#.XFVyJS10dQI

[2]https://www.england.nhs.uk/mental-health/taskforce/

[3]https://www.theguardian.com/society/2019/jan/22/mentally-ill-people-more-at-risk-losing-benefits-study-shows

[4] https://www.gov.uk/government/publications/2010-to-2015-government-policy-welfare-reform/2010-to-2015-government-policy-welfare-reform

[5]Rhodri Hayward, The Transformation of the Psyche in British Primary Care, 18701970, (London: 2014)

[6]Hayward, Transformation of the Psyche, xi

[7]Hayward, Transformation of the Psyche, p.37

[8]Hayward, The Transformation of the Psyche, p.36

[9] Hayward, Transformation of the Psyche, p.36

[10]Hayward, Transformation of the Psyche, p.42

[11]Hayward, Transformation of the Psyche, p.130

read more

The Guinea Pigs of Oakholme Road: Pacifism and Medical Research in Wartime Sheffield

IWM NCL – COs

At 4.30am on Saturday 8 March 2008, South Yorkshire Police arrived at Oakholme Hall, a 30-bed student residence in Broomhill, Sheffield, and began dispersing the 300-strong crowd gathered outside. As the Sheffield Telegraph reported later that week, what had started as a low-key house party had, due to some unwisely chosen privacy settings on Facebook, been gate-crashed by “hundreds of drunken revellers”.

The ensuing fracas, which resulted in ten arrests, nine on-the-spot fines, and numerous complaints from local residents, led Pro-Vice Chancellor of the University of Sheffield Professor Paul White to denounce those students who would “bring the good name of the university… into disrepute” and threaten expulsion for those who continued to flout rules of conduct. In response to White’s comments, Students’ Union President Mark Willoughby stressed that the party was an outlying incident and instead pointed to those students who conscientiously contributed to the local community, including “over 1,000 [who] are involved in voluntary work across the city.”

Willoughby’s appeal to voluntary work in an attempt to rehabilitate the tarnished reputation of Sheffield’s student population in 2008 provided a fortuitous call-back to the little-known place of Oakholme Road in the history of medicine and warfare. It was next door to Oakholme Lodge, at 18 Oakholme Road, that the Sorby Research Institute (SRI) was founded in December 1940. Although today merely another student hall, during the Second World War the building functioned as a site of unprecedented medical experimentation on human volunteers drawn from Sheffield’s community of pacifists and conscientious objectors (COs). Over the following six years, these ‘human guinea pigs’ would subject their bodies to infectious diseases, deficient diets, shipwreck simulations, stab wounds, and even bouts of malaria and scurvy. 1

To understand why pacifists would volunteer for these unpleasant tasks, it is necessary to consider the ambiguous position of COs in 1940s Britain. Whereas the well-publicised brutality inflicted on COs during the First World War generated a great deal of sympathy and solidarity, the comparative tolerance shown to their successors in 1939 caused something of an existential crisis for many in the pacifist community about how best to serve humanity and resist war. 2

This anxiety was particularly pronounced among young, university-age pacifists who increasingly rejected overly ‘intellectual’ and ‘academic’ forms of protest and instead promoted more practical, grounded, and physical kinds of war work such as agricultural labour, humanitarian relief, and medical aid. As well as being spurred on by their political beliefs, this drive towards more taxing kinds of labour was shaped by the mockery and scorn often directed towards university-educated pacifists by military tribunals and the local press. Comments regarding the application of Richard Charles Clarke, a 20-year-old student at the University of Sheffield, for exemption from military service, were typical. “You are receiving your education from the State, and you are not prepared to do anything in return,” the tribunal chairman concluded, before registering Clarke for military service against his wishes. 3

From this perspective, serving as a ‘human guinea pig’ made perfect sense: it offered the young, eager pacifist a form of labour that was constructive and humanitarian, but at the same time offered painful and unpleasant trials through which they could prove their bravery and commitment. The SRI’s experiments, therefore, offered a rare opportunity to improve their standing within the local community from mere tolerance to (at least grudging) respect.

It was with this hope in mind that volunteers signed-up for the first major experimental programme at the SRI: a series of trials designed to investigate the transmission of scabies, an infectious skin disease caused by parasitic mites which had been rising in incidence since the late 1930s. 4 These experiments required the volunteers to adopt a range of transgressive behaviours: wearing dirty military uniforms, sleeping naked between soiled bedsheets, and even sharing beds with infected soldiers. By presenting these unusual labours as vital to the protection of national health, the volunteers were able to overcome suspicion and distrust about their CO status to secure praise from local newspapers, gain sympathy from tribunal panels, and even reconcile with previously estranged family members.

Many of these benefits were short-lived, however. In the later years of the war, a shift towards less ‘exciting’ nutritional experiments, which largely required volunteers to adopt monotonous diets for months and even years a time, restricted the SRI’s capacity to transform maligned pacifists into unlikely wartime heroes. 5 As such, when the SRI closed in February 1946 to make way for “a student hostel”, many of the volunteers returned to their pre-war lives with little more to show for their efforts than disrupted careers, diminished finances, and compromised bodies. Nevertheless, for a short time, the house on Oakholme Road provided a space where a young, marginalised group could remake its public image against a backdrop of hostility and suspicion. Future party-throwers, take note.

David Saunders is a PhD student at the Centre for the History of the Emotions at Queen Mary University of London. His research focuses on medical experimentation and the politics of citizenship in wartime Britain.

Notes:

  1. For an overview of the SRI, see Kenneth Mellanby, Human Guinea Pigs (London: Victor Gollancz Ltd., 1945).
  2. See Martin Ceadel, Pacifism in Britain 1914-1945: The Defining of a Faith (Oxford: Clarendon Press, 1980), 301-305.
  3. “Pacifist Tells Tribunal He Loves Hitler,” Sheffield Telegraph, 24 November 1939, p.6.
  4. See Kenneth Mellanby, Scabies (London: Oxford University Press, 1943).
  5. See E.M. Hume and H.A. Krebs, Vitamin A Requirement of Human Adults: An Experimental Study of Vitamin A Deprivation in Man (London: His Majesty’s Stationery Office, 1949); W. Bartley, H.A. Krebs and J.R.P. O’Brien, Vitamin C Requirement of Human Adults: A Report by the Vitamin C Subcommittee of the Accessory Food Factors Committee (London: His Majesty’s Stationery Office, 1953).
read more