close

Medical Humanities

Jonas Salk turns 105: Some thoughts on lessons from history

salk1

Jonas Salk would have been 105 today, 28 October. He is remembered as the inventor of the polio vaccine who, when asked how much money he stood to make, declared: ‘There is no patent. Could you patent the sun?’

Of course, “it’s more complicated than that”. Salk was part of a multi-national, multi-agency project to develop prophylactics. Without the use of “his” injectable vaccine and the oral vaccine developed by rivals on the other side of the iron curtain, humanity would not be on the verge of eliminating polio. (For more on that story, see the excellent book by Dora Vargha.)[1] And one of the reasons Salk didn’t patent the vaccine was that it was unpatentable.

But let’s not be uncharitably pedantic. It is, after all, his birthday.

In the wake of recent reports of resurgent infectious diseasesincluding polio – vaccination is back in the news. (If, indeed, it ever went away). Matt Hancock, the UK’s Secretary of State for Health and Social Care, has suggested the government might consider mandatory vaccination. Public health experts have cautioned against this, using (in part) historical evidence. In the nineteenth century, compulsory vaccination generated a well-organised, vocal and occasionally violent anti-vaccination movement,[2] the effects of which still haunt Britain’s public health authorities.

Public health has taken its lessons from high-profile examples of crisis – smallpox, pertussis or measles to name but three.[3] But not all problems come from rejection of vaccines. With polio in the 1950s, the problem was the government’s inability to meet demand.

Salk’s vaccine (yes, we’ll give him credit here – after all, contemporaries referred to the inactivated poliomyelitis vaccine simply as “Salk”) became commercially available in 1955. The British government announced with great fanfare that it would provide the vaccine for free to all children and young adults. There was clear demand for it. This invention – in the same vein as space exploration and penicillin – was a marker of modernity, the power of science to solve once-intractable problems.

Unfortunately, there was not enough to go around. In 1955, a manufacturing defect by Cutter Pharmaceuticals resulted in the accidental infection of hundreds of American children. As a result, the British banned American imports and chose to use domestic factories to produce a “safer” form of the vaccine.[4] But Britain didn’t have the capacity to produce enough doses in time. Shortages created complaints from the British press and parents, and – despite the demand – few registered for the vaccine because of the long waiting lists and inconvenience.

As proof of the demand for the vaccine – despite the Cutter incident – local authorities were swamped with requests when high-profile cases made the news. The death of professional footballer Jeff Hall showed even fit, young people could be affected and created a surge in numbers of younger adults presenting themselves and their children for the jab. In the ensuing shortages, the health minister blamed people for their apathy – if they’d just done as they were told when they were told, the government could have better distributed the vaccine over the course of the year. This did not go down well as a public relations exercise.

This crisis was eventually overcome through the introduction of the oral polio vaccine in the early 1960s. Taken on a sugar cube, parents were much more willing to present their children. It was a quick process that could be done anywhere; it didn’t hurt (though its taste was somewhat to be desired); and it could be manufactured so easily, and in such volume, that there was no need to wait around for months for the next batch to become available.

Of course, all historical circumstances are different. Anti-vaccination literature is certainly more visible than it was in the 1950s. Populations are more mobile. The immediate memory – even fear – of certain infectious diseases has faded.

At the same time, the intriguing part of this history – at least to this historian – is not why people don’t vaccinate their kids. It’s why so many do.[5] The vast majority of children receive some form of vaccination – upwards of 95 per cent – even if they do not always complete every course in the recommended time frame.

The great improvements in vaccination rates over the past 70 years have come from better administration. Easier-to-administer vaccines. More-robust procedures for following up on missed appointments. Advertising. Having local health professionals answer the specific questions and concerns individual parents might have. Following up with patients who might otherwise slip through the surveillance of public health authorities (such as those who do not speak English, regularly change addresses, have other acute social care needs). All these things required resources which have been squeezed significantly since public health was reintegrated into already-struggling local authorities.

It would be unwise for a historian to say that this is the cause of the problems, or that extra funding will provide a magic-bullet solution.

It is, however, worth reminding ourselves that crises in vaccination policy are not new. We have experienced them before. And not all of them have been due to a lack of demand or fear of a particular vaccine. The 1950s polio example shows us that more practical issues can be at play, and that the public and its collective behaviour are not necessarily at the root of them.

Gareth Millward is a Wellcome Research Fellow at the Centre for the History of Medicine at the University of Warwick. He has worked on the history of disability, public health, vaccination and most recently sick notes. His book Vaccinating Britain was published in January 2019 by Manchester University Press.

[1] Dora Vargha, Polio across the Iron Curtain: Hungary’s Cold War with an Epidemic (Cambridge: Cambridge University Press, 2018).

[2] Nadja Durbach, Bodily Matters: The Anti-Vaccination Movement in England, 1853–1907 (Durham: Duke University Press, 2005).

[3] Stuart Blume, Immunization: How Vaccines Became Controversial (London: Reaktion, 2017).

[4] Hannah Elizabeth, Gareth Millward and Alex Mold, ‘’Injections-While-You-Dance’: Press advertisements and poster promotion of the polio vaccine to British publics, 1956-1962’, Cultural and Social History 16:3 (2019): 315-36.

[5] Gareth Millward, Vaccinating Britain: Mass Vaccination and the Public Since the Second World War (Manchester: Manchester University Press, 2019), p. 1.

read more

PIP, Parity, and the Past: why history matters

L0006105 String galvanometer and human electrocardiogram

Few would deny that living with a mental health condition today often means living with stigma, limited support, or access to services. It has also become recognized that these issues do not affect people living with a physical health condition in the same way, thus leading to calls for ‘parity of esteem’ from charities such as Mind. [1]

Parity of esteem is best understood as valuing mental health equally with physical health, and in 2015 a government taskforce was created to achieve this. [2]

Nonetheless, disparity was recently brought into sharp focus by researchers at the University of York, who revealed significant differences in the allocation of Personal Independence Payment (PIP) to people who have a mental health condition, in comparison to people living with physical health conditions such as diabetes.[3]

PIP was introduced as part of the 2012 Welfare Reform Act, and supports people aged 16 to 64 who are living with long term health conditions or disability. [4]

York researchers cited the “informal observation” of appearance and body language in order to make decisions regarding eligibility as a potential cause of this disparity.

Nonetheless, history provides useful insight when attempting to understand how, rather than just why, such disparity between the mental and the physical may emerge in welfare contexts.

This is exemplified by the work of Rhodri Hayward, who traced the emergence and uses of the concept of the ‘unconscious’ in early twentieth century British primary healthcare. [5]

In this comprehensive book, Hayward’s focus on how the unconscious facilitated the interrogation of insurance or compensation claims in the wake of early twentieth-century welfare legislation, is particularly compelling.

Hayward defined the unconscious as the belief that there is “some sort of inner agent which records our experience and organizes its embodiment” which is beyond our control. [6]

In the early twentieth century, the passage of the Workmen’s Compensation Act (1897, 1900, and 1906), and the National Health Insurance Act (1911) offered a new scheme of sick pay and remuneration for the working population of Britain. These welfare policies set in motion significant changes in primary care. In the doctor/patient relationship the interests of the latter changed, as they became a claimant seeking financial compensation or insurance, not just medical treatment.

This legislation thus also stimulated a wave of insurance and compensation claims from the working population. Contemporaries lamented the economic and social implications of this increase, highlighting that a situation had been created where “any experience of sickness was bound up with the possibility of unearned reward.” [7]

The unconscious was vital to navigating and disciplining these complexities, and identifying malingerers. Crucially, the use of this concept allowed claims to be assessed without creating an oppositional relationship between the doctor and the claimant.

Moreover, contemporary medical professionals identified the group of “unconscious malingerers” whose “symptoms may be founded on fact, but are mostly imaginary.” [8] Such claimants continued to seek compensation long since their “real physical disabilities” had disappeared. [9]

Technological developments in electrophysiology facilitated the interrogation of a claim, as by detecting electrical currents produced by the heart, the ‘galvanometer’ was believed to reveal the “unspoken attractions and intentions of an investigative subject.” [10] In becoming quantifiable and measurable, the acceptance and use of the unconscious were solidified.

Hayward also sheds light on the place of the unconscious today, as he suggested that we may now have entered an “age of cosmetic psychiatry”, where psychological health is understood as within our control. [11]

In this new age, we are encouraged to shape our identities through an eclectic package of pharmaceutical and therapeutic treatments such as anti-depressants or mindfulness courses.

If we accept this shift, it is important to question what psychological concepts have or will replace those such as the ‘unconscious’ as a means of understanding the health, characters, and lives of others and ourselves. It is moreover useful to consider how these concepts may operate, discipline, or discriminate in a welfare context, such as a PIP assessment.

In his work, Hayward demonstrated how the unconscious shaped insurance and compensation administration. Married with new language and developing electrophysiological technology, this concept supported the interrogation, investigation, and assessment of claimants, and most importantly, the detection of malingerers. The acceptance and meaning of the unconscious was in turn shaped and reinforced by the language and practice which grew up around it.

By analyzing the use of this concept, Hayward demonstrated that it is possible to grasp why some people were granted insurance or compensation, and why others were not. His contentions and approach are therefore useful when trying to understand the current enduring and damaging disparity between mental and physical health, which has been highlighted by researchers and evidenced in PIP assessments.

Hayward’s work provides us with a useful template to analyse how, and therefore to understand why, people with mental health conditions are currently losing their welfare entitlement to PIP. His contentions should force us to question how current psychological concepts continue to facilitate and shape the decision-making process and outcome for PIP claimants, and whether these concepts have a role to play in disparity.

There are no simple answers to why parity of esteem continues to be so elusive in practice. This blog hopes, however, to have presented some useful tools to begin to ask the right questions.

Kate McAllister is a first year PhD student at the University of Sheffield’s Department of History. Her research is funded by the Wellcome Trust, and aims to contextualise the current parity of esteem agenda, demonstrating that although this concept has shaped policy for over a century, implementing it in practice has recurrently failed. To navigate the complexities of this issue, her thesis focuses on the outbreak of Epidemic Encephalitis in Sheffield during the 1920s and 1930s.

 

[1]https://www.mind.org.uk/information-support/your-stories/valuing-mental-and-physical-health-equally/#.XFVyJS10dQI

[2]https://www.england.nhs.uk/mental-health/taskforce/

[3]https://www.theguardian.com/society/2019/jan/22/mentally-ill-people-more-at-risk-losing-benefits-study-shows

[4] https://www.gov.uk/government/publications/2010-to-2015-government-policy-welfare-reform/2010-to-2015-government-policy-welfare-reform

[5]Rhodri Hayward, The Transformation of the Psyche in British Primary Care, 18701970, (London: 2014)

[6]Hayward, Transformation of the Psyche, xi

[7]Hayward, Transformation of the Psyche, p.37

[8]Hayward, The Transformation of the Psyche, p.36

[9] Hayward, Transformation of the Psyche, p.36

[10]Hayward, Transformation of the Psyche, p.42

[11]Hayward, Transformation of the Psyche, p.130

read more

The Guinea Pigs of Oakholme Road: Pacifism and Medical Research in Wartime Sheffield

IWM NCL – COs

At 4.30am on Saturday 8 March 2008, South Yorkshire Police arrived at Oakholme Hall, a 30-bed student residence in Broomhill, Sheffield, and began dispersing the 300-strong crowd gathered outside. As the Sheffield Telegraph reported later that week, what had started as a low-key house party had, due to some unwisely chosen privacy settings on Facebook, been gate-crashed by “hundreds of drunken revellers”.

The ensuing fracas, which resulted in ten arrests, nine on-the-spot fines, and numerous complaints from local residents, led Pro-Vice Chancellor of the University of Sheffield Professor Paul White to denounce those students who would “bring the good name of the university… into disrepute” and threaten expulsion for those who continued to flout rules of conduct. In response to White’s comments, Students’ Union President Mark Willoughby stressed that the party was an outlying incident and instead pointed to those students who conscientiously contributed to the local community, including “over 1,000 [who] are involved in voluntary work across the city.”

Willoughby’s appeal to voluntary work in an attempt to rehabilitate the tarnished reputation of Sheffield’s student population in 2008 provided a fortuitous call-back to the little-known place of Oakholme Road in the history of medicine and warfare. It was next door to Oakholme Lodge, at 18 Oakholme Road, that the Sorby Research Institute (SRI) was founded in December 1940. Although today merely another student hall, during the Second World War the building functioned as a site of unprecedented medical experimentation on human volunteers drawn from Sheffield’s community of pacifists and conscientious objectors (COs). Over the following six years, these ‘human guinea pigs’ would subject their bodies to infectious diseases, deficient diets, shipwreck simulations, stab wounds, and even bouts of malaria and scurvy. 1

To understand why pacifists would volunteer for these unpleasant tasks, it is necessary to consider the ambiguous position of COs in 1940s Britain. Whereas the well-publicised brutality inflicted on COs during the First World War generated a great deal of sympathy and solidarity, the comparative tolerance shown to their successors in 1939 caused something of an existential crisis for many in the pacifist community about how best to serve humanity and resist war. 2

This anxiety was particularly pronounced among young, university-age pacifists who increasingly rejected overly ‘intellectual’ and ‘academic’ forms of protest and instead promoted more practical, grounded, and physical kinds of war work such as agricultural labour, humanitarian relief, and medical aid. As well as being spurred on by their political beliefs, this drive towards more taxing kinds of labour was shaped by the mockery and scorn often directed towards university-educated pacifists by military tribunals and the local press. Comments regarding the application of Richard Charles Clarke, a 20-year-old student at the University of Sheffield, for exemption from military service, were typical. “You are receiving your education from the State, and you are not prepared to do anything in return,” the tribunal chairman concluded, before registering Clarke for military service against his wishes. 3

From this perspective, serving as a ‘human guinea pig’ made perfect sense: it offered the young, eager pacifist a form of labour that was constructive and humanitarian, but at the same time offered painful and unpleasant trials through which they could prove their bravery and commitment. The SRI’s experiments, therefore, offered a rare opportunity to improve their standing within the local community from mere tolerance to (at least grudging) respect.

It was with this hope in mind that volunteers signed-up for the first major experimental programme at the SRI: a series of trials designed to investigate the transmission of scabies, an infectious skin disease caused by parasitic mites which had been rising in incidence since the late 1930s. 4 These experiments required the volunteers to adopt a range of transgressive behaviours: wearing dirty military uniforms, sleeping naked between soiled bedsheets, and even sharing beds with infected soldiers. By presenting these unusual labours as vital to the protection of national health, the volunteers were able to overcome suspicion and distrust about their CO status to secure praise from local newspapers, gain sympathy from tribunal panels, and even reconcile with previously estranged family members.

Many of these benefits were short-lived, however. In the later years of the war, a shift towards less ‘exciting’ nutritional experiments, which largely required volunteers to adopt monotonous diets for months and even years a time, restricted the SRI’s capacity to transform maligned pacifists into unlikely wartime heroes. 5 As such, when the SRI closed in February 1946 to make way for “a student hostel”, many of the volunteers returned to their pre-war lives with little more to show for their efforts than disrupted careers, diminished finances, and compromised bodies. Nevertheless, for a short time, the house on Oakholme Road provided a space where a young, marginalised group could remake its public image against a backdrop of hostility and suspicion. Future party-throwers, take note.

David Saunders is a PhD student at the Centre for the History of the Emotions at Queen Mary University of London. His research focuses on medical experimentation and the politics of citizenship in wartime Britain.

Notes:

  1. For an overview of the SRI, see Kenneth Mellanby, Human Guinea Pigs (London: Victor Gollancz Ltd., 1945).
  2. See Martin Ceadel, Pacifism in Britain 1914-1945: The Defining of a Faith (Oxford: Clarendon Press, 1980), 301-305.
  3. “Pacifist Tells Tribunal He Loves Hitler,” Sheffield Telegraph, 24 November 1939, p.6.
  4. See Kenneth Mellanby, Scabies (London: Oxford University Press, 1943).
  5. See E.M. Hume and H.A. Krebs, Vitamin A Requirement of Human Adults: An Experimental Study of Vitamin A Deprivation in Man (London: His Majesty’s Stationery Office, 1949); W. Bartley, H.A. Krebs and J.R.P. O’Brien, Vitamin C Requirement of Human Adults: A Report by the Vitamin C Subcommittee of the Accessory Food Factors Committee (London: His Majesty’s Stationery Office, 1953).
read more