close

Emergency

COVID-19, ‘Big Government’, and the Prohibition of Alcohol: Crisis as a Transnational Moment for Social Change

Liquor_bottles_array (1)

Throughout history, crises have often led to enormous social and economic reform as policymakers are forced to come up with new ways to meet unexpected demands. As Walter Scheidel argues in his book, The Great Leveller (2017), mass violence has been the primary impetus for the decline of inequality throughout world history, most recently with the Second World War serving as a watershed in relation to increased government spending on social programmes in many of its participating states. Although a crisis of a very different nature, the current coronavirus pandemic has also brought about similar shifts, with governments running huge budget deficits to protect jobs and counteract the threat of a looming recession caused by travel restrictions and lockdowns.

We also witness cases where governments experiment with creative solutions to crises that stretch across borders, as is the case with the current global pandemic. For a variety of reasons, a small handful of countries have resorted to banning the sale of intoxicants. One of the most debated aspects of South Africa’s lockdown has been their prohibition on the sale of alcohol and cigarettes, intended to reduce hospital admissions and secure beds for COVID-19 patients. Admissions have dropped by two-thirds due to reductions in alcohol-related violence and accidents, but such draconian measures also meant the rise of black-market trade and the near-collapse of the country’s proud wine industry.

The sale of alcohol was also banned in the Caribbean island of Sint Maarten, a constituent country of the Netherlands, and in Nuuk, the capital of Greenland, over its role in exacerbating incidents of domestic violence that came with the lockdown. In Thailand, the prohibition on alcohol was put in place to prevent the spread of the virus in social gatherings. In each setting, such policies were deemed drastic but necessary, carefully implemented for their advantages in tackling a variety of health concerns whilst also considering their clear downsides.

Although instituted under entirely different circumstances, the First World War was also a moment when similarly harsh controls were imposed on the sale of alcohol across the world. France and Russia were the first to institute bans on absinthe and vodka, respectively, due to concerns over their impact on wartime efficiency. Countries in which anti-alcohol temperance movements were already influential also implemented tough restrictions of varying degrees. Although the production and sale of alcohol had already been banned in different local jurisdictions in Canada and the United States, a national prohibition came into fruition in both countries due to the war. Alcohol was not banned in Britain, but the country nevertheless instituted far-reaching controls on the distribution of drink under the Central Control Board (CCB), established in 1915 to enforce higher beverage duties and shorter closing hours in pubs.

In almost every instance, it was the context of the war that spurred the move towards instituting these tough restrictions. Temperance activists in North America had been pushing for a national prohibition for decades, but the conditions of the war, such as the rise of anti-German sentiment directed towards German-American breweries such as Anheuser-Busch, brought the federal implementation of prohibition to the forefront of national politics. In Britain, part of the CCB’s responsibility was the nationalisation of pubs and off-licenses situated in parts of the country that were of strategic importance to the war effort.

These contexts directly parallel what we’re seeing in South Africa and Thailand, where extraordinary circumstances necessitated extraordinary countermeasures. However, there is also an important difference that must be stressed: while current lockdown prohibitions are merely temporary, most advocates of prohibitions and controls a century ago believed that such measures were to be permanent, based on their view that there were no advantages to permitting the existence of ‘demon drink’ in society. The ban on the distillation of vodka instituted under Imperial Russia in 1914 was maintained after the October Revolution and was not scrapped until after Lenin, himself an ardent prohibitionist, died in 1924. Yet, within the British context, the First World War effectively reshaped alcohol licensing for several generations, as high beverage duties and shorter opening hours were mostly preserved into the interwar and postwar eras.

These cases highlight the broader implications of social and economic reforms that are being implemented today. Right-wing governments in both Britain and Japan have approved record levels of government spending in the form of economic aid and stimulus. As Bernie Sanders ended his bid for the Democratic nomination in April 2020, politicians of both the left and the right debated the federal implementation of universal healthcare and paid sick leave in light of the public health crisis. Most recently, the Spanish government announced a €3-billion-euro universal basic income scheme to stimulate the pandemic-hit economy through increased consumer spending. A columnist for The Washington Post was clearly onto something when he declared that ‘there are no libertarians in foxholes’.

It is, however, decidedly too early to predict the long-term impacts of COVID-19 and whether these will lead to what many hope to be a reversal of neoliberal reforms that have dominated economics since the 1970s. One cannot forget that the ‘Keynesian Resurgence’ in stimulus spending during the Financial Crisis of 2007-08 was immediately followed by the tragedy of the Eurozone Crisis and the traumas of austerity measures that devastated the public sectors of Greece, Spain, Italy, Britain, and so on. Despite that, the impact of abrupt changes in undermining the status quo cannot be underestimated, as we saw with the global ‘wave’ of alcohol prohibitions a century before. History, therefore, is an apt reminder of how crises are moments when ‘radical’ reforms that were previously only imagined can eventually become reality.

Ryosuke Yokoe is a historian of medicine, science, and public health, presently affiliated with the University of Sheffield as an honorary research fellow. He recently completed a PhD on the medical understandings of alcohol and liver disease in twentieth-century Britain. You can find him on Twitter @RyoYokoe1.

Cover image: Array of liquor bottles, courtesy of Angie Garrett, https://www.flickr.com/photos/smoorenburg/3312808594/ [accessed 28 May 2020].

read more

Human Rights and the COVID-19 Lockdown

The_universal_declaration_of_human_rights_10_December_1948 edit

The speed with which we have given up some of our most basic rights and freedoms in the face of an incurable epidemic would be noteworthy, if it were not also such a cliché. Everyone has seen films in which the rights-bearing body of an individual becomes a disease-vector, and ultimately little more than toxic waste to be placed under rigorous cordon sanitaire, if not summarily obliterated. The mediocre Tom Clancy techno-thriller Executive Orders (1996) had the USA fight off a weaponised Ebola attack, with only conniving political opportunists moaning about rights, as the pragmatic authorities intoned the legal pabulum “the Constitution is not a suicide-pact!”[i]

Less entertainingly, it is also very nearly a truism of real-life commentary that the inequality with which “rights” are distributed in good times is multiplied in bad ones. While the virus itself may not discriminate, as we have been repeatedly advised, it seems to be having a disproportionate impact in the ethnic-minority communities of major Western nations, while the economic effects of lockdown are, of course, more violently traumatic the closer one is to the margins of society.

Human rights are supposedly universal and unconditional. But the protections they claim to offer have always proven flimsy and threadbare in practice. One reason for this is that the evolution of rights-language in the last three centuries is in fact frequently about two other things: firstly, an idea of grounded, foundational rectitude which has only partially shifted from theological to “scientific” underpinnings, and secondly, the doctrine of state sovereignty, historically entangled with the assertion of national identity. In the way they are used in practice in the world, “human rights” are frequently a cover for assertions and practices that entirely contradict their supposed premise of individual autonomy and security.

Human rights began their modern life as “natural rights”, an offshoot of centuries of European intellectual debate about the existence and contours of “natural law”. Understood, implicitly and explicitly, as a function of the fact of an ordered and purposive divine creation, and of the sovereign state as a component of such an order, rights retained their theological tinge very clearly into the Age of Enlightenment. The US Declaration of Independence invoked the “laws of nature and of nature’s God” as its foundation, spoke of the trinity of life, liberty and the pursuit of happiness as rights “endowed by their Creator” upon men, and appealed to “the Supreme Judge of the world” for validation. Thirteen years later, the French declared the “natural and imprescriptible rights of man” at the heart of a document they decreed to be proclaimed “in the presence and under the auspices of the Supreme Being”.

The French declaration of 1789 also placed the imagined rights-bearing individual in a complex and ultimately subordinated relationship to the other rising force of the era, in stating that “The principle of all sovereignty resides essentially in the nation”, and that “Law is the expression of the general will.” Across the declaration’s seventeen articles, although “every citizen” has the “right” to participate in lawmaking, the law itself – the encoded power of the nation-state – stands above anyone’s “liberty, property, security, and resistance to oppression” (the four enumerated natural rights).[ii]

The modern sovereign nation-state that increasingly took shape in the 1800s was built on claims of inherent superiority that displaced divinity with reason, but were no less, and sometimes more, discriminatory as a result. In France, even before the Revolution had transitioned into Napoleon’s dictatorship, the savants of the new National Institute had taken up the reins of scientific leadership dropped by the abolished royal academies of the old order. Alongside scholars of the sciences and literature, equal prominence was given to practitioners of the “moral and political sciences”.

One of the supposedly great truths that these scholars enunciated, for a country now explicitly referring to itself as “the Great Nation”, was that such a nation, while naturally superior to others, also contained many – multitudes indeed – who did not measure up, individually, to that greatness. France’s leading intellectuals quite deliberately defined the egalitarian republicanism to which they were sworn as something that required, in practice, a rigorous hierarchical division between the fully-enlightened and able elite, and the majority, still seeking to pull themselves out of the mire of the past, who could only expect to be led, gently but firmly, for the foreseeable future.

The legacy of the early nineteenth-century approach to the superiority of rational knowledge has been the creation of waves of ideological thinking, predicated on the foundational entitlement of those who know better to dominate and manipulate the common herd. Over the past two centuries, ideologies from to fascism to Marxism-Leninism, via the imperial liberalism that dominated Anglo-American and French public life, have used claims about their superior understanding of past, present, and future to claim the right to forcibly remake humanity for the collective good, using the overwhelming power of the state.

When the founders of the United Nations produced a Universal Declaration of Human Rights in 1948, they proposed to endow all people with a remarkably wide-ranging set of entitlements. The first clause of Article 25 states:

Everyone has the right to a standard of living adequate for the health and well-being of himself and of his family, including food, clothing, housing and medical care and necessary social services, and the right to security in the event of unemployment, sickness, disability, widowhood, old age or other lack of livelihood in circumstances beyond his control.

A noble aim, perhaps, but also a staggering act of hypocrisy on the part of France and the UK, ruling over swathes of subjugated and impoverished empire, the USSR, winding up to launch the antisemitic persecution of the so-called “Doctors’ Plot”, and the USA, mired in racial segregation and discrimination. The ultimate paradox of the notion of individual “rights” is that, if they are violated by a higher power, only a yet-higher and more righteous power can set matters straight. It is easy to believe such a power can exist, much harder to identify it in practice.

The past six decades have seen repeated and ever-more elaborate forms of international covenants binding states to increasing portfolios of rights that purport to demand respect. Yet, where are we? Half of the world’s ten largest countries – more than 3 billion people in those five states alone – are ruled by demagogues and autocrats.[iii] The UN’s “Human Rights Council”, founded in 2006, is a rotating talking-shop of forty-seven states which to date has never failed to include some of the world’s most notorious human-rights abusers in its membership.

Sitting in our homes, in a world which has, with the best intentions, summarily crushed many of our most fundamental everyday freedoms, we might legitimately wonder whether all discussion of “human rights” remains in the shadow of its pre-modern origins. We have, mostly, displaced the notion of divinely-ordained absolute sovereignty with more modern ideas, but we may well have given the sovereign nation and the state that embodies it almost as much power, while gaining in return little real regard for the individuals whose rights it supposedly protects.

David Andress is Professor of Modern History at the University of Portsmouth, and author of many works on the era of the French Revolution. He edited the Oxford Handbook of the French Revolution (2015), and has recently published Cultural Dementia (2018) and The French Revolution: a peasants’ revolt (2019).

Cover image: The universal declaration of human rights, 10 December 1948.

[i] The short-lived 2004 BBC show Crisis Command grimly demonstrated what might happen if a plague outbreak in the UK was not mercilessly stamped out, and to hell with rights.

[ii] According to the canonical text, the law may constrain liberty, in a whole number of ways, if behaviour troubles “the public order established by law”; it may overrule people’s own understanding of both security and resistance to oppression, for “any citizen summoned or arrested in virtue of the law shall submit without delay, as resistance constitutes an offence.” It may even, in the text’s final article, take away property, despite this being reiterated as “an inviolable and sacred right”, as long as due forms are followed and compensation paid. And what those are, of course, will be determined by the law.

[iii] In 2005 the UN invented the doctrine of a collective “Responsibility to Protect” human rights in other states. In 2015 the Saudi government invoked its “responsibility” to “protect the people of Yemen and its legitimate government” in launching the savage and near-genocidal campaign that continues to this day.

read more

Dawson’s ‘Big Idea’: The Enduring Appeal of the Primary Healthcare Centre in Britain

Retford

May 2020 marks the centenary of the publication of the Interim Report of the Consultative Council on the Future of Medical and Allied Services, popularly known as the Dawson report after its principal author, Lord Dawson of Penn.[i] The report, commissioned in 1919 by the newly established Ministry of Health, outlined a plan to bring together existing services funded by national health insurance, local authorities, and voluntary bodies in a coherent and comprehensive healthcare system. The final report was never published, being consigned to oblivion by a worsening economy and changed political climate. Though cautiously welcomed by professional leaders, Dawson’s plan was condemned by a hostile press as grandiose and unaffordable.[ii] However, recent NHS policy directives regarding Integrated Care Systems show that the principal task which Dawson’s group had set itself, that of successfully integrating primary, secondary and ‘allied’ health services, is one with which NHS leaders are still grappling today.[iii]

Lord Dawson of Penn, courtesy of the British Medical Association archive

Central to Dawson’s plan, and its most revolutionary idea, was the creation of a network of ‘primary health centres’ (PHCs) in each district in which general practitioners (GPs) could access diagnostic, surgical, and laboratory facilities for their patients and which would also house infant welfare and maternity services, facilities to promote physical health, and space for administration, records, and postgraduate education. GPs and other professionals would see and treat patients at PHCs, referring only complex cases to specialists at secondary care centres (essentially district hospitals) located in large towns, while patients needing the most specialized treatment would be referred to regional teaching hospitals with attached medical schools. This ‘hub and spoke’ model is one to which recent generations of NHS health planners have returned time and again, seemingly unaware of its antecedents.

A firm believer in teamwork, Dawson hoped that collaborative use of PHCs by GPs would encourage group practice and multi-disciplinary working. But the individualistic nature of general practice at that time meant GPs remained wary of his ideas, despite the fact that examples of PHCs already existed in Gloucestershire and in Scotland and many of the facilities they were meant to comprise could be found in GP-run cottage hospitals and Poor Law infirmaries.[iv] Experiments with architect-designed health centres in the 1920s and 1930s failed to elicit a major change in professional or governmental attitudes.[v] In 1948 the NHS brought public, voluntary and local authority hospitals under state control but in its early years the promise of new PHCs remained largely unrealised.[vi] Proprietorial traditions and fear of local government control led to a mushrooming of purpose- built, GP-owned practice premises between the late 1960s and 1990s independently of local authority-owned health centres, for which there was a major building programme in the 1970s.[vii]

Illustration of a Primary Health Centre, from the Dawson Report, courtesy of the BMA archive

Although by the late twentieth century the Dawson report had largely been forgotten, interest in PHCs resurfaced in the early 2000s with a major investment in primary healthcare facilities through the establishment of Local Improvement Finance Trusts (LIFT). These were a form of private finance initiative designed to provide state of the art community health and social care hubs housing GP practices and other services. Unfortunately, LIFT buildings proved more expensive than anticipated and their facilities, intended to promote the transfer of work from secondary to primary care, were often underutilised.[viii] While these were being constructed, the Labour health minister, Lord Ara Darzi, announced the establishment of a number of ‘polyclinics’, bearing a close resemblance to Dawson’s PHC idea. However, the Darzi Centres that were established were either mothballed or repurposed, being condemned as an expensive ‘white elephant’ by professional leaders.[ix]

In the last few years a ‘quiet revolution’ has been taking place in the NHS in England involving attempts to dismantle the financial and institutional barriers between primary, secondary and community care created by the internal market. Its byword, ‘Integration’, echoes Dawson’s overriding goal and the ‘hub and spoke model’ he advocated is now well established. Meanwhile, the pressures of unending demand have forced GPs to collaborate as healthcare providers in locality groups called Primary Care Networks (PCNs). Though guidance on these is not prescriptive, some PCNs have adopted the idea of a community ‘hub’ housing shared diagnostic and treatment facilities much as Dawson had envisaged.[x]

While the full impact of COVID-19 on our struggling health services is still unknown, the abiding necessity for all parts of the NHS to collaborate, communicate and mutually support each other during this crisis underlines the value and relevance of Dawson’s vision of integrated services. It remains to be seen if, in its aftermath, his ‘big idea’ of ubiquitous multi-purpose PHCs will come any closer to being realised.

Chris Locke is a fourth year PhD student in the History Department at the University of Sheffield. His research is focused on the political consciousness of British GPs and their struggle for professional self-determination in the early Twentieth Century.

Cover image: LIFT -built Primary Care Centre, Retford, Nottinghamshire, photographed by the author.

[i] Interim Report of the Consultative Council on the Future of Medical and Allied Services, Cmd 693 HMSO  1920. For an account of the origins and significance of the report see Frank Honigsbaum, The Division in British Medicine (London, 1979) chapters 6-12.

[ii] The British Medical Association’s blueprint for health services reform, A General Medical Service for the Nation (1930) and the report by Political and Economic Planning, The British Health Services (1937) both referenced the Dawson report, and it clearly influenced the Beveridge report, Social Insurance and Allied Services (1942).

[iii] https://www.kingsfund.org.uk/publications/making-sense-integrated-care-systems (last accessed 3 April 2020)

[iv] The report referenced the hub and spoke model of healthcare facilities overseen by Gloucestershire County Council’s Medical Officer of Health, Dr J Middleton Martin. Commentators also noted similarities with Sir James McKenzie’s Primary Care Clinic in St Andrews and Trade Union-run Medical Aid Institutes in South Wales.

[v] Jane Lewis and Barbara Brookes, ‘A Reassessment of the Work of the Peckham Health Centre 1926-1951’, Health and Society vol 61, 2, 1983 pp.307-350; For Finsbury Health Centre see A B Stewart, ‘Health Centres of Today’, The Lancet, 16 March 1946 pp. 392-393.

[vi] For one exception see R H Parry et al, ‘The William Budd Health Centre: the First Year’, British Medical Journal, 15 March 1954 pp.388-392.

[vii] BMA General Practitioners Committee guidance: The Future of GP Practice Premises (Revised 2010)

[viii] Nottinghamshire Local Medical Committee, NHS LIFT in Nottinghamshire (Nottingham,1997)

[ix] Peter Davies, ‘Darzi Centres: an expensive luxury the UK can no longer afford?’, British Medical Journal, 13 November 2010, 341; c6237.

[x] https://www.england.nhs.uk/primary-care/primary-care-networks/ (last accessed 3 April 2020)

 

read more

Locating Women in the history of India’s Emergency (1975-1977)

hm blog

Image: Prime Minister Indira Gandhi addressing a (female) audience in Delhi, 1 March 1977 (Socialist India, March 5 1977)

More than forty years on from India’s State of Emergency (1975-1977), we are beginning to understand the many ways in which women supported, resisted and experienced this critical period in India’s history.

Forty-two years ago today, on 21 March 1977, India’s State of Emergency collapsed. The Janata Party, a coalition of anti-Emergency opposition groups, defeated Indira Gandhi’s Congress Government at the polls. Gandhi imposed this Emergency in June 1975, responding to rising opposition and a legal challenge to her position. Government censored the press, arrested opposition party members and activists, suspended elections and undertook controversial socioeconomic programmes, including coercive sterilisation and aggressive slum clearance. This is now a well traversed history. Recently, there has been a burgeoning of scholarship analysing these events. But the role of women in relation to all aspects of the regime has not commanded sufficient attention.

This is particularly striking for several reasons. A female leader who drew heavily on gendered narratives like Bharat Mata (Mother India) presided over this regime, mobilising such imagery to defend the Emergency’s legitimacy. In one instance, Gandhi stated:

We felt that the country had developed a disease and if it is to be cured soon, it has to be given a dose of medicine. However dear a child may be, if the doctor has prescribed bitter pills for him, they have to be administered for his cure… Now, when a child suffers, the mother suffers too. Thus we were not very pleased to take this step. But we see it worked (Socialist India 15 November 1975).

In 1975 India participated in the UN’s International Women’s Year (IWY) celebrations and the government’s Committee on the Status of Women in India published its report Towards Equality. One of the Emergency’s most infamous policies, coercive sterilisation in the name of family planning, is an issue that has been at the fore of feminist activism and scholarship. Although the Emergency is widely acknowledged as a catalyst for the contemporary women’s movement in India, there has been little attention to women’s activism or experiences during it.

My doctoral research revealed the myriad ways in which women were key to the articulation and implementation of Emergency measures. Depictions of women’s support for the regime were integral to pro-Emergency propaganda. The Congress Party used women dominated photographs to represent support for the regime, even describing the Emergency as akin to the IWY, as ‘yet another significant event for the welfare of women in this country’ because of its imposition of ‘law and order’ (Socialist India 21 August 1976). Contrary to such claims, and despite perceptions of the Emergency’s sterilisation policies as a vasectomy programme, my research revealed the negative implications of these policies for women, particularly the impact of the focus on sterilisation on the Mother and Child Health programme. Women were often at the forefront of families’ attempts to negotiate the Emergency’s many coercive measures. As one man put it, because of the financial pressures authorities placed on his family ‘my wife had to get sterilised.’

Women were not simply victims of the Emergency’s repressive measures, nor symbols utilised by the Congress’s pro-Emergency narratives. Women were active in resistance and organised protests against the Emergency. Underground literature reveals glimpses of this recording how in December 1975, Jayawantiben Mehta, Ahilya Rangnekar and Kamal Desai led groups of women protestors in Mumbai as part of an organised Satyagraha (non-violent resistance) campaign. Documentation from Maharashtra’s prisons shows that state authorities there arrested over 500 women during this period for such activities. Once in prison, women cultivated lively cultures of resistance, continuing to protest and maintaining connections with the underground resistance movement. Those who escaped arrest, such as teacher and later Janata Party Secretary Pushpa Bhave, continued to organise protests and shelter those participating in underground resistance in their own homes.

The Janata Government that took office in March 1977 had the lowest number of women in parliament. As feminist activist and scholar Dr Ranjana Kumari, who was active in underground activism as a student in Delhi, told me in an interview, ‘there were a lot of women who were very, very active’, but ‘they were all pushed aside post-Emergency… so many of them not even recognised, not even written about, it is sad’. This marginalisation of women in post-Emergency politics has contributed to the absence of their voices and stories from this history. My doctoral research begins to address this gap, but forty-two years on, there is still much work to be done.

Gemma Scott completed an AHRC funded PhD at Keele University in 2018. Her research examines the history of India’s State of Emergency (1975-1977), focusing particularly on women’s activism during this period and women’s experiences of Emergency measures. In 2015, she was an AHRC International Placement Scheme Fellow at the Library of Congress, Washington DC, and in 2016/17 she held a Scouloudi Foundation Doctoral Fellowship at the Institute of Historical Research, University of London. She is currently working as Engagement, Partnerships and Impact Development Officer at Keele University.

read more