A few weeks before writing this blog, I was launching a book that I had recently published. A former PhD student of mine and someone I consider a friend handed me a copy of my book to sign – and I could not remember her name. It was a moment of acute embarrassment that quickly turned to terror: was this the beginning of the end? Is this what early-onset Alzheimer’s feels like? The moment passed; you will be pleased to know that my brain has been working just fine ever since. But it was a reminder how the word ‘Alzheimer’s’ has become a central metaphor for our fear of ageing and decline. Today, the specter of dementia is like a cough in a Victorian novel: a harbinger of death.
For a disease that is so present in our collective nightmares, there is a surprising lack of understanding of its nature. In literature, the most popular depiction of dementia can be found in Shakespeare’s As You Like It, where old age is depicted as ‘a second childishness and mere oblivion; sans teeth, sans eyes, sans everything’. My own interest in dementia, though, was sparked by a film called Black Daisies for the Bride, written by the poet Tony Harrison and directed by Peter Symes. It was broadcast on British television on 30 June 1993, coinciding with National Alzheimer’s Day. It has also been acclaimed, winning the 1994 Prix Italia for best documentary as well as the award for best drama at the Mental Health Media Awards that year. The film focuses on the lives of five Yorkshire women residing in the Alzheimer’s ward of the High Royds Hospital, an old Victorian asylum outside of Leeds. Uniquely, Harrison attempts to provide viewers with (a highly mediated) access to the inner as well as outer worlds of women living with late-stage Alzheimer’s. In the film, Maria Tobin, Muriel Prior, Kathleen Dickenson, Muriel Allen, and Irene Parker retain their humanity, as inflected through the lens of their unique pasts (including former jobs as opera singer and therapist), desires and preferences (obsession with cleanliness), interactions with others (their marriages), and environments. The environment includes the hospital’s dark and prison-like wards. The film’s soundtrack is disorientating – the rattling of trolleys, doors being locked and unlocked, footsteps, moans and screams, Muriel Allen repeating ‘I love you, I love you, I love you’, and Maria Tobin (an opera singer in her early life) trilling a high-A note. Imprisonment is visceral as much as metaphorical: ‘Death’s got the only door code’, viewers are told. At one point in the film, the women come to life when entertained by a jovial man singing ‘Oh, You Beautiful Doll’, a painful parody of the women’s current lives. One of the women’s therapists muses on the fact that Muriel Allen was also a therapist in her younger life. Today, though, she observes that Allen is ‘Now beyond all forms of therapy and if Alzheimer’s doesn’t spare a lifetime professional carer and denies a mind of Muriel’s kind... no-one’s free’. The film is a reflection on selfhood, loss, and humanity’s shared vulnerability. It draws attention to the difficulties experienced by people living with severe cognitive impairments: are they heard; do they fully ‘exist’ in the minds of others? But the film also elicits anxiety in viewers for other reasons: these five women are being presented as entertainment on television. By witnessing the fragmented minds and bodies of these women, are we guilty of being voyeurs of their confusion and pain? We might defensively insist on the importance of making highly impaired people visible and heard, but at what cost to their dignity? Is it even ethical to listen to their disjointed ramblings and vocalisations? After all, they are real women. Their high-level dementia rules out informed consent. Are people with dementia even part of our collective humanity? If they are, what does this mean in terms of our interactions with them? And, as I hinted at the very start of this blog, what happens when ‘they/them’ becomes a future ‘I/me’?
These are important questions. As with all my blogs in this series on the ‘Cultural History of Disease’, I believe that history provides useful ways to think about worlds of pain and dis-ease. Historical frames of meaning help us to think anew about how we might create better futures. As historian Jesse F. Ballenger has argued, ‘without a sense of history, without the ability to construct a coherent narrative linking the present to the past as well as to the future, public discourse on Alzheimer’s will itself be confused, disorientated’. In other words, dementia is a subject ‘good to think’.
Thinking with dementia is difficult. Thinking with people who are living with such impairments is also demanding because our hypercognitive culture silences cognitive alterities. There is no mystery about why dementia is so frightening. Dementia fuses cognitive degeneration, chronicity, and incurability – a most anxiety-ridden triad. Today, memory has become the shibboleth of modernity. It is central to our sense of selfhood. Once the plaques (protein deposits) and tangles (twisted fibers) in the brain have choked off memories, where is the self located? The body may be present – but stripped of its social identity. Personhood is often assumed to depend on the ability to recognize oneself in history and to place oneself, others, and the environment into a coherent narrative. That ancient Delphic injunction has never been more central: ‘know thyself’.
But we cannot turn away from dementia or people living with dementia: it is as emblematic of the late twentieth and twenty-first centuries as hysteria was in the nineteenth century. Worldwide, there are more than 50 million people diagnosed with Alzheimer’s. Since the incidence of Alzheimer’s Disease doubles for every five-year period after the age of 65 years, an ageing population means rising numbers of people living with dementia. This has resulted in almost apocalyptic predictions about a ‘tsunami’ of dementia and the fearful rise of this ‘demon in our midst’. In 2021, the World Health Organization declared that dementia is the seventh leading cause of death among all diseases. It is one of the major causes of disability and dependency among older people worldwide.
Admittedly, these statistics are deceptive. After all, ‘dementia’ is a name for a series of symptoms. The term comes from Latin: ‘de’ (out of), ‘mens’ (mind), and ‘ia’ (state of) – in a state of out of mind. Dementia (of which Alzheimer’s is the most prevalent) is a cluster of symptoms that includes memory loss, impairments in everyday activities, emotional volatility, impaired ability to understand or produce speech, and difficulties in recognition. It presents itself in very different ways, one might almost say it is ‘unique to the individual sufferer’. Furthermore, the diagnosis has always been challenged. In other words, the diagnosis is based on ‘tipping points’, when ‘normal’ forgetfulness (forgetting a single name) becomes memory ‘loss’ (forgetting a great deal).
Even today, there are scientific uncertainties around what constitutes Alzheimer’s Disease – after all, some people with symptoms of Alzheimer’s have brains that lack the typical plaques and tangles, while pathological signs can be found in some healthy brains. In other words, although amyloid build-up results in neuro-degeneracy, there are people with significant build-up of amyloid but no negative symptoms. Many people with plaques, tangles, and cell loss never develop Alzheimer’s. The one thing most people with dementia share is old age. Uncertainty about the ‘disease model’ of Alzheimer’s has led anthropologist Philip B. Stafford to ask,
What would it mean to entertain the possibility that age is the cause of our decline? It would mean sacrificing control over an element of nature we have been led to believe is subject to our influence. It would mean acquiescing to the suzerainty [sic] of death, an action not taken easily by our culture which seeks to prolong life at all costs.
Nevertheless, the disease model has persisted, in part because it reassuringly contends that dementia is something that can, ultimately, be controlled and even eradicated.
For those of you who have followed all my blogs, it won’t surprise you at all that there are vast disparities in who is more likely to develop dementia. For example, in the U.S., the prevalence of Alzheimer’s is twice as high among African Americans and Hispanics than non-Hispanic whites. Despite such differences in risk, most research into dementia is carried out on white people. In part, this is due to the fact that minoritized people are more anxious about becoming involved in medical trials – a legitimate concern given the historical role of physicians in justifying racial inferiority and experimenting on Black bodies. Minoritized people also experience the disease differently. Because they are diagnosed at later stages of the disease, their symptoms tend to be more severe, meaning that they are also less able to make decisions about what form of management best suits their needs. Their symptoms are also often downgraded by being called the ‘old timer’s disease’ or, in Hispanic communities, ‘the craziness’ (or ‘el loco’). Although dementia is undoubtedly raced, sociologist Maria Zubair warns that the ‘implicit white conceptual, ideological and political underpinnings’ of most dementia research ‘reinforce and legitimise the racial status quo at the expense of racialised Others’. She is dismayed with the way ‘ethnicity’ is conceptualized, observing that ‘framing… the minority ethnic equality-of-access issue largely in terms of minority ethnic “culture” and “cultural difference”‘ too often results in
the use of a deracialised language. This silences and makes invisible the critical role of ‘race’ and racialised social locations and positionings in creating inequalities for minority ethnic persons.
A similar argument is made by ‘crip-of-colour’ critics such as feminist disability scholar Jina B. Kim, who urges people to ‘hold racism, illness, and disability together, to see them as antagonists in a shared struggle, and to generate a poetics of survival from that nexus’. Intersectional analyses tie together disability, gender, class, racialization, age, and so on to make the point that overlapping, minoritizing identities dramatically increase a person’s risk of neglect and abuse, as well as of being subjected to physical and pharmaceutical restraints.
Before we continue this exploration of dementia in its social and cultural aspects, what is its medical or scientific history? The clinical signs of dementia were known to the ancient Greeks. However, even Hippocrates did not list it as a mental disorder, regarding it as a normal aspect of aging. By the First Century, Roman physicians Celsus and Galen catalogued it under terms such as ‘morosis’. Well into the eighteenth century, it was spoken about as ‘madness’ or ‘melancholia’. It was as much a legal category as a medical one, dominated by judgments about the validity of wills and debates about competency (to marry or inherit, for example). It was only with the growth of the materialistic sciences that dementia was identified as a pathology of the brain. In 1797, the term ‘demence’ was first used by the French physicians Philippe Pinel and, in 1838, by one of his pupils, Jean Étienne Esquirol who made a distinction between age-related dementia and imbecility. They described dementia as a disability that effects ‘discernment’ as well as ‘intellectual ability’: they observed that it resulted in ‘brain diseases’, causing the person to ‘lose joyfulness’. Scientists such as Erasmus Darwin and Franz Josef Gall went further. Their interest was in cerebral localization, making a link between the mind and brain. The invention by Camillo Golgi of nerve staining in the 1870s was another important milestone since it enabled scientists to develop a morphology of the brain.
In public discourses, though, dementia is colloquially known as ‘Alzheimer’s’ after the German neuropathologist Alois Alzheimer. On 3 November 1906, he gave a lecture entitled ‘On a Peculiar, Serious Disease Process of the Cerebral Cortex’, which focused on Auguste Deter, a 51-year-old female patient. Deter was seriously unwell. She could not remember things, suffered delusions, was sometimes aggressive, was irrationally jealous, hallucinated, and was agitated, as well as psychologically distressed. In his case notes in 1901, Alois Alzheimer contended that Auguste Deter
sits on the bed with a helpless expression. What’s your name? Auguste. Last name? Auguste. What is your husband’s name? Auguste, I think. Your husband? Ah, my husband. She looks as if she didn’t understand the question. Are you married? To Auguste. Mrs D? Yes, yes, Auguste D.... At lunch she eats cauliflower and pork. Asked what she is eating she answers spinach.... The patient is not able to progress in writing and repeats, I have lost myself.
By 8 April 1906, at the age of 55 years, Deter had lost all cognitive ability and died of septicemia and pneumonia. A postmortem of her brain revealed ‘plaques, neurofibrillary tangles, and arteriosclerotic changes’. For Alzheimer, this was not a different disease from dementia but was rather an atypical case of senile dementia found in a relatively young person. It was Emil Kraepelin (the founder of modern scientific psychiatry) who named the disease after Alzheimer when, in 1910, he published a revised version of his famous textbook. For Kraepelin, what distinguished Alzheimer’s Disease from ‘normal’ pathological processes associated with aging was the age at which symptoms appeared – and evidence of brain pathology suggested that it was a disease. Interestingly, though, the coinage of the term ‘Alzheimer’s Disease’ attracted very little comment amongst physicians of the time. It was not even mentioned in the numerous obituaries after Alois Alzheimer’s death in 1915. But this was the start of a shift that was to move attention away from individual patientsand towards macroscopic and microscopic analyses of brain tissue.
Brain pathology dominated research until the period between the 1930s and the 1950s, when some physicians began arguing for more psychosocial approaches. In 1945, for example, anthropologist Leo W. Simmons published The Role of the Aged in Primitive Society, which illustrated the very different ways aged members of society were treated. He argued that the elderly were more integrated within pre-industrial cultures, and in ways that facilitated their flourishing. In other words, senility was universal, but responses to it were culturally contingent. In ‘modern civilization’, Simmons contended that the ‘time-tested adaptations of the aged’ had been ‘disrupted... and perhaps even regressed in its solution of the problem of successful aging’.
American psychiatrist David Rothchild took such arguments further. He reframed ideas about dementia away from brain pathology and towards its social origins. Rothchild argued that physicians needed to explore the social contexts in which people developed specific symptoms. Compulsory retirement, increased leisure, and the disintegration of the family (which meant that many older people were socially isolated) were blamed for its debilitating symptoms. Gerontologists began addressing questions relating to the personal and social ‘adjustment’ of people as they aged; they identified problems associated with weak or even non-existent social support networks. Gendered and classed expectations were explored, leading in some instances to the erroneous assumption that the most prominent sufferers of dementia would be white, middle-class men because they were assumed to be most affected by compulsory retirement and lack of familial ties (in fact, two-thirds of people with dementia are women, and, as we have seen, older people-of-colour are twice as likely to have dementia than older white people). In 1953, Maurice Linden and Douglas Courtney augmented these arguments in an article in the American Journal of Psychiatry. For them, senility was ‘largely a cultural artifact’ since it was a consequence of ‘attitudinal alterations’. They lamented that ‘little place is found in a mobile and aggressive society... for individuals in the postreproductive [sic] phase of life’. Similarly, for David Wilson writing two years later in The American Journal of Psychiatry, ‘lonesomeness, lack of responsibility, and a feeling of not being wanted all increased the restricted view of life which in turn leads to restricted blood flow.’
From the 1970s, dementia increasingly became a topic of concern, due to a heightened focus on public health, a greater confidence in biomedical interventions (including pharmaceutical ones), and widening acknowledgement about the impact of lifestyles on health. Most important, though, was a growing awareness of dramatic shifts in human longevity. Between 1900 and 2000, life expectancy for Americans at birth jumped from fifty to seventy-five years – that is, a 50 per cent increase.
A new generation of physicians sought to capitalize on these changes. Senility was rapidly becoming medicalized. In her analysis of articles listed in The Reader’s Guide to Periodical Literature, Laura Davidow Hirshbein found that between 1900 and 1924, around 80 per cent of articles listed under the heading ‘old age’ were written by or about older people; from 1925 to 1932, about half were by or about older people; while by 1932 to 1941, 70 per cent were written by or about professionals. What this indicates is the increasing power of medical specialism over knowledge about and interventions in lives of older people.
It wasn’t inevitable. Initially, geriatrics struggled to be recognized. The problem was not only the perceived lower status of elderly patients but the fact that, in the nineteenth and early twentieth century, medical specialisms tended to be organized around ‘body systems’ (orthopedics, cardiology, or gynecology, for instance). In contrast, geriatrics was about the maintenance of healthiness based on life stage. To counter this criticism, proponents made comparisons between geriatrics and pediatrics. This was the point being made in a 1909 article in the New York Medical Journal, written by the New York physician Ignaz Leo Nascher. In this article, he coined the term ‘geriatrics’ (from ‘geras’ meaning old age and ‘iatrikos’ meaning ‘relating to the physician’). Nascher maintained that old age was ‘a physiological entity as much so as the period of childhood’. While ‘childhood has received special attention by physicians and a special branch of medicine has been assigned to it’, the same should be the case with old age. It, too, has ‘an individuality of its own as clearly defined as childhood, with anatomical features, physiological functions, diseases’. Since old age requires ‘treatment differing from maturity’, old age and senility should be assigned to a ‘special branch of medicine’.
Comparisons between pediatrics and geriatrics continued throughout its early decades. Malford Wilcox Thewlis, co-founder in 1942 of the American Geriatrics Society, insisted upon this connection in his 1941 edition of The Care of the Aged (Geriatrics). He reminded readers that
As more people live to a riper age, there will be an increasing need of focusing on their needs and their scientific care. It is now recognized that the diseases of old age (geriatrics) require special attention just as diseases of children (pediatrics) do.
A couple of years later, Marjorie W. Warren, the ‘high priestess of geriatrics’, made a similar comment, noting that when she was a medical student, the specialism of pediatrics was unheard of. She recalled that children were ‘too often nursed in adult wards (there being no special wards set apart for them), and too often junior medical and nursing staff were considered all that was necessary for their care’. Similarly, she continued ‘To-day much the same attitude is shown towards the aged or the chronic sick – a class which includes the majority of elderly folk’. While gerontologists urged people to supplement rhetoric about the ‘Century of the Child’ with that of the ‘Century of the Aged’, commercial firms sought to capitalize on the relationship by (to take one example) marketing baby-foods (such as Gerber’s) to elderly people.
It was during this period that electron microscopic studies of senile plaques and neurofibrillary tangles shifted attention back to brain pathology. An important researcher in this field was neurologist Robert Katzman, who argued that the distinction between senile dementia and Alzheimer’s presenile dementia was untenable. He argued that both should be called Alzheimer’s Disease. In 1976 article entitled ‘The Prevalence and Malignancy of Alzheimer Disease. A Major Killer’, Katzman contended that
Alzheimer disease and senile dementia are a single process and should, therefore, be considered a single disease. Both Alzheimer disease and senile dementia are progressive dementias with similar changes in mental and neurological status that are indistinguishable by careful clinical analysis.
He called upon physicians to ‘drop the arbitrary age distinction’ based on whether a person with symptoms was younger or older than sixty-five, and call both ‘Alzheimer disease’.
By the 1980s, the term Alzheimer’s was widespread in popular culture as well as medicine. Anxieties about an aging population, the popularization of medical debates, and investments by pharmaceutical industries were important. But the rise of the caregivers’ movement (which focused attention on carers rather than patients) was equally important, along with a growing awareness of the public health costs of dementia (as early as the 1980s, Alzheimer was ranked among the ten most common causes of death in the U.S.). Public interest was also encouraged by the exponential growth of professional services with an interest in distinguishing ‘frail’ from ‘confused’ older people. The energetic advocacy by numerous national Alzheimer’s societies and other organizations compounded awareness. In the U.S., the most important of these institutions was the National Institute on Aging (NIA), which was founded in 1974 as part of the U.S. Department of Health, Education and Welfare. Money for research dramatically increased. Within a decade, the institute had funded ten research centres dedicated to exploring the disease. Between 1976 and 1989, U.S. federal funds for research on Alzheimer’s increased from $4 million to $123 million – that is, a 3,000 per cent rise. The institute promoted using the term ‘Alzheimer’s Disease’, rather than ‘senility’ or the broader term ‘dementia’. As one fundraiser for the NIA admitted, ‘The name of the game is “Alzheimer”…. You can sell Alzheimer’. They sought to raise the profile of degenerative aging processes by making links with dementia and the ‘war’ on the infectious disease of polio (see my blog on polio). The NIA contended that Alzheimer’s Disease, like polio, was not a ‘normal’ aspect of aging but a disease that required pathological investigation and demanded a ‘cure’. The activities of the NIA drew attention to the fundamental tension between clinical research needs based on a disease model and social needs based on a public health level.
Amongst the wider public, debates about dementia are fundamentally tied into ageist prejudices. Of course, ageism was not new. In An Essay Concerning Human Understanding (1689), philosopher John Locke argued that without language and consciousness, people were mere ‘idiots’, similar to ‘brutes’ or even ‘monsters’. People with dementia were not only presumed to have no voice, but, when they clearly did, were regarded as profoundly annoying, irritating, and dismissible. As George Miller Beard, a New York neurologist, put it in Legal Responsibility in Old Age (1874),
Men die as trees die, slowly and frequently at the top first. As the moral and reasoning faculties are the highest, most complex, and most delicate development of human nature, they are the first to show signs of cerebral disease; when they begin to decay in advanced life[,] we are generally safe in predicting that, if neglected, other faculties will sooner or later be impaired.
The decline occurs in different ways, depending on the person. Beard maintained that one person with dementia
become peevish, another avaricious, another misanthropic, another mean and tyrannical, another exacting and querulous, another sensual, another cold and cruelly conservative, another vain and ambitious, and others simply lose their moral enthusiasm, or their moral courage, or their capacity of resisting temptation and enduring disappointment.
He projected a depressing and damaging image of people living with dementia as tyrannical, decadent, and perverted.
Such prejudices were explicitly tackled in 1969 when Robert N. Butler (psychiatrist, gerontologist, and the first director of the National Institute on Aging) coined the term ‘ageism’. In an article in The Gerontologist, Butler lamented that ageism was not only a common form of discrimination but was also one of the most acceptable forms. He wanted the term ‘senility’ to be abolished, arguing that it was simply a ‘wastebasket term’ that served to stigmatize the elderly. In his book entitled Why Survive? Being Old in America (1975), Butler described old age in America as a ‘tragedy’. He reflected that
Few of us like to consider it [old age] because it reminds us of our own mortality. It demands our energy and resources, it frightens us with illness and deformity, it is an affront to a culture with a passion for youth and productive capacity. We are so preoccupied with defending ourselves from the reality of death that we ignore the fact that human beings are alive until they are actually dead. At best, the living old are treated as if they were already half dead.
He was writing in the 1970s; the stigma has not receded. Today, there is an extraordinarily high level of shame associated with dementia, with sufferers accused of being a ‘burden’ on their families and communities. People with dementia are routinely infantilized. In some communities, Alzheimer’s is blamed on the ‘evil eye’ (or ‘el mal de ojo’) or ‘bad blood’. People living with dementia are seen as ‘old and mad’ well before any other unique features and feelings are noticed. Only rarely does this cause outrage, as when the biopic The Iron Lady was released, showing a confused and reclusive Margaret Thatcher suffering from dementia. The Daily Telegraph deemed the film exploitative and ‘insulting’. However, this response was fundamentally political: it was Thatcher’s conservative legacy that was being defended, as much as the dignity of the woman herself. More commonly, Butler is correct when he observed that agism is ‘the most acceptable’ form of discrimination. It is often very explicit. The back cover of Guy Lushin’s A Living Death: Alzheimer’s in America (1990) informed readers that people with dementia
steal. They shoplift. They’re violent. They ‘expose’ themselves in public. They’re verbally abusive. They lie. And they don’t know any better. Meet some of the 4 million Americans who have Alzheimer’s Disease.
The stigma is such that people even debate whether or not people with dementia should be given life-prolonging feeding tubes, antibiotics, and surgery. Is the person who has lost a sense of ‘selfhood’ truly alive? Are there lives ‘meaningful’? These are major questions, made even more fraught by the introduction of genetic testing, meaning that a person might know they ‘have dementia’ long before the appearance of any symptoms. What are their lives worth? Should the care of wealthy white people in ‘the west’ with dementia be devolved to places like Baan Kamlangchay in Thailand. As gender scholar Katerina Kolárová explains, such centres exemplify the ‘global effects of the neo-liberal ideology of privatisation of the social and the public’ that have resulted in cuts to public health provisions and welfare, but which ‘draw upon colonial tropes and legacies of white supremacy’. Are the lives of people living with dementia worth anything? This was the whispered question during the Covid-19 pandemic. People living with dementia faced profound isolation and restrictions on movement that were even more severe than the restrictions imposed on the so-called ‘able-bodied’. In the ‘economy of abandonment’, they suffered disproportionately high death rates. Grievability turned out to be linked to rationality and the ‘meaning-making self’.
The stigma is such that people in the early stages of the disease may engage in elaborate plots to hide their deteriorating memory. Many attempt to hide their diagnosis from friends and family, fearful that if their diagnosis became known they would lose their jobs, friends, and status. They use ‘post-it’ notes and electronic ‘reminders’. In the words of Adele LaPlante in her novel Turn of Mind, people living with Alzheimer’s learn to ‘laugh when others laugh, look serious when they do. When people ask do you remember you nod some more. Or frown at first, then let your face light up in recognition’.
Such widespread dismissals are not only due to fear but also due to the contempt shown to people who violate ideals of self-reliance, self-control, and self-mastery. Arguably, this has become even more true since the 1990s, with a new emphasis on the ‘Third Age’ -- that is, people in their sixties and older who are fit, healthy, and cognitively alert. Ironically, the prominence of ‘healthy aging’ or ‘ageing active’ may further stigmatize older people who are frail and forgetful. Being a ‘successful’ elderly person becomes a highly policed form of ‘self labour’.
Due to the dialectical interplay between neurological and psychosocial factor, stigma is extremely damaging. Social psychologist and activist Tom Kitwood argues that the labelling of people as ‘demented’ is a form of ‘malignant social psychology’: it helps to produce what it proports to describe.
Kitwood provides a searing account of the everyday insults people living with dementia face. He explores routine instances of treachery (that is, dishonesty used to enforce compliance), disempowerment (resulting in de-skilling), infantilization, intimidation, labelling (which becomes a self-fulfilling prophecy), stigmatization, outpacing (that is, people with dementia are placed in contexts where other people fail to establish an appropriate speaking pace), invalidation (being ignored), banishment, and objectification (being treated as ‘dead matter’). Added to these abuses is institutionalization. It is a classic case of what historian Louise Hide has called ‘cultures of harm in institutions of care’.
As I have argued in relation to other diseases in the series of blogs, stigmatization is often augmented by the use of pernicious metaphors. Language is important: metaphors not only describe ways of seeing other people but effect the way people see themselves and are both seen and treated by others. Is the person with dementia nothing more than a ‘computation device running gradually and inexorably amok’ or are their symptoms acting like a ‘series of circuit breakers in a large house, flipping off one by one’. The most common metaphor for people living with dementia is that their life is a ‘funereal without end’. They are experiencing a ‘second childhood’ or enduring ‘a death before death’. They are described as an ‘empty shell’, the ‘living dead’ who are ‘drifting towards unbeing’. Dementia is personified as a ‘mind robber’ who ‘plunders memories’. In the words of one author, the disease ‘quietly loots the brain, nerve cell by nerve cell, like a burglar returning to the same house each night’. Carers speak about ‘the ‘long goodbye’. As Daniel George, Erin Whitehouse, and Peter Whitehouse have argued in their article entitled ‘Asking More of Our Metaphors’,
As with HIV, the idioms of warfare so prevalent in the Alzheimer’s field have emphasized fear and anxiety while channeling resources away from prevention, care, and other approaches not premised on amyloid ‘toxicity’…. We should seek greater humanity in our metaphors. Instead of prosecuting a ‘war’ that many if not most experts regard as fundamentally unwinnable, we might shift expectations from an absolute ‘cure’ or ‘prevention’ to the more realistic ‘postponement’ of the more debilitating effects of brain aging that can be achieved by modifying known biological, psychosocial, and environmental risk factors.
Of course, this will require dramatic, even revolutionary, shifts in social environments as well as a redistribution of resources, the inequal distribution of which are detrimental to health, including the health of brains. We also need to think more about the ‘entanglements’ (the term is anthropologist Margaret Lock’s) between aging brains and social factors, such as poverty.
Of course, there have been major attempts to improve the lives of people with dementia. The ‘health politics of anguish’ is a powerful one. In 1959, even Life Magazine featured on its cover images of old age and lamented that people with dementia and other chronic diseases were being ‘stored away like vegetables’. But such images can be problematic: people with dementia are required to be abject subjects. Kitwood recalled an occasion in which photographs were to be taken of people with dementia in an attempt to raise funds. When the photographs were taken, however, they were rejected by the agency on the grounds that the clients did not show the disturbed and agonized characteristics that people with dementia ‘ought’ to show. In other words,
The failure of the photographic exercise, from the standpoint of the agency, was a measure of the success of the day centre from the standpoint of the clients. Here was a place where men and women with dementia were continuing to live in the world of persons, and not being downgraded into the carriers of an organic brain disease.
Other examples of the ‘health politics of anguish’ involve its commercialization. In the ‘Celebrity Champion’ section of the Alzheimer’s Society’s website, celebrities can be seen promoting #EndAlz clothing. Famous ‘faces’ of Alzheimer’s are Rita Hayworth (some speculate that this was an attempt to draw attention away from her alcoholism), former President Ronald Regan, singer Glen Campbell, and (on a fictive level) Julianne Moore, whose filmic portrayal of a woman suffering from the disease in About Alice earned her an Academy Award.
Medical professionals have also been active in seeking to ameliorate the lives of people with dementia. In the 1940s, for example, Marjorie Warren (who has been called the ‘high priestess of geriatrics’) was particularly effective in context of British hospitals and other care-institutions. She emphasized the fact that even those with chronic diseases and dementia were ‘individualists’, meaning that their specific preferences and needs should be taken into account. Rather than being housed in workhouse infirmaries or poorly equipped municipal hospitals, people living with dementia should be given ‘home-like’ accommodation in specific geriatric units. Wards needed to be decorated; comfortable furniture, reading lamps, and recreations were essential. Similarly, in the 1980s, nurse Sally Gadow emphasized the importance of avoiding paternalistic or condescending attitudes in order to create environments in which people with dementia could exercise self-determination.
Notably, the rise of ‘critical gerontology’ has been important in the shift of attitudes towards sufferers. By maintaining that people living with dementia suffer more from losing their ‘standing in the world’ than losing their minds, they have spurred ideas about just how ‘good it is to think’ with dementia. They draw attention to the joy that many people with dementia feel when dancing, listening to music or creating art, being massaged, exposed to pleasant smells, gardening, and interacting with children, animals, or lifelike robots (such as the interactive seal, Paro). The ‘personhood movement’, pioneered by people like Kitwood, emphasizes tactics for living full lives with the symptoms of dementia. It urges people to pay attention to the ‘capacities of the feeling person and not only on his or her losses’. The key words in ‘critical gerontology’ are dignity, respect, and empathy.
More recently, attention has shifted to the ‘embodied self’. Public health specialist Pia Kontos draws attention to ‘bodily knowledge’, arguing that ‘the self is not exclusively constituted by cognition but involves bodily knowledge’. Drawing on non-verbal communication and pre-reflexive phenomenological selfhood, Kontas wants us to discard the old mind/body dichotomy in favour of a model of personhood that places emphasis on embodiment and ‘acknowledges that capacities, senses, and experiences of bodies are central to the exercise of human agency’. The body, she insists, should be ‘treated as a source, and not just a product of subjectivity’. In her words, ‘understanding agency as being located also in the body, and not just in the mind as a separate entity from the body, underscores how the body itself has creative and intentional capacity.’ This is why people like the expressionist painter Willem de Kooning, who did not know the day of the week and could not sign his name, continued painting extraordinary works of art up to his death at the age of 92 years. As she puts it, ‘Despite the loss of memory, the body itself still possess a life force that continues to engage with the world’. This is also what literary critic John Bayley gestures towards in his controversial memoir about Iris Murdoch, entitled Iris (1998; film version in 2001). He noticed that although Iris Murdoch could no longer form ‘coherent sentences’ or ‘remember where she is, or has been’, nevertheless, when asked to sign copies of her book, she knows exactly what to do, carefully writing her name. Iris Murdoch also reenacted acts of ‘caring’ for her husband. In other words, there is a bodily memory – a knowledge of actions performed time and time again that Murdoch returns to time and again without conscious knowledge of ‘what’ or ‘why’.
A related point is made by Nicholas Jenkins in ‘Dementia and the Inter-Embodied Self’ (2014), when he writes
The inter-embodied self does not require a unified or coherent narrative in order to thrive. On the contrary, our inter-embodied selves may be fruitfully conceptualised as montages; polyphonic repertoires of voices and experiences that co-exist in dialogical relationship to one another; constantly updating, constantly changing.
For such thinkers, memory is ‘interactive’. They present a vision of being-human that emphasizes non-cognitive, affective, and inter-relational aspects of thriving.
Such approaches to people living with dementia have been augmented by the writings of dementia patients and their carers. In 1988, J. Bernlef (pseudonym of Hendrik Jan Marsman) published Out of Mind, the first (to my knowledge) book of fiction told from a dementia patient’s point of view. The first book-length autobiography was Robert Davis’s My Journey into Alzheimer’s Disease (1989). Such accounts are a reminder that not all is ‘lost’ in the dementia journey. For example, in Losing my Mind (2002), Thomas DeBaggio laments his ‘shrinking vocabulary’ and reflects on the fact that he has ‘only a few years before I become a hatstand’, but he also contends that his ‘childhood wounds healed, and my dreams were less haunted. Imagination soon replaced anxiety and I learned early its lonely, lovely powers’. Importantly, the subtitle of his memoir is An Intimate Look at Life with Alzheimer’s -- in other words, it remains a celebration of life, albeit a life that has taken an unexpected turn. Or, in the words of Richard Taylor in his Alzheimer’s from the Inside Out(2007), ‘the fact that I have a disease affecting my memory and cognitive processes does not make me any less an adult or any more like a child.... I am not a child. Even if sometimes I act like one, check me out – I AM NOT A CHILD!’ Of course, people with dementia who write their memoirs are not your ‘average’ patient; and there is always the risk of being Pollyannaish about a devastating illness. However, these writers are acutely aware that they will shortly no longer be fully cognitive beings. Like the narratives we looked at of breast cancer survivors and, in another blog, those living with HIV/AIDS, they speak from the kingdom of the ill with an emotional urgency that must not only be heard and respected, but also responded to.
My point is simple: people are much more than their memories. This point is nicely expressed by queer literary theorist Jennifer Eun-June Row in 2022 when she argued that
Until recently, scholarship in disability studies has emphasised that disability is an object – and object of inquiry, of social, medical, or legal studies of deformity or aberrance. These approaches endeavor to probe the origins of, correct, cure, or even eradicate disability. Although well-intentioned, these approaches can unknowingly perpetuate and reinforce hierarchies of ableism – the belief that abled bodyminds [sic] are superior to disabled ones.
Listening to those who are living with dementia reminds us of the need to develop models of selfhood that are liberated from cognitive ability. In her 1993 memoir entitled Living in the Labyrinth, Diana Friel McGowin, asks
If I am no longer a woman, why do I still feel I’m one? If no longer worth holding, why do I crave it? If no longer sensual, why do I still enjoy the soft texture of satin and silk against my skin? If no longer sensitive, why do moving song lyrics strike a responsive chord in me? My every molecule seems to scream out that I do, indeed, exist, and that existence must be valued by someone!
While society defines Friel McGowin solely in terms of a lack or loss, she insists on her continued sensuality and feelings, as well as her insight that the self as an ongoing process – it is malleable and relational. As Tom Kitwood and Kathleen Bredin put it, there is an advantage in seeing personhood in ‘social rather than individual terms’. It can provide ‘an exemplary model of interpersonal life, an epitome of how to be human’. Embracing cognitive alterities is good for all of us.
If this blog interests you, check out my books, What It Means to be Human and The Story of Pain: From Prayer to Painkillers.
Comments