Dementia, global health, and policy.

blur cartography close up concept
Photo by slon_dot_pics on Pexels.com

When we think of dementia we often think of the elderly, probably in care homes, possibly our own family. Most of our visions are of dementia’s manifestation in high income countries. The truth of the matter is, that an estimated 58% of people with dementia live in low- and middle-income countries (LMIC) . To add insult to injury, it was estimated that only 10% of the research into dementia is focused on people living with it in LMIC (Prince et al., 2008).

Many people in LMIC will go undiagnosed, as people often see dementia as a normal part of ageing and those suffering are often stigmatised. Seeing dementia as a normal part of ageing is in and of itself not limited to LMIC, as many people in high income countries at the very least seem to think it is a kind of inevitability. Moreover, stigma certainly also still exists in the West. Many people who are diagnosed indicate that they feel isolated, that their lives are over, that people stop treating them as fully human, and in extreme cases in certain parts of the world elderly people — primarily women — with dementia are accused of witchcraft and ostracised, and in the worst cases, even killed. 

Because dementia is a disorder that will affect more and more people as populations, on average, get older and older, it is often seen by organisations such as the World Health Organisation (WHO) and Alzheimer’s Disease International as a global health priority. Global health is a relatively new concept, with a relatively vague definition, but is generally taken to mean the promotion of equality when it comes to access and quality of healthcare worldwide. Global mental health, has a similar remit only focused on the psyche rather than (just) the physical body. Although these ideals are lofty, and the UN declared health care a human right (article 25 of the UN Declaration of Human Rights — I suggest you give the whole thing a read), there is always the looming possibility of Westerners imposing their own view on others; some would even go as far as to say it is a form of neocolonialism. That does not, however, mean that I think we should throw out the baby with the bathwater. Global (mental) health can be directed by the people of the Global South with the Global North and its institutions (such as the UN) aiding and advocating on their behalf. This means employing, and listening to, locals who will take the cultural sensitivities into account (examples include Chief Kiki Laniyonu Edwards who works to tackle stigma of dementia in Nigeria; Zimbabwean grandmothers offering therapy, Benoit Ruratotoye, a Congolese psychologist trying to tackle violence against women and particularly help the spouses of women raped as a weapon of war to come to terms with what happened, or Women for Afghan Women) . It means adjusting our diagnostic manuals and criteria so that they are relevant and valid within the country’s specific cultural context. It is working together with the spirit of true equality, seeing the people in LMIC/Global South not as people we need to convert, but people we can work with for the benefit of us all.

Returning to the lack of basic research on dementia and its manifestations in LMIC, I think it is important as scientists to be aware of our own biases and our tendency to extrapolate and apply our Western experience to that of everybody else. It is important for researchers in LMIC to have the funds and means to conduct studies on the manifestation of not only dementia, but mental health issues, in their own people and in their own cultures. It is vital that Western universities collaborate, not as superiors but as true and equal partners with the desire to bring about equality in access and quality of healthcare.

I am of the belief that basic scientists, local experts, global health professionals and policymakers would be best served in working together. Issues such as dementia and depression have different cultural manifestation in different cultural contexts. It is vital that policy is made on the basis of scientific knowledge, local knowledge, cultural sensitivity and a genuine belief in promoting equality of access and quality of healthcare. Perhaps the UN as an institution, and definitely its human rights declaration, is too optimistic or idealistic in a world full of violent realities. But it is most certainly the kind of hope and optimism we need on this blue planet we all share. Combining our knowledge (both Western and non-Western, scientific and traditional), using our privileges for good, and looking beyond our own bubble without superiority is the only way we can get closer and closer to the lofty dreams and aspirations of a truly equitable world where human rights, including the right to healthcare, are respected.

Advertisement

Brains have bodies.

photo of head bust print artwork
Photo by meo on Pexels.com

I’m a neuroscientist. Or, well, an aspiring one (let’s not get into the philosophical discussion of when it is that someone can call themselves a scientist. That’s a whole other post for another day). When I mention that people automatically assume a lot of things, with the most common assumptions being an incredible intelligence, and perhaps a lack of social life. However, to some people science also carries with it a connotation of distance; of the ivory tower; of something experts do over ‘there’ that has no bearing on the average person’s life in the ‘real world’. In the very worst case scenario, people might assume malevolence — that you’re in the hands of Big Pharma to propagate vaccinations, or part of China’s elaborate plot to make US manufacturing non-competitive by creating (and apparently recruiting the scientific community to proselytise) the concept of ‘global warming’. The musings of the Leader of the Free World aside, I think there is some merit to the claim that science is at times far removed from the ‘real world’. Sometimes that is a good thing, and sometimes that can cloud our judgment.

In February 2017 The Atlantic published an article about advances in technologies used in neuroscience, and the criticism some scientists have when it comes to using these techniques. These critics warn of the spectre of reductionism looming over our quest to understand the brain. John Krakauer and colleagues (2017) published an article in Neuron discussing the problem of reductionism in neuroscience. They postulate that the advances in technology have created a class of researchers who are well-versed in the novel techniques, but have a tendency to disregard the organism: behaviour, development and evolution are treated as secondary to the neural circuits and the exciting new technologies. As mentioned in the The Atlantic article, wanting to include behavioural research is at times looked at with scepticism in the neuroscientific community, with the idea that behavioural research is the sole domain of psychology as an underlying apprehension. However, it disregards the fact that the lines between psychology and neuroscience are often much blurrier than people give it credit for (not to mention the fact that inferring behaviour from circuits seems to be the wrong order to go at it). Basic biomedical research into disorders such as autism spectrum disorder (ASD) can at times run the risk of disregarding the voices of the autistic community who have called for conditions such as ASD, previously simply classed as ‘disorders’, to be seen as variants of normal human behaviour instead (see more on neurodiversity here).

Neuroscience is hardly the only life science that runs the risk of forgetting the human component of research or treatment. Medicine is famously known for occasionally treating patients as their illnesses and conditions rather than human beings. One reason for this, it is suggested, is caused by the need to distance oneself from the patient and consequently individual responsibility for what happens to the patients. In his book Do No Harm, Mr Henry Marsh hypothesised that a practice as common in neurosurgery as shaving a patient’s head might have its origin in dehumanising the patient in order to make it easier for the surgeon to operate. However understandable it may be to distance oneself, and prevent oneself from getting emotionally attached to patients, it is surely possible to do that without veering into the territory of dehumanisation.

Recently, Ed Yong in The Atlantic, looked at the ethics of a virus study that resurrected a dead horsepox virus. It is more important, the argument goes, to push the boundaries and expand knowledge, even if it is at times at the expense of ethics or concern for global consequences. The scientific quest for knowledge is an honourable one, however, in my opinion, scientists cannot disregard ethics or consequences to humans and the environment in pursuit of it.

I doubt anybody expects scientists to make these ethical decisions on their own, or to constantly think of all the possible consequences of their research. However, I believe all of the aforementioned cases highlight the importance of communication outside of the (biomedical and/or scientific) community with ethicists, psychologists, government, and importantly the public. If we want government, the public, and our colleagues in the humanities to respect science and its place in society, then we have to be more responsible as a community. In the life sciences in particular it is important to avoid reductionism and to remember that most of the research we do will affect people. We cannot recklessly sacrifice our humanity in the quest for knowledge consequences be damned. Science is not removed from society, and if we want the public to believe us when we say that we will have to act as if we believe it ourselves.