On Medical Research.

Medicine is a social science, and politics is nothing more than medicine on a large scale. Those are the words of German physician, and father of social medicine, Rudolf Virchow. Even now, all this time later, his words still resonate. Particularly during the pandemic where stark differences between groups seem to grow ever more obvious – whether it is within countries or between countries and regions. How can we reify the importance of social environment and context in a hyper-biomedical world?

Applying the lens of social medicine seems most obvious and intuitive in infectious diseases and public health. As mentioned before, the pandemic has placed a spotlight on how interconnected we really are. Talk of community transmission, public health and safety measures such as social distancing and masking, but also an increased public spotlight on how infectious diseases spread through marginalised communities and people working low income jobs are all prime examples of how medicine functions as a social science. It has shown us that our communities are more lattice than silo; more multigenerational than age-segregated. But the framework of addressing social causes and their impact on disease can, and should, reach farther – as far as into our world of fundamental and particularly translational science.

I have lamented the blinders that are pervasive in (bio)medical research before. Personally, my interest lies in a type of research called ‘translational research’ which aims to bridge the gap between basic research – that is research that is focused mostly on increasing knowledge about a natural phenomenon – and clinical research. When you’re going to be the bridge between the clinic and the bench, it is imperative that you have a good sense of the community even though that might be a little less intuitive than the connection one has to the bench. Dementia in general, and Alzheimer’s disease in particular, is a good case study for where it is necessary for those at the bench to be in tune with what is happening in the clinic and by extension the community. Alzheimer’s disease is a disease with many social determinants including education, access to health care and social isolation. Research in the United States suggests that African Americans and Hispanic Americans have higher rates of Alzheimer’s disease than their white counterparts, yet have a harder time getting a diagnosis. UK based estimates mimic the delay in getting an Alzheimer’s diagnosis in BAME (Black, Asian and Minority Ethnic) communities. Of course, when factoring in these social and ethnic differences in disease, it is imperative that we do not primarily default to biological determinism and instead take these together in a way that is already common in the social sciences, humanities and public health: it is complex and combines a lower socio-economic status, oppression and structural inequalities with cultural practices and the environment. The reality is that if we are going to translate bench science to the clinic, we need to factor in all of the community-based determinants for health outcomes and give these serious consideration.

In my view, integrating the ‘social’ into social medicine (and social medical research) is two fold: on the one hand the traditional approach of factoring in social determinants of health, on the other being in constant dialogue with the communities most affected by our research. Of course there are ways to keep the scientists at the bench engaged with the clinic and the raison d’être of their medical research. Alzheimer’s Society, a UK based charity that funds research into dementia and aims to improve quality of life for those suffering from dementia and their carers, funds projects and regularly allows volunteers to meet with the researchers to discuss the work. Community engagement is not solely science communication, but genuine interest and cooperation with the people our research most affects.

I think the adage ‘people’s health in people’s hands’, a slogan from the People’s Health Movement, is relevant to translational science as well. If translating science from the bench to the bedside is something we care about, we have to engage with the communities we are supposed to be working to help. That means we have to take a holistic approach; one that takes the social science nature of medicine – lab or clinic – into consideration, and one that is deeply rooted in the community. This will not only make our research more accurate and grounded in on-the-ground reality but will also empower people to be active agents in treatment and research into diseases that affect them and their loved ones. It will mean that people can truly take their health into their own hands.

If translating science from the bench to the bedside is something we care about, we have to engage with the communities we are supposed to be working to help. That means we have to take a holistic approach; one that takes the social science nature of medicine – lab or clinic – into consideration, and one that is deeply rooted in the community.

The Argentine-Cuban revolutionary and doctor Ernesto “Che” Guevara, in an address to fellow doctors, pointed out that it’s important not to approach people and communities from a space of charity, but with a sense of solidarity. He says: “We should not go to the people and say, ‘Here we are. We come to give you the charity of our presence, to teach you our science, to show you your errors, your lack of culture, your ignorance of elementary things.’ We should go instead with an inquiring mind and a humble spirit to learn at that great source of wisdom that is the people”.
I think this is not just an important mindset for the doctors working in the clinic, and not just in factoring in the social determinants of health, but certainly also for us trying to make a difference through translational medical research. An inquiring mind and a humble spirit truly go a long way.

Advertisement

A Covid-vaccine mustn’t be hoarded.

Photo by Retha Ferguson on Pexels.com

On July 20, researchers at Oxford University’s Jenner Institute released preliminary Phase I data on the immune response of their vaccine candidate, ChAdOx1 nCoV-19, in The Lancet. These findings are helpful and bring a glimmer of hope that perhaps a vaccine could be found to prevent (severe) COVID-19, caused by the virus SARS-CoV-2. On the same day the World Health Organisation (WHO), cautioned the world that indigenous peoples in the Americas, the current epicentre of the pandemic, are particularly vulnerable to the virus and its severe ramifications. This only strengthens the urgency with which we must avoid hoarding a potential vaccine or treatment for COVID-19 away from the most vulnerable in the world.

As we have seen over the last months, this virus and the disease it causes does not hit every one of us equally. The epidemic’s epicentre has shifted from China to Europe, and is now currently in the Americas. What we have seen is that many vulnerable people have borne the brunt of the pandemic, with the burden of mortality mainly shouldered by minoritised and racialised communities in Europe and the United States and key workers in general (many minoritised and racialised communities are also more likely to be frontline workers), as well as those with lower socio-economic backgrounds. As mentioned in the previous paragraph, Dr Tedros, the Director General of the WHO, has recently mentioned how indigenous communities in the Americas are currently most at risk of suffering the effects of the Covid surges throughout the continent. Presently, the spike in SAR-CoV-2 infections in recently contacted indigenous peoples in the Amazon have raised alarm. Furthermore, although some countries with weaker health systems have seemingly been able to relatively contain the virus, it has nonetheless been a terrible strain, especially in countries that are also still dealing other communicable disease outbreaks such as a recent Ebola and measles outbreak.

Recently, the United States bought up most of the world’s supply of Gilead’s remdesivir which, other than the drug dexamethasone, is currently the only hopeful candidate treatment for COVID-19. Even though there is as of now limited evidence for remdesivir, and the cheap drug dexamethasone at time of writing seems more promising, the move by the United States sets a worrying precedent.

As I have stated so many times on this blog, health is a human right. To ensure accessibility and equity in healthcare we have to act accordingly. When countries with relatively strong healthcare systems and strong scientific infrastructure to research and produce vaccines and medicines to prevent or treat COVID-19 end up distributing, or even hoarding, these vaccines and treatments for their own populations, there is a strong possibility that countries with disadvantages, many incurred because of a history of colonialism and extractive capitalist exploitation, will end up holding the metaphorical baby. Within these countries the poorest and those made most vulnerable (including indigenous peoples) will suffer the most. Beyond vaccine hoarding, the selling of vaccines or treatments for profit by pharmaceutical companies will also disadvantage the world’s poorest and those in (mainly) Global South countries. Moreover, there are some concerns that neocolonial approaches to vaccine and medicine testing will end up using the African continent as testing ground.

Dr Tedros has reiterated in the daily briefing that a potential vaccine should be a public good. It must be continually emphasised that access to healthcare is a basic human right. Many countries have pre-existing issues with being able to reach their most vulnerable communities and provide them with appropriate healthcare, and while the pandemic has exposed the vulnerability of all of our health systems, some countries and some people will be more disadvantaged than others. It is imperative that countries with more advanced health systems do not return to an ‘each man for himself’ mentality, but act in the spirit of solidarity.

A post-Covid world could – indeed should – be one where healthcare is accessible, health is treated as a human right, and our approach to global and public health is one of internationalism and solidarity.

A vaccine or treatment must be freely accessible to all people. The importance of healthcare as a human right must underpin every step our governments take moving forward. The pandemic has shown us that in an increasingly connected world, our health systems are really only as strong as the weakest link. In a neoliberal capitalist world it is progressively common to see everything, including our human rights, through the lens of profit margins and winners and losers. Austerity, the privatisation of healthcare, and growing inequality have direct impact on global and public health. We cannot, then, in good conscience apply the ‘logic’ of the market to a global pandemic where many vulnerable people are needlessly losing their lives and suffering. A post-Covid world could – indeed should – be one where healthcare is accessible, health is treated as a human right, and our approach to global and public health is one of internationalism and solidarity. What better way to laud in the new world than to use these principles as the way out of the pandemic? What better way to increase equality, health access and diminish the possible catastrophic effects of a next pandemic than to work together to make vaccines and treatments freely accessible? It is not just a nice thought; I would go as far as to say that this is our moral duty. The time for complacency is over and the time for solidarity is now.

Quarantine Foodies.

person holding sliced vegetable
Photo by Maarten van den Heuvel on Pexels.com

I, myself, am a self-professed foodie. Unsurprisingly, the lockdown/quarantine life has been the perfect time for me to fulfil my New Year’s resolution and spend more time cooking and baking, experimenting with new recipes, and sharing the fruits of my labour with my neighbours. It seems that I am far from the only person who has used these uncertain times to refine their cooking and baking skills. Many people have been sharing their new, delicious (and more or less successful) creations on their social media, though the endless stream of sourdough breads found on timeliness and explore pages has drawn ire from some people as well. Cooking and eating is far from just nourishment to our species, which is a rarity in the animal kingdom. So why is it exactly that humans have developed such a foodie culture?

During this time of quarantine I, too, have started creating my own sourdough starter. Her name is Prof Marie Curyeast.

Chimpanzees and gorillas have a rather monotonous diet compared to us, even though they are our closest living relatives. Drs Karina Fonseca-Azevedo and Suzanna Herculano-Houzel, Brazilian neuroscientists, have researched the possible reasons we, as humans, have such large brains compared to our other primate cousins. Their research suggests that perhaps our large brains compared to both our bodies and our primate cousins’, might be a result of having learnt how to use fire to cook. As brains are energy intensive organs, many raw-food-eating primates have to spend more of their time eating than humans do. Moreover, a primate living off of raw food with a brain of our size would probably spend most of their waking hours eating. Perhaps, then, cooking, which causes us to take up many more calories in one go than consuming solely raw foods would, has been an evolutionary trade-off making our bodies smaller, our brains larger, and our feeding time more special.

Dr Julie Mennella, amongst others, has done research on how children develop preference for certain foods. Her research suggests that children innately have a preference for sweet things and a dislike for bitter things, as a result of our evolutionary history. Furthermore, her research suggests that we learn about our food preferences through exposure in utero and, for breastfed babies, through breastmilk. Though later on babies on solid food can learn to like foods they initially dislike through repeated exposure. In the Season 2 episode of Babies on senses, Dr Mennella rightly points out that for human beings, ‘food is much more than a source of nutrients or a source of calories. It gives us pleasure.’

So perhaps our culinary endeavours during this unsure time are just ways for us to grasp at something seemingly fundamental to our humanity – where we can see cooking and baking as a form of community, connectedness and comfort, even in darkness and uncertainty.

Evolutionarily there might be a reason we have started to cook more, and science gives us somewhat of a perspective of why we have developed such a foodie culture as a species. There is so much more food means to us on a personal, individual or cultural level than just explained by evolution. The science suggests us that we are already connected and involved in our family’s or culture’s food structure from when we are very little, our tastes ever evolving with the wider scope of exposure to different foods. Cooking can tie us to our family histories, can make us feel connected to those that have gone before us. We can find a kind of grounding and deep humanness in plunging your hands into dough or salad. Food is a way in which we show love and care, through which we maintain social connections with our friends and families. Our recipes are what we pass down to the children in our lives. So perhaps our culinary endeavours during this unsure time are just ways for us to grasp at something seemingly fundamental to our humanity – where we can see cooking and baking as a form of community, connectedness and comfort, even in darkness and uncertainty. The real test will be if we keep up our cravings for nostalgia, comfort and connectedness now lockdowns are easing and people are trying to find a new sense of normality in abnormal times. But one truth remains: food, and subsequently foodie culture, is integral to human nature, in good times and in bad!

There is power in a union.

Embed from Getty Images

In September of this year, many university workers in The Netherlands started rising up against the abominable conditions many academics, non-academic university workers, and students alike face in academia. Many joined unions. In the United Kingdom, academics in November of this year again decided to picket and protested their precarious contracts, pay, and called for the protection of pensions within academia with many students picketing in solidarity. Indeed, it is clearer than ever that the majority of university staff, academic or otherwise, as well as its PhD students, are workers too and that love for the field does not preclude exploitation by bosses or dire working conditions. The fight between labour and capital within academia and wider society alike has escalated. Workers across the spectrum are uniting and unionising. In the words of Joe Hill (by way of Billy Bragg): there is power in a union.

The ivory tower – the hallowed halls of academia – are often seen through rose-tinted glasses. To many within it the work they do is indeed something they are passionate about and work often doesn’t feel like work at all. But the privilege of doing something you love, and the hallowed halls themselves, have been blinding many within and without from the decline in working conditions amongst its dwellers. Many academics are overworked. This state of continual working is, amongst other things, caused by scarcity of permanent academic positions, precarious contracts, the pressure to ‘publish or perish’, and also the internalisation of the neoliberal instinct to see each other as competition. We have normalised working in uncertain conditions on temporary contractsPhD students – who aren’t actually students but workers too – suffer from mental health issues, and there are ever slimming chances of getting a job within the hallowed halls that many of us hold dear. However much we might want to pretend that we are not the same as other workers in other sectors, the facts suggest otherwise.

Union membership in The Netherlands has been declining. The decline in union membership is a trend seen in countries like the United States and the United Kingdom as well. At the same time, workers across the spectrum have been seeing a decline in their working conditions, increase in precarious contracts, and stagnating wages. The decline in union membership, and in some cases their bargaining power, has not come out of nowhere but is the result of systematic policies. Many people, especially young people, may believe that unions are therefore an anachronism and have no relevance to the life of a modern worker. I believe nothing could be further from the truth. Organised labour gave us the five day workweek and many other gains. Both in academia and outside of it, the need for workers to organise against exploitative bosses and bad government policy is more pertinent than it has been in a very long time.

Both in academia and outside of it, the need for workers to organise against exploitative bosses and bad government policy is more pertinent than it has been in a very long time.

If we are going to organise against austerity, against the dismantling of our universities and gutting of public services, against precarious contracts – if we are going to organise against the exploitation by capital of our labour – we, within academia’s ‘hallowed halls’, are going to have to understand that we are not somehow better that other workers or the exception to the rule. The time to stand in solidarity with other workers who are striking, picketing, speaking up and fighting for better conditions (whether it’s health workersteachers, transport workers or others) has come. We can change the system not just for ourselves, but for each other with each other.

I believe that academia and its promises are worth saving. There is such passion for its reform within many of its workers, academic or otherwise, because we value education, research, and curiosity. In this moment of awakening and awareness of our conditions, and that of our fellow workers across different sectors, it is imperative that we stand in solidarity with each other and that we use our collective bargaining power to make life better for everyone involved. Trade unions have a great history of protecting, fighting, and gaining (for) our rights. They are not obsolete, but instead can help us progress. Solidarity between workers of all stripes is the only thing that will move us forward. There is power in a union, indeed.

You can’t eat money.

greta for president signage
Photo by Markus Spiske temporausch.com on Pexels.com

Greta Thunberg, the 16-year-old climate activist that I have talked about on this blog before, seems to bring out the worst in a (mostly) male cohort across Europe. In The Netherlands, Britain and Italy, mocking the teenager and her autism seems to be a national sport. Grown men being threatened by a child who has been speaking truth to power is baffling to many of us, but let those voices not deter us from her message that is still incredibly pertinent.

When we talk about climate change, we often think about middle-class environmentalists and we focus intently on individual sins: your carbon footprint is too high, you should be vegan, you you should not take long showers, you should not use plastic bags or bottles and ideally you really ought to drive an electrical car and not have more than two children (or even better: have no children at all!). Individual acts to counter climate change are always welcome, but I am afraid the focus on personal sinfulness does little to challenge the systemic sins of large corporations, the current economic system, and the collective “climate delaying” by our governments. In addition, the narrative surrounding climate change seems to easily point the finger at the Global South, who are historically far from the worst polluters, rather than confront the West’s longstanding intransigence. This has to change.

Just to be clear, individual choices to live a sustainable lifestyle are absolutely necessary. We are all responsible for the planet because this is, to quote Pope Francis, Our Common Home after all. However, the scale of the change needed to avert climate catastrophe far exceeds what can be done by simple individual changes in lifestyle. As 70% of the green house emissions since 1988 is produced by just 100 companies it seems clear that our individual changes alone are not going to cut it. What is needed is radical, systemic, change that comes from the top. Like Vox reporter David Roberts said (and I’m paraphrasing): we will have to stop signing resolutions and producing reports at one point, and actually start implementing policy changes.

Leaving it up to the market to solve seems to be, to some, a rational response to the climate catastrophe that awaits us. But in a globalised world full of consumers where companies are the main polluters, and where the fossil fuel lobby still holds sway over decisions made by our government, it seems to me that the problem might be with the unsustainable capitalistic system we have now. If we want to steer clear from the absolute worst case scenario, we will have to do something drastic and possibly eschew capitalism altogether. Saying that, understandably, evokes memories of the 20th century’s brutal dictatorships, but there is very little preventing us from creating a system that is both democratic and does not worship the market and place it above human dignity or the survival of our planet and species. What is needed is the will and the vision of (young) ambitious policymakers and politicians who are not in the pocket of the industries that are commodifying our human experiences and ‘our common home’. We need to think beyond the current frameworks, and that includes looking beyond an unsustainable economic system that we have grown so accustomed to.

Alanis Obomsawin, the Abenaki filmmaker, once said that:

When the last tree is cut, the last fish is caught, and the last river is polluted; when to breathe the air is sickening, you will realize, too late, that wealth is not in bank accounts and that you can’t eat money.

To effect actual change that is needed on a large scale we need to move beyond the highly atomised perspective of personal sins and individual carbon footprints. The sheer scale of the task ahead of us, the task we, as stewards of the earth, are burdened with, requires a radical approach. It requires the fundamental, and collective, overhaul of our current economic and political system. It needs to go beyond good will, treaties and pledges and towards radical policy changes. This will mean that we will all have to chip in, and more than anything that the largest fossil fuel and transport corporations will need to be taxed heavily. The fact of the matter is that we can’t eat money. Our common home is more than a commodity to be passed between hands, or a resource to be continually exploited at the expense of the worlds poorest, sickest and youngest. When children are dying because of air pollution induced asthma and when people around the world are dying because of extreme weather, it is clear we have to do something. It means listening to the scientists, to the young and ambitious politicians, and to the young activists like Greta (who deserves more than mockery for her passion for the planet and her autism). Right now it is not yet too late. But how much longer can we say that?

Climate change, anti-vaxxers, and ‘alternative facts’.

Embed from Getty Images

The last days of February were somewhat of a shock to the system of humans and nature alike, as temperatures rose to 21 degrees celsius in Kew, and many parts of Northwestern Europe saw April-like weather conditions. Whilst most people were enjoying the sun by lunching in parks, eating ice cream, and chatting on terraces, something about these scenes were equally unsettling. Clearly these temperatures are far from normal, and enjoying it felt an awful lot like the famous meme where Jay-Z bobs his head to music with an anxious expression on his face.

The Intergovernmental Panel on Climate Change (IPCC) recently released a report detailing what will happen if the Earth warms by 1.5 degrees celsius or by 2 degrees, and issued stark warnings. The World Health Organisation (WHO) warned of the health risks associated with climate change and an increased air pollution, and unsurprisingly the poorest people in low income countries who are far from the biggest polluters, will bear the brunt of the detrimental effects of climate change. In August 2018, 16 year old Greta Thunberg became somewhat of a celebrity when she started the school strikes for climate action that have become a phenomenon across Europe as of late. Expected petulance from the adults in the room aside, climate change has been firmly on the table. Whether through Greta and her age cohort’s school strikes, the IPCC’s reports, or Alexandria Ocasio-Cortez’s “Green New Deal“, climate change is something finally talked about in earnest. The interesting phenomenons that come with this increased attention for climate change, are both climate change deniers and climate delayers.

Climate delayers (thanks for coining the term, AOC) are the climate change deniers more respectable cousins. These are people who are aware of the devastation of climate change, but are reluctant to support or enact drastic reform of laws and regulations to make a meaningful difference to reverse, or more realistically lessen, the devastation that awaits us and our progeny. These are often politicians who will say that they are already doing more than they should, and expediently postpone any major changes for long enough so the next administration can not deal with the issue. Climate change deniers are the people (like the US president) who have an absolute commitment to denying all the scientific evidence for global warming and climate change, and are hostile to any measures taken to mitigate the effects of climate change. This outright denial of the evidence is an interesting phenomenon. As we all know, countering climate change denial with facts or insults do not help change people’s minds — in fact, they might even get more entrenched and double down on their views even more (this is called cognitive dissonance). It is easy to believe that many of the climate change denying politicians have some kind of vested interest in maintaining the status quo, but the reasons why the general public might not believe in climate change are less obvious and more disparate. These reasons range from misinformation (‘alternative facts’ if you’re Kellyanne Conway), to a lack of knowledge on what global warming entails or what the consensus really is.

This brings me on to anti-vaxxers. Anti-vaxxers are often young, middle-class parents who have chosen not to vaccinate their children because vaccines cause autism. Let me make it absolutely crystal clear: vaccines DO NOT cause autism. There is no evidence for this, and Andrew Wakefield is a disgraced physician who lost his license because of his shoddy and unethical study. Those are the facts based on countless scientific studies, but relaying those facts will probably cause cognitive dissonance if it is someone’s strongly held belief that vaccines are a Big Pharma conspiracy that will endanger their child.

Recently, there have been countless measles outbreaks across the world, with, for example the European Centre for Disease Prevention and Control identifying a suboptimal vaccination coverage to create herd immunity as the culprit. Many of the people who fell victim from this outbreak were young children who were too young to be vaccinated (and who, just like the immunocompromised, herd immunity is meant to protect). Just like with climate change, there seem to be people who are aware of the function and effectivity of vaccines, but think parents’ right to choose trumps public health. An argument can be made that if we are proponents of liberalism, individual liberty is of prime importance. Even so, John Stuart Mill, the father of liberalism, proposed that “The only purpose for which power can be rightfully exercised over any member of a civilised community, against his will, is to prevent harm to others”. Indeed, I would argue that governments being more diligent about getting at least 95% of the population vaccinated — perhaps even making them mandatory for those who aren’t immunocompromised — fits perfectly within Mill’s harm principle. Particularly when there are so many people who suffer without any choice in the matter, because of somebody else’s parent’s decision not to vaccinate their children. The most vulnerable in society bear the brunt of this, again, not unlike with climate change. Indeed, the WHO has named a lack of vaccination as one of the biggest threats to global health. Whilst low- and middle income countries, despite challenges, do their best to get vaccinations to the most vulnerable populations, there are instances where Westerners reintroduce preventable diseases to countries that had finally put a handle on them, and even export the fearsome stories of what might happen to vaccinated children. A recent example of exported disease it that of the French family who reintroduced measles to a measles-free Costa Rica. Situations like this give an eerily colonialist feel to the case. But I digress…

Whether it is climate scepticism, or anti-vaxxers, how we change people’s minds is a difficult question to answer. It is clear that questioning parents’ love for their children, or insulting climate sceptics is not working. Perhaps, in the case of climate change deniers, Greta and her peers’ strategy of striking until the grownups finally listen is a good idea. Maybe we should all be lobbying politicians, striking, signing petitions, at least so the policymakers end up doing something — the rest of the population might follow. When it comes to those firmly believing in the anti-vaccination movement it is important that we try to tackle these harmful untruths with evidence and understanding. Moreover, public health officials ought to do a better job at educating the population, the government ought to be more diligent in tackling misinformation, and journalists should stop inviting ‘both sides’ to create a sense of false balance. Scepticism is not a bad trait. Indeed, even a dose of scepticism towards established science is not necessarily a bad thing. But it is up to critical citizens to find factual and truthful answers to their questions based on research and scientific evidence, and why it is that scientists have reach consensus over something. I promise you, scientists do not reach consensus easily. So, I leave you with some sage advice: please, don’t believe everything you read on Facebook.

Dementia, global health, and policy.

blur cartography close up concept
Photo by slon_dot_pics on Pexels.com

When we think of dementia we often think of the elderly, probably in care homes, possibly our own family. Most of our visions are of dementia’s manifestation in high income countries. The truth of the matter is, that an estimated 58% of people with dementia live in low- and middle-income countries (LMIC) . To add insult to injury, it was estimated that only 10% of the research into dementia is focused on people living with it in LMIC (Prince et al., 2008).

Many people in LMIC will go undiagnosed, as people often see dementia as a normal part of ageing and those suffering are often stigmatised. Seeing dementia as a normal part of ageing is in and of itself not limited to LMIC, as many people in high income countries at the very least seem to think it is a kind of inevitability. Moreover, stigma certainly also still exists in the West. Many people who are diagnosed indicate that they feel isolated, that their lives are over, that people stop treating them as fully human, and in extreme cases in certain parts of the world elderly people — primarily women — with dementia are accused of witchcraft and ostracised, and in the worst cases, even killed. 

Because dementia is a disorder that will affect more and more people as populations, on average, get older and older, it is often seen by organisations such as the World Health Organisation (WHO) and Alzheimer’s Disease International as a global health priority. Global health is a relatively new concept, with a relatively vague definition, but is generally taken to mean the promotion of equality when it comes to access and quality of healthcare worldwide. Global mental health, has a similar remit only focused on the psyche rather than (just) the physical body. Although these ideals are lofty, and the UN declared health care a human right (article 25 of the UN Declaration of Human Rights — I suggest you give the whole thing a read), there is always the looming possibility of Westerners imposing their own view on others; some would even go as far as to say it is a form of neocolonialism. That does not, however, mean that I think we should throw out the baby with the bathwater. Global (mental) health can be directed by the people of the Global South with the Global North and its institutions (such as the UN) aiding and advocating on their behalf. This means employing, and listening to, locals who will take the cultural sensitivities into account (examples include Chief Kiki Laniyonu Edwards who works to tackle stigma of dementia in Nigeria; Zimbabwean grandmothers offering therapy, Benoit Ruratotoye, a Congolese psychologist trying to tackle violence against women and particularly help the spouses of women raped as a weapon of war to come to terms with what happened, or Women for Afghan Women) . It means adjusting our diagnostic manuals and criteria so that they are relevant and valid within the country’s specific cultural context. It is working together with the spirit of true equality, seeing the people in LMIC/Global South not as people we need to convert, but people we can work with for the benefit of us all.

Returning to the lack of basic research on dementia and its manifestations in LMIC, I think it is important as scientists to be aware of our own biases and our tendency to extrapolate and apply our Western experience to that of everybody else. It is important for researchers in LMIC to have the funds and means to conduct studies on the manifestation of not only dementia, but mental health issues, in their own people and in their own cultures. It is vital that Western universities collaborate, not as superiors but as true and equal partners with the desire to bring about equality in access and quality of healthcare.

I am of the belief that basic scientists, local experts, global health professionals and policymakers would be best served in working together. Issues such as dementia and depression have different cultural manifestation in different cultural contexts. It is vital that policy is made on the basis of scientific knowledge, local knowledge, cultural sensitivity and a genuine belief in promoting equality of access and quality of healthcare. Perhaps the UN as an institution, and definitely its human rights declaration, is too optimistic or idealistic in a world full of violent realities. But it is most certainly the kind of hope and optimism we need on this blue planet we all share. Combining our knowledge (both Western and non-Western, scientific and traditional), using our privileges for good, and looking beyond our own bubble without superiority is the only way we can get closer and closer to the lofty dreams and aspirations of a truly equitable world where human rights, including the right to healthcare, are respected.

Brains have bodies.

photo of head bust print artwork
Photo by meo on Pexels.com

I’m a neuroscientist. Or, well, an aspiring one (let’s not get into the philosophical discussion of when it is that someone can call themselves a scientist. That’s a whole other post for another day). When I mention that people automatically assume a lot of things, with the most common assumptions being an incredible intelligence, and perhaps a lack of social life. However, to some people science also carries with it a connotation of distance; of the ivory tower; of something experts do over ‘there’ that has no bearing on the average person’s life in the ‘real world’. In the very worst case scenario, people might assume malevolence — that you’re in the hands of Big Pharma to propagate vaccinations, or part of China’s elaborate plot to make US manufacturing non-competitive by creating (and apparently recruiting the scientific community to proselytise) the concept of ‘global warming’. The musings of the Leader of the Free World aside, I think there is some merit to the claim that science is at times far removed from the ‘real world’. Sometimes that is a good thing, and sometimes that can cloud our judgment.

In February 2017 The Atlantic published an article about advances in technologies used in neuroscience, and the criticism some scientists have when it comes to using these techniques. These critics warn of the spectre of reductionism looming over our quest to understand the brain. John Krakauer and colleagues (2017) published an article in Neuron discussing the problem of reductionism in neuroscience. They postulate that the advances in technology have created a class of researchers who are well-versed in the novel techniques, but have a tendency to disregard the organism: behaviour, development and evolution are treated as secondary to the neural circuits and the exciting new technologies. As mentioned in the The Atlantic article, wanting to include behavioural research is at times looked at with scepticism in the neuroscientific community, with the idea that behavioural research is the sole domain of psychology as an underlying apprehension. However, it disregards the fact that the lines between psychology and neuroscience are often much blurrier than people give it credit for (not to mention the fact that inferring behaviour from circuits seems to be the wrong order to go at it). Basic biomedical research into disorders such as autism spectrum disorder (ASD) can at times run the risk of disregarding the voices of the autistic community who have called for conditions such as ASD, previously simply classed as ‘disorders’, to be seen as variants of normal human behaviour instead (see more on neurodiversity here).

Neuroscience is hardly the only life science that runs the risk of forgetting the human component of research or treatment. Medicine is famously known for occasionally treating patients as their illnesses and conditions rather than human beings. One reason for this, it is suggested, is caused by the need to distance oneself from the patient and consequently individual responsibility for what happens to the patients. In his book Do No Harm, Mr Henry Marsh hypothesised that a practice as common in neurosurgery as shaving a patient’s head might have its origin in dehumanising the patient in order to make it easier for the surgeon to operate. However understandable it may be to distance oneself, and prevent oneself from getting emotionally attached to patients, it is surely possible to do that without veering into the territory of dehumanisation.

Recently, Ed Yong in The Atlantic, looked at the ethics of a virus study that resurrected a dead horsepox virus. It is more important, the argument goes, to push the boundaries and expand knowledge, even if it is at times at the expense of ethics or concern for global consequences. The scientific quest for knowledge is an honourable one, however, in my opinion, scientists cannot disregard ethics or consequences to humans and the environment in pursuit of it.

I doubt anybody expects scientists to make these ethical decisions on their own, or to constantly think of all the possible consequences of their research. However, I believe all of the aforementioned cases highlight the importance of communication outside of the (biomedical and/or scientific) community with ethicists, psychologists, government, and importantly the public. If we want government, the public, and our colleagues in the humanities to respect science and its place in society, then we have to be more responsible as a community. In the life sciences in particular it is important to avoid reductionism and to remember that most of the research we do will affect people. We cannot recklessly sacrifice our humanity in the quest for knowledge consequences be damned. Science is not removed from society, and if we want the public to believe us when we say that we will have to act as if we believe it ourselves.