Preventing a dangerous game of hide-and-seek in medical trials

Helen S reveals how the pharmaceutical industry hides unfavourable results from medical trials. She warns of the risks to human health, and proposes how we can make medical research more robust and trustworthy

Have you ever questioned the little pills prescribed by your doctors? I had not, until I began working on this article – and the truth is, we know less than we should about them. It is scary to think that, though these medications are supposed to heal us when we are feeling poorly, in reality, that it is not always the case.

Clinical trials are experiments or observations done for clinical research that compare the effects of one treatment with another. They may involve patients, healthy people, or both. Some are funded by pharmaceutical companies, and some are funded by the government. I will mainly focus on the phenomenon of hiding negative data in industry-funded trials.

Research done in 2005 by Evelyne Decullier, Senior Research Fellow at Hospices Civils de Lyon, compared registered trials that have failed and those that have succeeded, and which ones appear in the medical journals and academic literature. They consistently found that only half of trials are ever published and that positive results are 2.5 times more likely to be published than negative results.

Now, you might say, ‘how can those trials possibly affect me or other ordinary people?’ Well, read on…

Why this matters for your health

Lorcainide is an anti-arrhythmic heart drug and was tested in clinical trials in the 1980s. The results showed that patients given Lorcainide were far more likely to die than patients who weren’t.  But those results were not published until 10 years later, and during that time, doctors had been prescribing the drug to patients. According to Dr Ben Goldacre, director of DataLab at Oxford University, it has been estimated that more than 100,000 people who had taken Lorcainide died in America as a result. And Lorcainide is not a single case. Similar things may be happening to other clinical trials relating to drugs such as anti-depressants or cancer treatment.

The lack of transparency can also affect decisions on government spending. From 2006 to 2013, the UK government was advised to buy a drug called Tamiflu which was supposed to reduce pneumonia and death caused by influenza. The UK government went on to spend £424 million stockpiling this drug. But when the systematic reviewers tried to gather up all the trials that have been done on Tamiflu, they realised that the government had only seen a small number of the trials. They battled for years to get the trials from the drug company, and when they had finally got all of them, they found that Tamiflu was not sufficiently effective to justify that large a cost. If companies continue to withhold trials, similar expensive trials are going to be repeated, putting the volunteers, patients and doctors in danger.

Pharmaceutical companies have failed us, so what about the law? In America, it is required that medical trials held in the US need to be registered and have their results submitted within one year of the trial finishing. However, when scientists looked back at the data in 2015, they found out that only 20% of trials were submitted and reported.

Industry-funded research is not the complete villain in this situation. During these types of research, discoveries are more likely to occur (Adam, 2005; Adam, Carrier, & Wilholt, 2006; Carrier, 2004). And thanks to funding from industry, scientists are less pressured to present something that is directly linked to real‐world use, compared to public or government-funded projects (Carrier, 2011). And as we all know, new technologies all start with discoveries.

Finding remedies

Here are some suggestions from scientists for improving the current situation: to increase the transparency, to increase reproducibility and the most doable one, effective criticism (Elliott,2018). Out of these, the criterion that is the easiest to modify is to have more effective criticism. It is important to acknowledge that criticism doesn’t need always to be negative. Though the agencies that are usually responsible for evaluation can be limited by a variety of reasons, such as  understaffing or political issues, “they can get more involved in designing the safety studies performed by industry in specific cases,” suggests Philosopher of Science, Kevin Elliott. (A safety study is a study carried out after a medicine has been authorised, to obtain further information on a medicine’s safety, or to measure the effectiveness of risk-management measures.)

Luckily we have the technologies in our hands. Alpha Fold is leading the scene: it has done some amazing and accurate predictions on predicting the 3D shape of proteins, meaning scientists can facilitate the design of stable protein. It can also help to make sense of X-ray data to determine crystals structure; before Alpha Fold was invented, determining the structure of proteins to do structure-based drug design could take 3-4 years. Now they are presented in front of you in less than an hour.

Everyone is different, some people might have allergies, and some drugs might not even work for some people. To avoid these situations, technologies such as AI could make your prescription personalised to you. By analysing your DNA information sent to your pharmacy, AI would analyse the dosage and the drug suitable for you. The 3D printed “polypill” is a single pill that has all the personalised medication you need in one day in one pill, which is remarkable. 

Hopefully, now it is a little easier to understand the importance of transparency in clinical testing. Trial results were never just numbers – they are directly linked to the lives of millions. Pharmaceutical companies were not simply hiding data – they were hiding the deaths of the volunteers and patients, and the money of families wasted on more expensive but less effective treatments. There must be, without doubt, serious consequences if companies don’t follow regulations.  I believe there will be hope if the scientists use technology effectively and if a better research environment is created for future generations.

Should prisoners on Death Row be accepted as Organ Donors?

Isobel, a Year 10 pupil at WHS, assesses the ethics and logistics of accepting death row prisoners as organ donors.

Disclaimer: This piece is based on the US death row and does not highlight my own views on capital punishment.

From a Utilitarian standpoint, there may appear to be a simple answer to this question: organ donation should be permitted because there is a global shortage of transplantable organs and those in dire health condition are unable to receive the medical care they need. However, as more research is done numerable practical and ethical barriers arise. One country that already utilises organs from death row inmates is China. Reports state that more than 5,000 prisoners are executed in China annually, and organs are harvested for transplantation from suitable prisoners. These prisoners are executed via a temporal gun shot wound and are declared dead secondary to execution. They are not declared brain dead which causes many ethical headaches because the physicians removing the organs are then put in the position of executioner. This brief case study begins to highlight some of the major opposing arguments to organ donation from death row prisoners.

Picture from showing surgery https://pixabay.com

The numerous practical barriers surrounding organ procurement from death row prisoners begin to pile up after closer inspection. The first issue is the low yield of transplantable donor organs from these prisoners due to the potential high likelihood of alcohol or drug abuse. Whilst this is a potential stereotype, these factors can drastically impact the quality of the organs being donated.

For example, alcoholism is the leading cause of liver disease in the US because heavy drinking can cause irreversible cirrhosis. Approximately 10-20% of heavy drinkers develop this disease and it is the ninth leading cause of death in the US, killing around 35,000 people a year. Prisoners in long term facilities will not live on nutrition rich diets, will most likely be malnourished because of the (often) poor-quality food they have consumed and will not get the adequate exercise to build up strong organs such as hearts and lungs. These reasons could also impact the quality of their organs for transplantation.

The second practical barrier preventing condemmed prisoners from being organ donors is the logistics on the day of execution. The surgeon performing the operation cannot kill the patient by removing their organs as it breaches the Hippocratic oath of ‘do no harm’. The patient must already be dead or pronounced brain dead before they are put under general anaesthesia because when the transplant team cross clamps the aorta, resulting in a cardiectomy and takes the patient of the ventilator, they are then declared dead. Many physicians’ groups, including the American Medical Association, have prohibited physician participation in state executions on ethical grounds.

Looking through a utilitarian lense the death of an organ donor means dozens of lives saved and the donation is there simply to help those suffering from end stage organ disease, not for any other ulterior motives. The two documents that set out the rules around organ donation in the US are the National Transplant Act of 1984 and the Uniform Anatomical Gift Act. Neither of these documents explicitly prohibits organ donation by death row inmates which means there is no law preventing it from happening. The National Transplant Act states that organ donation cannot be made for ‘valuable considerations’, including exchange of money, material benefit, or a shortened sentence.  This would not be an issue for death row inmates as they have already been condemmed until the end of their life and they have little access to the wider society.

Christian Longo went public with his idea to donate organs as a condemmed prisoner and joined the organisation G.A.V.E (Gifts of Anatomical Value from Everyone). He came up with the idea himself so there is no fear of coercion and he approached the New York times with his story on a voluntary basis. There have been 14 other publicised instances of death row inmates and their lawyers attempting to seek their respective opportunities to donate their organs. They were denied on the grounds of current knowledge on the matter. As popularity surrounding capital punishment begins to dim the public’s sympathy for those stationed on death row is increasing. The conversation surrounding a prisoner’s ability to choose to have one good action in the world before their execution is becoming ever louder.

When a person in incarcerated many of their free rights no longer apply. This can make the ethical arguments considered in organ donation heightened or just too confusing to comprehend.  Two seemingly opposite arguments: fear of coercion, (insinuates the death row inmates are not being adequately protected) and the intention to preserve the morality of capital punishment (death row inmates rights are given too much protection) begin to represent this.

There is a fine line between coercion and free choice when it is made in a heavily pressurised situation like a prison. The emotional stress on the donor can be intense because of the need to make right. Also, the patient who is accepting the donor should be notified that the organs they are receiving came a person on death row. Those who oppose capital punishment are then forced to choose between their life or their personal morals. Many say that the idea of capital punishment is to achieve retribution and deterrence in society. The action of donation is not consistent with either of these aims. Making a hero of the person at the end of their life could have detrimental impact on the family and friends of the victim to the prisoner’s crime. For many, the death of the perpetrator of their pain can bring closure and end to the cycles of grief. To see them be glorified in their last days could have the opposite effect.

https://pixabay.com

It is important to consider the impact that organ donation from death row prisoners will have on the overall practice. The number of potential organs recovered from condemned prisoners would be small. The conceivable stigma that would be attached to organ donation from its coupling with execution could lead to decreases in donation rates. This may especially be true within certain minority groups.

Any notion that groups of people were receiving increased numbers of death sentences to provide organs for the rest of society would clearly make it difficult to attempt to obtain consent for altruistic donation from these groups.

Overall, the bad outweighs the good so although it may seem like an easy solution to a difficult problem, donation from death row inmates would cause more problems than it could hope to solve.

Antimicrobials in agriculture: a short introduction

Leah in Year 12 explores why the use of antimicrobials in commercial livestock agricultural medicine is such a contested issue.

The World Health Organization (WHO) has declared Antimicrobial Resistance (AMR) as one of the Top 10 public health threats facing humanity[1]. While the use of antimicrobials has proved a huge boost to the livestock industry, in terms of faster growth in animals, the concern is that higher concentration of AMR in the food chain risks diminishing how effective antimicrobials will be in the human population.

Antimicrobials are agents used to kill unwanted microorganisms by targeting the ones that may pose a threat to humans and animals. These include antibiotics and antibacterial medication for bacterial infections, antifungal medication for fungal infections and antiparasitic medication for parasites.   

Resistance, or AMR, develops when antimicrobials are not used for their full course, so the weakest strains of bacteria are killed but the strongest ones survive. This means that the strongest strains then have the chance to adapt to the environment. Over time these variant alleles become more common in the population and the antimicrobials will become ineffective.  

As Shown in the graph below[2]from 2015 to 2019, antimicrobial use in farming has actually decreased by 45% but since then has picked up again despite the issue becoming more widespread.

Antimicrobials are used often in the production of both meat and animal product farming such as dairy cows, with two main objectives: promoting growth and preventing disease.  

The prevention of diseases in livestock are less of a contested issue as it is well understood that livestock share living spaces, food, and water. Generally, if one animal contracts a disease then the whole flock is at risk due to the proximity. Antimicrobials can help break this link.  

However, the WHO has a strong viewpoint with antimicrobials as a growth agent.[3]As stated in their guidelines to reducing antimicrobial resistance, the organization believes that ‘antimicrobials should not be used in the absence of disease in animals.’[4]This happens by helping convert the feed that the farmers are giving to their livestock into muscle to cause more rapid growth. The quicker the animal meets slaughter weight the quicker they can send them off to an abattoir and get the profit. For example, a 44kg lamb produces 25kg of meat which is the heaviest a lamb can be so farmers want their lambs to reach 44kg so that they can get the most money from each lamb.  

Image via Pixabay

Over 400,000 people die each year from foodborne diseases each year. With the rise in antimicrobial resistance, these diseases will start to become untreatable. The most common pathogens transmitted through livestock and food to humans are non-typhoidal Salmonella (NTS), Campylobacter, and toxigenic Escherichia coli.  Livestock contain all of these pathogens and so they can spread easily.

The WHO have been addressing AMR since 2001 and are advocating for it to become a more acknowledged issue. In some countries, the use of antimicrobials is already controlled. The US Food and Drugs Association (FDA) has been acting on this matter since 2014 due to the risk on human health.

Antimicrobial Resistance is a contested issue because, as much as AMR is a problem that has a variety of governing bodies backing it, there will always be the point that farmers rely on their livestock for their livelihoods meaning they are often driven by profit to ensure income. Antimicrobial Resistance has always been hard to collect evidence on, so this makes it harder to prove to these farmers that it is a growing issue.


References


[1] World Health Organization, WHO guidelines on use of medically important antimicrobials in food producing animals, 7 November 2017, https://www.who.int/foodsafety/areas_work/antimicrobial-resistance/cia_guidelines/en/
Accessed:  24th April 2021

[2] World Health Organization, Antimicrobial Resistance in in the food chain, November 2017, https://www.who.int/foodsafety/areas_work/antimicrobial-resistance/amrfoodchain/en/
Accessed:  24th April 2021

[3] World Health Organization, Ten threats to global health in 2019, https://www.who.int/news-room/spotlight/ten-threats-to-global-health-in-2019 Accessed:  24th April 2021

[4] Farm Antibiotics – Presenting the Facts, Progress across UK farming, https://www.farmantibiotics.org/progress-updates/progress-across-farming/
Accessed:  24th April 2021

Invention through desperation – military medical advancements

Military

Jessica, Year 13, explores military medical advancements in recent conflicts, discussing their impact and whether the nature of war acts as an inspiration for innovation.

In 2001, the conflict in Afghanistan began, continuing until a majority of British troops withdrew in the final months of 2014. During these years, 6,386 British personnel were injured, with 28 fatalities, leaving the survival rate at 99.6%.

This was unheard of in previous wars and a major success story for military medicine. However, the injuries and trauma to the soldiers during this period of time increasingly involved haemorrhaging and amputations due to gunshot wounds and IEDs (also known as improvised explosive devices – a type of unconventional crude homemade bomb). These IEDs cause extensive blood loss which has been attributed to 50% of combat deaths since World War Two. In order for these soldiers to survive, a change had to be made in the form of military medicine to preserve life and limb. There are three major advancements in military trauma medicine which all arose from the need to problem-solve solutions to the new injuries personnel and the medics were now witnessing.

The first is haemostatic dressings. During the period of the Afghanistan conflict, two new dressings were developed: XSTAT and QuickClot powder which contain components such as fibrinogen and thrombin catalysing the natural coagulation response. XSTAT uses 92 medical sponges in a pocket-sized injector to pack an open wound and halt bleeding within fifteen seconds. XSTAT increases the chance of survival and holds pressure until the patient can reach a medical centre. They also contain a molecule which is visible on an X-ray to ensure all sponges are removed later to prevent infection.

Secondly, there was a development in the traditional tourniquet. A tourniquet is a constricting or compressing device used to control venous and arterial blood flow to a portion of an extremity for a period of time. This is possible because it creates pressure equal to or higher than the patient’s systolic blood pressure. The single hand tie tourniquet is a development from the original tourniquet used by army medics which had to be applied by the medic and thus were only carried by them. Without the patient being able to apply their own tourniquet, crucial time and blood was lost whilst the medic reached the injured individual, reducing their chance of survival as well as increasing the complexity of their treatment and injuries. This is when the Clinical Application Tourniquet (CAT) was developed and introduced into the US Army in 2005. It was the first single-hand tie tourniquet, allowing the soldiers to treat their own injuries immediately until the medic could attend and provide more advanced care. The tourniquet distributes pressure over a greater area which is advantageous because it reduces the underlying tissue and nerve damage, preventing it from becoming ischemic, a deficient supply of blood, whilst remaining effective. This decrease in time before a tourniquet is used has decreased the mortality rate due to haemorrhaging by 85%.

A third category of advancements is in the use of blood and the way it is transported. Blood and blood products, such as platelets, are crucial in the treatment of haemorrhaging and amputations. However, in order for it to be viable for transfusion, it must be maintained in a cool, constant environment, far from the natural one in Afghanistan. This was previously a significant disadvantage and contributed to the low survival rates for haemorrhaging but improved with the development of the blood container. The Golden-Hour mobile blood container stores up to four units of blood and platelets at[1]the required temperature of six and two degrees Celsius respectively, for 72 hours without electricity, batteries or ice to aid emergency medics. Crucially, this enabled blood to be brought forward to the battlefield rather than stored at the field hospital.

The environment of the military and the nature of its role means that trauma medicine needs to evolve to deal with the style of injuries it is experiencing: invention through desperation. However, it is important that the care not only reflects the immediate treatment of the patient but also considers their long-term care to ensure they can achieve a high quality of life post-conflict.

What would happen if there was no stigma around mental illness?

Mental Illness

Emily, Year 12, explores why there is a stigma around mental illnesses, how we can get rid of this stigma, and what effect the stigma has on society.

Mental illness is not just one disorder – and many people know that – but what they don’t understand is quite how expansive the list of disorders is. As young girls, we are taught about anxiety, body dysmorphic disorder, depression, addiction, stress, and self-harm but the likelihood is that we know – from personal experience, through friends, family or even social media – that many more mental illnesses exist. For example: bipolar disorder, obsessive-compulsive disorder, schizophrenia, autism and ADHD. Chances are, we all know someone with mental illness whether we know or not – the majority of the time these people function the same way that people with no mental illness do. So why is there such a stigma around mental illness and how can we get rid of the stigma?
When the AIDS epidemic started in the early 1980s, the disease was only affecting minority groups of people who already faced criticism. The disease only furthered this and made the patients virtual pariahs until advocacy groups and communities protested to expand awareness and pressured the U.S. government to fund research for the disease and its cure. In only seven years, scientists were able to: identify that the cause of AIDS was the Human immunodeficiency virus (HIV), create the ELISA test to detect HIV in the blood and establish azidothymidine (AZT) as the first antiretroviral drug to help those suffering from HIV/AIDS. This is a prime example of how public knowledge can lead to science pushing the boundaries of their knowledge and finding treatments. Along with treatments eliminating symptoms, they also eliminate the stigma as more and more people are learning about the disease. So why can’t this be the case for mental illness?

In a time when science wasn’t breaking new boundaries every day, and knowledge wasn’t being distributed properly, it is easy to see why those with such complicated illnesses were feared and had such a stigma surrounding them. However, now when the greatest barrier is access to treatments and not the science, and the education about the subject is as high as it has ever been, it is hard to see why there is still such shame in having these illnesses.

But what if there was no stigma? We would have early identification and intervention in the form of screening mechanisms in primary care settings such as GP, paediatric, obstetrics, and gynaecological clinics and offices as well as schools and universities. The goal would be to screen those who are at risk for or are having symptoms of mental illness and engage the patients in self-care and treatment before the illness severely affects their brains, and lives. We would also have community-based comprehensive care for those who are in more advanced stages of illness. This will support people who are unable to care for themselves and who may otherwise end up homeless, in jail or in mental hospitals.
For example: victims of trauma would be treated for PTSD along with any physical injuries while in the hospital to target PTSD before any symptoms started occurring and the patient could hurt themselves or others; first responders would have preventative and decompression treatments routinely administered to treat PTSD before waiting to see who may or may not show symptoms; mothers would be treated for pre/post-partum depression as a part of pre/post-natal check-ups instead of waiting and potentially harming themselves or their baby. Children with learning disabilities would be identified early on so they could get cognitive training, and emotional support to prevent counterproductive frustration due to something they cannot control.

Medical economists have shown that this method of proactive mental healthcare will actually reduce the cost of delivering it. It will also relieve emotional stress (for the patient and their family), financial burden for treatment, and will reduce the occurrence of many of the very prevalent social problems. We all know about the many mass shootings that occur regularly and a great deal of these crimes have been perpetrated by young males who have an untreated mental illness which have presented symptoms for long before the crime was committed – not that I am excusing their behaviour in any way.

As a worldwide community, we must be able to recognise mental illness for what it is – a medical condition that can be treated, be that with behavioural or cognitive therapy or with medication. In order to dissolve the stigma, we must be involved, ask questions, be kind, be compassionate, and make it our own business. There is only so much science can do if people are not willing to take the help they are being given – they need to want to get better. The only way this will happen is if we all help to make it known that having a mental illness is not a bad thing, and that it is easily treatable, and that they are no different from anyone else.

Nanotechnology and its future in medicine – 07/09/18

Maya (Year 11), discusses the uses of nanotechnology in medicine, thinking about how far it has come and helped doctors. She also considers the dangerous aspects of using such small technology and the future benefits it may bring.

Technology in medicine has come far and with it the introduction of nanotechnology. Nanotechnology is the action of manipulating structures and properties at an atomic and molecular level as the technology is so small; it being one-billionth of a metre. This technology has many uses such as electronics, energy production and medicine and is useful in its diverse application. Nanotechnology is useful in medicine because of its size and how it interacts with biological molecules of the same proportion or larger. It is a valuable new tool that is being used for research and for combatting various diseases.

In medicine, nanotechnology is already being used in a wide variety of areas, the principle area being cancer treatment. In 2006 a report issued by NanoBiotech Pharma stated that developments related to nanotechnology would mostly be focused on cancer treatments. Thus, drugs such as Doxil, used to treat ovarian cancer will use nanotechnology to evade and surpass the possible effects of the immune system enabling drugs to be delivered to the disease-specific areas of the body. Nanotechnology is also helping in neuroscience where European researchers are currently using the technology to carry out electrical activity across dead brain tissue left behind by strokes and illnesses. The initial research was carried out to get a more in-depth analysis of the brain and to create more bio-compatible grids (a piece of technology that surgeons place in the brain to find where a seizure has taken place). Thus, it is more sophisticated than previous technologies which, when implanted, will not cause as much damage to existing brain tissue.

Beyond help in combatting cancer and research, nanotechnology is used in many areas in medicine from appetite control to medical tools, bone replacement and even hormone therapy. Nanotechnology is advancing all areas of medicine with Nano-sized particles enhancing new bone growth and additionally, there are even wound dressings that contain Nano-particles that allow for powerful microbial resistance. It is with these new developments that we are revolutionising the field of medicine, and with more advancements, we will be able to treat diseases as soon as they are detected.

Scientists are hoping that in the future nanotechnology can be used even further to stop chemotherapy altogether; fighting cancer by using gold and silica particles combined with nanotechnology to bind with the mutated cells in the body and then use infra-red lasers to heat up the gold particles and kill the tumour cells. This application would be beneficial as it would reduce the risk of surrounding cells being damaged as the laser would not affect them as much as the chemotherapy would.

In other areas, nanotechnology is further developing with diagnostics and medical data collection. This means that by using this technology, doctors would be able to look for the damaged genes that are associated with particular cancers and screen the tumour tissue faster and earlier than before. This process involves the Nano-scale devices being distributed through the body to detect chemical changes. There is also an external scan by use of quantum dots on the DNA of a patient which is then sequenced to check if they carry a particular debilitating genome, therefore providing a quicker and easier method for doctors to check in detail if a patient has contracted any illnesses or diseases. Furthermore, doctors will be able to gain a further in-depth analysis and understanding of the body by use of nanotechnology which surpasses the information found from x-rays and scans.

While this is a great start for nanotechnology, there is still little known about how some of the technology might affect the body. Insoluble nanotechnology for example, could have a high risk of building up in organs as they cannot diffuse into the bloodstream. Or as the nanoparticles are so small, there is no controlling where they could go, which might lead to Nano-particles entering cells and even their nuclei, which could be very dangerous for the patient. The science and technology committee from the House of Lords have reported concerns about nanotechnology on human health, stating that sufficient research has not been conducted on “understanding the behaviour and toxicology of nanomaterials” and it has not been given enough priority especially with the speed at which nanotechnology is being produced.

Nanotechnology is advancing medical treatment at a rapid rate, with new innovative technologies approved each year to help combat illnesses and diseases. Whilst more research needs to be conducted, the application of Nano-medicine will provide a platform of projected benefits that has potential to be valuable. Overall with the great burden that conditions like cancer, Alzheimer’s, HIV and cardiovascular diseases impose on the current healthcare systems, nano-technology will revolutionise healthcare with its advances techniques in the future as it progresses.

@Biology_WHS