Is technology advancement really eating away at your future job?

Charlotte, Year 10, looks into the impact advancements in technology will have on future job opportunities.

Will technology only aggravate inequality, or provide healthier societies?

The technology driven globe that we live in is one full of thrilling and stimulating possibilities for our future. However, it is sure to pose countless challenges whilst advancing in this adventure.

Space tourism, people reincarnation through AI, edible water blobs (the most exciting of them all!) and self-driving cars are some of the many developments aiming to be produced in the future. But with all these startling products being created there are inevitably some challenges posed.

A major concern is jobs. Our jobs. The thing we will be relying on for income and a more comfortable lifestyle, the thing our whole education is aimed around, the thing the economy relies on from the collection of taxes. Careers play a huge role in everyone’s lives and the economy, but how on earth could this amazing technology that is advancing us so much, have a negative impact on the economy and your future?

I’m sure you have heard this many times before, and the biggest answer is simply: automation. Here are some figures to demonstrate how much will change – 9 out of 10 jobs will require digital skills, in 10 years’ time 50% of jobs will be changed by automation, and in 2025, humans will account for only 58% of total task hours, meaning the machines’ share will rise to 42% from the current 29%[1]. These staggering figures could be perceived as a negative attribute to the technology advancement, with it consuming all of our jobs and picking away at our futures. However you have perceived those numbers, let me assure you that all of the foreboding figures can easily be overridden with the fascinating possibilities of what is to come.

Examples include the following:

  • Unexpected industries will boom, not just the predicted boom of the IT industry; these include healthcare, veterinary science, social assistance, engineering, geology and history;
  • The share of women in the workforce is projected to reach 47.2% in 2024, and the number of men in the workforce is expected to slightly decrease to 52.8% in 2024;[2]
  • 85& of the jobs that will exist in 2030 haven’t even been invented yet.[3]

Personally, the last opinion excites me the most with the possibilities that are to come and will impact us. What jobs will be invented? How will they be invented? Who will invent them?

So, no matter how many articles and reports you see in the future about this topic, there are many positives that willoverride things reported as potential negatives. Change might be coming, as we have seen with the development of the internet over the last 40 years, but that does not mean that people will lose the ability to train, learn and adapt to use these new technologies in their day-to-day work. Creativity, critical thinking and complex problem solving – all things that automation currently finds challenging – have been identified as the top soft skills required by companies in 2020, and it is these areas which we need to promote in our learning.[4]

If you take one thing out of this brief article, let it be that creativity and your limitless imagination are the passport to the future.


[1] See https://www.weforum.org/press/2018/09/machines-will-do-more-tasks-than-humans-by-2025-but-robot-revolution-will-still-create-58-million-net-new-jobs-in-next-five-years/

[2] See https://core.ac.uk/download/pdf/85140562.pdf

[3] See https://www.linkedin.com/pulse/85-jobs-exist-2030-havent-been-invented-yet-leo-salemi#:~:text=According%20to%20a%20report%20published,t%20even%20been%20invented%20yet.

[4] See https://www.prca.org.uk/Creativity-is-the-number-one-skill-2020#:~:text=Creativity%20was%20identified%20by%20LinkedIn,’Future%20of%20Jobs’%20study.

Should prisoners on Death Row be accepted as Organ Donors?

Isobel, a Year 10 pupil at WHS, assesses the ethics and logistics of accepting death row prisoners as organ donors.

Disclaimer: This piece is based on the US death row and does not highlight my own views on capital punishment.

From a Utilitarian standpoint, there may appear to be a simple answer to this question: organ donation should be permitted because there is a global shortage of transplantable organs and those in dire health condition are unable to receive the medical care they need. However, as more research is done numerable practical and ethical barriers arise. One country that already utilises organs from death row inmates is China. Reports state that more than 5,000 prisoners are executed in China annually, and organs are harvested for transplantation from suitable prisoners. These prisoners are executed via a temporal gun shot wound and are declared dead secondary to execution. They are not declared brain dead which causes many ethical headaches because the physicians removing the organs are then put in the position of executioner. This brief case study begins to highlight some of the major opposing arguments to organ donation from death row prisoners.

Picture from showing surgery https://pixabay.com

The numerous practical barriers surrounding organ procurement from death row prisoners begin to pile up after closer inspection. The first issue is the low yield of transplantable donor organs from these prisoners due to the potential high likelihood of alcohol or drug abuse. Whilst this is a potential stereotype, these factors can drastically impact the quality of the organs being donated.

For example, alcoholism is the leading cause of liver disease in the US because heavy drinking can cause irreversible cirrhosis. Approximately 10-20% of heavy drinkers develop this disease and it is the ninth leading cause of death in the US, killing around 35,000 people a year. Prisoners in long term facilities will not live on nutrition rich diets, will most likely be malnourished because of the (often) poor-quality food they have consumed and will not get the adequate exercise to build up strong organs such as hearts and lungs. These reasons could also impact the quality of their organs for transplantation.

The second practical barrier preventing condemmed prisoners from being organ donors is the logistics on the day of execution. The surgeon performing the operation cannot kill the patient by removing their organs as it breaches the Hippocratic oath of ‘do no harm’. The patient must already be dead or pronounced brain dead before they are put under general anaesthesia because when the transplant team cross clamps the aorta, resulting in a cardiectomy and takes the patient of the ventilator, they are then declared dead. Many physicians’ groups, including the American Medical Association, have prohibited physician participation in state executions on ethical grounds.

Looking through a utilitarian lense the death of an organ donor means dozens of lives saved and the donation is there simply to help those suffering from end stage organ disease, not for any other ulterior motives. The two documents that set out the rules around organ donation in the US are the National Transplant Act of 1984 and the Uniform Anatomical Gift Act. Neither of these documents explicitly prohibits organ donation by death row inmates which means there is no law preventing it from happening. The National Transplant Act states that organ donation cannot be made for ‘valuable considerations’, including exchange of money, material benefit, or a shortened sentence.  This would not be an issue for death row inmates as they have already been condemmed until the end of their life and they have little access to the wider society.

Christian Longo went public with his idea to donate organs as a condemmed prisoner and joined the organisation G.A.V.E (Gifts of Anatomical Value from Everyone). He came up with the idea himself so there is no fear of coercion and he approached the New York times with his story on a voluntary basis. There have been 14 other publicised instances of death row inmates and their lawyers attempting to seek their respective opportunities to donate their organs. They were denied on the grounds of current knowledge on the matter. As popularity surrounding capital punishment begins to dim the public’s sympathy for those stationed on death row is increasing. The conversation surrounding a prisoner’s ability to choose to have one good action in the world before their execution is becoming ever louder.

When a person in incarcerated many of their free rights no longer apply. This can make the ethical arguments considered in organ donation heightened or just too confusing to comprehend.  Two seemingly opposite arguments: fear of coercion, (insinuates the death row inmates are not being adequately protected) and the intention to preserve the morality of capital punishment (death row inmates rights are given too much protection) begin to represent this.

There is a fine line between coercion and free choice when it is made in a heavily pressurised situation like a prison. The emotional stress on the donor can be intense because of the need to make right. Also, the patient who is accepting the donor should be notified that the organs they are receiving came a person on death row. Those who oppose capital punishment are then forced to choose between their life or their personal morals. Many say that the idea of capital punishment is to achieve retribution and deterrence in society. The action of donation is not consistent with either of these aims. Making a hero of the person at the end of their life could have detrimental impact on the family and friends of the victim to the prisoner’s crime. For many, the death of the perpetrator of their pain can bring closure and end to the cycles of grief. To see them be glorified in their last days could have the opposite effect.

https://pixabay.com

It is important to consider the impact that organ donation from death row prisoners will have on the overall practice. The number of potential organs recovered from condemned prisoners would be small. The conceivable stigma that would be attached to organ donation from its coupling with execution could lead to decreases in donation rates. This may especially be true within certain minority groups.

Any notion that groups of people were receiving increased numbers of death sentences to provide organs for the rest of society would clearly make it difficult to attempt to obtain consent for altruistic donation from these groups.

Overall, the bad outweighs the good so although it may seem like an easy solution to a difficult problem, donation from death row inmates would cause more problems than it could hope to solve.

Coronavirus and the economy

Calculator and pen economics

Lily in Year 13 wrote this article just before the start of Lockdown 2 in November 2020 in the UK. As we now gradually come out of Lockdown 3 some 7 months later, how much of the article rings true?

As of the 23rd March 2020 the UK was placed under lockdown and has been moving in and out of lockdowns and restrictions ever since. This is likely to cause an economic slowdown and possibly plunge the UK into recession over the coming years because when people are in lockdown consumption and aggregate demand in the UK is likely to fall. There will be impacts on workers and the UKs supply of goods domestically and from abroad which is also likely to negatively impact the economy. Overall, the UK and its economy is likely to suffer as a result of Covid-19.

Possible impacts

Firstly, aggregate demand in the UK will be hit by the virus as when people are quarantined they will be unable to go out and spend, especially as recently all shops selling non-essential goods along with bars and restaurants have been told to close meaning this spending in these sectors won’t be possible anymore. This will be added to by the low consumer confidence that is currently present as in uncertain times people save their money as a safety net so they will be able to support themselves in uncertain times.

Consumption patterns will also change, meaning goods with an income elasticity of demand between 0 and 1 are likely to see little change in demand as these goods are classed as necessities. Some might even see a rise in demand in the short term as people panic buy (think back to scenes of toilet roll panic buying in April 2020).

However, goods with an income elasticity over one are likely to see a decrease in demand as these are classed as luxuries so people won’t be prioritising purchasing these items when economics conditions are uncertain. This means that overall consumption will decrease as people don’t have the opportunity to spend as much out shopping or on luxuries and are also likely to be more cautious with their spending by nature. This fall in consumption will also be exacerbated if people’s incomes are negatively impacted, such as if they were previously working in the gig economy, perhaps on zero hour contracts or with irregular, situation-based income.

These people would be relying on their savings or help from the government for their money, meaning their spending and consumption is likely to fall. To encourage more spending the bank of England has dropped interest rates to a new record low 0.1% to encourage people to go out and spend instead of save as they will be receiving very little gain from letting their money sit in a bank account. Also, as if demand falls whilst there is still a constant level of supply prices of goods and services are likely fall. This can be seen through the drop in the price of flights as airlines suffer a shortage in demand. For example a flight from California to London would have previously cost $1000 can now be purchased for as low as $246 dollars. However, this drop in prices is likely to have a limited effect on demand because of the uncertainty currently which also means low interest rates are likely to have a limited effect in changing people’s behaviour. Recent travel bans will also limit the impact lower prices in the airline industry since people are unable to take advantage of these lower prices when they are unable to travel or will be deterred by quarantine times. This means that overall aggregate demand and consumption in the UK is likely to fall because there will be fewer opportunities for people to spend but mostly because of consumer uncertainty. This will negatively impact the UK economy.

Aggregate supply in the UK will also be impacted by the virus, impacting costs along with exports if less is being produced domestically. A fall in domestic supply could result in cost push inflation as if demand levels for some products remain steady prices would have to rise to make up for the limited stock or production due to covid-19 restraints on supply. This fall in supply could be caused if workers have to self-isolate and cannot go to work, meaning a company cannot operate at full capacity causing a shift inward on their PPF.

Companies might also have trouble receiving stock from other countries if their production has been impacted by the virus which could prevent production in domestic businesses. This is likely to hit the manufacturing industry especially hard as these rely on parts from abroad, such as the car company Jaguar which is now running out of parts it would usually ship from China[1], and people coming to work as it is very difficult to work from home if you worked in a factory. This effect on supply could be decreased depending as the UK has a flexible labour market, meaning people can easily move from job to job meaning businesses won’t be hit as hard by the shock as resources (people) can be reallocated more easily compared to France where the labour market is very inflexible.

This effect on supply also depends how flexible the product markets are as if a company could switch supplier there would be minimal effect especially if this new supplier was located domestically instead of abroad. There might also be a time lag if producers had stockpiled meaning it would take longer to run out of stock however is still likely to happen in the long term. In the long run this outbreak could cause LRAS curve to shift inwards if there is less investment as companies will be working harder to keep afloat rather than investing or spending on R&D meaning less will be invested into the countries long term productivity.

This can be seen though the fall in the share prices of the FTSE 100 as on 9th March 2020 the average prices of shares fell 8%[2], the worst day since the 2008 financial crisis with £144bn wiped off its combined value. These top 100 companies were likely to be some of the largest investment spenders. Although, as the UKs economy is mainly based on financial services, many can work from home meaning domestic supply issues may not affect the economy as much as countries that rely heavily on manufacturing such as China and Germany shown as Chinese economy shrank by 6.8% in the first quarter of 2020, the first contraction since 1976. The government could also reduce the effect on domestic supply by subsidising companies so they are able to invest in the technologies they need and keep production lines running, or as interest rates are low companies could take out a loan to invest. Meaning there are options for companies to try and uphold supply and investment. However, there is still the underlying issue of people not being able to go to work because of self-isolation. This will have a huge impact on supply in the UK and therefore the economy as if products aren’t being made they cannot be consumed and if they aren’t being consumed profits will fall leaving companies to have less money to invest, impacting supply in the short term along with productivity in the long term.

Overall, the impact of the virus is going to be wide-ranging across the world and in the UK, impacting both supply and demand. Impacts of the fall in supply in the short term will be slightly counteracted by the fall in aggregate demand as if both curves shift inwards there will be a new equilibrium point of production and cost push inflation is likely to be limited. However, the nation’s productivity and output will decrease which means the UKs GDP is likely to fall significantly, plunging the UK into a recession. The British government are spending more to combat some of the impacts however this is unlikely to cover the full economic impact and will see a rise in the government budget deficit as a result. This also makes it likely that we will see the government to following a policy of austerity over the coming years, meaning people in the UK and the UKs economy are likely to be hit hard by this crisis. However, as the virus is a global pandemic, it is likely that its impact will also be mirrored across the rest of the world.[3]

2021 Update

The UK Government introduced the furlough scheme to support workers and businesses who were unable to run as normal owing to the impact of the virus. Up to April 2021 this has cost over £61 billion[4], with 4.7 million jobs impacted. Government spending is at the highest figure ever seen outside of periods of war.[5]

With increasing numbers of the population having now been vaccinated against the virus, and the recently announced reopening of restaurants from 17 May 2021[6], there is a feeling that we are gradually moving out of the crisis and that normality is becoming closer.

However, the vast spending seen since the crisis started in January 2020 will almost certainly mean that a return to a pre-pandemic life will be a challenge and that further austerity will be required to settle the books. The need to ‘level up’ the country, as announced by the Prime Minister on 11 May 2021[7], will be central to life as we know it for many years to come. Whether this can be achieved will be subject to criticism and debate for the significant future.


[1] See https://www.independent.co.uk/news/business/news/coronavirus-jaguar-land-rover-suitcases-supply-chain-factories-china-a9343336.html

[2] See https://www.theguardian.com/business/2020/mar/09/ftse-plunges-to-below-6000-amid-global-coronavirus-sell-off-oil#:~:text=9%20March%202020%20Fears%20of,into%20an%20official%20bear%20market.

[3] See https://www.ons.gov.uk/economy/grossdomesticproductgdp/articles/coronavirusandtheimpactonoutputintheukeconomy/december2020#:~:text=6.,declined%20by%209.9%25%20in%202020.&text=GDP%20measured%20by%20the%20output,growth%20of%201.4%25%20in%202019. The impact on the UK economy is a 9.9% fall in GDP over the course of the year.

[4] Table and reference from https://www.statista.com/statistics/1122100/uk-cost-of-furlough-scheme/

[5] See https://www.bbc.co.uk/news/business-52663523

[6] See https://www.telegraph.co.uk/politics/2021/05/12/covid-lockdown-roadmap-new-rules-may-17-dates-when-end/

[7] See https://inews.co.uk/news/politics/queens-speech-2021-boris-johnson-pledges-to-harness-spirit-of-lockdown-as-he-sets-out-uks-covid-recovery-995558

Antimicrobials in agriculture: a short introduction

Leah in Year 12 explores why the use of antimicrobials in commercial livestock agricultural medicine is such a contested issue.

The World Health Organization (WHO) has declared Antimicrobial Resistance (AMR) as one of the Top 10 public health threats facing humanity[1]. While the use of antimicrobials has proved a huge boost to the livestock industry, in terms of faster growth in animals, the concern is that higher concentration of AMR in the food chain risks diminishing how effective antimicrobials will be in the human population.

Antimicrobials are agents used to kill unwanted microorganisms by targeting the ones that may pose a threat to humans and animals. These include antibiotics and antibacterial medication for bacterial infections, antifungal medication for fungal infections and antiparasitic medication for parasites.   

Resistance, or AMR, develops when antimicrobials are not used for their full course, so the weakest strains of bacteria are killed but the strongest ones survive. This means that the strongest strains then have the chance to adapt to the environment. Over time these variant alleles become more common in the population and the antimicrobials will become ineffective.  

As Shown in the graph below[2]from 2015 to 2019, antimicrobial use in farming has actually decreased by 45% but since then has picked up again despite the issue becoming more widespread.

Antimicrobials are used often in the production of both meat and animal product farming such as dairy cows, with two main objectives: promoting growth and preventing disease.  

The prevention of diseases in livestock are less of a contested issue as it is well understood that livestock share living spaces, food, and water. Generally, if one animal contracts a disease then the whole flock is at risk due to the proximity. Antimicrobials can help break this link.  

However, the WHO has a strong viewpoint with antimicrobials as a growth agent.[3]As stated in their guidelines to reducing antimicrobial resistance, the organization believes that ‘antimicrobials should not be used in the absence of disease in animals.’[4]This happens by helping convert the feed that the farmers are giving to their livestock into muscle to cause more rapid growth. The quicker the animal meets slaughter weight the quicker they can send them off to an abattoir and get the profit. For example, a 44kg lamb produces 25kg of meat which is the heaviest a lamb can be so farmers want their lambs to reach 44kg so that they can get the most money from each lamb.  

Image via Pixabay

Over 400,000 people die each year from foodborne diseases each year. With the rise in antimicrobial resistance, these diseases will start to become untreatable. The most common pathogens transmitted through livestock and food to humans are non-typhoidal Salmonella (NTS), Campylobacter, and toxigenic Escherichia coli.  Livestock contain all of these pathogens and so they can spread easily.

The WHO have been addressing AMR since 2001 and are advocating for it to become a more acknowledged issue. In some countries, the use of antimicrobials is already controlled. The US Food and Drugs Association (FDA) has been acting on this matter since 2014 due to the risk on human health.

Antimicrobial Resistance is a contested issue because, as much as AMR is a problem that has a variety of governing bodies backing it, there will always be the point that farmers rely on their livestock for their livelihoods meaning they are often driven by profit to ensure income. Antimicrobial Resistance has always been hard to collect evidence on, so this makes it harder to prove to these farmers that it is a growing issue.


References


[1] World Health Organization, WHO guidelines on use of medically important antimicrobials in food producing animals, 7 November 2017, https://www.who.int/foodsafety/areas_work/antimicrobial-resistance/cia_guidelines/en/
Accessed:  24th April 2021

[2] World Health Organization, Antimicrobial Resistance in in the food chain, November 2017, https://www.who.int/foodsafety/areas_work/antimicrobial-resistance/amrfoodchain/en/
Accessed:  24th April 2021

[3] World Health Organization, Ten threats to global health in 2019, https://www.who.int/news-room/spotlight/ten-threats-to-global-health-in-2019 Accessed:  24th April 2021

[4] Farm Antibiotics – Presenting the Facts, Progress across UK farming, https://www.farmantibiotics.org/progress-updates/progress-across-farming/
Accessed:  24th April 2021

Impact study: the spread of imported disease in Australia and New Zealand

Sophia (Year 13) looks at how European colonialism spread disease to Australia and New Zealand.

Although the tragedies brought by actions of colonisers such as slavery, wars and other abysmal treatment of native populations caused many deaths, one of the biggest killers of this action was the introduction of new diseases to which natives had no immunity due to their previous isolation from the European invaders.

Image from Pexels

Between 1200 and 1500 Europe itself was suffering several pandemics due to the growth of unsanitary cities, creating the perfect environment for infection, and also increasing contact with the Old World, such as through Mongol and Turkish invasions, which exposed Europe to major disease outbreaks. For example, between 1346-51, the Black Death killed off about a third of Europe’s population. As relatively disease-hardened populations in Europe emerged from this, although local epidemics emerged after 1500, none were really as bad as the years before it, rather such epidemics after this date were in colonised nations. Here I will focus on the colonisation of Australia and New Zealand, with different native peoples (the Aborigines and the Maoris) and with different effects of diseases.

New Zealand

Imported diseases began to impact many Maori from the 1790s. These diseases were those such as viral dysentery, influenza, whooping cough, measles, typhoid, venereal diseases, and the various forms of tuberculosis. Missionaries and observers reported massive death rates and plummeting birth rates. However, unlike the Americas and Australia, there is a big chance that the deaths as a result of foreign disease are widely exaggerated.

Rather, such exaggeration labelled the Maori as a dying race (a view which persisted to 1930), which helped to project the British Empire into New Zealand in 1840. One of the reasons for which the effect of disease was probably the smallest was simply the distance from Europe to New Zealand; it was a natural quarantine. The trip took 4 months or more, meaning that the sick either died or recovered; either way they were often no longer infectious on arrival. Therefore, the most pernicious European diseases – malaria, bubonic plague, smallpox, yellow fever, typhus and cholera – did not manage to transfer to New Zealand.

Image by Dan Whitfield via Pexels

Another factor which fostered the gross magnification of the demise of the Maori was the comparison in birth rates; missionary families were extremely large – the fourteen couples who went to New Zealand before 1833 had 114 children. Therefore, it was easy to amplify the decline in Maori birth rates into something far more serious than it was. The population of Maori on contact with the Europeans are very unreliable and, in most cases, wild guesses, and also allow for the misjudgement of the effect of the disease. For example, one estimate for 1769 based upon archaeological science gives an estimated pre-contact birth rate of 37 per thousand per year, and a death rate of 39[1], obviously impossible given that it leaves the Maori population in the minus-thousands. However, more moderate calculations suggest an average decline of 0.3% per year between 1769 and 1858[2]. Therefore, although the Maori population somewhat suffered as a result of these diseases, there is a tendency to exaggerate this, to portray them as ‘weaker’ peoples, and a dying race, allowing for easier colonisation.

Australia

Although Australia was initially discovered by the Dutch, it was a fleet of British ships which arrived at Botany Bay in January 1788 to establish a penal colony[3].  European disease spread to areas of Australia, even before Europeans had reached those parts. For example, there was a smallpox epidemic near Sydney in 1789, wiping out around half of the Aborigines there.[4] 

Photo by Damon Hall from Pexels

Some historians claim that this was acquired through contact with Indonesian fishermen in the far north, which then spread, and others argue that it is likely that the outbreak was a deliberate act by British marines when they ran out of ammunition and needed to expand their settlement. Indeed, unfortunately colonial thinking at the time placed Europeans as the ‘superior race’; a book written by William Westgarth in 1864 on the colony of Victoria included: ‘the case of the Aborigines of Victoria confirms…it would seem almost an immutable law of nature that such inferior dark races should disappear’. Therefore, as with New Zealand, description of the natives as a ‘dying race’ was an important tool for colonisation, meaning purposeful introduction and spread of some diseases is not too hard to believe.

Smallpox spread between Aboriginal communities, reappearing in 1829-30; according to one record killing 40-60% of the Aboriginal population.[5]  In addition, during the mid-late 19th century, many Aborigines in southern Australia were forced to move to reserves; the nature of many of these institutions enabled disease to spread quickly and many even began to close down as their populations fell.

Conclusion

Although one must be wary of statistics about native mortality rates in both countries, given the European tendency to exacerbate the decline in native populations, it is fair to say that the decline in Aboriginal populations was much higher than that of the Maori in New Zealand, although wars also massively contributed to this.

While roughly 16.5% of the New Zealand population is Maori, only 3.3% of Australians are aboriginal, and it is safe to say that disease influenced this to some extent. So why was there such a difference between the effects of diseases in these countries, seemingly close together and both colonised by the British? A very large reason was smallpox; this was by far the biggest killer in Australia, but never reached New Zealand. The nature of the existing native communities was also important; there were 200-300 different Aboriginal nations in Australia, all with different languages, but the Maori were far more united, and often seen to be a more ‘advanced’ society, and therefore were never forcibly placed into reserves; which is where a lot of the spread of disease took place.

In addition, events in New Zealand occurred much later than Australia, after slavery had been outlawed, meaning there was a slightly more humanitarian approach, and there is less evidence to suggest purposeful extermination of the Maori. This is not to discount any injustices suffered by the Maori; indeed, many did die from European disease, and in both cases the native populations were treated appallingly and were alienated from their land.

The influence of European disease was overwhelmingly more powerful in Australia. However, one must approach statistics about the effect of disease on native peoples with caution, as Europeans tended to exaggerate this area to portray such peoples as ‘dying races’, a device often used to support colonisation.


Bibliography

Ian Pool, Te Iwi Maori (New Zealand: Oxford University Press), 1991

James Belich, Making Peoples (New Zealand: Penguin Books), 1996

John H. Chambers, A Traveller’s History of New Zealand and the South Pacific Islands (Great Britain: Phoenix in association with the Windrush Press), 2003


[1] cited in Ian Pool, Te Iwi Maori (New Zealand: Oxford University Press), 1991, 35

[2] Ibid, 56

[3] wikipedia cites Lewis, Balderstone and Bowan (2006), 25

[4] Judy Campbell, Invisible Invaders: Smallpox and other Diseases in Aboriginal Australia 1780-1880 (Australia: Melbourne University Press), 2002, 55

[5] wikipedia cites Richard Broome, Arriving (1984), 27

What is a random number and what is the random number generator?

Dice

Sungmin in Year 13 looks at random numbers, explaining what they are and how they are relevant to our lives – from encrypted passwords to how games are programmed.

Random number in real life

As public and private data networks proliferate, it becomes increasingly important to protect the privacy of information. Having a random number is one of the steps which can become a core component of the computer to increase the security of the system platform. Random numbers are important for other things – computer games, for example. Random numbers will ensure there are different consequences after making different decisions during a game. The results will always be different because the given input is different. At an arcade, there are many games that rely on randomness. Falling objects fall in different patterns so that no one can anticipate when to catch the object. Otherwise, some people will be able to calculate how an object will fall and the game will no longer be loved by the people visiting arcades. It is interesting that we get some randomness as such.

The examples such as the falling object and computer games both require random numbers in order to be unique. For those games and many other situations that require randomness, private data must be encrypted. To do so, a true random number plays such a significant role and must be used. On the other hand, when we are programming games, both true random number and pseudo-random numbers can be used. As you might have noticed already, there are two types of random numbers. Random numbers are separated depending on how they are generated. One is the true/real random number and the other is the pseudo-random number.

Definitions of ‘random number’

There are several different ways to define the term, ‘random number’. First of all, it is a number which is generated for, or part of, a set exhibiting statistical randomness. Statistical randomness is a characteristic where a numeric sequence is said to be statistically random when it contains no recognisable patterns or regularities.

Secondly, a random number can be defined as a random sequence that is obtained from a stochastic process. A stochastic or random process can be defined as “a collection of random variables that is indexed by some mathematical set, meaning that each random variable of the stochastic process is uniquely associated with an element in the set.”  Moreover, a random number is an algorithmically random sequence in algorithmic information theory. An algorithm is a “process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer”. Also, the sequence which is algorithmically random can be explained as an infinite sequence of binary digits that appears random to any algorithm. The notion can be applied analogously to sequences of any finite alphabet (e.g. decimal digits).

Random sequences are the key object of study in algorithmic information theory. Algorithmic information theory is a merger of information theory and computer science that concerns itself with the relationship between computation and information of computably generated objects, such as strings or any other data structure. Random numbers can also be described as the outputs of the random number generator. In cryptography we define a random number through a slightly different method. We say that the random number is “the art and science of keeping messages secure”.

Different types of random number generators

1:

  • True random number generator (TRNG)
  • Hardware true random number generator (HRNG)

2:

  • Pseudo-random number generator (PRNG)
  • Linear Congruential Generator
  • Random number generator in C++

The major difference between pseudo-random number generators (PRNGs) and true random number generators (TRNGs) is easier to understand if you compare and contrast computer-generated random numbers to rolls of a dice. Since the pseudo-random number generator generates random numbers by using mathematical formulae or precalculated lists akin to someone rolling a dice multiple times and writing down all the outcomes, whenever you ask for a random number, you get the next roll of the dice on the list. This means that there are limitless results produced from the list. Effectively, the numbers appear random, but they are in fact predetermined. True random number generators work by “getting a computer to actually roll the dice — or, more commonly, use some other physical phenomenon that is easier to connect to a computer than is a dice”

Comparisons of PRNGs and TRNGs

As you go about your day today, consider how random numbers are being used to support your daily activities – from the passwords we use, the apps we run on our phones, and the devices and programmes we engage with. Maths is all around us.

Capitalising on eco-anxiety: inside the world of ‘greenwashing’

Vera (Year 13) looks at the issues surrounding ‘greenwashing’ – where false or misleading claims about the eco- friendly nature of a product are made to support sales.

 

Have you ever been enticed by rough brown packaging or images of green fields and butterflies while shopping? It’s only natural to be lured by the green statements on packaging such as ‘all natural’ or ‘eco-friendly’. But have you ever stopped to think about what these statements truly mean?

It is easy to paint a company as ‘eco-friendly’ with a skilled hand in marketing and an assumption that the consumer will not look into the claims plastered on their advertisements. This is terribly harmful for the environment and is summarised by a term known as ‘greenwashing’.

Greenwashing is the practice of a company making claims about its environmental impact that are either misleading or false; a commercial sleight of hand to distract its eco-conscious consumers from its true environmental impact. The term was coined by Jay Westerveld in 1983 while he was on a student research trip to Samoa. He had stopped by Fiji to surf and while sneaking into the Beachcomber resort to steal fresh towels, he saw a note to costumers telling them to refuse new towels to protect the local reefs. He found the claim ironic since the resort was currently expanding into local ecosystems but had painted itself as environmentally conscious. Westerveld and a fellow student later wrote an essay in a literary magazine and gave a name to the resort’s practice. The term was picked up in the Oxford English dictionary by the year 2000.

It would be best to explain greenwashing with an example. One early case is that of DuPont in 1989. DuPont is an American chemicals company – previously the world’s largest in terms of sales. A 1989 advertising campaign announced new double-hulled oil tankers. The ad sees clapping dolphins and other marine animals in a crystal blue sea with Beethoven’s Ode to Joy playing in the background as they claim to be ‘safeguarding the environment’. This was all while DuPont was the single largest corporate polluter in the United States.

To give a more recent example (and my personal favourite) we have H&M, a popular Swedish fast fashion brand. The company launched their ‘H&M Conscious’ campaign in 2010 and it has continued to develop ever since. Some of their products now sport a green tag to signal that they are sustainably sourced. To qualify for this special tag, the garment must contain at least 50% sustainable material, such as organic cotton or recycled polyester. They make an exception for recycled cotton, of which only 20% must be used for the green tag, due to quality restraints. H&M also has a textile collection programme in many of its stores. If you donate clothes to the collection bin you are eligible for a £5 voucher on your next purchase of £25 or more at H&M. All of this makes H&M look like the posterchild of fast fashion looking to make a positive environmental impact. But the company makes these claims whilst maintaining 52 micro-seasons per year, perpetuating the cycle of fast fashion. Most of the clothes it takes in for donation at not recycled but instead sent to developing countries without the infrastructure to deal with toxic waste. The £5 voucher incentive merely encourages more consumerism and more wastage of fabrics and other resources. Its green tag clothing has a fairly low baseline but since few consumers are going to look into detail at the material of their clothing it serves as a successful flag for do-good shoppers. I see H&M not as the posterchild for eco-conscious fast fashion, but as the posterchild of greenwashing.

Lastly, we have the ethical greenwashers. Fiji Water sells expensive bottled water in Instagram-friendly packaging. One particular ad campaign used the voice of a young girl saying “Fiji water is gift from nature to us, to repay our gift of leaving it completely alone. Bottled at the source, untouched by man. It’s Earth’s finest water”. Beautiful choral music plays in the background as images of grand green mountains and lush forests pan across the screen. These claims are made despite plastics taking many hundreds of years to degrade and ignoring the carbon emissions impact of shipping water from Fiji across the world. Adding to this, the WHO states that 47% of Fijians don’t have access to clean, safe drinking water. The brand’s story appeals to their customer’s moral conscious that allows – nay encourages – them to buy bottled water. Yet Fiji does not show the consumer the devastating lack of access to drinking water across the Fiji islands.

And why would they? It would hardly be an effective business model to say to your customers: ‘buy this clothing – even though you are effectively burning through the Amazon by doing so’ or ‘buy our water – even though most of Fiji can’t drink it’. Greenwashing works. By the early 1990s, polling showed that companies’ environmental records influenced the majority of consumer purchases. A 2015 Nielsen poll showed that 66% of global consumers were willing to pay more for environmentally sustainable products. Amongst millennials, the percentage rose to 72%.

Making changes to reduce the environmental impact of consumer products is always a good thing – even if they are small changes. I support H&M’s campaign to use more recycled fabrics. I believe that if DuPont’s double-hulled oil tankers truly reduced the environmental damage they made on a continued basis, then they should absolutely go ahead with the idea.

What becomes dangerous, however, is when these companies take to social media feeds and billboards to boast their incredible environmental achievements. It is dangerous because it only encourages the consumer to purchase more of the damaging product, offsetting any improvements they may have made. It is also misleading to the busy customer who does not have the time or resources to look into every environmental claim a company makes. Innumerable people have willingly spent more on what they assume to be eco-friendly products, when their claims could be entirely baseless.

Greenwashing often comes with noble intents. But the consequences may not always be noble. It is important to remain wary as a consumer of the potential motives of a company when you hear eco-jargon. It is even more important for the companies themselves to hold themselves accountable for the claims they make about their environmental impact. Greenwashing is a dangerous habit, but can easily be defeated with transparency and a little research.

 


Sources:

https://www.theguardian.com/sustainable-business/2016/aug/20/greenwashing-environmentalism-lies-companies

https://www.goingzerowaste.com/blog/how-to-tell-if-youre-being-greenwashed

Greenwashing: A Fiji Water Story

To what extent are imperialism and the cultural narrative of the ‘leave’ campaign linked?

Annabel (Year 13) looks at the impact of the British imperial history on the evolving relationship between the UK and the EU.

The Leave Campaign’s bus - From TheTimes.co.uk
The Leave Campaign’s bus – From TheTimes.co.uk

There is an argument to suggest that Euroscepticism, which has been a major part of our political narrative since the 1960s, has an imperialist undertone to it; as decolonisation came to a close, Euroscepticism rose up in its place. There is certainly room for this argument in today’s political climate as similarities can be drawn between the two ideas from an ideological point of view. Nevertheless, the ‘Leave’ campaign has a greater level of complexity to it than merely an overwhelming desire to return to days of imperialist superiority in the 19th century.

Firstly, British imperialism is an incredibly complex area of interest and reasoning for empire building changed dramatically from initial stages of adventure and exploration to its largest point in 1919, where the empire added 1.8 million square miles and 13 million subjects to its existing territory under the Treaty of Versailles.[1] The notion that British imperialism can be associated with a single motivation throughout the entire existence of the Empire is just too simplistic. How then, can we link imperialism to the motivations behind the ‘Leave’ campaign?

There are some commonalities throughout the British Empire’s existence that can be found and therefore associated (or not as the case may be) with the Eurosceptic narrative. Without question there are consistent undertones of British superiority throughout the time as metropole in one of the largest empires in history. Colonialism was associated initially with a desire to explore, and then claim, foreign lands. Humanitarian justification, through Social Darwinism and then increasingly through a motivation to decolonise, was an important aspect of imperialism. Above all, the competition between European neighbours, also imperialist powers at the time, was a key aspect of the British Empire and this is where the possible connection to Euroscepticism can be found.

The British Empire in 1919 – From WashingtonPost.com
The British Empire in 1919 – From WashingtonPost.com

The British relationship with the EU has been complex from the outset and it was heavily debated whether membership should be granted to the UK throughout the 1960s. Britain’s desire to have a special relationship with the EEC due to the Commonwealth trade meant they were rejected by the EEC twice in the 1960s. French president at the time, Charles De Gaulle, determined that the British had a “deep-seated hostility” to any European project.[2] The hostility that De Gaulle mentions could be referencing the peripheral location of Britain and historical competitiveness with European nations that, as previously mentioned, were a key aspect of British, and indeed European, imperialism. There is arguably therefore a compatibility with a reluctance to be a part of the EU and the anti-European narrative of the British Empire.

 

The “deep-seated hostility” that De Gaulle mentioned could suggest that there is perhaps an unconscious bias of the British population against any collaborative effort amongst European countries. Bernard Porter argues in his work The Absent-Minded Imperialists that the British population was largely unaware of the impact of Empire on British society and held a more subconscious affiliation with its principles as opposed to a direct support of the motivations.[3] There are two possible consequences of his argument in relation to the EU, that the imperialist subconscious merely drifted away from the British cultural narrative, or that there remains a subconscious affiliation with the principles of British isolationism and European competition in the British population. It is undoubtedly difficult to pinpoint which one it is, but it is nonetheless interesting to consider how far the European relationship has been impacted by the British Empire.

From VoteWatch.EU
From VoteWatch.EU

The British relationship with the EU has always been complex; Britain was not one of the 11 countries to join the Eurozone in 1999 and only voted to join the EEC in 1973, long after the ECSC was formed to prevent Franco-German conflict in 1951. The economic narrative of the EU was a key one in the ‘Leave’ campaign, as seen on the bus above, but arguably it was much more about cultural identity than the economic relationship between the UK and the EU. The tones of placing internal British priorities above those of regionalist policies in the EU could be seen to hold an aspect of British isolationism which was a key pillar of British imperialism in competition with other imperial European powers.

Ultimately, while there are certainly correlating elements between the narratives of imperialism and that of the ‘Leave’ campaign, it is incredibly difficult to pin down how far it is a conscious decision. There is, perhaps, an “absent-minded” aspect to the narrative that has retained some of the colonial narratives present in the days where the Empire placed Britain as a leading world power. Therefore, the desire to return to a powerful place, as was the case of Britain as an imperial power, might have provided a sub-conscious motivation for the desire to break away from historically rival European countries.


 

Bibliography:

Murphy, R. Jefferies, J. Gadsby, J. Global Politics for A Level Phillip Allen Publishing, 2017

Porter, B. The Absent-Minded Imperialists, Oxford University Press, 2006

Porter, B. The Lion’s Share: A History of British Imperialism 1850-2011, Routledge Publishing 2012

[1] Ferguson, Niall (2004b). Empire: The Rise and Demise of the British World Order and the Lessons for Global Power. Basic Books. ISBN 978-0-465-02329-5.

[2] “1967: De Gaulle says ‘non’ to Britain – again”. BBC News. 27 November 1976. Retrieved 9 March 2016.

[3] The Absent-Minded Imperialists, Bernard Porter (2004)

Immunology: a brief history of vaccines

Sienna (Year 11) looks at the history of immunisation, from variolation to vaccination, exploring some of the topics around this important science.

History of Immunisation:

Variolation: 

While vaccination is considered quite a modern medical procedure, it has its roots in more ancient history. In China there are records of a procedure to combat smallpox as early as the year 1000. This was called variolation and was a procedure where pus was taken from a patient with a mild case of smallpox which was then given to another person. This means the person gets a less dangerous version of smallpox than they may have otherwise, promoting an immuno-response to act as a way of preventing the disease. This method became established around the world and was later seen in the work of Edward Jenner, who is considered the ‘father of vaccinations’, after he used this technique in Africa, England and Turkey in the 1700s.

Later in the 1700s, the USA learned of it from slaves who came inoculated from Africa. Even though a remarkable feat for the time, it wasn’t without risk, as the way the immunity was reached was by direct exposure to the virus, so infected patients could still die from the virus – as is what happened with King George III’s son and countless number of slaves. However, the risk of dying from variolation was far smaller than the risk of catching and dying from smallpox, so variolation was popular despite the risks.

 

Origin of the first widely accepted vaccination: 

Vaccination, as we know it in modern terms, was first established in 1796 by Edward Jenner. He was a scientist and fellow of the Royal Society in London. Seeing how much of a problem smallpox was at that time (and for most of history prior to then), Jenner was interested at innovating the process of variolation to tackle smallpox.

He was inspired by something he heard when he was a child from a dairymaid saying I shall never have smallpox for I have had cowpox. I shall never have an ugly pockmarked face.” This inspired him later in life to carry out an experiment where he inoculated an eight-year-old with cowpox disease. He recorded the boy felt slightly ill for around 10 days after the procedure, but afterwards was completely fine. After being injected with active smallpox material a few months later, the boy did not show any symptoms of the disease; Jenner concluded his experiment had been a success.

After writing up his findings, Jenner decided to name the new procedure vaccination as the Latin for cowpox is ‘vaccinia’. His paper was met with a mixed reaction from the medical community. Despite this, vaccination began gaining popularity due to the activity of other doctors such as Henry Cline, a surgeon whom Jenner had talked closely with.

Due to the success of the procedure, especially compared to variolation, by the turn of the century (just a few short years after Jenner had run his experiment) vaccination could be found in almost all of Europe and was particularly concentrated in England. The success of Jenner’s work is outstanding. By 1840 vaccination had replaced variolation as the main weapon to fight against smallpox so much so that variolation was prohibited by law in British Parliament. The disease that had ripped so mercilessly through the world for centuries was finally declared eradicated in 1977 by the World Health Organisation (WHO) – perhaps more than the deceased Jenner could have ever hoped his discovery would achieve.

Edward Jenner: 

Image via Pexels

Despite undeniably being a force for good in terms of the world, Jenner was also a remarkable person on a slightly smaller scale. Despite low supplies at times, Jenner would send his inoculation to anyone who asked for it – medical associates, friends and family, even strangers. Later in his life, he even set up his ‘Temple of Vaccinia’ in his garden where he vaccinated the poor free of charge. Despite the opportunity, Jenner made no attempt to profit off of his work, rather viewing his invention as a contribution to science and to humanity, and this was perhaps vital for the speed at which the vaccine and vaccination process spread.

Modern Vaccinations: 

Nowadays vaccinations have changed – not in principle but in the nitty-gritty science of them – as we have begun to know more about how our immune system works. Jenner’s inoculant was adapted and changed to suit different diseases, containing either very mild strains of a virus with similar spike proteins, a dead strain of the virus, or even the isolated spike protein, enabling the body to recognise the pathogen without being exposed to the danger of it.

Introducing the body to the same spike proteins found on the harmful pathogen is in essence how vaccination works. The body responds to these spike proteins are foreign and so send phagocytes (a type of white blood cell) to destroy them, and lymphocytes to create antibodies to activate an immune response. This is why a few days after vaccination there may be a feeling of discomfort or slight fever – this is because the body is fighting against those spike proteins.

While the spike proteins are being destroyed, the body creates memory cells. These are the most important part of the vaccination procedure and mean that if the body is exposed to the actual, more dangerous pathogen in the future, the memory cells will recognise the spike protein and the body will have a secondary immune response, so that antibodies are produced in much greater quantity, sooner and more rapidly. Secondary immune responses to diseases are far more effective and often the person will never show any symptoms they have that disease, with the pathogens being destroyed within a matter of days.

Viral Vector Vaccines:

These are an example of exciting advances in vaccination. The way these type of vaccines work, such as the COVID-19 vaccine developed in the UK by Oxford University, is that the DNA from the actual virus is injected into an adenovirus (a deactivated virus that acts as a carrier for the actual virus DNA to our bodies), causing the antigens for actual virus to develop on the adenovirus. These can then trigger a strong immune response from the body without the actual virus itself being introduced into the body. This is an effective way to ensure memory cells to that virus are created, and this attributes to the Oxford vaccines high efficacy reports.

mRNA Vaccines:

The exciting new vaccination adaption is the mRNA material in the vaccine, and this has been used in some of the COVID-19 vaccines. The mRNA essentially is a set of instructions for the body to make the spike protein of the pathogen meaning the body makes the protein rather than it being cultivated in a laboratory and then put into a vaccination, but after that has exactly the same response. This allows the vaccination to be produced quicker and to be more effective. However, due to the newer and more complicated nature of the vaccine, it is more expensive to produce and needs to be stored at very low temperatures due to the mRNAs unstable nature. This can cause logistical issues with storage and distribution and is why the DNA based vaccine has been hailed as the best option for low income developing countries who do not have the facilities to store the mRNA vaccines. DNA vaccines can be stored at fridge temperature as DNA is far more stable than mRNA due to its double helix structure. This novel type of vaccine was developed by two Turkish immigrants living in Germany, who thought outside the box, like Jenner to improve human health in the race against time to find an effective vaccine. They have been enormously successful with the mRNA vaccine displaying 95% effectiveness against COVID-19 seven or more days after the second shot is administered.

Image via Pexels

Controversies of vaccinations:

During this pandemic, there has been wide-spread appreciation of how vital vaccines will be to control the spread of COVID-19. However, the voices of skeptics, often amplified by social media, seem to have found a more prominent platform to spread their opinions. They do not trust vaccination due to a variety of unfounded concerns. One of these is the argument that that the vaccinations are really ways for the government to implant chips into its citizens. Not only does this theory ignore the historic science of vaccination but logistically the needle would need to be far wider and the subsequent puncture wound would be far more noticeable.

The autism study:

Unfortunately, even though an article by Andrew Wakefield in 1998 was quickly shown to be based upon unfounded evidence, it continues to resurface among skeptics in their argument against vaccines, falsely claiming there is a link between autism and the MMR vaccine. Wakefield not only used only 12 children to test his hypothesis, far too small a group to draw up any kind of reliable conclusion, but he was also struck of the UK medical register for this paper. Wakefield’s study was disproven and redacted, and his hypothesis has been disregarded in the medical community through subsequent research and publication. The amplification of this fraudulent study has been cited as a reason for a decline in the uptake of the MMR vaccination and the subsequent small outbreaks of measles.

Development of COVID-19 vaccines:

For some, when they look at the speed with which the Covid-19 vaccine has been developed – under a year compared to more standard research time which can be as much as a decade – they are skeptical.

However, this is not because of cutting corners in the process; rather it is due to the immense amount of funding and equipment being given to scientists, as well as the sheer number of people working on the vaccine, to prioritise its development. In Phase I, II and III human trials are used and are assessed extensively for how the vaccine works in a diverse range of age groups, races, body types and pre-existing health conditions, as well as to accurately measure the exact immune response of the body – the antibodies and cells that have been produced and the efficacy and safety of the drug. This is then tested again by the approval companies – The Medicines and Healthcare Products Regulatory Agency for the UK, the European Medicines Agency for the EU and the Centre for Disease Control for the USA.

The World Health Organisation listed ‘vaccine hesitancy’ as one of the top ten threats to global health in 2019. This will play a crucial role in how quickly life can return to normal following the COVID-19 pandemic. Vaccinations are humans’ biggest weapon against the pandemic; they are, in the words of Sir David Attenborough, ‘a great triumph of medicine’, and although there has been recent news about mutations of the virus, it is important to remember that this is completely to be expected. The recent talk of the South Africa, UK and Brazil mutations have been due to small changes in the spike protein of the virus which have affected the transmissibility of the virus. There are tests currently being run, but early signs show that the vaccines are still effective against the mutation.

Even in the worst-case scenario, the vaccines can be adapted in a matter of weeks or months, and the government is preparing for a situation in which a COVID-19 vaccine has to be given annually to those at high risk, similar to the current flu vaccine. It comes as a relief that finally, in the wake of such a disruptive and terrible pandemic, there is light at the end of the tunnel and a reason to look forward to better days ahead, knowing that this lockdown will be very much so beneficial as every day more people are getting these game changing vaccinations.


Sources:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1200696/

https://www.historyofvaccines.org/timeline/all

https://www.britannica.com/science/variolation

https://www.nhs.uk/news/medication/no-link-between-mmr-and-autism-major-study-finds/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4944327/

https://www.bmj.com/content/371/bmj.m4826

https://www.independent.co.uk/news/uk/home-news/david-attenborough-anti-vax-ignorance-covid-b1797083.html

https://www.nature.com/articles/d41586-020-02989-9

Is geothermal energy the answer to our climate problems?

Lucy in Year 10 looks at issues surrounding climate change and the damage our current ways of living are having on the planet. Might geothermal energy offer the UK, and the world, a solution for us to clean up our act?

We are in the midst of a climate crisis; the UK government has recently made a commitment to achieve net zero emissions by 2050 to help stop further damage to the environment. The burning of fossil fuels to generate power is a significant contributor to the UK’s greenhouse gas emissions, so the use of renewable energy sources is critically important to meeting this commitment to achieve net zero emissions. There are already many established sources of renewable energy, such as wind, solar and tidal power, but geothermal energy might be an unexpected solution to the UK’s problems.

Geothermal energy: a solution to a cleaner future?
Picture from https://www.britannica.com/science/geothermal-energy

Geothermal energy uses the natural heat from within the Earth’s crust to heat water and create steam.  This steam then powers a turbine in a similar way to the production of energy using fossil fuels, with the key exception that the heat comes from the earth instead of from the burning of coal, oil or gas.  So, like other forms of renewable energy, geothermal energy produces far less CO2 than fossil fuels do.

The key advantage geothermal energy offers over many other forms of renewable energy is consistency.  Solar cells and wind turbines rely on climate and weather conditions to operate, which means that the amounts of energy produced varies and can be unreliable.  Geothermal energy doesn’t have that problem. No matter what happens, a geothermal plant will always produce the same amount of energy. The problems caused by inconsistent energy provision have already been seen; only weeks after setting a new wind power generation record, a breezeless day in January 2021 resulted in a shift back to fossil fuelled power and a tenfold surge in spot energy prices.[1]

Geothermal energy is currently in the news due to a recent announcement to build the first ever geothermal plant in the UK, in Porthtowan, Cornwall.  It will produce enough energy to power 10,000 homes[2] – enough to power almost all of Birmingham. So, why don’t we build them everywhere?[3]

While geothermal energy does have significant benefits, it also comes with its own set of problems.  The most prominent of these is the very specific characteristics of the Earth’s crust needed to be able to superheat the steam and power the turbines. As opposed to somewhere like Iceland, on the boundary of a tectonic plate, these locations are few and far between in the UK. Some will unfortunately be located in populous areas, where the negative aesthetics of a power station would outweigh its benefits. Another worrying fact about geothermal plants is that their construction, and the drilling of geothermal wells into the earth’s surface, have been the cause of several earthquakes over the past decade (5.5 magnitude earthquake in Pohang, South Korea in 2017).  While this is less of a risk for the UK, being geologically more stable, it still is a factor to be considered. I would hasten to add that this risk is less than that of CO2 from fossil fuels or the toxic clean-up of a nuclear power station!

While geothermal energy plants are undoubtedly an effective and positive use of the Earth’s natural resources to create a sustainable and consistent supply of energy, the problems that their construction and capabilities raise mean that it would be impossible for them to become the sole provider of the UK’s energy. However, it is undeniable that their existence and use could aid the UK greatly in our battle against greenhouse gases and the climate crisis. While geothermal energy cannot solve the climate problem alone, it should definitely be a part of the UK’s, and the world’s, solution to the threat that is the climate crisis.

 


REFERENCES

[1] https://www.thetimes.co.uk/article/the-energy-answer-is-not-blowin-in-the-wind-xbntdm6pv

[2] https://www.newscientist.com/article/mg24032000-300-supercharged-geothermal-energy-could-power-the-planet/

[3] Check out https://cornishstuff.com/2019/09/11/successful-drilling-at-uks-first-deep-geothermal-plant-in-cornwall/ to see the new Geothermal Plant take shape