Does Great Britain need to move on from the Second World War?

Rosie, Year 11, shares her recent WimTalk with us, discussing issues surrounding the way Britain remembers its past to shape its future.

September 2nd, 1945, Tokyo Bay. On the deck of the American battleship USS Missouri, the Japanese Instrument of Surrender document was signed by representatives from Japan, the United States, China, the United Kingdom, the USSR, Australia, Canada, France, the Netherlands, and New Zealand. World War Two was officially over. This ceremony aboard USS Missouri lasted 23 minutes, and yet the impact of what it represented rings on to today, almost 75 years later.

Now, in 2020, Great Britain has not moved on the Second World War – far from it. Everywhere in Britain, wartime memorials and museums can be found, remembering the half a million soldiers and civilians who lost their lives. Most British people have relative who fought in or experienced the war, and there are few who would not recognise the phrase ‘We shall fight on the beaches’ from Churchill’s most famous speech. And this prominent remembrance is not just confined to the older generations: It is an integral part of every child’s education too. Hundreds of books, TV programmes, podcasts and films have documented the war with great success – even recently. The modern economy, too, remembers the war, with Britain making the final war loan payment to the United States only 14 years ago in 2006. Overall, the memory of the Allied victory in the Second World War – “our Finest Hour” – inspires the national sense of pride in our military history that has become a rather defining British characteristic.

But the question is: why does Great Britain cling on to the Second World War more than any other nation involved? And is this fixation justified, or is it time to move on?

One perspective is that the British viewpoint of the Second World War is bound to be different because of geography. The triumph of physically small island nation prevailing in war is something we can celebrate and take pride in. For other nations involved – larger landlocked countries with shifting borders – this is less easy. For example, Germans today are less inclined to look back, not only because of the radical changes in society since the Third Reich or lack of a victory to celebrate, but also because modern Germany is physically different to the earlier Germany of the Kaisers, Weimar, Hitler and the divided states of the Cold War. Instead, Germany today looks forward, not backwards, which some would argue has allowed it to become the economic giant on the world stage that it now is.

And that’s another thing – how much has Britain changed since the Second World War? Of course, it has modernised along with the rest of the world: politically, economically, and physically, but so many of the same institutions remain as were present in 1939. Our democratic government, our monarchy, our military and traditions have survived the test of worldwide conflict twice in one century, the collapse of the British Empire and the Cold War in a way that those of France, Spain and Italy have not.

Above: Photo from wikimedia commons

The Second World War was a clear clash of good vs bad – peace vs aggression. Britain was not directly attacked by Hitler but stepped up to honour a promise to defend Poland against invasion for the greater good. Remembering the Second World War makes Britain proud of these national values, as had Chamberlain not roused from his policy of appeasement and committed Britain to the sacrifice of money, empire and life, had Churchill not fortified the nation’s most important alliance with Roosevelt, the world would certainly be a very different place today. And so, if a nation’s psyche comes from the values and institutions it possesses that have stood up throughout history, is it really any wonder Brits take pride in looking back?

On the other hand, perhaps after so many years it’s time to recognise that we are not, in fact, the same Britain that we were in 1945. In 1944, British economist John Maynard Keynes spoke at the famous Bretton Woods conference. He said that the Allies had proven they could fight together, and now it was time to show they could also live together. In achieving this, a genuine ‘brotherhood of man’ would be within reach. At this conference, the IMF and World Bank were created, soon followed by the UN, to promote peace and prevent the kind of economic shocks that led to war in the first place. But at the same time, these organisations were a convenient way for the main Allied powers to solidify their power and privileges. Since then, a European has always headed the IMF, and an American the World Bank. The UN Security Council is dominated by the five permanent members, whose privileged position, some say, is nothing but a throwback to the power distribution on the world stage of 1945. By clinging on to the war, are we really clinging on to the idea that Britain is still a leading power, and modern economic giants such as Germany and Japan do not deserve to disrupt the power structure of 1945? We pour so much money into Britain’s defence budget to maintain this powerful status – into remembered threats and sometimes archaic strategies: submarine warfare, aerial dogfighters and manned bombers. The Second World War was certainly a catalyst for change across the globe. Perhaps now, Britain’s inability to let go of these old power ideals and designated roles of nations prevents us from achieving the ‘brotherhood of man’ that, in 1944, Keynes dared to dream of.

We are told that the value of history is to ‘learn a lesson’ to prevent us from repeating the same mistakes again. But there is an argument to say that this concept is a consistent failure. So many conflicts around the world seem to be caused by too much remembering: refreshing tribal feuds, religious division, border conflicts, expulsions and humiliations. Doesn’t remembering cause Sunni to fight Shia or Hindu to fight Muslim? Is it memory that maintains dispute in the Balkans, the Levant, Mesopotamia? Perhaps the emotion sparked by remembering the details of our past is better left in history when it has the capability to spark aggression, conspiracy theories and irrational anger. Today’s politics of identity seem provocative enough without being fuelled by history, so perhaps we should heed Jorge Luis Borges who wrote: ‘The only vengeance and the only forgiveness is in forgetting’. This advice has been proven to work over time – Nelson Mandela’s philosophy in 1990s South Africa was to focus on ‘truth and reconciliation’ and draw a line under his country’s recent history – closure. Can Britain not find closure on the 20th century?

What I can conclude is that there are two perspectives to take on this statement: there are some who hold onto our history as a lesson for the future, as a reminder of the importance of peace and action for the greater good, who will never be able to forget the Second World War because of the core British values that it represents. And then, there are those who think it is time to let go of the past, and adapt our nation’s values to suit our current position in the quickly-changing world that we live in. And so, the only question I have left to ask is: which are you?

Did the Great Depression influence the response to the 2008 Financial Crisis?

Lauren, Year 13, discusses whether the Great Depression influenced the response to the 2008 Financial Crisis.

During the Great Depression, wages were cut for workers which led to a reduction in demand. This stemmed in the bankruptcy of thousands, as the stock market went into free fall after the Wall Street Crash.  Between 1929 and 1932 more than 100,000 businesses went bankrupt, and around 11,000 banks stopped trading. When these banks shut down, savers lost all of their money so they could no longer buy consumer goods. This reduction in demand resulted in the redundancy of many workers, ultimately creating a further decline in the level of aggregate demand. Thus, the economy entered a downward spiral.

President Hoover interpreted the Depression as hypothetical notion, a normal business turndown, rather than a solidified and evidenced occurrence.  Consequently, when an attempt to take action was made it was a little too late. Following the Great Depression, regulations were altered, and economic policies restructured all across the world. The economic system was redesigned to avoid a repeat of this disaster and the levels of government spending were increased.

After the Great Depression, it was often assumed that there would not be another economic downturn of such major proportions as it was believed that the lessons that had been learned then could be applied to any future crisis, protecting the future from such economic turmoil. However, perhaps the lessons learned were not enough to ultimately fend off the Financial Crisis of 2008 in which there had been a rise of non-bank institutions which were not regulated to the same extent as commercial banks, concerning loans.

Several measures were put into place in order to alleviate the effects of the crisis. In the USA, loans from the Federal Reserve were enforced, and the US even tried a Keynesian fiscal stimulus in early 2008 to ‘jump-start’ the economy, but this wasn’t successful enough because the stimulus was too small, at only about 1% of GDP.

A matter of interest to many economists is how the crisis was dealt with in the UK under the Labour government, because they continued to spend significantly in the immediate aftermath of the Financial Crisis. This helped to ease the initial impact because it reduced the economic downturn, but the rate at which the national debt was shooting up was dangerous.


The coalition government slashed public spending after 2010, damaging public services and holding back economic recovery after the crisis. Although it would have been wrong to ignore the huge government deficit inherited from Labour, it could be suggested that Osborne should not have cut spending on infrastructure and capital to such a degree, allowing the UK to invest to boost productivity.

In conclusion, had the enormous intervention by governments not happened, the impact of the Financial Crisis would have been significantly greater. This means that it can be argued that the Great Depression did, to some extent, influence the response to the 2008 Financial Crisis as it persuaded governments to intervene quickly and at great expense in order to avoid a repeat of events in the 1930s. However, it can be argued that these policies were not as successful as envisaged due to the complexity of new financial instruments.

GROW 2.0 – Being Human in an AI World

On Saturday 21st September we host our second Grow Pastoral Festival. The theme for this year is an examination of what it is to be human in a machine age. What questions should we be asking about the way technology affects our lives and what are our hopes for the future? More specifically, how will our young people develop and grow in a fast-paced, algorithmically driven society and what might education look like in the future?

 
In the morning session Professor Rose Luckin and Professor Robert Plomin will be giving keynote addresses, and then talk with our Director of Digital Learning & Innovation, Rachel Evans.
Prof Luckin specialises in how AI might change education; Prof Plomin has recently published Blueprint, a fascinating read about genetics and education. We can’t wait to talk about how education might get personalised, and how that change might affect our experience of learning.

In the afternoon we’ll dive into some provocative debate with Natasha Devon, Hannah Lownsbrough and Andrew Doyle, addressing questions of identity, wellbeing and community in an online age with our own Assistant Head Pastoral, Ben Turner.

So what kind of questions are in our minds as we approach this intellectually stimulating event? Ben Turner brings a philosophical approach to the topic.


Is our ever-increasing reliance on machines and subscription to the ‘universal principles of technology’[1] eroding our sense of empathy, compassion, truth-telling and responsibility?



Our smartphones give us a constant connection to an echo-system that reflects, and continuously reinforces, our individual beliefs and values. Technology has created a world of correlation without causation, where we understand what happened and how it happened but never stop to ask why it happened. Teenagers are understandably susceptible to an eco-system of continuous connection, urgency and instant gratification. It is these values that they now use to access their world and that inform them what is important in it.

Are tech giants like Amazon, Google and Facebook creating a monoculture that lacks an empathy for its surroundings? If we all become ‘insiders’ within a technology dominated society, pushing instant buttons for everything from batteries to toilet roll, are we losing the ability to see things from a fresh perspective? By raising children in a world of instant access and metropolitan monism are we creating only insiders; young people who will never gain the ability to step back and view what has been created in a detached way. How as parents, schools and communities do we keep what is unique, while embracing the virtues of technological innovation?

Is social media destroying our free will?

If you are not a determinist, you might agree that free will has to involve some degree of creativity and unpredictability in how you respond to the world. That your future might be more than your past. That you might grow, you might change, you might discover. The antithesis to that is when your reactions to the world are locked into a pattern that, by design, make you more predictable – for the benefit of someone or something else. Behaviourism, developed in the 19th Century, believes in collecting data on every action of a subject in order to change something about their experience, often using punishment or reward to enact the change. Is social media, through its algorithms, gratification systems and FOMO, manipulating our actions and eroding our free will?

Social media is pervasive in its influence on the beliefs, desires and temperaments of our teenagers and you do not have to be a determinist to know that that will lead to a disproportionate level of control over their actions. Does social media leave our young people with no alternative possibilities; locked in a room, not wanting to leave but ignorant to the fact that they cannot?

Is social media the new opium of the masses?

Social media has changed the meaning of life for the next generation. The change in human contact from physical interactions to those, arguably superficial, exchanges online is having not only a well-documented detrimental effect on individual young people but also on the very fabric and makeup of our communities.

In addition to the ongoing concerns about privacy, electoral influence and online abuse, it is becoming increasingly obvious that social media has all the qualities of an addictive drug. Psychologists Daria Kuss and Mark Griffiths wrote a paper finding that the “negative correlates of (social media) usage include the decrease in real life social community participation and academic achievement, as well as relationship problems, each of which may be indicative of potential addiction.”[2]

That is not to say that everyone who uses social media is addicted. However, the implications of the ‘heavy’ usage of social media by young people are increasingly painting an unpleasant picture. The UK Millennium Cohort Study, from the University of Glasgow, found that 28% of girls between 13 and 15 surveyed spent five hours or more on social media, double the number of boys survey who admitted the same level of usage. Moreover the NHS Digital’s survey of the Mental Health of children and young people in England[3], which found that 11 to 19 year olds with a “mental disorder” were more likely to use social media every day (87.3%) than those without a disorder (77%) and were more likely to be on social media for longer. Rates of daily usage also varied by type of disorder; 90.4% of those with emotional disorders, for example, used social media daily.

Panel Discussion

However, there is more to this than just the causal link between the use and abuse of social media and poor mental health. With the march of technology in an increasingly secular world, are we losing our sense of something greater than ourselves? Anthony Seldon calls this the “Fourth Education Revolution”, but as we embrace the advances and wonders of a technologically advanced world do we need to be more mindful of what we leave behind? Da Vinci, Michelangelo and other Renaissance masters, not only worked alongside religion but also were inspired by it. Conversely, Marx believed Religion to be the opium of the people. If social media is not to be the new opium, we must find a place for spirituality in our secular age. Even if we are not convinced by a faith, embracing the virtues of a religious upbringing seems pertinent in these turbulent times. Namely inclusivity, compassion and community, because if we do not, then very quickly the narcissistic immediacy and addictive nature of social media will fill the void left in our young peoples’ lives, becoming the addictive drug that Marx forewarned against.


References:

[1] Michael Bugeja, Living Media Ethics: Across Platforms, 2nd Ed. 2018

[2] Online Social Networking and Addiction – A review of Psychological Literature, Daria J. Kuss and Mark D. Griffiths, US National Library of Medicine, 2011

[3] November 2018

China: Should we be worried?

Sofia, Year 13, discusses whether the increasing power of China is something that should be concerning the global community.

China is increasingly becoming a hot topic amongst economists as we see the developing influence it is having on the western world. We are seeing a new form of colonialism – neo-colonialism – whereby China has (by being the second largest economy in the world) significant power over countries. One would expect this to be only over lower income countries; however, China is even beginning to power the West’s markets and economies and even has the power to have political control.

It is evident that many African countries increasingly depend on China as a trading partner as trade was worth $10.5 billion in 2000, $40 billion in 2005 and $166 billion in 2011. China is currently Africa’s largest trading partner, having surpassed the US in 2009. However, dependency on China extends more deeply than trade. China has been seen to be providing many African countries with loans in the form of top-down development projects. Examples such as this can be seen in a $3.2 billion railway in Kenya, trekking 300 miles from Nairobi to Mombasa, which is faster than the equivalent distance of a train journey from Philadelphia to Boston. China has also built a $526 million dam in Guinea and a $475 million light rail system in Ethiopia, which is the first of its kind in sub-Saharan Africa. These infrastructure projects are effectively seen to be loans however these loans are extremely risky, with low or no interest, where often most of the money is not completely paid back. This shows that China is not investing in these projects for economic benefit, but to have leverage over a country. This allows China to have political leverage, especially in votes at UN conferences such as those involving the China/Taiwan governance issues or China’s allies such as North Korea.

In the most recent vote involving condemnation of North Korea, only 12 out of the 54 countries in Africa voted against China’s ally. It has also been found that if a country recognises Taiwan (which is under Chinese governance) as a country in its own right they receive 2.7 fewer Chinese infrastructure loans a year. Furthermore, if an African country voted overwhelmingly along with China in a UN General Assembly they receive 1.8 more infrastructure projects a year. This shows that increasingly in these vulnerable countries China is controlling their economies as well as their political views.

However, this is not only the case in low-income countries such as those in Africa, we have been seeing in recent years China is using a similar technique to have more influence over Europe. China is the EU’s largest provider of imports accounting for 20.3% in 2015. China has also invested a lot into Europe, arguably for profit however, some projects could also be for political influence even though European economies are significantly larger than those in Africa. Greece and Hungary worked together to prevent Europe condemning of a tribunal’s finding against China and its plan in the South China Sea. China has also recently invested half a billion euros into the Greek port of Piraeus and the Belgrade – Budapest railroad. China has also been seen to drive a wedge between the UK and the USA by decreasing trade between the two and siding with Europe on matters concerning Climate change. China has also been seen to exploit links with certain countries to make foreign policy hard in areas such as human rights.

It is clear China is having an increasing influence in countries everywhere, which is increasingly leading to the loss of democracy on the international stage. Countries should be weary of this increasing influence and so should decrease dependency on the super-power.

Exploring Sri Lanka – 28/09/18

Globe

Serrena in Year 11 discusses the geography and struggles of Sri Lanka that she learnt about and witnessed when she was on a school trip there this summer.

Environmental Geography:

Sri Lanka is a teardrop-shaped island, located in the Indian Ocean with different climatic conditions across this small country. Its coastal areas are around 0–30m above sea level whilst its central highlands are 300–500m above sea level with the highest point in Sri Lanka, Pidurutalagala, at 2524m above sea level. These differing altitudes result in different climates: in the coastal areas there is hotter weather with more convectional rainfall, whereas the central highlands are cooler with more relief rainfall.  The cold, wet weather of the central highlands has resulted in an area called Nuwara Eliya being referred to as “Little England”.

Sri Lanka’s climate is also influenced by monsoons: the northeast monsoon (December to February), and the southwest monsoon (May to September). When the land heats up and low pressure is caused by the rising, hot air, cooler wind from the ocean is drawn in and brings with it heavy rainfall for the country. In Sri Lanka, the rivers naturally flow more towards the southern, wet zone but there has been human intervention to divert the flow of rivers towards the north of the country where there are dams.

The Sinharaja Rainforest:

The Sinharaja Forest Reserve is a national park and a biodiversity hotspot in the southwest of Sri Lanka. It has been designated a Biosphere Reserve and World Heritage Site by UNESCO. There are 211 woody trees and lianas so far identified in the reserve, 66% of which are endemic, 20 of Sri Lanka’s 26 endemic birds are found here as well as half of Sri Lanka’s endemic mammals and butterflies.

Human Impacts:

Lots of plants are cut illegally – the agarwood plant is fragrant and is stolen for use in cosmetics and perfumes, venivel creepers are taken for medicinal uses and the rattan plant is stolen for furniture. Precious gems are also illegally extracted from the site for jewellery. Pesticides left by people trying to grow certain plants to steal them lead to bioaccumulation. Contractors open up routes to facilitate logging operations and, although no felling is permitted within 1.6km of the reserve boundary, this renders the reserve more accessible to illicit timber operations. The planting of Honduran mahogany Swietenia macrophylla along abandoned logging trails as an enrichment species leads to the displacement of natural species, especially as it is a prolific seed producer.

The future of the site:

There are concerns over the future of the conservation of the site and its biodiversity as, despite being a UNESCO world heritage site, many people are illegally damaging it. Additionally, if global warming leads to more erratic and shorter monsoons, the plants will receive less water for photosynthesis which subsequently leads to less respiration and less growth.

Economic Geography:

Sri Lanka’s island status and ports allow it to have good trade relations with other countries. Its main exports are rubber, tea and coconut and its smaller exports are spices as well as minerals and gems.

Tourism:

Jobs in Sri Lanka are becoming increasingly based in the tertiary sector as it develops. There is Chinese investment in Sri Lanka: the government has taken a loan from the Chinese government to build a new harbour airport and highway as well as a new artificial island in Colombo called Port City. This new artificial island will provide jobs for locals as well as bring in lots of tourist revenue as new hotels will be built.

A growing tourist industry in recent years has allowed the country to recover after its development was hindered by a civil war. However, despite a growing tourism industry in Sri Lanka, upon visiting a hotel school on our trip and talking to the young men there who aspire to work in hotels we discovered that the vast majority of them want to work abroad. This may be due to a desire to see how other hotel industries work but Sri Lanka faces a larger issue of skilled workers leaving the country for higher salaries in the West.

1.2 million Sri Lankans work abroad and send money back home, whilst economic issues can also force women to work abroad. Women in rural areas who struggle to support their families have far fewer opportunities for employment in Sri Lanka compared to areas such as the Middle East. For example, there are 1.5 million Asian domestic workers in Saudi Arabia.

Social Geography:

Education:

Education is free in Sri Lanka and subsequently there is a 94% literacy rate. In Sri Lanka there are only spaces for around 10% of students to go to university. This lack of universities and increased competition for spaces in higher education leads to many parents feeling forced to send their children to tuition to give them the best chance possible of getting into university. This issue is exacerbated by the fact that school finishes at around lunchtime in Sri Lanka and children are unoccupied in the afternoon, a time when many school teachers will run their own private tuition. This leaves bright children, in a country where the average wage is USD $12,768, whose parents cannot afford to send them to tuition at a disadvantage compared to students whose parents have a larger disposable income. As parents feel compelled to allocate part of their income to their child’s tuition, they face an opportunity cost of giving their children a better chance of getting into university or having the financial means to afford a better quality of life for their family.

Culture:

There are people of many different religions in Sri Lanka who peacefully co – exist. As religious studies is a compulsory subject up until 16 in the Sri Lankan schooling system, the religious tensions of the civil war are unlikely to resurface and students are more tolerant.

Polwathatha Eco lodge:

This Eco lodge encourages ecotourism.

  • It produces its own tea and coffee
  • It has a community produce section and employs many locals to give them a source of income
  • They collect polyethylene plastic and give it to a place in Digana where the plastic is recycled
  • They give kitchen waste to the wild pigs which uses up the kitchen waste while maintaining biodiversity
  • To maintain local culture and show tourists a non – westernised and authentic Sri Lankan experience they provide homestays for visitors

During our community stay in Digana local women have small plots of land where they can plant crops and sell them to gain a source of income and become independent, furthering the emancipation of women in rural areas of Sri Lanka.

When we visited a roadside rural restaurant, we realised that many people in rural areas lead sustainable lifestyles and their low income means they aim to have minimal waste. For example, coconuts are fully grated inside to provide food for the restaurant whilst the husk is used for building thatched rooves.

Evaluation

I found this trip visiting rural areas to be a humbling experience for someone who lives in a busy city like me for a few reasons:

  1. The resourcefulness of Sri Lankans makes up for their lack of technological advancements in comparison to Western, developed nations;
  2. The resilience of the people we met in the face of adversity; the absence of a social benefit system after tsunamis have caused vast devastation in Sri Lanka in the past two decades and a civil war that lasted 25 years disrupted the lives of thousands of Sri Lankans, displacing an estimated 800,000 people;
  3. The incomparable hospitality of our host families, drivers and everybody we met. There is a huge sense of community in rural Sri Lanka that left a lasting impact on my outlook in life.

Nanotechnology and its future in medicine – 07/09/18

Maya (Year 11), discusses the uses of nanotechnology in medicine, thinking about how far it has come and helped doctors. She also considers the dangerous aspects of using such small technology and the future benefits it may bring.

Technology in medicine has come far and with it the introduction of nanotechnology. Nanotechnology is the action of manipulating structures and properties at an atomic and molecular level as the technology is so small; it being one-billionth of a metre. This technology has many uses such as electronics, energy production and medicine and is useful in its diverse application. Nanotechnology is useful in medicine because of its size and how it interacts with biological molecules of the same proportion or larger. It is a valuable new tool that is being used for research and for combatting various diseases.

In medicine, nanotechnology is already being used in a wide variety of areas, the principle area being cancer treatment. In 2006 a report issued by NanoBiotech Pharma stated that developments related to nanotechnology would mostly be focused on cancer treatments. Thus, drugs such as Doxil, used to treat ovarian cancer will use nanotechnology to evade and surpass the possible effects of the immune system enabling drugs to be delivered to the disease-specific areas of the body. Nanotechnology is also helping in neuroscience where European researchers are currently using the technology to carry out electrical activity across dead brain tissue left behind by strokes and illnesses. The initial research was carried out to get a more in-depth analysis of the brain and to create more bio-compatible grids (a piece of technology that surgeons place in the brain to find where a seizure has taken place). Thus, it is more sophisticated than previous technologies which, when implanted, will not cause as much damage to existing brain tissue.

Beyond help in combatting cancer and research, nanotechnology is used in many areas in medicine from appetite control to medical tools, bone replacement and even hormone therapy. Nanotechnology is advancing all areas of medicine with Nano-sized particles enhancing new bone growth and additionally, there are even wound dressings that contain Nano-particles that allow for powerful microbial resistance. It is with these new developments that we are revolutionising the field of medicine, and with more advancements, we will be able to treat diseases as soon as they are detected.

Scientists are hoping that in the future nanotechnology can be used even further to stop chemotherapy altogether; fighting cancer by using gold and silica particles combined with nanotechnology to bind with the mutated cells in the body and then use infra-red lasers to heat up the gold particles and kill the tumour cells. This application would be beneficial as it would reduce the risk of surrounding cells being damaged as the laser would not affect them as much as the chemotherapy would.

In other areas, nanotechnology is further developing with diagnostics and medical data collection. This means that by using this technology, doctors would be able to look for the damaged genes that are associated with particular cancers and screen the tumour tissue faster and earlier than before. This process involves the Nano-scale devices being distributed through the body to detect chemical changes. There is also an external scan by use of quantum dots on the DNA of a patient which is then sequenced to check if they carry a particular debilitating genome, therefore providing a quicker and easier method for doctors to check in detail if a patient has contracted any illnesses or diseases. Furthermore, doctors will be able to gain a further in-depth analysis and understanding of the body by use of nanotechnology which surpasses the information found from x-rays and scans.

While this is a great start for nanotechnology, there is still little known about how some of the technology might affect the body. Insoluble nanotechnology for example, could have a high risk of building up in organs as they cannot diffuse into the bloodstream. Or as the nanoparticles are so small, there is no controlling where they could go, which might lead to Nano-particles entering cells and even their nuclei, which could be very dangerous for the patient. The science and technology committee from the House of Lords have reported concerns about nanotechnology on human health, stating that sufficient research has not been conducted on “understanding the behaviour and toxicology of nanomaterials” and it has not been given enough priority especially with the speed at which nanotechnology is being produced.

Nanotechnology is advancing medical treatment at a rapid rate, with new innovative technologies approved each year to help combat illnesses and diseases. Whilst more research needs to be conducted, the application of Nano-medicine will provide a platform of projected benefits that has potential to be valuable. Overall with the great burden that conditions like cancer, Alzheimer’s, HIV and cardiovascular diseases impose on the current healthcare systems, nano-technology will revolutionise healthcare with its advances techniques in the future as it progresses.

@Biology_WHS 

How far can fashion trends be considered to be dictated by the social and political climate?

Alice Lavelle (Y13) looks into how fashion taste can be shaped by different trends in social and political thinking.

In this February’s Vogue there was an article written by Ellie Pithers ascribing the sudden popularity of the jagged hemline among both designers and consumers, to the current uncertain political climate, post Brexit and post Trump. Pithers claimed, with support from the Preen designer Thea Bregazzi, that the sudden interest in the more bohemian, asymmetrical hem was a representation of people’s confusion and uncertainty following both Britain leaving the EU and Trump being elected president. Pithers further highlighted how this trend of rollercoaster hemlines can be linked to the fluctuating value of the pound, and more generally the uncertain economic climate, citing the climbing hemlines of the prosperous twenties and sixties, and ankle grazing skirts of the poorer thirties as her evidence. How far this can be considered true, or rather an overzealous journalist reading too far into an otherwise trivial catwalk trend is of course debatable.

However I would argue that this link between fashion and politics is not only accurate in today’s changing social climate, but one that can be seen throughout history – and, when considering this idea, one name immediately springs to mind – Jackie Kennedy. The first lady was a style icon within the United States throughout her husband’s presidency, with the clothes and styles she wore immediately being copied by designers up and down the country. However, what the women of the time who looked to the first lady as means of inspiration were not aware of, was that her beautifully designed gowns and brightly coloured skirt suits were in fact designed in response to the changing US political policies. Following the McCarthyist era of the 1950s, the Unites States was pushing to reinvent itself as progressive, self-believing nation, and Jackie’s traditional, yet simultaneously cosmopolitan ensembles, with a hint of European influence at the hands of Hollywood designer Oleg Cassini, were essentially a well-crafted response to the country’s growing global presence.

Looking further back at iconic moments in the history of fashion it becomes more and more evident that the garments which have shaped the way we dress today were in fact themselves shaped by the political climate they were created within. Take Christian Dior’s ‘New Look’, the long skirted, cinched waisted silhouette that reinvented feminine dress, created in 1947 in response to the more liberal society emerging following the second world war. Or Paco Rabanne’s metal disc dress of 1966 – favouring experimentation over practicality, this design embodied the hopes of the emerging European society.

In terms of designers creating garments as a response to the social climate, you have Rudy Gernreich’s topless dress in the early 70’s, showing the still persistent objectification of the female form, rapidly followed by Bill Gibb’s eclectic, romantic collection in 1972 that paved the way for the ‘hippie movement’ within design, and the debut of Diane Von Furstenberg’s iconic wrap dress in 1973 – a garment that became synonymous with female empowerment within the workplace, a statement of society’s changing attitude towards women. The speed with which these popular styles changed and evolved is just a further representation of how the fashion industry responded to the changes of attitudes towards women in the workplace for example, again showing how intrinsically linked both fashion and political trends are.

And this concept, as explained by Pithers, is relevant today beyond the sudden popularity of rollercoaster hemlines. The spring shows in September all indicated that the previous androgynous styles of autumn/winter were out, and feminine florals and chiffon were back, this time with an edge of female empowerment. Models walked the dior catwalk in white t shirts emblazoned with the slogan ‘We should all be feminists’ taken from the title of an essay written by the Nigerian born Chimamanda Adichie – a bold statement from the newly appointed, first female head of the iconic fashion house, Maria Grazia Chiuri. This surge of feminism across the spring/summer shows again was more than just a trivial fashion trend, it was an embodiment of the rising power of women in the workplace, and within politics – with Hillary at that time still being the potential president of the US.

And it is these trends, the jagged hemlines and cinched waists that eventually get filtered down through the high-street stores and into our wardrobes – meaning the clothes that we wear, either to make a statement or purely because they are comfortable, are essentially just a physical representation of our current uncertainty towards our political climate in a post Brexit post Trump universe.

Follow the WHS DT department on Twitter.

Historical Guilt: Sorry seems to be the hardest word

Wimbledon High History

By Millie McMillan, Year 12.

The debate surrounding the question of historical guilt is a controversial and emotionally fraught conversation.

For the campaigners that lobby governments to admit their involvement and express their repentance for past actions, an apology is a necessary step towards reconciliation. For the government, and many citizens, an official apology seems like a misguided action, conceding their financial liability and provoking backlash from their citizens. This question can be explored through the example of the British Empire; regarded as a symbol of colonial oppression by its critics, and a nostalgic reminder of bygone British power by others.

“The United Kingdom is one of the few countries in the European Union that does not need to bury its 20th century history.”

This statement, allegedly tweeted by Liam Fox (Secretary of State for International Trade) in March 2016, exposes many of the problems surrounding the issue of historical guilt, for two reasons. Firstly, the word ‘bury’ is intriguing, implying the general desire of nations to hide or suppress their own questionable historical actions by refusing to address the significance and impacts of them. Whilst this is not at all surprising, it does raise questions about many nations not accepting accountability for their actions, and whether this is the best way to approach a future of reconciliation. Moreover, this statement exemplifies how many citizens see British imperial history in a ‘nostalgic’ light. However, whilst one can disagree with the sentiment expressed, it is the wider repercussions of such attitudes that are almost more alarming.

The question lies not with whether Britain should bury her history, but why it is perceived that nations need to bury theirs in the first place.

You may have personal grievances with the way in which Fox chooses to view the British Empire, yet even disregarding this fact, his statement is a symbol of a wider culture of glorifying historical achievements, whilst vehemently refusing to acknowledge those which we would rather forget. We feel the need to bury our morally ambivalent actions, producing a warped view of historical events.

Surely it is this very approach; of sweeping historical misdeeds under the carpet, of equating ‘forgetting’ with ‘forgiving’, that is most detrimental?

This question of historical guilt has but another facet – it is not only about whether we should apologise, but also if we are able to. The generations of today had no input into past actions, and therefore an apology is either a reconciliatory mark of separation from past mistakes, or an ‘empty’ gesture, with little significance or substance behind it. If an apology is defined as an expression of one’s regret at having wronged another, then it seems not only counterintuitive, but disingenuous to deliver an apology for an action that you didn’t commit.

Nevertheless, if we choose to view an apology as a necessary step towards changing attitudes and actions, an opportunity for education along with a sign of mutual respect… it renders an apology far more significant and long lasting. A strained apology is hollow and superficial; a sincere apology offers solace and closure, twinged with optimism for a future encompassing different approaches and education.

Tony Blair’s 2006 expression of “deep sorrow” is the closest to an apology for the activities of the empire that the British Government has released thus far. Meanwhile, in other cheery news, a poll in 2014 revealed that 59% of those surveyed believed the British Empire was “something to be proud of”. It is not that the British Empire was solely a negative influence, but it is this perception of it being a source of ‘pride’ to generations that had no involvement in its creation or demise that seems somewhat confusing.

It is indicative of a flaw in the way in which the education system chooses to portray British history, glossing over the barbaric aspects of British rule and igniting a misplaced sense of patriotism amongst those eager to listen.

The question of whether countries should apologise for their actions remains, and will likely be a contentious issue for many years to come.

It is certain that we can no longer continue to ‘forget’ the events of the past. This approach achieves nothing, except fostering a culture of ignorance and misguided ‘pride’. A reformed approach to national education regarding the perceptions of Britain’s past is surely the best way in which historical guilt can best be repented. An apology is but mere words; however, the longevity of an informed population with changed mind-sets, who no longer believe their homeland is infallible, is undoubtedly more significant.

Let us not feel the ‘need to bury’ the mistakes of the past, and instead use enhanced knowledge and wisdom of our history to create a better future.

“British policy towards India changed completely from 1857-76.” How far do you agree?

Wimbledon High History

By Ellie Redpath, Year 12.

The Indian Mutiny of 1857-8 resulted in a change to British policy towards India from an idealistic view, with the hopes that India would one day have become civilised enough under British rule to self-govern, to one of resigned moral duty coupled with a heightened awareness of the need for cementing the security of the British Raj. However, it did not result in the complete eradication of the previous policies employed under Company rule. When policy is defined as the changes made by the British government with regards to the military system and administrative government of India, and the changes to economic strategy, it becomes apparent that the policies were altered in order to avoid provoking the revival of violence by imposing Western ideology on the indigenous people. Normality for the Indian people remained largely the same as before the Mutiny; these policies were introduced solely as insurance that the events of the Mutiny would never be repeated.

The differences to the administrative government of India implemented after the Mutiny can ostensibly be seen as drastic, yet in reality resulted in little change other than to consolidate the restriction of the power of the indigenous people. An Indian Civil Service was created and the Governor General renamed the Viceroy, creating an illusion of the upheaval of the East India Company’s goverance. Yet despite the change in title, the new Viceroy of India was in fact the same man who had been Governor General, Charles Canning, and largely took on the same role as before 1857. The only tangible alteration was that he worked for the Government rather than the Company. Moreover, the Indian Civil Service was mainly comprised of white British men, and whilst indigenous people were not prohibited from joining, the entrance tests were based in London, so it was made near impossible; this had not even changed several decades later in 1905, when a mere 5% were men from Bengal. The creation of the Civil Service therefore only served to strengthen the administrative control of the British over the Indians by limiting how much influence Indians had over their own government. Another ostensible change introduced by the British government was the return of authority to the indigenous rulers of the princely states, a reversal of Dalhousie’s Doctrine of Lapse. While this appeared to be an extreme shift from Britain’s policy pre-Mutiny, the Princes overwhelmingly complied with British legislation and the restoration of their power made little difference to everyday life; the British government gave back their former entitlements solely because it appeared to be respecting tradition. A considerable amount of bitterness had developed in recently annexed states such as Oudh, so this difference in policy was expected to help pacify the indigenous people to prevent future uprisings. Ultimately, the British changes to the administrative rule of India were not as severe for the majority as they could seem at first glance, and were made principally to cement British rule and influence in the subcontinent.

Britain’s modifications to the structure of the Indian military were slightly more radical because it was sepoys in the East India Company’s army who had begun the Mutiny, so to avoid a repeated occurance and confirm that Britain held power over the army, it was necessary for Britain to change its military organisation in a more extreme fashion than it had changed administrative or economic policies. In order to prevent the recurrance of a similar incident, the religions and castes of the regiments were mixed to cut off a sense of unity against the British. This was intended to avert a situation like that of the Brahmin caste before the Mutiny – members of the elite Brahmin were forbidden to travel across the sea, yet this custom was often overlooked or ignored by British generals, leaving them to harbour resentment against the British. In addition to this, eighty four percent of regiments in Bengal (where much of the resistance had originated) were replaced, in order to diffuse any remaining tension in the area between the sepoys and their white officers. The number of British officers supervising a sepoy regiment was increased, and weapons were left under British control when not being used directly in battle to ensure that any violence that broke out amongst sepoys would not immediately endanger the British generals. However, whilst more changes were enacted in regards to the Indian military than in Britain’s administrative or economic policy, they were almost all made with the objective of inhibiting the escalation of future conflicts between sepoys and their officers into full-scale revolutions. The statement could be made that because sepoys were treated with greater respect after the Mutiny, Britain’s aim was not to assert control over the Indian troops or remain distant from them, but rather to foster amiable relations between officers and their soldiers; yet this was another strategy used by Britain to create an illusion of interpersonal respect to avoid further provocation of the indigenous peoples. Hence the military strategies of the British towards India only changed significantly because they were the most relevant in preventing the reoccurance of a mutiny.

The changes to British economic policy towards India were not a complete reversal of policy under the East India Company, yet again the changes that were made were directed towards attempting to curb the economic progress and industrial independence of the indigenous people to secure British control over India. The British built over 3000 miles of railway after 1857, a vast distance compared to the mere 288 miles built under Company rule. This development, whilst not being entirely new –railway lines, despite being short distances, had already existed before the Mutiny – simultaneously benefitted British trade as it allowed them to transport their goods further distances, increasing their wealth over that of the Indian economy, and allowed British troops to reach and crush any uprisings in remote areas much quicker than they would have been able to otherwise. While one could argue that developing and promoting industry in remote areas was an equally important reason for the construction of railways, and thus that their purpose was not to consolidate the British Raj, Britain’s economic policies actually intended to hinder India’s industrial growth. The recently introduced policy of free trade made it far easier for Britain to bombard India with inexpensive British-manufactured goods, which India would often have provided the raw materials for. For example, India produced raw cotton for export to Britain, yet its textiles industry was crushed by imports of cheaper British cloth. India’s economic development was hence restrained as it remained reliant on exports of raw materials to Britian, but had no protected market in which to sell its own manufactured goods, so its own industry could not flourish when faced with British competition; Britain was therefore kept economically superior to India, securing its power over the country, whilst India was kept dependent on British trade for its economy to survive, strengthening its ties to Britain. Therefore, Britain’s economic policy somewhat changed after the Mutiny due to the addition of railways to hasten the transportation of troops, and the import of British manufactured goods to India to limit its industry, however because railways had first been developed by the East India Company, the adjustments were only made for the purpose of security over the region and were not as extreme that one could state that they were changed completely.

To conclude, the Indian Mutiny resulted in Britain altering its policy on India from that of forced Westernisation with the ultimate aim of India achieving self-government, to one primarily focused on retaining British control and security in the subcontinent. However, outside of this shift in emphasis, little was changed, for life itself was not made radically different for the indigenous people; instead, the differences were precautionary, to avert the recurrance of brutality and ensure Britain remained the dominant power in India.