The Creole Origins of the Chemise de la Reine

Written by: Phoebe Clayton

If you’ve spent enough time around me, you will have heard about the infamous 18th century dress, the so called ‘Chemise de la Reine’. To explain, a ‘chemise’ was a women’s undergarment, worn directly against the skin under a set of stays or as a nightgown, and usually made of fine white material. In 1783, Marie Antoinette (the ‘reine’ at that time) was painted wearing a dress which loosely resembled a ‘chemise’, displayed at the Salon de Paris in the Louvre. The gown sparked outrage due to its perceived informality and nonconformity with the highly structured aesthetic of traditional court gowns. It was unlike anything worn by French aristocracy before. But although named after the queen, the ‘Chemise de la Reine’ was not invented by Marie Antoinette. So, where did it come from?

 Marie Antoinette en gaulle, Élisabeth Louise Vigée Le Brun, 1783

The dress itself was made by her tailor, Rose Bertin, who adapted it based on clothes of white women in the West Indies, who had themselves appropriated the style from women of colour. The gown first came Paris in the form of a fashion plate published in 1779 depicting a women dressed ‘in the Creole style’. In fact, Antoinette herself refers to the gown as ‘Le Robe a la Creole’ in her diaries, suggesting a direct awareness of the colonial cultural origins of the dress.

The term ‘creole’ refers to ‘a person of mixed European and black descent, especially in the Caribbean’, implying the inherently multi-racial context of the dress’ origins. Thus, the dress was likely first worn by women of colour, made of undyed madras material – which already widely imported to both West Africa and the Caribbean at this time, as it was light and well-suited to hot or tropical climates. Two black women wearing similar white, flouncy gowns strikingly reminiscent of Antoinette’s ‘chemise’ are featured in a Brunias painting from 1770, and two other women of colour wearing the style are featured in similar painting of his from c.1780. Agostino Brunias, who was active in documenting colonial life in the Caribbean in his art, later depicts white, black and creole women all wearing similar loose, white, chemise-type dresses in his ‘Linen Market’, 1780.

Free West Indian Dominicans, Agostino Brunias, c.1770

It is clear from this series of visual evidence the gradual appropriation of ‘chemise’ style dresses by white women – likely for reasons of practicality as well as a more hostile jealousy towards black beauty and style. One can ascertain the latter from the increasing sumptuary laws (legislation controlling what certain demographics can or can’t wear) that enslaved and formerly enslaved people were subject to during this period in the Caribbean, suggesting that white colonist elites felt threatened by the fashion and expression of black communities and thus restricted it to the best of their ability.

However, while the adoption of ‘chemise’-type gowns by white women in the West Indies is a clear appropriation and attempt at mimicry of black fashion, Marie Antoinette’s motivations in donning the style are harder to discern. Although she shows clear awareness of the apparent ‘creole’ origins of the dress, general contemporary and modern census is that the queen was instead imitating a romanticised, pastoral, ‘shepherdess’ style, trying to emulate the perceived idyllic simplicity of a rural lifestyle. For reasons obvious to anyone with a passing awareness of 18th century France (think: economic crisis, famine and widespread destitution), such an imitation was met with decidedly ill reception and offence caused at the queen’s ignorant naiveté and apathy to the struggles of her own subjects.

After Antoinette was painted in her controversial rendition of the gown, it immediately became known as the ‘Chemise de la Reine’ and quickly caught on, gaining popularity amongst upper class women in France, England and wider Europe. The style caused a seismic shift in 18th century women’s clothing and, as the 1790’s dawned, sent fashion careening straight into the regency period. The white, gauzy fabric finely gathered beneath the bust, puffed sleeves, square neckline and simple skirt all became foundational staples of women’s fashion for the next 40 years. It had a truly transformative impact. And, although the gown travelled far from its birthplace of the Caribbean, it is important to acknowledge the black and Creole origins of the Chemise de la Reine and recognise their monumental influence on an entire century of Western women’s fashion.


DuPlessis, R. (2019). Sartorial Sorting In The Colonial Caribbean And North America. The Right To Dress: Sumptuary Laws In A Global Perspective, c.1200–1800,


pp.346–372. doi:

Halbert, P. (2018). Creole Comforts and French Connections: A Case Study in Caribbean Dress. [online] The Junto. Available at:

Peterson, J. (2020). Robe en Chemise or Chemise a la Reine – Pattern 133. [online] Laughing Moon Merc. Available at:

Square, J.M. (2021). Culture, Power, and the Appropriation of Creolized Aesthetics in the Revolutionary French Atlantic | Small Axe Project. [online] Available at:

Van Cleave, K. (2021). On the Origins of the Chemise à la Reine. [online] Démodé Couture. Available at:

Whitehead, S. (2021). À la Creole, en chemise, en gaulle: Marie Antoinette and the dress that sparked a revolution.


Retrospect Journal. Available at:

The Kirtle: The Original Dress

[Early 17th century kirtle, ‘the lute player’, Orazio Gentileschi]

Pheoebe C in Year 12 explores the evolution of the kirtle, from its origins to its more modern form, the dress.

A kirtle is a one-piece garment that was popular in Western Europe from the early Middle Ages up into the 17th century. Mentions of the kirtle date back to as early as the 10th century[1], and painted depictions survive from throughout the 17th century[2]. Initially worn by both men and women (although men’s kirtles are often referred to as ‘cotehardies’ in modern scholarship, they are fundamentally the same[3]), men’s fashion gradually shifted away, towards the shirts and trousers we see in menswear today. However, the basic concept of the kirtle still survives in modern-day dresses, making it perhaps the most influential garment in the entire history of Western fashion.

[Kirtle from 15th century manuscript]

The kirtle was the first western ‘dress’, so to speak. Although clothing had also previously consisted of one long garment draped over the whole body, the kirtle was made to fit around the human body, rather than be wrapped and manipulated with folds and belts until it fit. Additionally, it needed no extra closures such as pins or brooches, as previous garments had done. Examples of the kirtle’s predecessor include Roman togas (and other robes), as well the Anglo-Saxon peplos[4]. These garments were loose-fitting and tended to be made of just a large rectangle (or two) of fabric, pinned or tied around the body, and often sleeveless. The kirtle was also one of the first Western garments which required no extra undergarments (although they were often worn in combination with other garments for warmth or practicality anyway). They had sleeves and were long enough to also cover the legs, circumventing the need for several additional items of clothing, and instead combining them all into one.


Kirtles started off as both under and outerwear, and it wasn’t uncommon to wear both an under-kirtle and an over-kirtle. The under-kirtle would have been made from a cheaper fabric that could survive frequent washing, whereas the outer-kirtle would be finer or decorated in some way. Kirtles were usually woollen, however, linen was sometimes used depending on environment and availability[5]. As time went on, the upper classes progressed to only wearing kirtles as undergarments, whereas the lower classes used them as main outer-garments for much longer[6].

Of course, since the popularity of the kirtle lasted over half a millennium, some evolution and change in style was inevitable. Early kirtles were loose-fitting and didn’t have waist-seams. However, by the 15th century they were skin-tight and then evolved to consist of separate skirts that were pleated or gathered before being attached to the bodice[7]. Kirtles were closed using lacing along the front, sides or back of the garment, although earlier examples of kirtles have no lacing at all. Since early kirtles were loose-fitting and had relatively wide necklines, lacing was unnecessary as they could just be put on over the head. Later, tighter kirtles also acted as a kind of prototype for the corset; they provided the kind of support we now receive from a modern bra.

[Kirtle with front lacing]

For example, here is a reconstruction of a 14th-century kirtle that I made a few years ago, versus the 17th-century one I made this summer.

[self-made 14th century kirtle & self-made 17th century kirtle]

Notice the stark difference in styles. Although, visually, they appear more different than similar, the fundamental and defining feature of the kirtle remains: a one-piece skirted garment.

Eventually, the kirtle fell out of fashion in favour of separate skirts and bodices among all classes. By the 18th century, there are barely any depictions of kirtles in art, even among lower-class and rural communities. While this development was inevitable since aristocratic fashion had long abandoned the kirtle in favour separate skirts, and working-class fashion has always followed upper class fashion- just at a delay of several decades- practicality was likely a major factor in this evolution. Combining a skirt with a separate jacket or bodice (still over the top of a shift) allowed the wearer to have more freedom in what they wore and saved unnecessary washing. It reduced the total number of garments someone needed to have a varied and adaptable wardrobe. It was also cheaper and quicker to make smaller individual pieces as required, rather than entire dresses. Additionally, as structural undergarments became more commonplace (providing bra-like support and shaping the torso, either in the form of early stays, as a pair of jumps, or as boning sewn directly into the bodice), the need for tight-fitting kirtles as supportive garments declined.

[example of separate skirt/jacket outfits worn by women of different classes. Woman reading a letter, Gabriel Metsu, c.1665-67]

The dress did come back into mainstream fashion eventually (after over a century- and even later in upper-class fashion). Although it looks unrecognisable from its medieval ancestor, the concept of the dress as a one-piece flowing garment originated with the kirtle in European fashion. Modern dresses also share this heritage, although construction techniques and style conventions have progressed significantly. In fact, the closest modern-day equivalents to the kirtle are various forms of European folk dress. The legacy of the kirtle lives on through garments such as the German Dirndl, a type of folk dress based off rural Alpine clothing in the 16th-18th centuries[8]. The modern Dirndl bares striking resemblance to the 17th century kirtle, and it is fair to presume that this is because one was based off the other, seeing as the kirtle was a very common working-class garment throughout Western Europe (including the Alpine region) at the time. Although it seems modern society has largely forgotten about the kirtle, its lasting impact is undeniably still evident in fashion today.

[modern dirndl example]

[1] Anglo-Saxon Female Clothing: Old English Cyrtel and Tunece, Donata Bulotta ( )

[2] ’A Peasant Family at Meal-time’, c1665, Jan Steen


[4] Dress In Anglo-Saxon England, Gale R. Owen-Crocker ( )





Why ideas matter: the calculated uses of British ‘civilisation’ in Africa

Josie M, Transition Representative on the WHS Student Leadership Team, explores how ideologies are constructed in order to justify atrocity, in relation to Britain’s exploitation and colonisation of Africa

Throughout history, and particularly in relation to the British empire, Britain’s international dominance has been obtained and developed through the enforcement of British beliefs and culture. When Africa became the object of Britain’s desire in the late 17th century, as an integral part of the transatlantic slave trade, Africa’s apparent lack of ‘civilisation’ – as determined by Britain’s definition – was used to justify the horrendous treatment of African slaves and was later instrumental in gaining public support for African colonisation.

This reflects the wider phenomenon of how the ‘western’ notion and interpretation of what constitutes a respectable civilisation has been severely damaging to the periphery during colonisation and European land acquisition. I am taking the term ‘western’ to refer to ideas that have been popularised in economically developed areas in Europe; and ‘periphery’ to refer to countries that are viewed as less economically developed nations with ‘poor communications and sparse populations’: ‘Defined in geographical or sociological terms, the centre represents the locus of power and dominance and importantly, the source of prestige, while the periphery is subordinate’ (Mayhew, 2009).

The British Empire and Commonwealth

A widely circulated definition describes civilisation as a complex society, concerned with so-called civilised things; Money, Art, Law, Power, Culture, Organised belief systems, Education, Hierarchy, Trade and Agriculture (, 2022).This proposed definition contains only a limited range of categories for observation: 17th and 18th century Britain also used similar definitions to decide what constituted a respectable civilisation. When British explorers and leaders arrived in Africa, they found peoples and cultures that operated very differently to the commercial towns and cities of Britain. Technological advancements within Britain meant that emerging industrialisation within towns and cities was considered a major manifestation of civilisation, so Africa’s lack of these particular trademarks led to the continent being branded as ‘Darkest Africa’, which was the idea that the people were savage and brutal and incapable of governing themselves.

As a result of this racist ideology, African people were then portrayed as uncivilised and inferior to the white British classes who sought to rule and profit from them. As the cradle of humanity, African kingdoms and tribes had been evolving for thousands of years, all the while developing rich histories and cultures that oversaw daily life. However, this vibrant tapestry of language, music, art, customs, trade, and religion was not seen as such – in the eyes of the British, the tribal system and Africa’s lack of advanced weaponry and technology meant that it was their ‘rightful place’ to be subservient to the colonising powers, and slavery was a means of achieving this submission.

The shifting uses of ideology

The development of the transatlantic slave trade and the eventual European colonisation of Africa are heavily intertwined, with the racist ideology developed during the slave trading period resurfacing and being used to justify colonisation. Across Europe, Christianity had long since been associated with civilised society and as being the pinnacle of world religion. Christian teachings were used to justify the poor treatment of slaves and their forced removal from their homelands. Propaganda that Africans “worshipped the devil, practiced witchcraft, and sorcery” among other evils was rife, these practises directly opposing many values held by religious Europeans. As a result, one of the foremost Christian missions was employed: to evangelise and spread the word of God.

Pseudo-scientific race theories were also beginning to emerge at this time, suggesting that black races were genetically inferior to white races and so God required that they serve their white masters. Christianity’s evangelical mission was utilised in justifying the removal of African people from their ‘devil worshipping’ cultures and bringing the ‘heathens’ to Christian lands where they could be saved by the Gospel and brought into the light, thereby spreading the faith and achieving one of the primary objectives of the religion.

These race theories and evangelising missions were later turned on their head when slavery was abolished on 1 May 1807. The trade did not stop instantaneously: Britain continued to expand its economic horizons through increased trade with India and the Far East, in order to maximise its global reach. By the later 1800’s, colonial expansion into Africa became the new object of European interest and the ‘Scramble for Africa’ formally began with the Berlin Conference in 1885.

Cecil Rhodes (1892)

During African colonisation, the popular Christian mission altered from justifying ferrying slaves to Christian countries to deliver them liberation through the Word of God, to directly opposing slavery and all the evils associated with the practice. This altering of common Christian beliefs about slavery was employed by British leaders to gain support for direct British involvement in Africa. David Livingstone was an extremely popular and influential explorer and missionary at the time; calling for a worldwide crusade to defeat the slave trade controlled by Arabs in East Africa. The British were then able to turn the tide of belief by establishing their own moral authority on the issue, they created another enemy in the Arabs and were then able to present their quest for land acquisition as ethical because they were supposedly assisting in the eradication of slavery – a system they had exacerbated enormously – by fuelling colonisation and increasing their involvement in Africa.

Livingstone’s three C’s: Commerce, Christianity and Civilisation were then employed as the main objectives of British administration in Africa. In this way, Britain portrayed itself as a more innocent party that was merely extending a great opportunity to Africa to modernise in the same way it had. However, this was not the reality of the situation and the true motives behind imperial expansion: competition, and profit, were often disguised behind this veil of apparent moral authority. The plight of many African nations that suffered at the hands of European expansion was then blamed on themselves, on their own ‘savage’ ways, when in fact European nations were instrumental in causing many of the issues of corruption, instability and poverty, which persist as legacies of colonisation today.

In conclusion, the western definition of civilisation was warped and used by the British to justify a campaign of control and submission throughout Africa. This method of obtaining control allowed Britain to profit and develop on an immense scale, whilst the African nations that Britain occupied had their natural resources exploited, their people dehumanised, and their cultures and ways of life demonised as savage and barbaric. In many cases, Christianity was used as a means to justify these actions because it was seen as such a pinnacle of civilisation by the Europeans, and was believed to go hand-in-hand with respectable society.

Over 12 million African people were forcibly removed from their homeland and sold into slavery during the transatlantic trade, and millions more suffered as a result of colonisation and extortionate land acquisition by global powers. And critical in enabling human suffering and exploitation on such as massive scale, was the damaging western definition of civilisation, which resulted in extraordinary pseudo-scientific race theories being used to justify horrific actions.


Oxford reference: Core-periphery, (Azaryahu (2008) Soc. & Cult. Geog. 9, 4).

Olusoga, David: The roots of European racism lie in the slave trade, colonialism – and Edward Long, The Guardian

David, Dr Saul: Slavery and the ‘Scramble for Africa’, BBC History

St John’s College: The Scramble for Africa, Europeans called Africa the ‘Dark Continent’ because it was unknown to them, University of Cambridge,The%20Europeans%20called%20Africa%20the%20’Dark%20Continent’%20because,it%20was%20unknown%20to%20them.&text=African%20peoples%20did%20not%20have,their%20rich%20histories%20and%20cultures.

Raypole, Crystal: A Saviour No One Needs: Unpacking and Overcoming the White Saviour Complex, Healthline

Christian History: Why did so many Christians support slavery?, Christianity Today

Farmer, Alan: The British Empire c1857-1967, Access to History, Hodder Education, 2018

Oxford AQA History: The British Empire c1857-1967, A Level and AS, Oxford University Press, 2015

Why being a great linguist means broadening your horizons beyond the exam

WHS Linguistica Club

WHS Head of French and Mandarin, Claire Baty, extols the crucial, intrinsic importance for linguists of broadening their cultural and imaginative horizons, and discusses two school initiatives to support this – Linguistica magazine and its associated club, Linguistica and Friends

My MFL colleagues and I are currently busy proof-reading articles for the summer edition of the department’s Linguistica magazine. Each term, as the deadline for submissions comes and goes, I feel a sense of curiosity tinged with apprehension. I am excited to read the fruits of students’ efforts beyond the language classroom but I can’t escape the underlying worry that they may not feel sufficiently impassioned to actually submit articles for publication. Why is that?

Linguistica was created to be more than just a magazine – it is a space to explore language learning and the myriad opportunities this affords. Fortunately, post-covid, our classrooms have once again become inspiring, collaborative spaces where students can assimilate new language through role plays, and can put their heads together, literally, to work out the rules of a new grammatical structure. Whilst rote learning of vocabulary and grammar rules is important, language learning is and should be much more than this. An understanding of the music, film, fashion, food, history, politics, literature, geography of the country is just as significant as being able to use the words correctly.

It is this cultural understanding, coupled with strong syntactical awareness, that ultimately creates an expert communicator. In a world that is increasingly driven by technology, it is our ability as human beings to empathise and communicate with each other that will become the most important 21st century skill. Linguistica is a platform for our students to engage with the cultural, social and political world of the country they are studying.

Students learning about the Hanfu

This term our ‘Linguistica and Friends’ club has whole-heartedly embraced the STEAM+ ethos by inviting other departments to deliver workshops, seminars and lectures exploring the interplay between their subject and MFL. Our aim, to enrich our students’ understanding of the world around them. We have encouraged them to ask big questions which force them to make connections between their subjects such as:

  • How does Maths help me with translation in a foreign language? 
  • Does learning Latin mean I am better at French?
  • If we all spoke the same language would there be less conflict in the world? 
  • What helps me understand people better – learning their language or learning their history?
  • Science has nothing to do with languages: discuss.
  • Is computer code a language? 

We have enticed them to see things through a different lens. Ultimately no discipline can exist in isolation and learning a language really does entail learning a whole other perspective on the world.

Why does this matter?

The WHS Civil Discourse programme has as its core aim for our students “to be truly flexible, robust and open in their thinking, and for the world to re-awaken itself to the notion of real debate and discussion, based on authentic encounters between enquiring hearts and minds”. Exploring topics we thought we understood from a new perspective allows for nuanced thinking and offers access to opinions which differ from our own.

We all start out with a ‘blik’ or worldview, informed by our upbringing, circumstances and personal experiences. Our ‘blik’ tells us how to interpret the world, and we then choose to embrace the facts that support our ‘blik’ whilst selectively ignoring or explaining away those that go against it (R.M Hare in his response to Anthony Flew’s 1971 Symposium). Our job as teachers is to challenge a student’s ‘blik’ by offering them diverse ways to engage with subject material outside of the classroom. To stride out into the world, our students need to be able to see that world and how concepts connect with in it. This was exactly the aim of ‘Linguistica and Friends’ this term when we offered sessions designed to show the connections between subjects that the students in KS3 at least, often see as disparate.

But why do I worry our students won’t engage? Why am I concerned they won’t be as excited as I am about the opportunity to spend my lunchtime time considering the flaws of a translation of the New Testament? As teachers we can see the value of inter-connected thinking, we are excited by this opportunity to engage with the big picture, and we are frustrated by how exam specifications can thwart and potentially diminish a student’s desire to explore. For the students, however, “c’est l’arbre qui cache la forêt” and the demands of exams can hinder true scholarship, taking away the passion, the willingness to engage and explore just for the fun of it.

An Introduction to Semitic Languages

And this is precisely why Linguistica matters. It is in this co-curricular space that we can open our students’ minds to new concepts, encourage them to challenge their pre-existing ideas without the judgement of an exam. Here they can discover their passions, find out who they are and what inspires them.

So look out for this term’s edition of Linguistica, which will be published in hard copy before the summer holidays. It will showcase the creative and eloquent writing of our fantastic MFL students, who have had success in all manner of competitions. You can find out more about how our students engaged with the inspiring ‘Linguistica and Friends’ workshops, as well as the big questions considered by Years 8 and 9. Here is a flavour of what they explored.

  • The interplay between Maths and language exemplified by the deciphering work done at Bletchley Park during WW2
  • How textiles and fashion are inextricably linked to culture and history, as demonstrated by traditional Chinese Hanfu
  • The use of Greek in the New Testament: symbolism and translation. How the meaning of a text is not separate from the language in which it is written.
  • Furthering our understanding of scientific concepts by exploring the derivation of scientific words and their language of origin.
  • The role of cognates, body language and demonstration when making sense of a language you don’t speak. (Loom weaving in Italian.)
  • How Semitic languages fit into the European languages we commonly learn in school.
  • How the use of language in popular film could be used as a way of raising awareness of languages at risk of dying out. With a focus on Polynesian languages and the Disney film Moana.
  • The recent presidential elections in France and how language can be used to persuade, convince and influence.

How does mapping help to create a fictional world?

Ruby L, Deputy Head Girl, explores the significance of maps within literature, and how they help imaginatively guide both readers and writers.

Many famous literary works started off as a blank piece of paper and an idea for a fictional world. J.R.R. Tolkien produced three maps [1] and six hundred place names for his ‘Lord of the Rings’ trilogy, which became one of the bestselling series in history with over 150 million copies sold worldwide [2]. He is one of many successful authors to utilise the practice of cartography in the establishment of a fantasy land, along with Robert Louis Stevenson, who wrote ‘Treasure Island’ with the inspiration of a hand-drawn map; and C.S. Lewis, who invented Narnia. But why is this technique so popular and why does it make for more developed novels and fruitful book sales?

As Holly Lisle reveals, the process of literary map-making is an extensive and varied one. Authors generally depict a country or full land map instead of a city or street to generate a full view of the world they are creating and its geography. Once borders have been established, the addition of features such as mountain ranges, forests and cities fill the world with purpose and start to create a realistic-looking artefact. Mistakes made can also be of benefit to the plot and narrative. For example, if extra lines are drawn accidentally or a town has been placed far from any others, there is space for artistic license to make these into a story. If there is an abandoned trail it could have been deserted after a guerrilla warfare group used it in an ambush, and the isolated town could be used to excommunicate criminals as punishment in the country’s justice system [3].

But why wouldn’t the author simply write and skip this sketching? The answer is simple: this physical expression of the world inside the author’s head is invaluable when delving deeper into the story’s background. The writer can use their map to discover more about the land they have pictured, which is the main luxury of using cartography to compliment literature. Even a simple structure like the borders of the land probes into why that line was laid in that precise place. Was there dispute or war over territory? How are foreign relations between this country and its neighbour, and how does this impact the everyday lives of the citizens? Does a potential lack of security give rise to a totalitarian state in which inhabitants cannot cross the threshold to leave? Questions like these help the author to contextualise the history of the world that they are creating, which makes for a more three-dimensional setting. It helps us to understand their message in relation to their world’s history and landscape (political and social as well as physical) and in this respect, cartography is undoubtably important for the production of a fantasy world from an author’s perspective.

A hand-drawn ‘Annotated map of Middle-earth’ by British author J. R. R. Tolkien (Photo Daniel Leal-Olivias/AFP/Getty Images)

With the market for novels becoming more competitive, readers gravitate towards stories with an easily visualisable world and deeply considered, nuanced characters. Although there are many techniques which can achieve this, mapping is a simple way to produce ‘evidence’ for the fictional land to exist as they imply the realism of the author’s creation [4]. It adds another layer of credibility to the novel as we want to believe in what has been put in front of us. By human nature we are inclined to wish to read for escapism and suspension of disbelief is a huge part of what draws us into the narrative, so producing artefacts becomes very useful. This fact is what makes book sales soar for fantasy novels as they carry us away from the sometimes mundane real world. The illusion of reliability from a seemingly genuine source encourages us to engage with the text more deeply.

J.R.R. Tolkien’s work is a clear example of how mapmaking benefits both the author and reader in a fictional tale. He wrote in a letter to the novelist Naomi Mitchinson in 1954 that: ‘I wisely started with a map and made the story fit (generally with meticulous care for distances). The other way about lands one in confusions and impossibilities, and in any case, it is weary work to compose a map from a story.’ [1] Tolkien decided to come up with detailed maps depicting what would become ‘middle-earth’ and even chose to invent detailed languages and names before creating a plot. Based on his remarks, we can see that having a map before a narrative is not a defect but a delight, as successful exploration of possible characters and storylines can only come from detailed research and prior thought as to the setting. Not only was Tolkien’s cartography useful for him to devise a plot, it was widely appreciated by readers of his books worldwide. Literary critic Shippey writes that his maps are “extraordinarily useful to fantasy, weighing it down as they do with repeated implicit assurances of the existence of the things they label, and of course of their nature and history too” [1].

It is no wonder that fantasy books containing careful cartography are so popular and successful, then. They are sure to thrive as long as humans continue to need exploration and escapism.


[1] Tolkien’s maps. (2020, October 21). Retrieved November 12, 2020, from’s_maps

[2] The Lord of the Rings. (2020, November 05). Retrieved November 12, 2020, from

[3] Maps Workshop – Developing the Fictional World through Mapping. (2019, April 16). Retrieved November 12, 2020, from

[4] Grossman, L. (2019, October 02). Why We Feel So Compelled to Make Maps of Fictional Worlds. Retrieved November 12, 2020, from

Does money actually grow on trees?

Alexia P. Head Girl, analyses the historic and future impact of trees on the economy.

‘Money doesn’t grow on trees’. A cliché I’m sure most people will have heard when they were younger; when they had no understanding of the true value of money.  However, is this cliché wrong – are there economic benefits to trees?

As of 2020, there are approximately 3.04 trillion trees on the planet, made up of 60,065 different species. Their uses vary, from being produced into something tangible, such as paper or furniture, or providing intangible services, such as the carbon cycle or retaining nutrients in biomass to aid farmers in growing crops. Over time, although their uses may have changed, trees have always been a vital part of our economy, in ways that at first, may not be apparent.

Photo by zhang kaiyv from Pexels

Let’s jump back in time. The year is 1690, and the global dominance of the British Empire is growing. In Britain, most of the population are in the primary sector of employment, particularly in agriculture, growing trees to help build houses, or to trade for an animal to increase income for the household. As timber and fruits were traded amongst farmers, incomes increased. However, as more villages were established, space that was previously forestland was cleared of trees, and the supply started to diminish. The navy – at the time, the biggest in the world – relied on the timber for their ships; to continue to expand their fleet, they had to travel further abroad. Ships then travelled to America, India, and Europe to gain resources, power, and valuable influence to create trading alliances that are still in place today. This extra money and resources gave Britain an advantage when The Industrial Revolution hit in 1760. This allowed for a quick and smooth integration of the new, more efficient way of life that asserted Britain further as a global power and further boosted its economy. And all of this stemmed from the reliance and resources of trees, without which, the roots of our economy would not stand today.

However, as countries developed, their reliance on single resources and tangible products have decreased, particularly in ‘advanced’ countries in favour of services and jobs in tertiary and quaternary sectors. As a result, agriculture – such as timber production – has steadily decreased.

But trees still play a vital part in the growth of our economy today. In LIDCs and EDCs, such as Brazil, logging and mass production of wood has become part of the economy. Although the industry is environmentally frowned upon, it has an estimated worth of $200 billion annually, allowing many developing countries who produce this material to place money into developing infrastructure and technology further. There are not only economic benefits. In some societies, such as in parts of Indonesia, trees and wood have been used as currency on a local scale, allowing people to trade wood for farming animals, or clothes, encouraging economic movement in smaller villages, that may not have reliable national trading routes. Paper, furniture and fuel are just some other ways that trees have become so heavily relied on in people’s lives, with few other ways to substitute the valuable resources they produce.

Photo by mali maeder from Pexels

However, the rate at which tree resources are exploited is becoming too high. In the quest to become economically developed, forest sustainability has been forgotten. Increasing tropical deforestation rates account for loss of biodiversity and reduction in carbon intakes,affecting further tree growth in surrounding areas as nutrients are removed.

There have been recent attempts, however, to preserve the trees and rainforests. In a recent study by Yale School of Forestry and Environmental Studies, it was determined that rainforests store around 25% of carbon dioxide, with the Amazon alone strong 127 billion tons. To release these gases would heavily increase the enhanced greenhouse effect, changing the balance of the Earth’s ecosystems.

Sustainable income from trees is becoming more apparent, particularly in countries where deforestation rates are highest. In Bangladesh, where fuel industry relies on 81% wood, the logging industry has been encouraged to collect dead trees, wood waste and pruning rather than felling increased sections of forest. This still allows for an income, whilst ensuring trees remain part of the ecosystem. Furthermore, there has been a global effort to move away from the use of wood entirely. Reusable energy, such as solar power, makes up 26% of the global energy used and is expected to rise to 45% by 2045. Although this means the usage of trees in the economy will decline, it allows for new income sources, such as eco-tourism that encourages more environmentally aware holidays; for example, Samasati lodge, Costa Rica. The lodge uses rainwater instead of transporting water through pipes; is built on stilts rather than the ground as not to disrupt run-off water to rivers; and blends in with surroundings to ensure not to disturb local wildlife in attempts to make holidays more environmentally sustainable, whilst still taking economic advantages of trees.

‘Money doesn’t grow on trees’. Well, since 2016 in the UK, it hasn’t. Our bank note system changed from paper to plastic, showing the progression from a society that once relied on a single produce, to a new, man-made source. This well represents our economy today and our declining reliance on trees: what was once the roots of our economy will soon become a thing of the past.

Is globalisation a new phenomenon?

Andrea T, Academic Rep, looks at the nature of globalisation and whether with the context of our history we can consider it a ‘new phenomenon’

Globalisation is an ever-present force in today’s society. Scholars at all levels debate the extent of its benefits and attempt to discern what life in a truly globalised world would entail. But where did it all begin? A comparison of the nature of colonialisation and globalisation aid our understanding of this phenomenon’s true beginning, yet no clear conclusion has been reached. This leads us to the matter of this essay, an attempt at answering the age-old question: “Is globalisation a new phenomenon?” Though there are striking similarities between both colonialisation and globalisation, I do not believe we can see them as them one and the same. Due to the force and coercion that characterised colonisation’s forging of global cultural connectivity, and the limitations of colonial infrastructure, we cannot consider it true globalisation. Therefore, though imperfect, the globalisation of the modern world is its own new phenomenon.

Before I can delve into the comparisons of colonisation and globalisation, we must first gain a common understanding of the characteristics of both. There is no set definition for globalisation, though most definitions portray it as an agglomeration of global culture, economics and ideals. Some also allude to an ‘interdependence’ on various cultures and an end goal of homogeneity. (One could certainly debate whether this reduction of national individuality is truly a desirable goal, but that is sadly not the purpose of this essay.) Furthermore, for the purpose of this argument, homogenisation is taken on the basis of equality; equal combination of culture forming a unique global identity. And the focus of this essay will be the sociological aspects of globalisation, as opposed to the nitty gritty of the economics.

Though we are far from a truly homogeneous world, we certainly see aspects of it in the modern day. With an increase in international travel and trade, catalysed by the rise of technology and international organisations, we have seen the emergence of mixed cultures and economies. Take for example the familiar ‘business suit’. Though it is seen as more of a western dress code, all around the globe officials and businesspeople alike don a suit to work, making them distinctly recognisable. One might however consider how truly universal this article of clothing is. Its first origins are found in the 17th Century French court, with a recognisable form of the ‘lounge suit’ being seen in mid 19th Century Britain, establishing it firmly as a form of western dress. We then later see, with its rise to popularity in the 20th century (as international wars brought nations closer), the suit and many other western trends adopted across the globe (see picture below). Considering the political atmosphere of the time, and the seeming dominance of the West, we may doubt that the adaptation of the suit was an act of mutual shared culture. And yet we see the ways in which the suit has been altered as it passed to different cultures. Take the zoot suit, associated with black jazz culture, or the incorporation of the Nehru jacket’s mandarin collar (Indian origin) into the suits popularised by the Beatles. Though it still remains largely western, with the small cultural adaptations we can see how something can be universalised and slowly evolve towards homogenisation. In this way, a symbol as simple as the suit can be representative of a globalising world.

This is also where we start to see the link between colonisation and globalisation form. Trade formed an essential part of each colonial empire – most notably, the trade of textiles. Through the takeover of existing Indian trade (India in fact formed 24% of world trade prior to its colonisation), British-governed India exported everything from Gingham to tweed, and had a heavy influence on the style of the society’s elite, taking inspiration from the traditional Indian methods of clothes-making. Furthermore, this notion of the business suit can be seen as early as when Gandhi arrived in Britain (seeking education on law), dressed in the latest western trends. However, though the two do certainly share characteristics, we must consider the intent behind this blend of culture. The ideal of globalisation suggests an equality that is not echoed in colonisation. Gandhi did not wear western styles because of his appreciation of British fashion trends, but instead knew that it was far easier to assimilate if you looked and acted the same. Similarly, influence of Indian dress on British dress was not from a place of appreciation either, but from one of exploitation. Therefore, though the sharing of culture is present in both globalisation and colonisation, one cannot consider them to be the same due to the underlying intent. Furthermore, as the intent in modern day globalisation is in some ways similarly exploitative, one cannot consider the world truly globalised, but rather globalising, through a process one could still consider a new phenomenon.

Another aspect of globalisation we can consider is the role of the media. McLuhan, a 20th century Canadian professor, capitalised on this by proposing the idea of a ‘global village’ that would be formed with the spread of television. His theories went hand-in-hand with the ideas surrounding ‘time-space compression’ that have come about due to travel and media. And McLuhan was right, with a newly instantaneously connected world we have become more globalised. With the presence of international celebrities, world-wide news and instant messaging we have the ability to share culture and creed, and though far from homogenous we can certainly see small aspects of global culture beginning to form. Due to this dependence of globalisation on technology it is therefore hard to view colonisation as early-stage globalisation. But one can make one distinguishing link. One could argue: the infrastructure implemented for trade routes served as the advancements in technology of the imperial time. Similar to air travel, with the creation of the Suez Canal and implementation of railways, it was easier to traverse the globe. This is what further catalysed open trade and contact between different nation states, one of the most recognisable traits of globalisation. However, despite this, the trade routes did not improve communication anywhere near to the level we see today, and the impact technology has had on the connectivity of our globe is too alien to colonisation for the two to be considered the same. In terms of interconnectivity, the form of globalisation we see today is entirely novel, and though they have the same underlying features, the difference between the two remains like that of cake and bread.

Another aspect of globalisation we can consider is the spread of religion. Religion is an incredibly important aspect of a country’s culture, defining law and leadership for hundreds of years. The American political scientist Huntington explored religion and globalisation in his work: ‘The Clash of Civilisations’ (1996) in which he put forward the following thesis: due to the religio-political barriers, globalisation will always be limited.

But events have challenged this. There has been a rapid spread of religion around the world due to the newfound (relative) ease of migration and the access to faith related information through the internet. From London (often dubbed a cultural ‘melting-pot’) to Reykjavik (rather the opposite), we see Mosques and other religious institutions cropping up. With the lack of religious geographical dependence, we see the homogenising effect of globalisation. This is also to some extent echoed in colonisation. During the years of the British Empire, colonisation followed a common narrative of the white saviour. Missionaries preached a new and better way of life, supposing that the application of Christian morals and values would help develop the ‘savage’ indigenous tribes. This attempt at integrating western Christian culture into the cultures present across Africa and Asia shows an early attempt at a homogenised culture. However, though there was certainly some success in the actions of the missionaries (as seen with the establishment of many churches across South Africa), the aggressive nature of this once again contradicts the fairness implied in the concept of a homogenous culture, and globalisation remains a new phenomenon.

One cannot dispute that colonisation does share a number of characteristics with globalisation. From free trade to new infrastructure to the mixing of culture through religion and fashion, we can certainly see aspects of a globalising world. And yet the forceful intent of the homogenisation of cultures seen in the colonial era, removes it from being the true interconnectivity of nations. This is not to say that the world today is free of this intent, but the way in which our world today is globalising is approaching the ideal of globalisation more closely than colonisation ever did, and there is a distinct enough difference between the two that one cannot consider colonisation to truly be an early-stage globalisation. Furthermore, the world today relies so heavily on technology as a facilitator of globalisation that any notion of globalisation in the 19th century cannot be considered one and the same. Therefore, the globalisation of our day and age can be considered its own new phenomenon.


“Cultural Globalization.” Encyclopædia Britannica, Encyclopædia Britannica, Inc., 

“Globalization Is a Form of Colonialism.” GRIN, 

“Globalization versus Imperialism.” Hoover Institution, 

Steger, Manfred. “2. Globalization and HISTORY: Is Globalization a New Phenomenon?” Very Short Introductions Online, Oxford University Press, 

“What Is Globalization?” PIIE, 26 Aug. 2021, 

Maddison, Angus “Development Centre Studies The World Economy Historical Statistics: Historical Statistics” OECD Publishing, 25Sep. 2003,

Chertoff, Emily. “Where Did Business Suits Come from?” The Atlantic, Atlantic Media Company, 23 July 2012, 

What progress has been made this year towards creating a diverse curriculum at WHS?

WHS Classroom

Miss Emily Anderson, Head of History at WHS, evaluates the progress of the diversity in the curriculum working party since September, and reflects on our next steps.

It has been both a challenge and a privilege to have been leading the working party examining diversity in the curriculum since the Autumn Term. Ensuring that our curriculum is fit for purpose in both empowering our students to be active citizens of the world in which they live, and reflecting both their identities and those they will live and work alongside in their local, national and global communities could not be a more vital part of our work as teachers, individually, in departments and as part of the whole school. Such a curriculum would simultaneously support our students and ensure they feel that they belong in the WHS community, and would empower them to understand and champion diversity in their lives beyond school. The curriculum is not a fixed entity, and the constant re-evaluation of it is one of, to my mind, the most challenging and important parts of our professional lives as teachers.

As members of the school community will be aware from his letters and assemblies, in the autumn Deputy Head Pastoral Ben Turner asked staff, as part of our commitment to systemic change, to scrutinise three different areas of our work as a school in order to better inform our future direction. Alongside our scrutiny of the curriculum, colleagues have been looking at our recruitment of students and staff and how we reach out to a broader and more diverse range of communities, and at our work with our students beyond the curriculum, in our pastoral, super-curricular and extra-curricular contexts.

WHS Partnerships

Examining the curriculum were staff from the arts, sciences and humanities, bringing a variety of perspectives. I wanted to make an ambitious but absolutely necessary distinction from the outset – that we cannot approach the curriculum by diversifying what is already there, but need to create a curriculum that is inherently diverse. We discussed the need to broaden our collective understanding of different identities (the GDST’s Undivided work has been very valuable in this regard), and to model open, honest and often difficult dialogue. The difficulties of the process of change were also considered, especially the transition from an old to a new curriculum, and the fear of being labelled knee-jerk or tokenistic until it became embedded and normal. This is, however, no excuse for not trying. Doing nothing is not an option. Three areas for evaluation emerged for us to take to departments:

  1. The day-to day – teachers’ understanding about different types of diversity, our use of language and resources in the classroom, encouraging more challenging and reflective discussions in the classroom.
  2. The medium term – creating a diverse curriculum at WHS – looking again at KS3, and evaluating our choices at KS4 and KS5 to identify more diverse lines of enquiry or exemplars in existing specifications, or opportunities to move to other boards.
  3. The bigger picture – joining the growing national conversation with exam boards to make changes to GCSEs and A Levels to better reflect diverse identities, critically evaluating the cultural assumptions and frameworks through which our knowledge is formed and which privilege certain identities over others, to problematise and ultimately change these in our teaching.

The reflections that came back from discussions at department level showed that much carefully considered planning is being undertaken across departments, in terms of the individuals whose voices are heard through study of their work, the enquiries that are planned to broaden our students’ horizons and the pedagogical implications of how we create an environment in which diverse identities can be recognised and understood.  

My own department (History) are completely reconceiving our curriculum. My colleague, Holly Beckwith, wrote a beautiful rationale for this in WimTeach last year which I would highly recommend reading.[1] We have been preparing for major curriculum change for a number of years, firstly through trialling experimental enquiries to pave the way, such as a new Y9 enquiry on different experiences of the First World War. Our choosing of a unit on the British Empire c1857-1967 at A Level – a unit whose framework could, if taught uncritically, be problematic in terms of what it privileges, but which enables us to at least explore, understand and challenge such power structures and give voice to some of the people it oppressed through the study of historical scholarship – also helps facilitate changes further down the school as it demands significant contextual knowledge about societies across the world before the age of European imperialism.[2] Now, we are in a position to put in place major and increasingly urgently needed changes for September 2021 at Year 7 and Year 10, which will lead to a transformed KS3 and KS4 curriculum over the next three years.

To pivot back to the whole-school context, I also met with student leaders from each year group who had collated ideas from their peers to feed back. These were wonderfully articulately and thoughtfully put, often critical, and unsurprisingly revealed a great appetite for change. As teachers and curriculum designers, there is a balance to be struck here between taking students’ views into account, and creating coherent and robust curricula where knowledge and conceptual thinking builds carefully as students progress up the school – areas of study cannot simply be swapped in and out. As I have alluded to above, for example we start sowing the seeds of contextual understanding for GCSE and A Level at Y7. Furthermore, this process will take time, as meaningful change always does, and so managing expectations is also something we must consider. In and of itself, modelling the process of systemic change is such a valuable lesson for our students so this must be seen as an opportunity to demonstrate this.

So far, this process of evaluation has prompted profound and necessary reflection by teachers not only on what we teach in the classroom, but on how our own understandings of our disciplines have been conditioned by our experiences and educations. As well as educating our students, we are also continually educating ourselves, often unlearning old ideas. There is still a significant way to go in creating the inherently diverse curriculum we are aiming for, and I look forward to continuing to challenge and be challenged as we work together as a community to, ultimately, try to do right by our students and our world.



[2] Akala, Natives, London, Two Roads, 2019; R. Gildea, Empires of the Mind: The Colonial Past and the Politics of the Present, Cambridge, Cambridge University Press, 2019; P. Gopal, Insurgent Empire, London, Verso, 2019;

Was wissen wir über Ötzi?

For her entry for the Oxford German Olympiad, Caroline (Year 12) researched the famous stone age glacier mummy from Austria.

Ötzi besitzt viele Namen – Mann vom Tisenjoch, Mann vom Hauslabjoch, der Mann aus dem Eis… Die Gletschermumie aus Österreich ist weltbekannt. Seine Entdeckung förderte unsere Kenntnisse von der Steinzeit, bewies dabei wie wenig davon bekannt war. Am 19. September 1991 wurde er durch eine schockierende Entdeckung gefunden, als ein Paar Wanderer über seine Leiche stolperten. Auf der Oberfläche des Eises ragte nur der Kopf, die Schultern, und ein Teil des Rückens heraus, also war es kaum zu wissen, was gerade aufgedeckt worden war. Das Paar hielt es ursprünglich für einen verunglückten Wanderer, jedoch wurde es nach gründlicher Untersuchung klar, dass dieser Mensch nicht kürzlich gestorben war. Sein Werkzeug wurde gleich nebenan gefunden, dazu eine altertümliche Axt, die zu seiner Sammlung gehörte. Diese Axt ermöglichte die Datierung von Ötzi selbst, denn ihre Typologie stammte aus ungefähr 4000 Jahre her!

Interesse an dieser Figur wuchs extrem schnell – sogar Extrembergsteiger wie Reinhold Messner, der als erster das Mount Everest eroberte, begannen Ötzis Alter zu raten. Messner schätzte, dass er über 2000 Jahre her gestorben war, und dank weiterer Untersuchungen, von Archäologieprofessor Konrad Spindler geleitet, wurde seine Theorie bestätigt. Erstaunlich und fraglich als es war, war Ötzis Alter um 5300 Jahre geschätzt – also stammte er aus der Steinzeit! 600 Einzeluntersuchungen folgten, um mehr Informationen über ihn herauszufinden; zum Zeitpunkt seines Todes war Ötzi etwa 46 Jahre alt, 1,60 Meter groß, und wog 50 Kilogramm. Dies zeigte einen aktiven, intensiven Lebensstil an, obwohl spätere Forschungen es bewiesen, dass nicht alles mit seinem Körper in Ordnung war. Seine Krankenakte war umfangreich – Arthritis, Würmer, Zahnprobleme, und schwarze Lungen, möglicherweise vom Rauch öffentlicher Feuer. Dazu litt Ötzi an einer schweren Herzkrankheit, und war laut Analyse für solche Krankheiten genetisch vorbelastet. So setzte diese Entdeckung einige Medizinische Prinzipien in Frage – wie konnte ein schlanker, sportlicher Mann eine solche Krankheit überhaupt bekommen? Herzkrankheiten wurden stereotypisch als eine Folge ungesunder Ernährung, Rauchen und Übergewicht, sowie mangel an Bewegung betrachtet. Und Ötzi passte zu keinen von diesen Beschreibungen. Weiterhin wurde es entdeckt, dass Ötzi laktoseintolerant war, also keine Milch vertrug; dabei handelte es sich jedoch um eine genetische Mutation statt um seinen Lebensstil.

Außer seinen Krankheiten, und alles, dass durch seinen inneren Körper herausgefunden wurde, hatte Ötzi auch äußere Merkmale (60 Tattoos zum Beispiel!) und trug sogar Bergschuhe, die mit Gras ausgestopft waren. Das Leder seiner Kleidung bestand aus sowohl Wildtiere als auch domestizierte Tiere (insgesamt 5) sowie Gras- und Felljacken um ihn warm zu halten. Diese beeindruckende und einfallsreiche Technologien der Steinzeit, die durch seine eigene Ausrüstung geschildert wurden, sorgten für sein Überleben, trotz seiner gefährlichen Umgebung in den Alpen der Steinzeit. Seine wichtigsten Gegenstände trug er bei sich in einer Gürteltasche, wie Feuersteingeräte zum Schneiden, ein Zunderschwamm, und ein Dolch, Pfeil und Bogen. Aber sogar seine hervorragende Ausrüstung konnte ihn schließlich nicht vor dem Tod schützen. Früh war es zu erkennen, dass ein Pfeil in seinem Rücken steckte – so starb er wahrscheinlich. Aber diese Entdeckung stellte die unlösbare Frage: Wer hat Ötzi getötet, und was war das Tatmotiv?

Untersuchungen an Ötzis Körper haben auch Verbindungen zur Gegenwart hergestellt – laut Proben von 3700 männliche Einwohner in der Umgebung teilten 19 davon eine genetische Linie mit ihm. Weitere Studien könnten zukünftig sogar mehr Verwandte aufdecken. Aktuell lebt Ötzi selbst im Südtiroler Archäologiemuseum, mit ähnlichen Umweltbedingungen wie am Fundort. Seine Aufbewahrung wird sich lohnen – noch gibt es weiteres zu entdecken. Obwohl man weiß, was genau er vor seinem Tod aß, wie er aussah, und was für eine Stimme er möglicherweise hatte, könnten zukünftige technologische Fortschritte einiges klarer stellen. Jedoch wird manches ewig unbekannt bleiben: die Identität seines Mörders, mit dem er unterwegs war, und die Umstände seines Todes. Denn alles andere außer der wissenschaftlichen Fundgrube seines Körpers und seiner Ausrüstung bleibt in der Steinzeit unwiederbringlich verloren.

Es ist also unbestreitbar, dass diese überraschende Entdeckung die Welt der Medizin revolutioniert hat. Ein einziger Mann hat die fortschrittlichsten Technologien wie Radiokarbonmethoden und Computertomographie auf die Probe gestellt, und birgt heute immer noch ungelöste Geheimnisse einer anderen Zeit.

Das Leben in der DDR

Jemima (Year 10) explored life in the former German Democratic Republic for her entry for the ISMLA original writing competition.

Ich habe mich entschieden, über die Berliner Mauer und die Trennung von Deutschland zu reden, weil ich mich dafür interessiere seitdem ich das Mauermuseum in Berlin als ich jünger war besichtigt habe.

Nach dem zweiten Weltkrieg, wurde Deutschland in vier Gebiete aufgeteilt, die von Frankreich, dem Vereinigten Königreich, den Vereinigten Staaten und der Sowjetunion übernommen wurden. Die Sowjetunion hat die DDR (oder Ostdeutschland) besetzt. Über einem Viertel der ostdeutschen Bevölkerung hat zwischen den Jahren 1945 und 1961 Ostdeustchland verlassen. Um der Braindrain zu halten, hat die Sowjetunion die Mauer gebaut.

Die DDR hatte fast keine industrielle Möglichkeiten, also gab es begrenzte Mittel. Der Vorrat schwankte, deshalb mussten manche Leute 10 bis 12 Jahren warten, um ein neues Auto zu kaufen. Etwa jeder Sechste der Bevölkerung waren Mitglieder der Stasi. Die Stasi war die Geheimpolizei, die alle nachspioniert hat. Die Stasi beobachtete fast alle.

Paula Kirby, die aus Grossbritannien kam, unterrichtete von 1985 bis 1987 in Dresden Englisch, und ich werde ihre unvoreingenommenen Erfahrungen vom Leben in der DDR erzählen. Sie hat gesagt, dass trotz des Gefühls der Trostlosigkeit, es auch Vorteile gab. Alle waren beschäftigt, es gab niedrige Mieten, gute Schulen und staatliche Kinderbetreuung. Sie hatten alle ein einfaches Leben, weil es nutzlos war, zu versuchen, mehr Geld zu verdienen. Deshalb konnte man sich auf Familie und Freunde konzentrieren. Aber dieser Lebensstil führte dazu, dass es zu viele negative Folgen gab.

Viele Leute waren von dem Mangel der persönlichen Freiheit und der endlosen Bürokratie frustriert. Ostdeutsche wurden von dem Rest der Welt abgeschnitten. Es gab keine Freiheit zu reisen; sie durften keine westliche Fernsehsendungen sehen; die Stasi hat ihre Telefone abgehört und jeder Brief wurde zensiert. Kirby hat gesagt, dass die meisten Ostdeutschen mehr Freiheit wollten.

In 1989 gab es große politische Veränderungen in Osteuropa und viele Proteste in Deutschland, die den Mauerfall verursacht haben. Dies war der erste Schritt für die Wiedervereinigung Deutschlands am 3. Oktober 1990. Leider ähnelte die Wiedervereinigung ihrer Meinung nach einer Übernahme von Ostdeutschland durch Westdeutschland. Obwohl viele Taten der DDR Regierung unentschuldbar waren, sehnen sich viele Ostdeutschen noch nach manchen Aspekten ihres ehemaligen Lebens zurück.