Why do babies in medieval art look like mini adults?

Helena, Year 10, looks at the different influences on medieval and Renaissance art, and how this changed the portrayal of children and babies in art.

Last summer, I visited the Uffizi Gallery in Florence, which is full of amazing Italian art from the medieval and Renaissance periods. Whilst there, I found it amusing that all of the babies in earlier artwork look less like babies and more like old men, such as in Madonna and Child by Bonaventura Berlinghieri, painted between about 1260 and 1270. Or in Paolo Veneziano’s Madonna With Child, painted in 1333. At first, I thought perhaps these artists had just never actually seen a baby, or couldn’t paint them, however, these odd-looking babies were actually very intentional.

Above: Madonna and Child, Berlinghieri
Above: Madonna with Child, Veneziano

Most medieval babies were depictions of Jesus

In the medieval period, most portraits of children and babies were commissioned by the church, which greatly limited the range of subjects to Jesus and a few other babies in the Bible. At the time, portrayals of Jesus were heavily influenced by homunculus, which translates from Latin to mean ‘little man’. They believed that Jesus was born perfect and unchanged, which was reflected in the artwork of the period, as he often is painted with similar features as a wise old man. Over time, this homuncular, adult-looking Jesus became the norm, and artists depicted all babies in the same way.

Medieval artists were less interested in realism

This unrealistic way of painting baby Jesus actually reflected a much wider trend in medieval art; unlike the Renaissance artists, they were far less interested in naturalism, and tended to lean more towards expressionistic conventions. This can be seen in both of the paintings above, as like Jesus, Madonna also does not look very realistic.

How the Renaissance changed medieval conventions

Non-religious art flourished

During the Renaissance, Florence’s middle-class prospered, and art was used for more purposes than the decoration of churches. Unlike in the medieval period, where common or even middle-class people are rarely portrayed in art, during the Renaissance more people could afford to commission art and portraits. Therefore, as portraiture expanded, and people did not want their own children and babies to look like homunculi; realistic, cuter babies became more standard. Eventually, even Jesus began to be depicted as the more cherub-like baby we would recognise today.

Renaissance idealism changed

During the Renaissance period, artists became more interested in naturalistic and realistic painting styles, unlike the more expressionistic style used by the earlier Medieval artists. There was a new interest in observing from the natural world and this extended to include babies and children as well as adults.

Children were viewed as innocents

In this period, a transformation in the way children were viewed was underway. Instead of tiny adults, babies were thought to be born without sin or knowledge and were therefore innocent. This changing of adult attitudes was reflected in artwork, as babies began to look much cuter, younger and more realistic than before.

It’s probably a good thing that post-Renaissance attitudes to children have prevailed, as I think we can all agree homunculi babies are not the prettiest!

Is contemporary architecture threatening London’s historic skyline?

Walkie Talkie Building

Maddie, Year 13, argues whether modern buildings are ruining London’s skyline and balances the advantages and disadvantages of modern projects.

London’s historic architecture is one of our greatest assets – culturally, socially and economically. It lies at the heart of London’s identity and distinctiveness, and its very success. It is at risk of being badly and irrevocably damaged. More than 70 tall towers are currently being constructed in London alone, prompting fears from conservation bodies and campaigners that the capital’s status as a low-rise city is being sacrificed in a dash by planners to meet the demand for space and by developers to capitalise on soaring property prices.
There have been many examples of tall buildings that have had a lasting adverse impact through being unsuitably located, poorly designed, inappropriately detailed and badly built and managed. For example, the so-called ‘Walkie talkie’ building which due to bad design concentrated the sun’s rays melting parts of cars on the streets below. And recently there has, yet again, been another proposed skyscraper in the Paddington area to the west of central London. The 224m-high Paddington Tower costing £1 bn would be the fourth highest in the capital and the first of such scale in that part of London. A building of this scale in this location threatens harm to many designated heritage assets across a wide geographical area, including listed buildings, registered historic parks and conservation areas.

London Bridge However, some people think that cities face a choice of building up or building out. Asserting that there’s nothing wrong with a tall building if it gives back more than it receives from the city. An example of a building succeeding to achieve this is the £435 million Shard, which massively attracted redevelopment to the London Bridge area. So, is this a way for London to meet rising demand to accommodate growing numbers of residents and workers?

Well, planning rules are in place in order to make sure that London achieves the correct balance to ensure tall buildings not only make a positive contribution to the capital’s skyline, but deliver much-needed new homes for Londoners as well workspace for the 800,000 new jobs expected over the next 20 years. Furthermore, tall contemporary buildings can represent “the best of modern architecture” and it encourages young architects to think creatively and innovatively making London a hub for budding architects. It also means that areas with already run-down or badly designed features have the chance to be well designed improving user’s day-to-day life whilst also benefiting the local landscape.

Protected viewpoints of the city of London

The protected viewpoints of the city of London. Do skyscrapers threaten this?

Overall, I think that in a cosmopolitan and growing capital city, London needs contemporary architecture, to embody its spirit of innovation. However, this needs to be achieved in a considered and managed way so as not to ruin the historic skyline we already have.

 

Was Hitler’s greatest mistake at Dunkirk?

Georgia, Year 13, explores the British retreat at Dunkirk and argues that Hitler’s greatest mistake was at this point in the war.

DunkirkDunkirk was the climactic moment of one of the greatest military disasters in history. From May 26 to June 4, 1940, an army of more than three hundred thousand British soldiers were essentially chased off the mainland of Europe, reduced to an exhausted mob clinging to a fleet of rescue boats while leaving almost all of their weapons and equipment behind for the Germans to pick up. The British Army was crippled for months, and had the Royal Air Force and Royal Navy failed, Germany would have managed to conduct their own D-Day, giving Hitler the keys to London. Yet Dunkirk was a miracle, and not due to any tactical brilliance from the British.

In May 1940, Hitler was on track to a decisive victory. The bulk of the Allied armies were trapped in pockets along the French and Belgian coasts, with the Germans on three sides and the English Channel behind. With a justified lack of faith in their allies, Britain began planning to evacuate from the Channel ports. Though the French would partly blame their defeat on British treachery, the British were right. With the French armies outmanoeuvred and disintegrating, France was doomed. And really, so was the British Expeditionary Force. There were three hundred thousand soldiers to evacuate through a moderate-sized port whose docks were being destroyed by bombs and shells from the Luftwaffe. Britain would be lucky to evacuate a tenth of its army before the German tanks arrived.

Yet this is when the ‘miracle’ occurred. But the miracle did not come in the form of an ally at all. Instead, it came from the leader of the Nazis himself. On May 24th, Hitler and his high command hit the stop button. Much to their dissatisfaction, Hitler’s tank generals halted their panzer columns which could have very easily sliced like scalpels straight to Dunkirk. The Nazi’s plan now was for the Luftwaffe to pulverise the defenders until the slower-moving German infantry divisions caught up to finish the job. It remains unclear why Hitler issued the order. It is possible that he was worried that the terrain was too muddy for tanks, or perhaps he feared a French counterattack. Hitler later claimed, at the end of the war, that he had allowed the British Expeditionary Force to get away simply as a gesture of goodwill and to try to encourage Prime Minister Winston Churchill to make an agreement with Germany that would allow it to continue its occupation of Europe. Whatever the reason, while the Germans dithered, the British moved with a speed that Britain would rarely display again for the rest of the war.

Not just the Royal Navy was mobilised. From British ports sailed yachts, fishing boats and rowing boats; anything that could sail was pressed into service.

Under air and artillery fire, the motley fleet evacuated 338,226 soldiers. As for Britain betraying its allies, 139,997 of those men were French soldiers, along with Belgians and Poles. Even so, the evacuation was incomplete. Some 40,000 troops were captured by the Germans. The Scotsmen of the 51st Highland Division, trapped deep inside France, were encircled and captured by the 7th Panzer Division commanded by Erwin Rommel. The British Expeditionary Force did save most of its men, but almost all its equipment—from tanks and trucks to rifles—was left behind.

In spite of this, the British would and could continue to view the evacuation of Dunkirk as a victory. Indeed, the successful evacuation gave Britain a lifeline to continue the war. In June 1940, neither America nor the Soviets were at war with the Axis powers. With France gone, Britain, and its Commonwealth partners stood alone. Had Britain capitulated to Hitler or signed a compromise peace that left the Nazis in control of Europe, many Americans would have been dismayed—but not surprised.

Hitler’s greatest mistake was giving the British public enduring hope, ruling out any chance of them suing for peace. He gave them an endurance that was rewarded five years later on May 8, 1945, when Nazi Germany surrendered. A British writer, whose father fought at Dunkirk wrote that the British public were under no illusions after the evacuation. “If there was a Dunkirk spirit, it was because people understood perfectly well the full significance of the defeat but, in a rather British way, saw no point in dwelling on it. We were now alone. We’d pull through in the end. But it might be a long grim wait…”

Trying not to PEE on your paragraphs…

Holly Beckwith, Teacher of History and Politics at WHS, explains how the History and English departments are using a small-scale action research project to try and rethink the way in which analytical writing is taught at Key Stage 3.

The age-old question for history teachers: how do we get our pupils to produce effective written analysis? It is a question we regularly grapple with as a department. Constructing and sustaining arguments is at the centre of what we do as Historians and analytical writing is thus at the core of our teaching of the discipline. But it has not always been an easy task for history practitioners to get pupils to achieve this, even over a whole key stage.

Through published discourse, history teachers have explored the ways in which we can teach pupils to produce argued causal explanations in writing (Laffin, 2000; Hammond, 2002; Chapman, 2003; Counsell, 2004; Pate and Evans, 2007; Fordham, 2007).  Extended writing has been seen as an important pedagogical tool in developing pupils’ causal reasoning as it necessitates thinking about the organisation, arrangement and relative importance of causes.

In 2003, History teacher Mary Bakalis theorised pupils’ difficulty with writing as a difficulty with history. She posited that writing is both a form of thinking and a tool for thinking and, therefore, that historical understanding is shaped and expressed by writing. Rather than viewing writing as a skill that one acquires through history, Bakalis saw writing as part of the process of historical reasoning and thinking. Through an analysis of her own Year 7 pupils’ essays, she noticed that pupils had often failed to see the relevance of a fact in relation to a question. She realised that pupils thought that history was merely an activity of stating facts rather than using facts to construct an argument.

As a solution to similar observations in pupils’ writing, history teachers have used various forms of scaffolding to help pupils construct arguments. This includes the well-known PEE tool, which was advocated by genre theorists and cross-curricular literary initiatives as put forward by, for example, Wray and Lewis (1994), and has since been used widely in History and English departments nationwide, including ours at Wimbledon High.  The concept of PEE (point, evidence, explanation) is simple and therefore a helpful tool for teaching paragraph structure. It gives pupils security in knowing how to organise their knowledge on a page.

Figure 1: PEE – Point, Evidence, Explanation

But while PEE in theory offers a sound approach to structuring extended writing in history, it has been criticised for unintentionally removing important steps in historical thinking. Fordham, for example, noticed that the use of such devices in his practice meant that there was too much ‘emphasis on structured exposition [which] had rendered the deeper historical thinking inaccessible’ (Fordham, 2007.) Pate and Evans similarly argued that ‘historical writing is about more than structure and style; the construction of history is about the individual’s reaction to the past’ (Pate and Evans, 2007).  Therefore, too much emphasis on the construction of the essay rather than the nuances of an argument or an engagement with other arguments, as Fordham argues, can create superficial success. Further problems were identified by Foster and Gadd (2013), who theorised that generic writing frame approaches such as the PEE tool was having a detrimental effect on pupils’ understanding and deployment of historical evidence in their history writing.

After reflecting on this research conducted by History teachers as a department, we started to consider that encouraging our pupils to use structural devices to help pupils’ historical writing may not be very purposeful if divorced from getting pupils to see the function and role of arguments in the discipline of history itself. Through discussions with the English department, who have also used the PEE tool in their teaching, we realised we shared similar concerns.

Not satisfied with simply holding these, we decided to do something about it and have since embarked on a piece of action research with the English department.  Action research is interested in finding solutions to problems to produce better outcomes in education and involves a continual cycle of planning, action, observation and reflection such as Figure 2 below illustrates.

We started our first cycle of our piece of small-scale research last term teaching analytical writing to classes using two different lesson sequences: one which teaches pupils PEE and one which omits this.

We then compared the writing produced by these classes to identify any noticeable differences and structured our reflections around four questions:

1.      How has the experience of teaching and learning been different to previous experience, and why?

2.      How have students responded to the new method?

3.      How far has the intervention resulted in a different approach to analytical writing so far?
4.      What are our next steps – what went well, and what needs adjusting?

Figure 2: The action research spiral (Wilson, 2017, p. 113)

Thus far, the comparisons have allowed us to make some tentative observations. Whilst these do not seem to show an established pattern yet, there does seem to be a greater sense of originality and creativity in some of the non-PEE responses. Pupils seemed to produce more free-flowing ideas and were making more spontaneous links between those ideas, showing a higher quality of thinking. In addition, a few of the participating teachers noticed that their questioning became more tailored to developing the ideas and thinking of the pupils they taught rather than getting them to write something particular. However, others noticed that pupils were already well versed in PEE and so the change in approach may have had less of an effect. Other pupils seemed to feel less secure with a freeform structure. In order to encourage the more positive effects, our next cycle of teaching will experiment with different ways of planning essays that provide pupils with a way of organising ideas more visually and focus on the development of our questioning to further develop the higher quality thinking we noticed with some classes.

The first research cycle has thus been a worthwhile collaborative reflection on our teaching practice in the pursuit of improving our pupils’ historical and literary analysis. It has given us some insights which we’re looking to develop further as we head into the second term of the academic year.

References

Bakalis, M. (2003). ‘Direct teaching of paragraph cohesion’ Teaching History 110.
Chapman, A. (2003). ‘Camels, diamonds and counterfactuals: a model for teaching causal reasoning’ Teaching History 112.
Counsell, C. (2004). History and Literacy in Year 7: Building the lesson around the text. Abingdon: Hodder Education.
Fordham, M. (2007). ‘Slaying dragons and sorcerers in Year 12: in search of historical argument’ Teaching History 129.
Foster, R. and Gadd, S. (2013). ‘“Let’s play Supermarket ‘Evidential’ Sweep”: developing students’ awareness of the need to select evidence’ Teaching History 152.
Hammond, K. (2002). ‘Getting year 10 to understand the value of precise factual knowledge’ Teaching History 109.
Laffin, D. (2000). ‘My essays could go on for ever: using Key Stage 3 to improve performance at GCSE’ Teaching History 99.
Pate, J. and Evans, G. (2007). ‘Does scaffolding make them fall? Reflecting on strategies for causal argument in Years 8 and 11’ Teaching History 128.
Wray, D. and Lewis, M. (1994). Working with Writing Frames: Developing Children’s Non-Fiction Writing Scholastic.

Speaking in tongues: why reconstruct a language we don’t even know existed? – 09/11/18

Anna (Year 13) looks back to our earliest beginnings as a civilisation in the Indo-European world, discovering that there is only one route to the reconstruction of Indo-European culture that offers any hope of reliability and that is language.

Swedish, Ukrainian, Punjabi, and Italian. To many of us, these languages are as different and distinct as they come. But it has been discovered that, in the same way that dogs, sheep and pandas have a common ancestor, languages can also be traced back to a common tongue. Thus, Dutch is not merely a bizarrely misspelled version of English and there is more to it than our languages simply being pervaded by the process of Latin words being imported into native dialects in the Middle Ages.

In the twelfth century, an Icelandic scholar concluded that Englishmen and Icelanders ‘are of one tongue, even though one of the two [tongues] has been changed greatly, or both somewhat.’ He went on to say that the two languages had ‘previously parted or branched off from one and the same tongue’. Thus, he noticed the common genetic inheritance of our languages, and coined the model of a tree of related languages which later came to dominate how we look at the evolution of the Indo-European languages. We call this ancestral language Proto-Indo-European, a language spoken by the ancestors of much of Europe and Asia between approximately 4,500 and 2,500 B.C.

The Indo European Family Tree

But what actually is it? Well, let me start simply. Consider the following words: pedis, ποδος (pronounced ‘podos’), pada, foot. They all mean the same thing (foot) In Latin, Ancient Greek, Sanskrit and English respectively. You will notice, I hope, the remarkable similarity between the first three words. English, on the other hand, sticks out slightly. Yet, it has exactly the same root as the other three. If I were to go back to one of the earliest forms of Germanic English, Gothic, you may perhaps notice a closer similarity: fotus. Over time, a pattern emerges: it is evident that the letter p correlates to an f and a letter d to a t. This is just one example of many: it is these sound laws that led Jacob Grimm to develop his law.

Grimm’s law is a set of statements named after Jacob Grimm which points out the prominent correlations between the Germanic and other Indo-European languages. Certainly, single words may be borrowed from a language (like the use of the words cliché, from the French, or magnum opus, from Latin), but it is extremely unlikely that an entire grammatical system would be. Therefore, the similarities between modern Indo-European languages can be explained as a result of a single ancestral language devolving into its various daughter languages. And although we can never know what it looked like, we can know what it sounded like. This is because, using Grimm’s Law, we can construct an entire language, not only individual words, but also sentences and even stories.

In 1868, German linguist August Schleicher used reconstructed Proto-Indo-European vocabulary to create a fable in order to hear some approximation of PIE. Called “The Sheep and the Horses”, the short parable tells the story of a shorn sheep who encounters a group of unpleasant horses. As linguists have continued to discover more about PIE, this sonic experiment continues, and the fable is periodically updated to reflect the most current understanding of how this extinct language would have sounded when it was spoken some 6,000 years ago. Since there is considerable disagreement among scholars about PIE, no single version can be considered definitive: Andrew Byrd, a University of Kentucky linguist, joked that the only way we could know for sure what it sounded like is if we had a time machine.

The earliest version read as follows:

(The audio of a later version, read by Andrew Byrd can be found at the following link: https://soundcloud.com/archaeologymag/sheep-and-horses)

Here is the fable in English translation:

Though seemingly nonsensical, it is definitely exciting, and when you take a metaphorical microscope to it, you can notice similarities in words and grammar, particularly that of Latin and Ancient Greek. What is the point, though, in reconstructing a language no longer spoken?

Firstly, the world wouldn’t be what it is today had it not been for the Indo-Europeans. If you’re reading this article, chances are that your first language is an Indo-European language, and it’s also very likely that all of the languages you speak are Indo-European languages. Given how powerfully language shapes the range of thoughts available for us to think, this fact exerts no small influence on our outlook on life and therefore, by extension, on our actions.

Secondly though, as a society, we are fascinated by our history, perhaps because examining our roots (to continue the tree metaphor) can help us understand where we may be headed. Although many archaeologists are hesitant to trust linguistic data, by gaining an insight into the language of the PIE world, we can make inferences about their culture and in turn learn more about our own. One such example of this is Hartwick College archaeologist David Anthony’s discovery of a mass of sacrificed dog and wolf bones in the Russian steppes. By consulting historical linguistics and ancient literary traditions to better understand the archaeological record, he and his team found that historical linguists and mythologists have long linked dog sacrifice to an important ancient Indo-European tradition, the roving youthful war band (known as a ‘koryos’ in reconstructed PIE). This tradition, which involved young men becoming warriors in a winter sacrificial ceremony, could help explain why Indo-European languages spread so successfully. Previous generations of scholars imagined hordes of Indo-Europeans on chariots spreading their languages across Europe and Asia by the point of the sword. But Anthony thinks Indo-European spread instead by way of widespread imitation of Indo-European customs, which included, for example, feasting to establish strong social networks. The koryos could have simply been one more feature of Indo-European life that other people admired and adopted, along with the languages themselves. We can learn about the customs of our prehistoric ancestors and so Indo-European studies is relevant because as powerfully as it has influenced our modern social structure and thought, there are also many ways in which the Indo-European worldview is strikingly different from our own. Studying it enables you to have that many more perspectives to draw from in creating your own worldview.

National Historical Museum Stockholm: A bronze Viking plate from the 6th century A.D. depicts a helmeted figure who may be the god Odin dancing with a warrior wearing a wolf mask.

Glamour and Hedonism: Why the American Jazz Age Still Intrigues Us

Laura (Year 11) explores what makes the Jazz Age a significant time in America’s history and how it has been preserved through music and literature.

The American Jazz Age, or the “Roaring Twenties”, brings to mind many images of feathers, flapper dancers and flamboyance. As the 1920s were characterised by rapid stock market expansion, successful Americans spent more, and flaunted their wealth, throwing extravagant parties. Reminders of the era cannot be avoided, as it inspires fashion, films and music of today. F. Scott Fitzgerald’s 1925 novel The Great Gatsby captured the essence of the time and offers a paradigm of the jazz age. When Baz Luhrmann took on the challenge of adapting it for film, it made $353.6 million at the box office, as audiences were captivated by the romance of the period.

Whilst the 1920s saw people move away from the austere and unpromising life during the Great War, they also brought new changes and difficulties with them. This new America had lost faith in its organisation and structure, having become disillusioned by war and patriotism. The parties and indulgence reflected newfound individualism as traditional values were left behind. Many were critical of the more frivolous lifestyle in cities, as ideas of morality seemed to shift. Prohibition, the 1920 ban on alcohol, seemed to only encourage more drinking in the clandestine speakeasies, and organised crime and bribery were rife. But the era was also characterised by modernisation and greater liberation, especially for women. The 19th Amendment was changed in 1920, giving women the vote, and social changes followed as women in the workplace became more of a norm and gender roles were questioned. Even fashion became more liberating as short skirts and hair became popular.

The jazz music that fuelled the parties of the rich and powerful in 1920s America first came from the African-American communities of New Orleans and had its origins in blues. With a more free, improvisational style, it broke musical norms whilst social conventions were being dismantled in America. With better recording of music during the mid-1920s, this new style spread quickly, and radio broadcasting allowed more rapid popularisation of the genre, as it reached people of all ages and classes. Although the US was still a place of deep-rooted racism and xenophobia, and many conservatives feared the influence of “the devil’s music”, jazz’s popularity was a step towards better inclusion in American society. When Luhrmann made his adaptation of The Great Gatsby, the music was a key element of the film. Modern hip hop and traditional jazz were both a part of the soundtrack. It cleverly blended music that evoked the era with new music that allows the modern audience to experience what it was like to listen to something completely new and unheard. Luhrmann said that “the energy of jazz is caught in the energy of hip-hop”. Check out the Jazz Spotify playlist on the Music Department Spotify here.

Ernest Hemingway, Gertrude Stein and F. Scott Fitzgerald are among the authors that have helped to preserve the excitement and intensity of the Jazz Age in their writing and are part of the “Lost Generation” writers, who came of age during the Great War. Main themes in their writing included the opulence and wealth of the 1920s, but also the damaging effects of hedonism and disillusionment. Idealised versions of the past are often seen in writing of the era, reflecting on how the indulgence and enjoyment was overwhelming and even put individuals out of touch with reality. Fitzgerald describes one of Jay Gatsby’s parties:

“The lights grow brighter as the earth lurches away from the sun, and now the orchestra is playing yellow cocktail music, and the opera of voices pitches a key higher. Laughter is easier minute by minute, spilled with prodigality, tipped out at a cheerful word.”

The giddy description shows an uncomfortable confusion of the senses, as the narrator, Nick Carraway, discovers the exciting city life. However, Fitzgerald also reveals a world damaged by war, as the “valley of ashes” in the novel represents the effects of industrialisation and modernisation on the less wealthy, and the social inequality of the time. Carraway, having served in the First World War, notes that Jordan Baker had an “erect carriage which she accentuated by throwing her body backward at the shoulders like a young cadet”, his vision is clouded by experiences of war. The literature of the jazz age endures because it shows not only the glamour and thrill of the period, but also offers sobering reflections on the price of the new lifestyle.

The sparks of wealth and excitement of the Roaring Twenties were stamped on by the Wall Street Crash of October 1929 and were extinguished abruptly. As the terrible poverty of the Great Depression began, Fitzgerald wrote “Echoes of the Jazz Age”, recalling the earlier, more prosperous times.

“It bore him up, flattered him and gave him more money than he had dreamed of, simply for telling people that he felt as they did, that something had to be done with all the nervous energy stored up and unexpended in the War.”

It is no surprise that the Jazz Age has aged so well. The excitement and romance of the period has captivated readers and audiences, and this formative period of American history is not forgotten.

Castles: architecture and story.

Wimbledon High History

Daisy (Y12) explores the significance of the castles that are dotted all over the British Isles, arguing that we should look beyond their architectural genius to study the stories behind them.

Castles are undoubtedly the most important architectural legacy of the Middle Ages. In terms of scale and sheer number, they undermine every other form of ancient monument and dominate the British landscape. What’s more, the public has an enduring love affair with these great buildings since they play an intrinsic role in our heritage and culture, meaning that over 50 million people pay a visit to a British castle each year.

Arundel Castle, West Sussex

The period between the Normans landing at Pevensey in 1066 and that famous day in 1485 when Richard III lost his horse and his head at Bosworth (consequently ushering the Tudors and the Early Modern period in to England), marks a rare flowering of British construction. Whilst the idea of “fitness for purpose” was an important aspect of medieval architecture, the great castles of the era demonstrate that buildings had both practical and symbolic use.

One of the most iconic forms of medieval castle, was that of the Motte and Bailey which ultimately came hand in hand with the Norman conquest of 1066. The Bayeux Tapestry eloquently depicts the Norman adventurers landing in Pevensey and hurrying to occupy nearby Hastings. The men paused on the Sussex coast and are said to have indulged in an elaborate meal whereby they discussed their plans for occupying the highly contested country. The caption of the Tapestry notes “This man orders a castle to be dug at Hastings”, which is followed by a rich scene illustrating of a group of men clutching picks and shovels heading to start their mission- castles were the Normans first port of call. In the years that followed, castle-building was an intense and penetrating campaign, with one Anglo-Saxon Chronicler stating that the Normans “built castles far and wide throughout the land, oppressing the unhappy people, and things went from ever bad to worse”. It was vital for William to establish royal authority over his newly conquered lands, and thus the country saw the mass erection of potentially more than 1,000 Motte and Bailey castles. With these buildings, speed was of the essence, and so the base, or Bailey, was made of wood, while the Keep (which sat on the top of the Motte) was relatively small and made of stone.

A drawing displaying a Motte and Bailey Castle.

Despite the fact that these castles obviously served a defensive purpose, with the elevation from the Motte providing the Norman noble with a panoramic view of the surrounding countryside, their symbolic purpose was arguably of greater significance. The erection of such a large number of castles set out to alter the geopolitical landscape of the country for ever and made sure that the Norman presence would be felt and respected. Figuratively, the raised Keep acted as a physical manifestation of the Norman’s dominance over the Anglo-Saxons with the upwards gradient effortlessly representing the authoritative function and higher superiority of the Lord, which was the fundamental aspect of the feudal system. Additionally, as many academics tend to emphasise, the fact that they were often located in order to command road and river routes for defensive purposes meant that their possessors were also well placed to control trade, and thus could both exploit and protect mercantile traffic.

Another key development in castle building occurred approximately 200 years after the Battle of Hastings, during the reign of Henry II. Prior to Henry’s accession, England had been burdened by civil war and a period known as the “anarchy” under his predecessor, King Stephen. Taking advantage of the confusion and lawlessness displayed throughout Stephen’s reign, the barons had become fiercely independent. They had not only issued their own coinage but also built a significant number of adulterine castles and had unlawfully adopted a large sum of the royal demesne. As a result, in order to establish royal authority, Henry set about demolishing these illegal castles en-masse, on which he expended some £21,500, and further highlighted his supremacy. Thus, as seen under the Normans, castles were again used as a means of establishing royal authority. From here the British landscape was significantly altered for a second time, which, in a period lacking efficient communication and technologies would have been a highly visible emblematic change impacting all members of society from the richest of the gentry to the everyday medieval peasant.

As a result, whilst is it important to appreciate the architectural styles and physical construction of medieval castles, I believe it is vital to acknowledge their symbolic nature, and appreciate how through the introduction of such fortresses, peoples’ lives would change forever.

Follow @History_WHS on Twitter.

Japan- a culture to die for? Cultural attitudes to suicide in Japan and the West

Wimbledon High History

Gaining publicity following Youtuber Logan Paul’s video filmed in Aokigahara, one of Japan’s suicide hotspots, the extremely high suicide rate in Japan has been featured increasingly in Western news. In this article, Jess Marrais aims to explore possible historical and traditional reasons for both Japan and Western attitudes towards suicide.

The world of YouTube and social media crossed over into mainstream media on 1st January 2018 following a video uploaded by popular YouTuber, Logan Paul. Paul and a group of friends, while traveling around Japan, decided to film a video in ‘Aokigahara’, a forest at the base of Mt Fuji, famous as the second most popular suicide location in the world. The video, which has since been taken down, showed graphic images of an unknown man who had recently hanged himself, and Paul and the rest of his party were shown to joke and trivialise the forest and all that it represents.

Unsurprisingly, Paul received a lot of backlash, as did YouTube for their lack of response in regards to the video itself. This whole situation has restarted a discussion into Japanese suicide rates, both online and in mainstream media sources such as the BBC.

In the discussions surrounding the problem, I fear that little has been said in the UK about the cultural attitudes in Japan towards suicide, and how drastically they conflict with the historical beliefs entrenched in our own culture.

In Christianity, suicide is seen as one of the ultimate sins- to kill oneself is to play God, to decide when a soul should leave the Earth, and breaks one of the 10 Commandments (‘Thou shall not murder’). Historically, those victim to suicide were forbidden from having a Christian funeral or burial, and it was believed that their souls would have no access to heaven. As a result of this, it makes sense that in Christian countries suicide is frowned upon. We in the West view the high suicide rate in Japan, and other East-Asian countries, through our own cultural understanding; while in actual fact, the problem should be seen within the context of the cultural and historical setting of the countries themselves.

In Japan, the history of the samurai plays a large role in attitudes towards suicide. The samurai (military nobility) had monopoly over early Japan, and they lived by the code of ‘Bushido’- moral values emphasising honour. One of the core values of Bushido was that of ‘seppuku’- should a samurai lose in battle or bring dishonour to his family or shogun (feudal lord), he must kill himself by slitting open his stomach with his own sword in order to regain his- and his family’s – honour in death. Due to the prominent role the samurai played in Japanese society, this idea of killing oneself to regain honour seeped into all aspects of society, thanks to personal and familial honour being a central part of Japanese values, even today.

More recently, this warrior attitude to death can be seen in the famous World War II ‘kamikaze’ pilots- pilots who purposefully crashed their planes, killing themselves and destroying their targets (usually Allied ships). These pilots were typically young, and motivated by the prospect of bringing honour to their family and Emperor in death. During the war, 3,682 kamikaze pilots died, spurred on by the samurai code of Bushido.

In modern day, suicide is seen by many in Japan as taking responsibility. Suicide rates in Japan soared after the 2008 financial crash, reaching their highest at the end of the 2011 economic year. Current statistics say around 30,000 Japanese people of all ages commit suicide each year, as opposed to 6,600 per year in the UK.  Increasing numbers of Japan’s aging population (those over 65) are turning to suicide to relieve their family of the burden of caring for them. Some cases even say of unemployed men killing themselves to enable their family to claim their life insurance, in contrast to the UK where suicide prevents life insurance being from claimed. Regardless of the end of the samurai era and the Second World War, the ingrained mentality of honour drives thousands of people in Japan to end their own lives, motivated not only by desperation, but also the desire to do the right thing.

If anything can be taken away from this, it is to view stories and events from the cultural context within which they occur. While suicide is a tragic occurrence regardless of the country/culture in which it happens, social pressures and upbringing can – whether we are aware of it or not – influence a person’s actions. If this lesson can be carried forward to different cultures and stories, we will find ourselves in a world far more understanding and less judgemental than our current one.

Follow History Twitter: @History_WHS

Suicide hotlines:

  • PAPYRUS: support for teenagers and young adults who are feeling suicidal – 0800 068 41 41

Further reading:

Historical Guilt: Sorry seems to be the hardest word

Wimbledon High History

By Millie McMillan, Year 12.

The debate surrounding the question of historical guilt is a controversial and emotionally fraught conversation.

For the campaigners that lobby governments to admit their involvement and express their repentance for past actions, an apology is a necessary step towards reconciliation. For the government, and many citizens, an official apology seems like a misguided action, conceding their financial liability and provoking backlash from their citizens. This question can be explored through the example of the British Empire; regarded as a symbol of colonial oppression by its critics, and a nostalgic reminder of bygone British power by others.

“The United Kingdom is one of the few countries in the European Union that does not need to bury its 20th century history.”

This statement, allegedly tweeted by Liam Fox (Secretary of State for International Trade) in March 2016, exposes many of the problems surrounding the issue of historical guilt, for two reasons. Firstly, the word ‘bury’ is intriguing, implying the general desire of nations to hide or suppress their own questionable historical actions by refusing to address the significance and impacts of them. Whilst this is not at all surprising, it does raise questions about many nations not accepting accountability for their actions, and whether this is the best way to approach a future of reconciliation. Moreover, this statement exemplifies how many citizens see British imperial history in a ‘nostalgic’ light. However, whilst one can disagree with the sentiment expressed, it is the wider repercussions of such attitudes that are almost more alarming.

The question lies not with whether Britain should bury her history, but why it is perceived that nations need to bury theirs in the first place.

You may have personal grievances with the way in which Fox chooses to view the British Empire, yet even disregarding this fact, his statement is a symbol of a wider culture of glorifying historical achievements, whilst vehemently refusing to acknowledge those which we would rather forget. We feel the need to bury our morally ambivalent actions, producing a warped view of historical events.

Surely it is this very approach; of sweeping historical misdeeds under the carpet, of equating ‘forgetting’ with ‘forgiving’, that is most detrimental?

This question of historical guilt has but another facet – it is not only about whether we should apologise, but also if we are able to. The generations of today had no input into past actions, and therefore an apology is either a reconciliatory mark of separation from past mistakes, or an ‘empty’ gesture, with little significance or substance behind it. If an apology is defined as an expression of one’s regret at having wronged another, then it seems not only counterintuitive, but disingenuous to deliver an apology for an action that you didn’t commit.

Nevertheless, if we choose to view an apology as a necessary step towards changing attitudes and actions, an opportunity for education along with a sign of mutual respect… it renders an apology far more significant and long lasting. A strained apology is hollow and superficial; a sincere apology offers solace and closure, twinged with optimism for a future encompassing different approaches and education.

Tony Blair’s 2006 expression of “deep sorrow” is the closest to an apology for the activities of the empire that the British Government has released thus far. Meanwhile, in other cheery news, a poll in 2014 revealed that 59% of those surveyed believed the British Empire was “something to be proud of”. It is not that the British Empire was solely a negative influence, but it is this perception of it being a source of ‘pride’ to generations that had no involvement in its creation or demise that seems somewhat confusing.

It is indicative of a flaw in the way in which the education system chooses to portray British history, glossing over the barbaric aspects of British rule and igniting a misplaced sense of patriotism amongst those eager to listen.

The question of whether countries should apologise for their actions remains, and will likely be a contentious issue for many years to come.

It is certain that we can no longer continue to ‘forget’ the events of the past. This approach achieves nothing, except fostering a culture of ignorance and misguided ‘pride’. A reformed approach to national education regarding the perceptions of Britain’s past is surely the best way in which historical guilt can best be repented. An apology is but mere words; however, the longevity of an informed population with changed mind-sets, who no longer believe their homeland is infallible, is undoubtedly more significant.

Let us not feel the ‘need to bury’ the mistakes of the past, and instead use enhanced knowledge and wisdom of our history to create a better future.

The long and winding road: how factual recall tests can effectively support linear examination courses

Wimbledon High History

By Emily Anderson, Head of History.

Think back, if you can, to your own History studies at school, whether these were months, years or perhaps decades ago. For most, the content covered becomes, over time, increasingly hard to recall. My current grasp of the French Revolution, for example, which I studied at AS Level, is embarrassingly basic now, almost 15 years later, as it is something I have rarely had need to revisit. At Parents’ Evening, parents smile wryly at vague memories of the Corn Laws or the Spinning Jenny (not meaning to undermine their importance, but their ubiquity in the collective memory of British adults is truly extraordinary) and voice envy at the breadth of opportunities available in the current History curriculum.

Instead, it is the broad conceptual understanding of, say, the nature of power, as well as the skills that remain, and these which lie at the heart of the purpose of History education for our department here at WHS. Empowering our students to participate in the academic discourse of History is our core aim, to enable them to engage critically with the world around them in their future lives. It is, however, impossible to participate in this discourse without what has been termed ‘fingertip knowledge’ as well as more conceptual ‘residual knowledge’: to secure meaningful progress in History, both need to be developed (Counsell, 2000). As argued recently in Teaching History where dialogue around cognitive psychology is increasingly evident, ‘fluent access to a range of types of knowledge is what enables historians to participate in some of the more sophisticated forms of historical discourse’ (Fordham, 2017).

Recent changes to A Levels (AL) have brought how we secure this fingertip knowledge into focus. The nature of the new linear exams mean there is more demand for a greater volume of content to be retained over a longer period of time. The importance of detail is evident both from reviewing past papers and from our experience in examining at AL last summer.

To approach this, we reflected on our experience of nurturing fingertip as well as residual knowledge at GCSE, where the linear model is, of course, long established, as is our practice of setting factual recall tests at the end of each topic. Our evaluation of the latter is below:

Advantages Disadvantages

It is classic retrieval practice, which results in stronger storage and retrieval strength (Fordham, 2017).

It encourages an extra stage of revision early in the course before more high stakes testing kicks in for mocks and terminal exams, reducing the pressure on Year 11.

It helps lead to great results (above 75% A* in the past three years).

Our tests were much too challenging – becoming notorious amongst our students and sapping morale.

They were no longer fit for purpose – pupils would never need to recall such specific detail, especially after the reform of the CIE IGCSE Paper 4 in 2015 which removed such questions.

 

Therefore, we have changed the structure of our tests to open ended questions. At IGCSE these are in the style of 4 mark recall questions. At AL I am experimenting with questions taking the form ‘cite two pieces of evidence which could be used to support an argument that…’, or similar. To try to tackle the issue of relevant but vague answers, I have awarded bonus marks at AL for detail to encourage both a conscious choice in selecting evidence (as pointed out by Foster & Gadd (2013)) and in-depth revision. All are now out of a uniform mark – 20 – to encourage comparison across topics and at different stages of the two years.

Furthermore, we have used the new AL structure to rethink when we test, in order to support maximum recall over the two years. Here, we currently have two approaches: retaining end of topic testing at GCSE in order to keep the advantages identified above, but utilising spaced tests at AL (the benefits of which are argued by, amongst others, Laffin (2016) and Fordham (2017)) by revising and testing existing knowledge on a topic before the next stage of it is covered. This lends itself particularly well to the unit on the British Empire from c1857-1967: in the past few weeks, my Year 13 class have sat tests on the increasing independence of the Dominions and on India, both in the period from c1867-1918, before studying inter-war developments. Students then complete their own corrections, consolidating the learning and identifying areas for development. During the revision period at AL, they can also undertake the same test several times citing different evidence. My 2017 cohort had, at their own suggestion, a star chart to record how many times they had undertaken a test for each area of the course, broadening their evidence base each time.

Whilst I hope that this gives a snapshot of the department’s current and very fledgling thinking, I would be mortified if it was taken to show that we are overly focussed on factual recall testing in the department. We are not. Tests of course never can and never will be the ‘be all and end all’ in terms of assessing student progress, but approaching them critically can only be a good thing.

References and further reading

Counsell, C. (2000). Historical knowledge and skills: a distracting dichotomy . In James Arthur and Robert Phillips, Issues in history teaching (pp. 54-71). London: Routledge.

Fordham, M. (2017). Thinking makes it so: cognitive psychology and history teaching. Teaching History, 166, 37-43.

Foster, R., & Gadd, S. (2013). Let’s play Supermarket ‘Evidential’ Sweep: developing students’ awareness of the need to select evidence. Teaching History, 152, 24-29.

Laffin, D. (2016). Learning to like linear? Some ideas for successful introduction of the new A Levels. Historical Association Conference workshop.