Historical Guilt: Sorry seems to be the hardest word

Wimbledon High History

By Millie McMillan, Year 12.

The debate surrounding the question of historical guilt is a controversial and emotionally fraught conversation.

For the campaigners that lobby governments to admit their involvement and express their repentance for past actions, an apology is a necessary step towards reconciliation. For the government, and many citizens, an official apology seems like a misguided action, conceding their financial liability and provoking backlash from their citizens. This question can be explored through the example of the British Empire; regarded as a symbol of colonial oppression by its critics, and a nostalgic reminder of bygone British power by others.

“The United Kingdom is one of the few countries in the European Union that does not need to bury its 20th century history.”

This statement, allegedly tweeted by Liam Fox (Secretary of State for International Trade) in March 2016, exposes many of the problems surrounding the issue of historical guilt, for two reasons. Firstly, the word ‘bury’ is intriguing, implying the general desire of nations to hide or suppress their own questionable historical actions by refusing to address the significance and impacts of them. Whilst this is not at all surprising, it does raise questions about many nations not accepting accountability for their actions, and whether this is the best way to approach a future of reconciliation. Moreover, this statement exemplifies how many citizens see British imperial history in a ‘nostalgic’ light. However, whilst one can disagree with the sentiment expressed, it is the wider repercussions of such attitudes that are almost more alarming.

The question lies not with whether Britain should bury her history, but why it is perceived that nations need to bury theirs in the first place.

You may have personal grievances with the way in which Fox chooses to view the British Empire, yet even disregarding this fact, his statement is a symbol of a wider culture of glorifying historical achievements, whilst vehemently refusing to acknowledge those which we would rather forget. We feel the need to bury our morally ambivalent actions, producing a warped view of historical events.

Surely it is this very approach; of sweeping historical misdeeds under the carpet, of equating ‘forgetting’ with ‘forgiving’, that is most detrimental?

This question of historical guilt has but another facet – it is not only about whether we should apologise, but also if we are able to. The generations of today had no input into past actions, and therefore an apology is either a reconciliatory mark of separation from past mistakes, or an ‘empty’ gesture, with little significance or substance behind it. If an apology is defined as an expression of one’s regret at having wronged another, then it seems not only counterintuitive, but disingenuous to deliver an apology for an action that you didn’t commit.

Nevertheless, if we choose to view an apology as a necessary step towards changing attitudes and actions, an opportunity for education along with a sign of mutual respect… it renders an apology far more significant and long lasting. A strained apology is hollow and superficial; a sincere apology offers solace and closure, twinged with optimism for a future encompassing different approaches and education.

Tony Blair’s 2006 expression of “deep sorrow” is the closest to an apology for the activities of the empire that the British Government has released thus far. Meanwhile, in other cheery news, a poll in 2014 revealed that 59% of those surveyed believed the British Empire was “something to be proud of”. It is not that the British Empire was solely a negative influence, but it is this perception of it being a source of ‘pride’ to generations that had no involvement in its creation or demise that seems somewhat confusing.

It is indicative of a flaw in the way in which the education system chooses to portray British history, glossing over the barbaric aspects of British rule and igniting a misplaced sense of patriotism amongst those eager to listen.

The question of whether countries should apologise for their actions remains, and will likely be a contentious issue for many years to come.

It is certain that we can no longer continue to ‘forget’ the events of the past. This approach achieves nothing, except fostering a culture of ignorance and misguided ‘pride’. A reformed approach to national education regarding the perceptions of Britain’s past is surely the best way in which historical guilt can best be repented. An apology is but mere words; however, the longevity of an informed population with changed mind-sets, who no longer believe their homeland is infallible, is undoubtedly more significant.

Let us not feel the ‘need to bury’ the mistakes of the past, and instead use enhanced knowledge and wisdom of our history to create a better future.

Behind Closed Doors: The secret worlds within us…

By Rahi Patel, Year 12.

Have you ever wondered where the common phrase ‘gut feeling’ stems from or how this meandering smooth tissue could be related to such complexities as emotions?

For centuries Aryuvedic medicine (An ancient Indian branch of medicine) has regarded the gut as the centre of our wellbeing; however, in modern medical practice this once revered organ has been pushed to the side in order to make way for the big players: the brain and the heart. However, recent developments in the medical field are beginning to reveal evidence to prove this ancient theory: showing us that our ‘gut feelings’ truly are the most significant.

In order to understand this rather counter-intuitive principle we must first establish the functions of our brain; it is the centre of conception and movement, the organ that led us to the discovery of electricity, and the organ helps us to coordinate the complexities of standing for Mrs Lunnon in a Monday morning assembly. Although we have created a strong link between the term ‘self’ and our brains we can see through the exploration of this underrated organ, the gut, that there may be more to ‘ourselves’ than what lies behind the eyes.

Our guts possess a multitude of nerves found uniquely in this part of the body. This immediately poses the question of why such an elaborate and complicated system would be needed if the sole purpose of the gut were to create a pleasant journey for food to move from the mouth through to the colon, accompanied by the occasional release of gaseous sounds?

So the scientists amongst us must all be wondering where the evidence is to support these new claims. Well, several experiments have been conducted around the world highlighting the importance of our gut with regard to mental well-being.

The ‘forced swimming test’ is a commonly used experiment to assess the effectiveness of antidepressants. A mouse is placed in a basin of water too deep for it to stand in, so it is forced to swim; mice with depressive tendencies give up swiftly, however, the ones with greater motivation persevere. Most antidepressants tend to increase the time that a mouse swims for, before giving up. One scientist, John Cryan decided to feed half the mice with Lactobacillus Rhamnosus: a bacterium widely known to be beneficial for gut health. Impressively, the mice with enhanced gut health not only swum for longer, but their blood also contained significantly less stress hormones.

Cooperation between the gut and the brain, via the vagus nerve, is thus proving to be a promising field for the curing of mental disorders and diseases. The gut is our largest sensory organ, so it would only make sense for the brain to form a close relationship with it, to create a detailed image of the state of our bodies, given ‘knowledge is power’. This understanding is helping to shed a light on complex neurological diseases, such as depression, as scientists are now aware that there is more to the ‘self’ than the brain, questioning the philosophical proposition of ‘I think therefore I am’…maybe we should adapt this to ‘I eat, then I think, therefore I am’.

Lining the labyrinth of organs known as the gut are approximately 100 trillion bacteria (weighing 2kg) eagerly waiting to help us break down and assimilate the billions of particles that enter our bodies each day. They also help to produce new vitamins, for example sauerkraut is significantly higher in vitamins than plain cabbage.

Not only do our bacteria increase nutrient values in our food they also advise us on the foods that we should be eating – a perplexing idea I know! But what we eat is a matter of life and death for our friendly cohabitants, so it only makes sense for them to influence our choices. In order to trigger a craving the brain must be accessed, which is a tough feat considering the armour-like meninges and the blood – brain barrier. Bacteria can synthesise molecules small enough to access our brains, such as the amino acids, tyrosine and tryptophan, which are converted into dopamine and serotonin within the brain. Of course this is not the only way in which cravings materialise, but it is far easier to control our brain with bacteria than with genes, which may help to pave the future of treatments for diseases such as hypertension and diabetes.

So next time you wonder why you’re craving a tortilla or your favourite brie, just eat it, since 95% of serotonin (the happiness hormone)  is produced by the gut and we all now know the significance of ‘gut feelings’ on our well-being!

The long and winding road: how factual recall tests can effectively support linear examination courses

Wimbledon High History

By Emily Anderson, Head of History.

Think back, if you can, to your own History studies at school, whether these were months, years or perhaps decades ago. For most, the content covered becomes, over time, increasingly hard to recall. My current grasp of the French Revolution, for example, which I studied at AS Level, is embarrassingly basic now, almost 15 years later, as it is something I have rarely had need to revisit. At Parents’ Evening, parents smile wryly at vague memories of the Corn Laws or the Spinning Jenny (not meaning to undermine their importance, but their ubiquity in the collective memory of British adults is truly extraordinary) and voice envy at the breadth of opportunities available in the current History curriculum.

Instead, it is the broad conceptual understanding of, say, the nature of power, as well as the skills that remain, and these which lie at the heart of the purpose of History education for our department here at WHS. Empowering our students to participate in the academic discourse of History is our core aim, to enable them to engage critically with the world around them in their future lives. It is, however, impossible to participate in this discourse without what has been termed ‘fingertip knowledge’ as well as more conceptual ‘residual knowledge’: to secure meaningful progress in History, both need to be developed (Counsell, 2000). As argued recently in Teaching History where dialogue around cognitive psychology is increasingly evident, ‘fluent access to a range of types of knowledge is what enables historians to participate in some of the more sophisticated forms of historical discourse’ (Fordham, 2017).

Recent changes to A Levels (AL) have brought how we secure this fingertip knowledge into focus. The nature of the new linear exams mean there is more demand for a greater volume of content to be retained over a longer period of time. The importance of detail is evident both from reviewing past papers and from our experience in examining at AL last summer.

To approach this, we reflected on our experience of nurturing fingertip as well as residual knowledge at GCSE, where the linear model is, of course, long established, as is our practice of setting factual recall tests at the end of each topic. Our evaluation of the latter is below:

Advantages Disadvantages

It is classic retrieval practice, which results in stronger storage and retrieval strength (Fordham, 2017).

It encourages an extra stage of revision early in the course before more high stakes testing kicks in for mocks and terminal exams, reducing the pressure on Year 11.

It helps lead to great results (above 75% A* in the past three years).

Our tests were much too challenging – becoming notorious amongst our students and sapping morale.

They were no longer fit for purpose – pupils would never need to recall such specific detail, especially after the reform of the CIE IGCSE Paper 4 in 2015 which removed such questions.

 

Therefore, we have changed the structure of our tests to open ended questions. At IGCSE these are in the style of 4 mark recall questions. At AL I am experimenting with questions taking the form ‘cite two pieces of evidence which could be used to support an argument that…’, or similar. To try to tackle the issue of relevant but vague answers, I have awarded bonus marks at AL for detail to encourage both a conscious choice in selecting evidence (as pointed out by Foster & Gadd (2013)) and in-depth revision. All are now out of a uniform mark – 20 – to encourage comparison across topics and at different stages of the two years.

Furthermore, we have used the new AL structure to rethink when we test, in order to support maximum recall over the two years. Here, we currently have two approaches: retaining end of topic testing at GCSE in order to keep the advantages identified above, but utilising spaced tests at AL (the benefits of which are argued by, amongst others, Laffin (2016) and Fordham (2017)) by revising and testing existing knowledge on a topic before the next stage of it is covered. This lends itself particularly well to the unit on the British Empire from c1857-1967: in the past few weeks, my Year 13 class have sat tests on the increasing independence of the Dominions and on India, both in the period from c1867-1918, before studying inter-war developments. Students then complete their own corrections, consolidating the learning and identifying areas for development. During the revision period at AL, they can also undertake the same test several times citing different evidence. My 2017 cohort had, at their own suggestion, a star chart to record how many times they had undertaken a test for each area of the course, broadening their evidence base each time.

Whilst I hope that this gives a snapshot of the department’s current and very fledgling thinking, I would be mortified if it was taken to show that we are overly focussed on factual recall testing in the department. We are not. Tests of course never can and never will be the ‘be all and end all’ in terms of assessing student progress, but approaching them critically can only be a good thing.

References and further reading

Counsell, C. (2000). Historical knowledge and skills: a distracting dichotomy . In James Arthur and Robert Phillips, Issues in history teaching (pp. 54-71). London: Routledge.

Fordham, M. (2017). Thinking makes it so: cognitive psychology and history teaching. Teaching History, 166, 37-43.

Foster, R., & Gadd, S. (2013). Let’s play Supermarket ‘Evidential’ Sweep: developing students’ awareness of the need to select evidence. Teaching History, 152, 24-29.

Laffin, D. (2016). Learning to like linear? Some ideas for successful introduction of the new A Levels. Historical Association Conference workshop.

The Rapid Growth of Artificial Intelligence (AI): Should We Be Worried?

By Kira Gerard, Year 12.

“With artificial intelligence we are summoning a demon.” – Elon Musk

In 2016, Google’s AI group, DeepMind, developed AlphaGo, a computer program that managed to beat the reigning world champion Lee Sedol at the complex board game Go. Last month, DeepMind unveiled a new version of AlphaGo, AlphaGo Zero, that mastered the game in only three days with no human help, only being given the basic rules and concepts to start with. While previous versions of AlphaGo trained against thousands of human professionals, this new iteration learns by playing games against itself, quickly surpassing the abilities of its earlier forms. Over 40 days of learning by itself, AlphaGo Zero overtook all other versions of AlphaGo, arguably becoming the best Go player in the world.

Artificial intelligence is defined as a branch of computer science that deals with the simulation of intelligent behaviour in computers, allowing machines to imitate human behaviour in highly complex ways. Simple AI systems are already wide-spread, from voice-recognition software such as Apple’s Siri and Amazon Echo, to video game AI that has become much more complex in recent years. It plays a key role in solving many problems, such as helping with air traffic control and fraud detection.

However, many people are concerned with the continued advancement of artificial intelligence potentially leading to computers that are able to think independently and can no longer be controlled by us, leading to the demise of civilisation and life as we know it. In 2014 Elon Musk, the tech entrepreneur behind innovative companies such as Tesla and SpaceX, stated in an interview at MIT that he believed that artificial intelligence (AI) is “our biggest existential threat” and that we need to be extremely careful. In recent years, Musk’s view has not changed, and he still reiterates the fear that has worried humanity for many years: that we will develop artificial intelligence powerful enough to surpass the human race entirely and become wholly independent.

As demonstrated in a multitude of sci-fi movies – 2001: A Space Odyssey, The Terminator, Ex Machina, to name a few – artificial intelligence is a growing concern among us, with the previously theoretical concept becoming more and more of a reality as technology continues to advance at a supremely high pace. Other scholars, such as Stephen Hawking and Bill Gates, have also expressed concern about the possible threat of AI, and in 2015 Hawking and Musk joined hundreds of AI researchers to send a letter urging to UN to ban the use of autonomous weapons, warning that artificial intelligence could potentially become more dangerous than nuclear weapons.

This fear that AI could become so powerful that we cannot control it is a very real concern, but not one that should plague us with worry. The current artificial intelligence we have managed to develop is still very basic in comparison to how complex a fully independent AI would need to be. AlphaGo’s Lead Researcher, David Silver, stated that through the lack of human data used, “we’ve removed the constraints of human knowledge and it is able to create knowledge itself”. This is an astonishing advancement, and signals huge improvements in the way we are developing artificial intelligence, bringing us a step closer to producing a multi-functional general-purpose AI. However, AlphaGo Zero’s technology can only work with tasks that can be perfectly simulated in a computer, so highly advanced actions such as making independent decisions are still out of reach. Although we are on the way to developing AI that matches humans at a wide variety of tasks, there is still a lot more research and development needed before advanced AI will be commonplace.

The artificial intelligence we live with every day is very useful for us, and can be applied in a variety of ways. As addressed by Mr Kane in last week’s WimTeach blog, technology has an increasing role in things such as education, and we are becoming ever more reliant on technology. Artificial intelligence is unquestionably the next big advancement in computing, and as Elon Musk stated in a recent interview: “AI is a rare case where I think we need to be proactive in regulation instead of reactive… by the time we are reactive in regulation it is too late.” As long as we learn how to “avoid the risks”, as Hawking puts it, and ensure that we regulate the development of such technologies as closely as we can, our fears of a computer takeover and the downfall of humanity will never become reality.

Classic Chemistry Clips – The Beauty of the Practical

By Anthony Kane, Teacher of Chemistry.

Chemistry is, fundamentally, a very exciting and dynamic subject.

Part of the reason for this is the practical work we undertake – this takes two main forms, the class practical and the teacher demonstration.

When thinking about chemistry demonstrations, most students (past and present) will think of bangs, explosions and fire – all good things, but all over rather quickly. Some of you might remember, as I do, the disappointment when a teacher got you excited for a demonstration, only to watch it fizzle, sputter, and their subsequent and despondent “it wasn’t supposed to do that…

Imagine if we could replay, in slow motion, our favourite demos, to watch the magic of reality unfold frame by frame. Imagine always being able to see the demonstration clearly, regardless of where you were in the class. Imagine if we had a backup in case a demonstration, for whatever reason, went awry. Imagine if we were teaching a different topic entirely, and felt that now would be a wonderful time to illustrate our point with a display, but there was no time to throw it together. (Imagine if you wanted to show all your friends really cool science videos…)

These were the ideas that I had in mind when I started recording demonstrations during lessons at Wimbledon High School. Since then I have put together a catalogue of over twenty videos of common classroom demonstrations, and played them countless times. Using our Windows Surface Laptops, and connecting wirelessly to our SmartBoards, I am able to project what I am recording while it is being recorded.

The advantages are huge. Twenty students cannot all see one small beaker on a desk, but project it to the room and they can all get a perfect view. Sometimes the eye is not quick enough, or we blink, but with a video, we can go back and watch it again. We can slow it down, we can analyse frame by frame, and our learning is richer for it.

“Boom” goes the thermite.

Another aspect of the videos that I think particularly embodies the spirit of learning here at Wimbledon High is the sheer joy that students find in watching these demonstrations. “Ooh”s and “Ah”s are just as gratifying on recording as they are the first time you listen to them live in a lesson. One of our stated aims here at Wimbledon High is to nurture curiosity and a sense of wonder, and listening to some of the clips below, I hope you would agree that we are doing just that.

“Woah!”

“WOAAAAH!”

 

 

Where does this leave the future of chemical education? I think that the next logical step would be to record the method of class practicals – so that these videos can be distributed to students in advance of lessons and set as required viewing for the lesson. This would empower students to feel more confident with their equipment, have more time in the lesson to gather data, and to have more belief in their own abilities as scientists, encouraging their independence as learners. This would also prepare them well for scientific disciplines at university, which often require you to familiarise yourself with pre-lab exercises before entering a laboratory.

This is also a promising avenue for developing school partnerships. These videos are broadly applicable to many chemistry curricula, and we are fortunate at Wimbledon High to have excellent facilities and lab technicians. Sharing the fruits of our chemical labour is quick, easy, and importantly very beneficial to the education of others. I have already begun sharing my collection with another school and look forward to increasing their reach as time goes on.

Science is a practical discipline, and chemistry is a particularly visual subject. By offering students more opportunities to experience its beauty we open them up to a world of possibilities; an exciting pathway to deeper understanding of the universe, a subject both big and small, with deep history and philosophy, heroes and villains, and instil in them a lifelong appreciation for nature.

Developing WIMlevels and a new model of assessment

By Paul Murphy, Deputy Head Academic.

Perhaps Plato’s desire to ensure an expert mariner sails the ship in which you travel is a more striking illustration for the need to appoint and trust experts to do their business than pointing out that when it comes to how children learn, it is the teacher who is best placed to deliver students from the metaphorical storms they must weather. Although applying a gentle rhetorical massage to a critique of the character of democracy in the Peloponnese during the fourth century BCE is probably poor soil from which to begin an explanation of how Wimbledon High School has re-visited its own model of education, assessment and academic support, it captures the essence of our basic approach; in lieu of an accessible, clear and viable set of examination criteria and grade-boundaries (which in any case differentiate, rather than provide a guide for how to educate), we as a common room turned to each other, to pedagogical expertise and to our (extensive) experience to decide how to best support our girls throughout their time on Mansel Road.

Cherie Blair, a champion (albeit self-proclaimed) of the under-educated, noted in a speech on the subject that “someone with 4 A grades at A-Level from [a famous Public School] may look good on paper…but push a bit harder and often you get the impression they have learned to pass exams rather than think for themselves”. Although I risk (and indeed am being) highly reductive, it is my firm belief that learning to pass examinations, although a valuable skill (the most valuable in terms of future earnings, beside inheritance), really only teaches you do to precisely that, pass examinations. To consider the Junior perspective, SATS do not help GCSEs, which in turn offer little skill-based progression to A-Level alone. Data shows us that students who do well at GCSE tend to take well to A-Level. This does not mean GCSE is a good preparation to take the higher discipline; both are differentiating measures, and so it is likely doing well at one measure of capacity and intellect will see you do well at another. The same is true of the jump from A Level to Degree, at least in terms of skills (I should note that some studies do link outstanding A Level performance to 1st class degrees and that I, of course, write generally and for emphasis). Examinations do prove that learning has occurred, and are a basic requirement of universities and employers, so we had them keenly in our focus as we developed our model, but they were certainly not the focal-point. Outstanding examination results are intended to be the happy by-product of focussed, considered and subject-specific and synoptic education (not the oxymoron it might at first appear).

My (internal) starting point when opening discussions with Heads of Department and Staff was the work of Piaget (now a rather unfashionable educational philosopher, despite his respected grounding in child psychology). Piaget found, in 1920, that children’s power of reasoning was not flawed after all. In areas where children lacked life experience as a point of reference, they logically used their imagination to compensate. He additionally concluded that factual knowledge should not be equated with intelligence and “good” decision-making.

Over the course of his six-decade career in child psychology, Piaget also identified four stages of mental development. “Formal operations,” the fourth and final stage, involves 12-to-15-year-olds forming the ability to think abstractly with more complex understandings of logic and cause and effect. This is when he considered (and later theorists have not successfully, in my view, challenged this developmental stage) the brain at its most plastic in terms of learning beyond mere knowledge (though, of course, as I noted above, he felt knowledge was still essential for positive outcomes). I was keen therefore that our system of assessment, our schemes of work, our developmental model, should be more consciously building undergraduate skills, concepts and modes of working from Year 7. There were, of course, many of these elements in existing assessment models and schemes of work, but we needed greater clarity and accuracy (and indeed conviction) about what such skills were, and how they could be developed, taught and assessed over a seven-year period, in each subject discipline (until our education system’s conception of subjects as disparate areas of studies subsides, subject-specific skills will be the way of thinks in the United Kingdom).

So, the first step was to communally identify our goals, which was relatively straightforward. In a meeting with a key team of HoDs and SMT members, we thrashed out the key aims we would like to use to frame our assessment policy. Of course like all good discussions, concordat was neither complete nor decisive (and like all chairs of such discussions, I am conscious my own starting point will have coloured the outcome), as our thoughts will be subject to change and amendment as greater understanding comes forward. We settled on two themes; that our key idea would be the pursuit of scholarship, with an “end-goal” of providing every student will the tools and skills to thrive at a top university, conservatoire or other tertiary institution (our context precludes the immediate focus of work at 18 for most), and that each department would draft their own set of progressive criteria, describing in detail the “threshold-concepts” that demonstrate the distinctive steps in understand each subject more fully and completely, and using their extensive experience to explain to parents, girls and themselves, what these moments were, which skills a girl was currently able to use, and which they were working to next on the ladder to becoming a capable undergraduate. As such, the skills required in Year 7 had to be mindful of ensuring the skills required at University were developing in the right way, and our highest “progress levels” are beyond the requirements of GCSE and A-Level respectively.

A good “threshold-concept” example (elaborated for all HoDs in a session we held with Ian Warwick, an educational-consultant who focusses on the academic development of highly able pupils) is the moment at which a student of English Literature first recognises that the characters are fabrications, and that the author deliberately writes to create and develop them. Without this step, analysing literature is at best comprehending the narrative of a story, with it, a world of opportunity opens. We tasked all HoDs to work with their departments, to find all such steps and progressions which students undergo during their secondary and further education, and to stage these progressions in a table which demonstrated them. An example is below (English Literature), at Appendix A. A note must be made here to the elasticity and dedication of the staff involved in the development process; to hold this close a micro-scope to your methods of assessment is difficult and challenging in the current political climate, where examination pressure can so easily trump educational goals.

A two-year process was devised for the development of these threshold concept progress tables, with a view to the new model being adopted in Years 7-10 from September 2017, and the whole school from September 2018. The first part of the model has been drafted and implemented, with our first (internal) reporting assessment scheduled in October. The model has broken progress down into these threshold concepts further and skills progressions, with separate descriptors for “skills” and “concepts and ideas”, so that girls, parents and teachers can all clearly identify and track the progress of a student with accuracy and confidence, whilst also showing students what they need to do next in order to progress. The rationale for a dual-descriptor approach (see again Appendix A) was based in both practical evidence (a similar model is already in use, and has proven very successful, at the flagship Westminster Harris Sixth Form) and educative and psychological theory, where the ability to understand and the ability to do remain distinct concepts (see Naglieri, Goldstein or notably Brooks (in Psychology Today)) that require acknowledgement, assessment and explanation in their own right. Each threshold has been standardised using internal moderation, cross-reference with standards in the reformed GCSEs being undertaken in various subjects (our A Level draft is pending) and also, by heavily relying on the pedagogical knowledge and experience of the Wimbledon High staff. Departmental meetings remain the epicentre of good teaching and learning, and it is from them, in combination with educational theory, that this system was devised.

The system has also sought to allow departments the freedom to devise schemes of work in a way which encourages subject-specific skills to subsist at the core of our academic offering. The model moves away from collective assessment weeks and towards a fluid style of assessment, where teachers’ overall opinions of a pupils’ progress are combined with punctuated and careful written assessment that allow pupils to display and develop skills beyond those expected for their age-range, without sacrificing the need for clear, identifiable points of progress. MidYIS (despite its inaccuracies it remains the best available base-line data from a test scenario) forms the basis of our initial projections for pupil progress on our scale, but it is by no means the main driver over time, as yearly pupil targets will be clear, fluid, subject specific and, most importantly, highly individual.  Progress up our various “WimLevels” will be tracked half-termly, without the need for cumbersome reporting systems, and we hope that it will focus our girls on the simplest goal in self-improvement: which step must I take next to get better? Our Assistant Head, Performance, devised a specific flight path for each girl’s projected progress both intra-year and year-on-year, which can be amended based on achievement should the demon MidYIS be proven a too miserly tool.

The finished product means that all girls, parents and staff will receive a clear, robust message about the skills they have developed and concepts they have learned, every half-term, and in every subject. It will inform scheme of work planning, assessment, intervention, tracking and teaching, setting our goals as classroom practitioners based on mastery and excellent of the subjects we are teaching, with fantastic examinations results little more than a by-product which proves that we are ensuring our girls are always learning and developing academically in the best possible way.

Mr Paul Murphy

Deputy Head (Academic)

19th October 2017

www.wimbledonhigh.gdst.net/

“British policy towards India changed completely from 1857-76.” How far do you agree?

Wimbledon High History

By Ellie Redpath, Year 12.

The Indian Mutiny of 1857-8 resulted in a change to British policy towards India from an idealistic view, with the hopes that India would one day have become civilised enough under British rule to self-govern, to one of resigned moral duty coupled with a heightened awareness of the need for cementing the security of the British Raj. However, it did not result in the complete eradication of the previous policies employed under Company rule. When policy is defined as the changes made by the British government with regards to the military system and administrative government of India, and the changes to economic strategy, it becomes apparent that the policies were altered in order to avoid provoking the revival of violence by imposing Western ideology on the indigenous people. Normality for the Indian people remained largely the same as before the Mutiny; these policies were introduced solely as insurance that the events of the Mutiny would never be repeated.

The differences to the administrative government of India implemented after the Mutiny can ostensibly be seen as drastic, yet in reality resulted in little change other than to consolidate the restriction of the power of the indigenous people. An Indian Civil Service was created and the Governor General renamed the Viceroy, creating an illusion of the upheaval of the East India Company’s goverance. Yet despite the change in title, the new Viceroy of India was in fact the same man who had been Governor General, Charles Canning, and largely took on the same role as before 1857. The only tangible alteration was that he worked for the Government rather than the Company. Moreover, the Indian Civil Service was mainly comprised of white British men, and whilst indigenous people were not prohibited from joining, the entrance tests were based in London, so it was made near impossible; this had not even changed several decades later in 1905, when a mere 5% were men from Bengal. The creation of the Civil Service therefore only served to strengthen the administrative control of the British over the Indians by limiting how much influence Indians had over their own government. Another ostensible change introduced by the British government was the return of authority to the indigenous rulers of the princely states, a reversal of Dalhousie’s Doctrine of Lapse. While this appeared to be an extreme shift from Britain’s policy pre-Mutiny, the Princes overwhelmingly complied with British legislation and the restoration of their power made little difference to everyday life; the British government gave back their former entitlements solely because it appeared to be respecting tradition. A considerable amount of bitterness had developed in recently annexed states such as Oudh, so this difference in policy was expected to help pacify the indigenous people to prevent future uprisings. Ultimately, the British changes to the administrative rule of India were not as severe for the majority as they could seem at first glance, and were made principally to cement British rule and influence in the subcontinent.

Britain’s modifications to the structure of the Indian military were slightly more radical because it was sepoys in the East India Company’s army who had begun the Mutiny, so to avoid a repeated occurance and confirm that Britain held power over the army, it was necessary for Britain to change its military organisation in a more extreme fashion than it had changed administrative or economic policies. In order to prevent the recurrance of a similar incident, the religions and castes of the regiments were mixed to cut off a sense of unity against the British. This was intended to avert a situation like that of the Brahmin caste before the Mutiny – members of the elite Brahmin were forbidden to travel across the sea, yet this custom was often overlooked or ignored by British generals, leaving them to harbour resentment against the British. In addition to this, eighty four percent of regiments in Bengal (where much of the resistance had originated) were replaced, in order to diffuse any remaining tension in the area between the sepoys and their white officers. The number of British officers supervising a sepoy regiment was increased, and weapons were left under British control when not being used directly in battle to ensure that any violence that broke out amongst sepoys would not immediately endanger the British generals. However, whilst more changes were enacted in regards to the Indian military than in Britain’s administrative or economic policy, they were almost all made with the objective of inhibiting the escalation of future conflicts between sepoys and their officers into full-scale revolutions. The statement could be made that because sepoys were treated with greater respect after the Mutiny, Britain’s aim was not to assert control over the Indian troops or remain distant from them, but rather to foster amiable relations between officers and their soldiers; yet this was another strategy used by Britain to create an illusion of interpersonal respect to avoid further provocation of the indigenous peoples. Hence the military strategies of the British towards India only changed significantly because they were the most relevant in preventing the reoccurance of a mutiny.

The changes to British economic policy towards India were not a complete reversal of policy under the East India Company, yet again the changes that were made were directed towards attempting to curb the economic progress and industrial independence of the indigenous people to secure British control over India. The British built over 3000 miles of railway after 1857, a vast distance compared to the mere 288 miles built under Company rule. This development, whilst not being entirely new –railway lines, despite being short distances, had already existed before the Mutiny – simultaneously benefitted British trade as it allowed them to transport their goods further distances, increasing their wealth over that of the Indian economy, and allowed British troops to reach and crush any uprisings in remote areas much quicker than they would have been able to otherwise. While one could argue that developing and promoting industry in remote areas was an equally important reason for the construction of railways, and thus that their purpose was not to consolidate the British Raj, Britain’s economic policies actually intended to hinder India’s industrial growth. The recently introduced policy of free trade made it far easier for Britain to bombard India with inexpensive British-manufactured goods, which India would often have provided the raw materials for. For example, India produced raw cotton for export to Britain, yet its textiles industry was crushed by imports of cheaper British cloth. India’s economic development was hence restrained as it remained reliant on exports of raw materials to Britian, but had no protected market in which to sell its own manufactured goods, so its own industry could not flourish when faced with British competition; Britain was therefore kept economically superior to India, securing its power over the country, whilst India was kept dependent on British trade for its economy to survive, strengthening its ties to Britain. Therefore, Britain’s economic policy somewhat changed after the Mutiny due to the addition of railways to hasten the transportation of troops, and the import of British manufactured goods to India to limit its industry, however because railways had first been developed by the East India Company, the adjustments were only made for the purpose of security over the region and were not as extreme that one could state that they were changed completely.

To conclude, the Indian Mutiny resulted in Britain altering its policy on India from that of forced Westernisation with the ultimate aim of India achieving self-government, to one primarily focused on retaining British control and security in the subcontinent. However, outside of this shift in emphasis, little was changed, for life itself was not made radically different for the indigenous people; instead, the differences were precautionary, to avert the recurrance of brutality and ensure Britain remained the dominant power in India.