Lucia Flaherty, Teacher of English, reviews the podcast ‘Trialled and Tested’, in which Jamie Scott and Alex Quigley explore how students must learn to verbalise the process of metacognition early.
‘Metacognition is intuitive […] We just need to give it a language’ – Alex Quigley
This week, in a bid to think about metacognition off screen, I have been listening to the podcast ‘Trialled and Tested’. In the first episode, Jamie Scott and Alex Quigley explore what metacognition and self-regulation is and how it can be implemented in the classroom. There was more food for thought in the podcast than a review can cover so I’ve focused on what resonated the most with me: the type of language we can use to talk about what metacognition looks like in the classroom.
Alex Quigley is quick to note the values of metacognition with the impressive statistic that it can provide ‘7 months of additional progress in 12 months’ when students use metacognitive strategies effectively. The problem is that a surprising amount of students are rather poor at metacognitive skills. Consider the default revision method (even used by university students) of reading over and highlighting notes when this has been shown to be a very ineffective strategy.[1]
To help solve this, Quigley believes that students must start metacognition early and learn the language to verbalise what is an intuitive process. To start, he defined a 3-stage process that he refers to as ‘metacognitive regulation’.[2] It is simply:
Plan
Monitor
Evaluate
These are things we do in our daily lives such as planning to take an earlier bus so that we are not anxious about being late to work. We monitor what the traffic is like and whether we should change to walking instead. We then evaluate whether our journey was a success. Did we arrive on time? Would we take that bus again?
This is a process that both teachers and students do in lessons all the time but Quigley says that the trick is to verbalise it. He noted how the same process looks in ‘the best Art lesson he ever saw’.[3]
Plan: The teacher verbalises the planning process by introducing the task and discussing the strategies needed to draw a self-portrait. What tools should we use? Why is a pencil best? How did I prepare for this drawing?
Monitor: The teacher would model a self-portrait and monitor what he was doing to create the art in real time. What shapes are being used? How should the pencil be held? How did I know where to start?
Evaluate: At the end, students and teachers evaluated the drawing done. What are the successes? What would you change? Was it a clear process? Did you struggle or was it a seamless process?
Coming from the land of teacher training that talked in ‘starters’, ‘objectives’, ‘main activity’ and ‘plenary’, I rather prefer Quigley’s language for the process of learning and how to structure a lesson that puts metacognition at the heart of it.
Lucia Flaherty
[1] Jeffrey D. Karpicke, Andrew C. Butler & Henry L. Roediger III (2009) Metacognitive strategies in student learning: Do students practise retrieval when they study on their own?, Memory, 17:4, 471-479, DOI: 10.1080/09658210802647009
Spring Focus: Metacognition – Computer Science Skills
In this gem, I will be looking at the thinking skills that are taught as part of the Computer Science Curriculum and the ways in which they are taught. I hope that by sharing our ideas, we can start to think of problem solving as a set of skills involved across a range of subjects. Metacognition skills are key to the study of computer programming. When encountering a new task, novice computer programmers are likely to concentrate on the superficial details of the problem, failing to break it down into manageable sub tasks and trying to solve the whole problem in one go. We often see this in our lessons and I’d be really interested to hear if any other colleagues encounter similar issues or use similar skills in their subjects.
Metacognition Skill 1: Decomposition Decomposition is the process of breaking a large problem down into progressively smaller “chunks”, making it easier to solve. By the time they complete the GCSE course, students should be comfortable with these steps. In order to promote this at GCSE, students develop this skill in three ways:
At the start of the course:
After introducing the concept of decomposition, students are asked to create an overview of the parts of their favourite board game. This gets them to take an algorithm (set of steps, as defined by the rules) and gets them to think about them in a different way.
Further on in their learning, the class will be asked to attempt a decomposition diagram, working collaboratively to spot the key components of the problem. This work is not marked, nor do they have to follow a set format; it simply acts as their plan for the task.
Finally, at the end of a project, the class is given a solution prepared by the teacher. Their task is then to reverse-engineer the decomposition diagram, so that they can follow the thought process used and begin to do it by themselves in the future.
Metacognition Skill 2: Abstraction
Abstraction is the skill of removing unnecessary detail, allowing the programmer to focus on the important parts of the problem. A famous example of this is tube map, where Harry Beck realised that the geographical positions of the tube stations was unimportant; his map focused more on the order of stations and highlighting interchanges, using approximate locations (click here for a geographically accurate tube map and see how much hared it is to follow).
In this activity, students are paired, with one partner blindfolded. The partner who can see is given a photograph (of a bird, for example) and has to get the blindfolded “artist” to recreate the picture as accurately as possible. The results are often comical, occasionally hilarious and always excite some sort of comment. After a couple of iterations, the class is asked to reflect and discuss how they made it easier to describe the image to their partner. Many of them will respond with ideas such as “I told her to draw a circle the size of a 10p” and this can lead us in to the concept.
Metacognition Skill 3: Mental Mapping
In creating larger software projects, it’s important to consider how users will interact with the solution; the user will create a mental map of software, giving them an idea of where they are, where they need to go and the way back to the beginning. The class are asked to close their eyes and count the number of windows in their house (some of the numbers shocked me when I first asked this in a private school). After asking for their responses and writing them on the board, they are asked to forget about the number and to describe the process they went through. Were they inside or outside? Which room did they start in if they were inside? Did they fly around the outside? This allows us to explore the idea that they have a mental model or map of their house in their heads. This can be broadened out into directions to their nearest train station or supermarket. Then we look at the steps involved in performing everyday computer tasks, such as writing a letter in Word. Using these examples, the students then design their solution.
Why these ideas are Useful…
By introducing the skill in a non-technical and familiar situation to begin with, we can avoid overwhelming the pupils with new terminology
Instead of this being something new that the students feel they have to acquire, we can give them the idea that these are skills that they already possess and with practice can develop
It allows them to develop their confidence in the face of unknown problems and to draw out the similarities between tasks
Although these are Computer Science examples, they can be applied to other subjects:
Planning a project or research by splitting it into easy to achieve tasks
Describing concepts to others in a simple and concise way
Claire Boyd, Head of Junior School, reflects on the process that brought about the inception of Adventum, the new Junior School philosophy-led academic curriculum.
Education, like so many other areas of life, is not immune to the comings and goings of fashions and trends. What is en vogue one decade can be reviled the next. When qualifying to teach back in the early 00s, my evangelical tutors waxed lyrical about ‘The Literacy and Numeracy Hour’, the golden bullet, as they saw it, for guaranteeing educational success in classrooms across the country.
When it was launched in 1998, this highly prescriptive minute-by-minute approach to teaching daily Maths and English lessons, provoked the then-Education Secretary, David Blunkett, to promise to resign in 2002 unless “80% of 11-year olds met the expected level in their end of Key Stage 2 SATs tests”[1]. Alas, by 2010, when I was mentoring new teachers through their training myself, the tide had turned – rather unceremoniously – against the Literacy and Numeracy Hour, and nothing as rigid and straightened as that has earned a trainee teacher their stripes since.
Just a few moments scrolling through the most popular Edu Twitter accounts today will lead you to believe frequent retrieval practice, regular low stake testing and knowledge organisers hold the key to success that Blunkett’s beloved Literacy Hour did, twenty years ago.
When it comes to deciding how to craft a curriculum imbued with the integrity, longevity and depth to withstand the test of time (or least see a good few cohorts reap its benefits), you need something that will not only deliver exceptional educational outcomes but something which will also stand resolute as other trends come and go around it. Between September 2019 and January 2021, this preoccupation loomed large over my team and I, as we sought to overhaul our curriculum and breathe new life into what we teach and how we teach, as well as, most importantly, consider why we teach what we teach.
Launched to our pupils at the start of the Spring Term 2021, Adventum (named in tribute to the spirit of adventure that rests at the heart of the Junior School) is the net result of this process in action. Over the course of four terms, we went from asking ourselves where the value lay in what we had been teaching and which aspects were delivering excellent outcomes to what we wanted for the next generation of our Junior School learners.
Wimbledon High – Reception Class
Our curriculum building process began at the end, rather than the beginning, by considering what we wanted the legacy of our curriculum to be. What did we want our pupils to take away with them when they finished seven years engaged in our bespoke curriculum and its related lessons? By no means an easy question to answer, we worked through a range of iterations of legacy statements before asserting that we will aim to instill our learners with a love of wisdom, integrity of thought and the social awareness to act with compassion, confidence and agency; leaving our girls filled with a desire to grapple with and overcome the challenges presented by the world in which they are growing up.
With this in place, we then felt a close and immediate connection with the potential a philosophy-led curriculum could provide. Exploring existing research on philosophy driven curricula drove us to agree emphatically with the Lipman that “every subject seems easier to learn when its teaching is infused with the open, critical spirit and logical characteristic of philosophy.”[2] It is only by fostering a curriculum that elevates thinking rather than the transmission of knowledge will we truly equip the young minds in our care, with the skills and abilities to use the knowledge and skills they acquire to meaningfully contribute to shaping the world around them.
When considered alongside both the capabilities and abilities of our eager learners, Adventum began to take shape around a foundation of provocative thinking, intellectual disruption, critical questioning and increasing levels of self-knowledge. Rather than being tied to closely to a means of delivering content over time in an efficient and sufficient manner, we worked hard to look for ways that the discovery of knowledge and skills could be fused together to help strengthen connections and schema building whilst responding naturally to the innate predisposition all children have for asking questions, for challenging and seeking out possibility. We looked for a practical way to take the structure and progression of the National Curriculum – in which we recognise inherent value – and align it closely with a programme which gives space and breadth for the thinking, contemplation and sequence of discoveries that relate directly to reasoning; there is indeed “no point in teaching children logic if one does not at the same time teach them to think logically.”[3]
So, half a term into the implementation of Adventum, what are our girls experiencing? Each sequence of lessons is rooted in a philosophical question that provides a focus to the learning for that term. The questions posed simply yet designed to offer perplexity of thought when engagement levels are high.
Adventum begins by introducing first providing an introduction to meta-physics (understanding ourselves), moving through to develop an understanding of aesthetics (appreciating the natural world) and culminating with the complexities of ethics (wrangling with the moral dilemmas of life). This term sees Reception wonder what makes a good character, Year 3 ask if colour plays a part in our identity, Year 6 consider who decides the status quo around us. With the humanities, science, art and music interwoven into the exploration of these questions, high quality and ambitious texts provide the important context required to interrogate the big questions being asked of our bright minds. Where the aim of philosophy writ large is to cultivate excellence in thinking, Adventum has been crafted to spur our girls on to examine what it is to think historically, musically and scientifically.
Whilst we do not expect Adventum to exist in a pedagogical vacuum, unchallenged and unaffected by the progress in education and child development, it is hard not to feel that the providence found in the quest of thinking that has gone before sets us in good stead. So here is to the adventure of asking big questions of big minds and inspiring big thinking from Early Years onwards.
References:
[1] p.1 After the Literacy Hour: May the Best Plan Win, Centre for Policy Studies, 2004
[2]Philosophy Goes to School, M. Lipman, Temple, 1988, p.4
Alex in Year 13 writes a short introduction to foreign aid, highlighting some of the successes and problems that can appear from the charitable act of giving.
Instilled in us from a young age is the principal that we should help those who are in extreme need. And what could be simpler? From charity mufti days and bake sales, this theory underpins social behaviour in our modern day. It is indeed this principal that drives support for foreign aid.
In the words of Roger Riddell, ‘The belief that aid is a ‘good thing’ is sustained by the assumption that the resources or skills that aid provides do make a difference to those being assisted’. However, the impact of such aid on recipient countries is not always as positive as it may initially appear.
As the effects of climate change enhance the frequency and severity of natural disasters, we often see foreign aid expenditure in an emergency form. Altruism of this kind is uncontested as in the short-term these humanitarian responses are overwhelmingly positive. However, it is with sustained aid that potential problems arise.
Overtime, foreign aid has expanded from small beginnings to become a large and complex global enterprise. Development cooperation (as foreign aid is also called) is now established as an integral part of international relations, with many donor countries contributing at a UN target rate of 0.7% of their gross national income. For the UK, this sum stood at £14.6 billion in 2018. As can be seen from the charts below, few countries meet this target.
Net development assistance by country (total million US$ in 2015)
However, if we compare this data above with looking at this giving as a percentage of GDP, a rather different picture emerges:
For many people, this huge economic contribution to foreign aid and development is a triumph in the world of humanitarianism and society as a whole. An ethical theory linked closely with the topic of foreign aid is utilitarianism. To put it simply, this is the notion that ‘moral life should be guided by the objective of trying to achieve the maximum happiness for the greatest number of people.’
As stated by Paul Streeten, ‘a dollar redistributed from a rich man to a poor man detracts less utility than it adds, and therefore increasing the sum total of utility’. This argument is comprehensive and easy to wrap your head around, which explains why foreign aid is so often short sightedly seen as a win-win situation.
Unfortunately, this is not always the case. There has been evidence of several key factors that can inhibit the aggregate impact of foreign aid.
The first problem arising with aid is its potential for misuse. Additional resources in the hands of potentially corrupt governments are significant impediments to optimum utilization of funds. This is because the fungibility of aid could enable the financing of non-developmental projects against the interest of the population. Hence, aid itself has, in some cases, the perverse ability to create negative effects on recipient economies.
Secondly, there are limits associated with aid and a country’s absorptive capacity. As the volume of aid increases, it is subject to diminishing marginal utility. In basic terms, the effect is as if I gave you one chocolate bar that you enjoyed consuming. And perhaps a couple more wouldn’t do you any harm… but once I’ve given you 100 chocolate bars, each individual bar’s worth has decreased along the way. In this way it can be seen than after a certain point (called the absorptive capacity threshold), providing more aid becomes completely ineffective.
Finally, fluctuations in aid inflows are external shocks to vulnerable economies, which plan expenditures based on promised aid commitments. When a highly dependent country’s aid is not given in full, this can damage future growth prospects significantly.
From all this, we can gather that the future of aid-giving and its associated policies may need modifying to ensure aid is given and used in the most efficient and appropriate ways possible, enabling it to help those who are most in need.
Spring Focus: Metacognition – students selecting and organising the whole class revision plan
Teaching and learning Gem #29: Planning the Revision Process/Logging Progress
In this gem, I will be taking you through the way in which we use the girls’ own confidence ratings to plan the revision and teaching schedule in Computer Science, as well as promoting the idea of tackling your weakest topics first.
This Friday Gem was, in part, gifted to me several years ago on a course. The Chief Examiner for Computer Science at the time (pre-Govian A-levels) claimed that it should be possible for a student to fully revise for the A-Level in a single hour, as long as the students prioritised their revision effectively. Although I never did subscribe to that timeframe, I noted that students often simply start at the beginning of the specification and waded their way through to the end, rather than targeting the trickiest topics before fatigue sets in!
First Review
After the Computer Science exam classes have finished the specification (this is usually just after Autumn Half Term), they have a single lesson where they are asked to give their gut reaction to the topics on the syllabus, in order to inform our planning of revision topics going forward.
They are provided with a grid, containing all of the spec points from the syllabus and a booklet full of revision questions which they can use as a stimulus for discussion. Working collaboratively, they discuss the specification points, look at the questions and rate their confidence on each topic (a score out of 5) by completing their column in the table:
Why it’s useful…
Taking these numerical snapshots of the students’ confidence lets the students:
Understand their areas of strengths and weakness
Discuss the topics and practice exam questions with their peers, to further their understanding
Feel more confident about the approaching assessment, as they look at more examination style questions and understand the types of questions and skills required
Find reassurance when all of their peers rate a topic with a low score
It also allows us to put the scores in a spreadsheet:
We can calculate an average student understanding for each topic
Sorting the syllabus from lowest to highest average, we plan our revision lessons to tackle those topics which the students are most concerned about first
We can also take an average per student and use this to identify anyone who needs a pep talk or who may need extra support:
Towards the End of Revision The class comes back to the table again and we repeat the process again. Students are able to see their progress, having hopefully driven all of their confidence scores higher, which should help to prove to them that their hard work has paid off.
Have you ever walked into a classroom and made an initial judgment which you can’t see to amend? Perhaps when we make initial observations, we are comparing two things and judging their similarities? If our judgments are distorted by perception, how can we be sure that our decision making is having a positive impact on teaching and learning? This is why it is so important for us to think first about why we think the way we do. Not only will this reflection allow us to consider how we come to make judgments, but also make us factor in the unknown in our decision making.
The Undoing Project – Michael Lewis
On each round of a game, 20 marbles are distributed at random among five children: Alan, Ben, Carl, Dan, and Ed. Consider the following distribution:
Type I
Type II
Alan
4
Alan
4
Ben
4
Ben
4
Carl
5
Carl
4
Dan
4
Dan
4
Ed
3
Ed
4
In many rounds of the game, will there be more results of type I or type II?[1]
If you have spent a moment looking at the above example, I wonder if you thought why you chose type I or type II. What are we doing when we make judgments? How do we take pieces of information, process them, and come to a decision or judgment?
For one or more answers, I recently read The Undoing Project by Michael Lewis in which he tracks the careers and lives of two of the greatest psychologists, Danny Kahneman and Amos Tversky.
The above table is taken from Lewis’ book, chapter 6, The Mind’s Rules. Questions such as, ‘when/where was human judgment likely to go wrong’, ‘why do people often say that they were doing one thing when they were actually doing another’ ‘what are people doing when they judge probability’ are examples which Kahneman & Tversky try and tackle. In their paper Subjective Probability: A Judgment of Representativeness[2]Kahneman & Tversky attempt to ‘demonstrate people make predictable and systematic errors in the evaluation of uncertain events’. If nothing else this should get you thinking about thinking. Part of their approach comes from the premise that when people make judgments, they compare whatever they are judging to some model in their minds. “Our thesis is that, in many situations, an event A is judged to be more probable than an event B whenever A appears more representative than B.”[3] So, take a look again at the above example. Do you know why you chose type I or type II? If you think that the uneven distribution of type I is more likely than all the children receiving four marbles each, then think again. Just because type II “appears too lawful to be the result of a random process…”[4] it doesn’t mean it is wrong. This is something worth thinking about, “if our minds can be misled by our false stereotype of something as measurable as randomness, how much might they be misled by other, vaguer stereotypes?”[5]
Throughout the book there are questions raised about our understanding of how hard it is to know anything for sure. Kahneman himself favoured Gestalt psychology which sought to explore the mysteries of the human mind. The central question posed by Gestalt psychologists was, ‘how does the brain create meaning?’ Look at the two parallel lines below.[6] Are you really going to insist that one line is longer than the other?
If perception has the power to overwhelm reality in such a simple case, how much power might it have in a more complicated one?
For those of you of a more medical persuasion you may prefer Chapter 8 which tracks the impact Kahneman & Tversky had on Dr. Don Redelmeier, an internist-researcher. Working at Sunnybrook, Canada’s largest trauma centre he says, “You need to be so careful when there is one simple diagnosis that instantly pops into your mind that beautifully explains everything all at once. That’s when you need to stop and check your thinking.”[7] This is not to say that the first thing that comes into our mind is wrong, but because it was in our mind, we become more certain of it. How costly may this be in school life? This I think is highlighted in an example of a maths problems in which we can check our answers to see if we have erred. In comparison to education it highlights an interesting thought. “…If we are fallible in algebra, where the answers are clear, how much more fallible must we be in a world where the answers are much less clear?”[8] This is certainly a book to read from cover to cover even if it doesn’t give you all the answers why we should be careful to think about thinking.
With ‘slowing down’ a key part of our wellbeing strategy of ‘Strong Body, Strong Mind’, our Director of Studies, Suzy Pett, looks at why slowing down is fundamental from an educational perspective, too.
So often, the watch words of classroom teaching are ‘pace’ and ‘rapid progress’. I’m used to scribbling down these words during lesson observations, with a reassuring sense that I’m seeing a good thing going on. And I am. We want lessons to be buzzy, with students energised and on their toes. We want them to make quick gains in their studies. But is it more complex than this?
The more I think about it, the more I am convinced that ‘slow and deep’ should be the mantra for great teaching and learning. I’m not suggesting that lessons become sluggish. But, we need to jettison the idea that progress can happen before our very eyes. And, with our young people acclimatised to instant online communication, now more than ever do we need our classrooms – virtual or otherwise – to be havens of slow learning and deep thinking. Not only is this a respite from an increasingly frenetic world, but it is how students develop the neural networks to think in a deeply critical and divergent way.
What I love most in in the classroom is witnessing the unfurling of students’ ideas. This takes time. I’m not looking for instant answers or quick, superficial responses. I cherish the eeking out of a thought from an uncertain learner, or hearing a daring student unpack the bold logic of her response. Unlike social media, the classroom is not awash with snappy soundbites, but with slow, deep questioning and considered voices. As much as pacey Q&A might get the learning off to a roaring start, lessons should also be filled with gaps, pauses and waiting. You wouldn’t rush the punch line of a joke. So, it’s the silence after posing a question that has the impact: it gifts the students the time for deep thinking. In lessons, we don’t rattle along the tracks; we stop, turn around and change direction. We revisit ideas, and circle back on what needs further exploration. This journey might feel slower, but learning isn’t like a train timetable.
But what does cognitive science say about slow learning? Studies show that learning deeply means learning slowly.[1] I’m as guilty as anyone at feeling buoyed by a gleaming set of student essays about the poem I have just taught. But don’t be duped by this fools’ gold. Immediate mastery is an illusion. Quick-gained success only has short term benefits. Instead, learning that lasts is slow in the making. It requires spaced practice, regularly returning to that learning at later intervals. The struggle of recalling half-forgotten ideas from the murky depths of our brains helps them stick in the long-term memory. But this happens over time and there is no shortcut.
Interleaving topics also helps with this slow learning. Rather than ploughing through a block of learning, carefully weaving in different but complimentary topics does wonders. The cognitive dissonance created as students toggle between them increases their conceptual understanding. By learning these topics aside each other, students’ brains are working out the nuances of their similarities and differences. The friction – or ease – with which they make connections allows learners to arrange their thoughts into a more complex and broad network of ideas. It will feel slower and harder, but it will be worth it for the more flexible connections of knowledge in the brain. It is with flexible neural networks that our students can problem solve, be creative, and make cognitive leaps as new ideas come together for a ‘eureka’ moment.
Amidst the complexity of the 21st century, these skills are at a premium. With a surfeit of information bombarding us and our students from digital pop-ups, social media and 24 hour news, the danger is we seek the quick, easy-to-process sources.[2] This is a cognitive and cultural short circuit, with far reaching consequences for the individual’s capacity for critical thinking. With the continual rapid intake of ideas, the fear is a rudderlessness of thought for our young people.[3]
And yet, peek inside our classrooms, and you will see the antidote to this in our deep, slow teaching and learning.
Sources: [1] David Epstein, Range (London: Macmillan, 2019), p. 97.
[2] Maryanne Wolf, Reader, Come Home (New York: HarperCollins, 2018), p. 12.
Sienna (Year 11) looks at the history of immunisation, from variolation to vaccination, exploring some of the topics around this important science.
History of Immunisation:
Variolation:
While vaccination is considered quite a modern medical procedure, it has its roots in more ancient history. In China there are records of a procedure to combat smallpox as early as the year 1000. This was called variolation and was a procedure where pus was taken from a patient with a mild case of smallpox which was then given to another person. This means the person gets a less dangerous version of smallpox than they may have otherwise, promoting an immuno-response to act as a way of preventing the disease. This method became established around the world and was later seen in the work of Edward Jenner, who is considered the ‘father of vaccinations’, after he used this technique in Africa, England and Turkey in the 1700s.
Later in the 1700s, the USA learned of it from slaves who came inoculated from Africa. Even though a remarkable feat for the time, it wasn’t without risk, as the way the immunity was reached was by direct exposure to the virus, so infected patients could still die from the virus – as is what happened with King George III’s son and countless number of slaves. However, the risk of dying from variolation was far smaller than the risk of catching and dying from smallpox, so variolation was popular despite the risks.
Origin of the first widely accepted vaccination:
Vaccination, as we know it in modern terms, was first established in 1796 by Edward Jenner. He was a scientist and fellow of the Royal Society in London. Seeing how much of a problem smallpox was at that time (and for most of history prior to then), Jenner was interested at innovating the process of variolation to tackle smallpox.
He was inspired by something he heard when he was a child from a dairymaid saying “I shall never have smallpox for I have had cowpox. I shall never have an ugly pockmarked face.” This inspired him later in life to carry out an experiment where he inoculated an eight-year-old with cowpox disease. He recorded the boy felt slightly ill for around 10 days after the procedure, but afterwards was completely fine. After being injected with active smallpox material a few months later, the boy did not show any symptoms of the disease; Jenner concluded his experiment had been a success.
After writing up his findings, Jenner decided to name the new procedure vaccination as the Latin for cowpox is ‘vaccinia’. His paper was met with a mixed reaction from the medical community. Despite this, vaccination began gaining popularity due to the activity of other doctors such as Henry Cline, a surgeon whom Jenner had talked closely with.
Due to the success of the procedure, especially compared to variolation, by the turn of the century (just a few short years after Jenner had run his experiment) vaccination could be found in almost all of Europe and was particularly concentrated in England. The success of Jenner’s work is outstanding. By 1840 vaccination had replaced variolation as the main weapon to fight against smallpox so much so that variolation was prohibited by law in British Parliament. The disease that had ripped so mercilessly through the world for centuries was finally declared eradicated in 1977 by the World Health Organisation (WHO) – perhaps more than the deceased Jenner could have ever hoped his discovery would achieve.
Edward Jenner:
Image via Pexels
Despite undeniably being a force for good in terms of the world, Jenner was also a remarkable person on a slightly smaller scale. Despite low supplies at times, Jenner would send his inoculation to anyone who asked for it – medical associates, friends and family, even strangers. Later in his life, he even set up his ‘Temple of Vaccinia’ in his garden where he vaccinated the poor free of charge. Despite the opportunity, Jenner made no attempt to profit off of his work, rather viewing his invention as a contribution to science and to humanity, and this was perhaps vital for the speed at which the vaccine and vaccination process spread.
Modern Vaccinations:
Nowadays vaccinations have changed – not in principle but in the nitty-gritty science of them – as we have begun to know more about how our immune system works. Jenner’s inoculant was adapted and changed to suit different diseases, containing either very mild strains of a virus with similar spike proteins, a dead strain of the virus, or even the isolated spike protein, enabling the body to recognise the pathogen without being exposed to the danger of it.
Introducing the body to the same spike proteins found on the harmful pathogen is in essence how vaccination works. The body responds to these spike proteins are foreign and so send phagocytes (a type of white blood cell) to destroy them, and lymphocytes to create antibodies to activate an immune response. This is why a few days after vaccination there may be a feeling of discomfort or slight fever – this is because the body is fighting against those spike proteins.
While the spike proteins are being destroyed, the body creates memory cells. These are the most important part of the vaccination procedure and mean that if the body is exposed to the actual, more dangerous pathogen in the future, the memory cells will recognise the spike protein and the body will have a secondary immune response, so that antibodies are produced in much greater quantity, sooner and more rapidly. Secondary immune responses to diseases are far more effective and often the person will never show any symptoms they have that disease, with the pathogens being destroyed within a matter of days.
Viral Vector Vaccines:
These are an example of exciting advances in vaccination. The way these type of vaccines work, such as the COVID-19 vaccine developed in the UK by Oxford University, is that the DNA from the actual virus is injected into an adenovirus (a deactivated virus that acts as a carrier for the actual virus DNA to our bodies), causing the antigens for actual virus to develop on the adenovirus. These can then trigger a strong immune response from the body without the actual virus itself being introduced into the body. This is an effective way to ensure memory cells to that virus are created, and this attributes to the Oxford vaccines high efficacy reports.
mRNA Vaccines:
The exciting new vaccination adaption is the mRNA material in the vaccine, and this has been used in some of the COVID-19 vaccines. The mRNA essentially is a set of instructions for the body to make the spike protein of the pathogen meaning the body makes the protein rather than it being cultivated in a laboratory and then put into a vaccination, but after that has exactly the same response. This allows the vaccination to be produced quicker and to be more effective. However, due to the newer and more complicated nature of the vaccine, it is more expensive to produce and needs to be stored at very low temperatures due to the mRNAs unstable nature. This can cause logistical issues with storage and distribution and is why the DNA based vaccine has been hailed as the best option for low income developing countries who do not have the facilities to store the mRNA vaccines. DNA vaccines can be stored at fridge temperature as DNA is far more stable than mRNA due to its double helix structure. This novel type of vaccine was developed by two Turkish immigrants living in Germany, who thought outside the box, like Jenner to improve human health in the race against time to find an effective vaccine. They have been enormously successful with the mRNA vaccine displaying 95% effectiveness against COVID-19 seven or more days after the second shot is administered.
Image via Pexels
Controversies of vaccinations:
During this pandemic, there has been wide-spread appreciation of how vital vaccines will be to control the spread of COVID-19. However, the voices of skeptics, often amplified by social media, seem to have found a more prominent platform to spread their opinions. They do not trust vaccination due to a variety of unfounded concerns. One of these is the argument that that the vaccinations are really ways for the government to implant chips into its citizens. Not only does this theory ignore the historic science of vaccination but logistically the needle would need to be far wider and the subsequent puncture wound would be far more noticeable.
The autism study:
Unfortunately, even though an article by Andrew Wakefield in 1998 was quickly shown to be based upon unfounded evidence, it continues to resurface among skeptics in their argument against vaccines, falsely claiming there is a link between autism and the MMR vaccine. Wakefield not only used only 12 children to test his hypothesis, far too small a group to draw up any kind of reliable conclusion, but he was also struck of the UK medical register for this paper. Wakefield’s study was disproven and redacted, and his hypothesis has been disregarded in the medical community through subsequent research and publication. The amplification of this fraudulent study has been cited as a reason for a decline in the uptake of the MMR vaccination and the subsequent small outbreaks of measles.
Development of COVID-19 vaccines:
For some, when they look at the speed with which the Covid-19 vaccine has been developed – under a year compared to more standard research time which can be as much as a decade – they are skeptical.
However, this is not because of cutting corners in the process; rather it is due to the immense amount of funding and equipment being given to scientists, as well as the sheer number of people working on the vaccine, to prioritise its development. In Phase I, II and III human trials are used and are assessed extensively for how the vaccine works in a diverse range of age groups, races, body types and pre-existing health conditions, as well as to accurately measure the exact immune response of the body – the antibodies and cells that have been produced and the efficacy and safety of the drug. This is then tested again by the approval companies – The Medicines and Healthcare Products Regulatory Agency for the UK, the European Medicines Agency for the EU and the Centre for Disease Control for the USA.
The World Health Organisation listed ‘vaccine hesitancy’ as one of the top ten threats to global health in 2019. This will play a crucial role in how quickly life can return to normal following the COVID-19 pandemic. Vaccinations are humans’ biggest weapon against the pandemic; they are, in the words of Sir David Attenborough, ‘a great triumph of medicine’, and although there has been recent news about mutations of the virus, it is important to remember that this is completely to be expected. The recent talk of the South Africa, UK and Brazil mutations have been due to small changes in the spike protein of the virus which have affected the transmissibility of the virus. There are tests currently being run, but early signs show that the vaccines are still effective against the mutation.
Even in the worst-case scenario, the vaccines can be adapted in a matter of weeks or months, and the government is preparing for a situation in which a COVID-19 vaccine has to be given annually to those at high risk, similar to the current flu vaccine. It comes as a relief that finally, in the wake of such a disruptive and terrible pandemic, there is light at the end of the tunnel and a reason to look forward to better days ahead, knowing that this lockdown will be very much so beneficial as every day more people are getting these game changing vaccinations.
Spring Focus: Metacognition – students driving their own learning through reflection
Teaching and learning Gem #28 – exam/assessment wrapper
Lots of us are promoting metacognition in the self-reflective reviews we are setting for students following the Spring Assessments. By reflecting on their own performance, we are encouraging students to think about their skills/understanding and become self-regulated learners.
I’m aware that for self-reflection to work, students need to take it seriously, realise its impact rather than pay lip-service to it. We can help them do this in the way we approach this sort of task. Additionally, the first minute of this videois great at helping students realise that self-reflection is an important part of life for all sorts of people: it’s not just something that happens in the classroom.
Right now, there is lots of great practice going on around the school, so I thought I’d share five different approaches from five departments to give a flavour:
Flipgrid for powerful, verbal self-reflection (Claire Baty)
Claire used Flipgrid as a way for students to send her a video of their self-reflection. This was quick to set up and powerful in its impact. Using a moderated Flpgrid board meant that students couldn’t see each other’s video reflections, so it felt like a personal one-to-one discussion with their teacher. Claire could then easily video a response back to the student using the platform. Claire says, “I am convinced that verbalising their self-reflection helps students to clarify their ideas and take on board their own advice more readily. I think they give more thought to something they have to say out loud than they would if I’d just asked them to jot down their ideas on OneNote.” Here were her instructions posted on Flipgrid.
NB: on a technical note, if you set up a moderated board and then want students to rewatch their video submission and see any video feedback from the teacher, they need to go to my.flipgrid.com Watch out for a video about this from Claire.
Redrafting with students noting why they are redrafting (Judith Parker)
Giving students the time to redraft is an invaluable metacognitive process. This is a slow/deep activity and cannot be rattled off quickly – it’s worth the lesson or homework time in gold. Judith asked students to engage with their assessment responses and think carefully about how to improve their own work. She increased the metacognitive challenge by asking student to note down why they have chosen to redraft a particular section. Making their thought processes clear to themselves helps them drive their own learning.
Students categorising the questions into skill type and reviewing their performance in these different skills (Clare Roper)
This is one part of a self-reflection worksheet that students complete on OneNote. By identifying and categorising the skills in each question, Clare is asking students to think in a structured way about strengths and to identify for themselves next steps in their learning. Spotting patterns in their performance makes clear to students how to approach further learning, and helps them see the sorts of skills they need to employ in future assessments/tests.
Microsoft Forms for targeted reflection on specific skills/questions (SuzyPett)
A questionnaire of focussed, self-reflection questions can be created using Microsoft Forms. Of course, these questions could easily be completed by students in OneNote, too.
And here is another exampleof a self-review for students at KS3 (Steph Harel)
I really like this metacognitive question on the below worksheet, “If you could go back in time before the assessment due date, what advice would you give yourself.” Encouraging a ‘self-dialogue’ is really valuable: the more students can ‘talk’ to themselves about what they are doing, the better.
Mrs Efua Aremo, a Design & Technology Teacher at WHS, explores whether a ‘human-centred design’ approach can help us deliver solutions which are effective in meeting local and global needs.
A World full of Need
It is impossible to adequately describe the profound losses experienced over the past 12 months. There are the more measurable losses such as employment, finance and health but then there are also the relational losses caused by isolation and tragic bereavements. It has been a brutal year for many, and the impact of the pandemic has been acutely felt by the most vulnerable.
When we are confronted with such needs both locally and internationally, we desire to help in any way we can, as Mr Keith Cawsey observed in his December article. However, it doesn’t take long to discover that people have many different types of needs and there are many different types of help we might provide.
What do we need?
“I need blue skies, I need them old times, I need something good…”
Those words from singer-songwriter, Maverick Sabre, powerfully captures the sense of longing many of us feel for simple things like sunshine and for more intangible things we struggle to name.
One of the most popular ways of categorising human needs was introduced by Abraham Maslow in 1943, it is known as Maslow’s hierarchy of needs.
Sometimes, efforts to provide help do not achieve the intended result. For example, in 2010, a US aid agency installed 600 hand pumps to supply clean water for rural households in northern Mozambique. The aim was to help the women and girls who travelled long distances to collect contaminated water from wells and rivers. The aid agency imagined that the pumps would save time, improve health conditions and empower the women to start small businesses. However, these water pumps were not used by most of the people in the community. What went wrong?
Helping those who are different to ourselves
Though the desire to help others is always to be commended, it can often be accompanied by wrong assumptions which hinder our ability to help effectively. This is especially true when we are seeking to help people from a different economic status, ethnicity or culture to our own.
It is tempting to assume we know what people need, especially if they have basic physiological needs which are not being met. However, even in his original paper, Maslow acknowledged that human beings are more complex than the tidy logic of his hierarchy suggests. He recognised that the lower-order needs do not need to be completely satisfied before the higher-order needs become important. Therefore, when helping the neediest people in society, we need to get to know them beyond their basic needs.
Recognising this fact is key to understanding what went wrong with the water pumps in Mozambique. The aid agency seems to have stereotyped the rural women as passive, needy people and so failed to ask their opinion about where best to locate the new pumps. They focussed their attention on providing access to clean water but did not account for the fact that the original water sites were “important social spaces where women exchanged information, shared work, socialized their children, and had freedom outside the home.” The new sites lacked the privacy, shade and areas for laundry and bathing which the women valued, and so the new water pumps were rejected.
Thankfully, we can learn from experiences like this to devise better ways of helping people in need.
Human-Centred Design: A Better Way?
“In order to get to new solutions, you have to get to know different people, different scenarios, different places.”
Human-centred design (also known as ‘design thinking’) is an approach to problem-solving which involves partnering with those in need of help to deliver the solutions which most benefit them. It involves “building deep empathy with the people you’re designing for… as you immerse yourself in their lives and come to deeply understand their needs.”
The Elements of Human-Centred Design
But this does not mean that those who are being helped are only consulted at the start of the process. Human-centred design is a non-linear collaborative process which involves back-and-forth communication between those helping and those needing help. Together they produce many design iterations until they find a solution which best suits those who need it. It is obvious how this approach might have led to better results in Mozambique.
Human-centred design involves looking beyond their needs and acknowledging the full humanity of the people who we wish to help: appreciating their culture, discovering what they value, and how they might contribute to meeting their own needs.
Taking a more human-centred approach enabled Jerry Sternin from Save the Children to successfully deal with the problem of severe malnutrition amongst children in rural Vietnam in the 1990s. Previous attempts had relied on aid workers providing resources from outside the affected communities – these methods proved unsustainable and ineffective.
Sternin discovered that despite their poverty, some mothers were managing to keep their children healthy. So he sought to learn from them and discovered what they were doing differently from their neighbours: they were feeding their children smaller meals multiple times a day rather than the conventional twice daily. They were also adding to these meals freely available shellfish and sweet potato greens even though other villagers did not deem these appropriate for children.
By empowering the mothers to train other families in these practices, Sternin was able to help the community help itself. Malnutrition in northern Vietnam was greatly reduced through implementing this effective, empowering and sustainable local solution.
The Wonderbag
Wonderbag by Conasi.eu, CC BY-NC 3.0[iii]
Another sustainable design solution is the Wonderbag, which is a non-electric slow-cooker. Once a pot of food has been brought to the boil and placed in the foam-insulated Wonderbag, it will continue to cook (without the need for additional heat) for up to 12 hours. This product was developed in South Africa to address the problems caused by cooking indoors on open fires. It has vastly improved the lives of the women who use them because cooking with the Wonderbag uses less fuel and water, improves indoor air-quality, and frees up time which many girls and women have used to invest in their education, employment, or to start their own businesses. Local women use their sewing skills to customise the Wonderbags with their own cultural designs.
Human-Centred Design at WHS
Year 9 WHS Design Students
In Year 9, design students at WHS are tasked with designing assistive devices for clients with disabilities. One of the first things they need to do is get to know their users; seeing beyond their disabilities and discovering who they are, what they love, and what they hate.
One pupil found that her client who suffers from benign tremors loves to paint but hates having to use massive assistive devices because they draw too much attention to her. This pupil is currently developing a discrete product which will help their client paint again, meeting her needs for esteem and self-actualisation.
Helping Others in this Time of Need
In the midst of a global pandemic and in its aftermath, we will encounter people in need of both emergency relief and longer-term development assistance. Perhaps by adopting a human-centred design approach, we will be able to help others in ways which are effective, sustainable, and which recognise the beautifully complex humanity of those in need.
Keith Cawsey, “What Has COVID Taught Us about Our Relationships with Others?,” WimTeach, 10 December 2020, http://whs-blogs.co.uk/teaching/covid-taught-us-relationships-others/.
Maverick Sabre, I Need (Official Video), 2011, https://www.youtube.com/watch?v=GZNtticFI60.
Abraham H. Maslow, A Theory of Human Motivation, 1943, http://psychclassics.yorku.ca/Maslow/motivation.htm.
Joshua Seong, Maslow’s Hierarchy of Needs, 2020, https://www.verywellmind.com/what-is-maslows-hierarchy-of-needs-4136760.
Timothy Keller, Generous Justice (London: Hodder & Stoughton, 2010); Oxfam GB, “How We Spend Your Money,” n.d., https://www.oxfam.org.uk/donate/how-we-spend-your-money/.
Emily Van Houweling, Misunderstanding Women’s Empowerment (Posner Center, 2020), https://posnercenter.org/catalyst_entry/misunderstanding-womens-empowerment/.
Emily Van Houweling, Misunderstanding Women’s Empowerment.
Emi Kolawole, Stanford University d.school cited in IDEO.org, The Field Guide to Human-Centered Design: Design Kit, 2015, 22.
IDEO.org, “What Is Human-Centred Design?,” Design Kit, n.d., https://www.designkit.org/human-centered-design.
Monique Sternin, “The Vietnam Story: 25 Years Later,” Positive Deviance Collaborative, n.d., https://positivedeviance.org/case-studies-all/2018/4/16/the-vietnam-story-25-years-later.
Jerry Sternin and Robert Choo, “The Power of Positive Deviancy,” Harvard Business Review, 1 January 2000, https://hbr.org/2000/01/the-power-of-positive-deviancy.