Should the periodic table be turned upside down?

Chemistry beakers

Isabel, Y10, explores the comprehensibility of Dmitri Mendeleev’s traditional periodic table and whether it would be more accessible for younger children and enhance learning methods if it were flipped around by 180˚.

The Periodic Table is an important symbol in Chemistry and since Dmitri Mendeleev’s discovery of the Periodic system in 1869, it has remained the same for 150 years; but could turning it 180˚ make important concepts easier to understand, especially in teaching younger children?

This year has been announced the Year of (Mendeleev’s) Periodic Table which has become the generic way of arranging the elements. However, some scientists like Martyn Poliakoff and his team have started to question the comprehensibility of it. After extensive research, they decided to flip the traditional arrangement upside down, so that the information is more understandable and intuitively ordered.

The research team argues that this presentation is more helpful and has many benefits. Firstly, when the table is flipped the properties of the elements such as atomic mass and proton number now increase from bottom to top therefore making more numerical sense. Secondly, it represents the Aufbau principal more accurately, which states that electrons fill up ‘shells’ from low to high energy. Finally, when young children are trying to learn from the table, the more relevant elements to them are located towards the bottom of the table, making its use quicker and more accessible. Therefore, in lessons, students will not have to look all the way to the top of the table to be able to find the right information.

Above: Inverted Periodic Table. Source: University of Nottingham
Above: Traditional Periodic Table

However, when I compared the two versions of the periodic table myself, I found that the traditional form of the table made more sense to me for many reasons. For example, in both situations I found my eyes drawn to the top row of elements, so it did not matter that the elements that I use the most were on the bottom row. However, this could be put down to a force of habit, so I also asked my 10-year-old brother to look at the two perspectives of the table and see where he looked. He immediately pointed to the top of both and when I asked him the reason he said that from top to bottom ‘is the way you read’ so the properties make more sense going down from top to bottom. He also seemed to prefer the traditional table, commenting that it was like a ‘pyramid’ in the way the numbers were arranged and was a much clearer way to display the elements.

Whilst some may argue that the arrangement of the table is more effective if it were upside down, for me the traditional version of the periodic table works just as well. Testing this principle to a larger group will allow different models to be tried to see if it makes understanding the periodic table easier for younger learners.


References:

Martyn Poliakoff et al, Nat. Chem., 2019, https://www.nature.com/articles/s41557-019-0253-6

 

Is contemporary architecture threatening London’s historic skyline?

Walkie Talkie Building

Maddie, Year 13, argues whether modern buildings are ruining London’s skyline and balances the advantages and disadvantages of modern projects.

London’s historic architecture is one of our greatest assets – culturally, socially and economically. It lies at the heart of London’s identity and distinctiveness, and its very success. It is at risk of being badly and irrevocably damaged. More than 70 tall towers are currently being constructed in London alone, prompting fears from conservation bodies and campaigners that the capital’s status as a low-rise city is being sacrificed in a dash by planners to meet the demand for space and by developers to capitalise on soaring property prices.
There have been many examples of tall buildings that have had a lasting adverse impact through being unsuitably located, poorly designed, inappropriately detailed and badly built and managed. For example, the so-called ‘Walkie talkie’ building which due to bad design concentrated the sun’s rays melting parts of cars on the streets below. And recently there has, yet again, been another proposed skyscraper in the Paddington area to the west of central London. The 224m-high Paddington Tower costing £1 bn would be the fourth highest in the capital and the first of such scale in that part of London. A building of this scale in this location threatens harm to many designated heritage assets across a wide geographical area, including listed buildings, registered historic parks and conservation areas.

London Bridge However, some people think that cities face a choice of building up or building out. Asserting that there’s nothing wrong with a tall building if it gives back more than it receives from the city. An example of a building succeeding to achieve this is the £435 million Shard, which massively attracted redevelopment to the London Bridge area. So, is this a way for London to meet rising demand to accommodate growing numbers of residents and workers?

Well, planning rules are in place in order to make sure that London achieves the correct balance to ensure tall buildings not only make a positive contribution to the capital’s skyline, but deliver much-needed new homes for Londoners as well workspace for the 800,000 new jobs expected over the next 20 years. Furthermore, tall contemporary buildings can represent “the best of modern architecture” and it encourages young architects to think creatively and innovatively making London a hub for budding architects. It also means that areas with already run-down or badly designed features have the chance to be well designed improving user’s day-to-day life whilst also benefiting the local landscape.

Protected viewpoints of the city of London

The protected viewpoints of the city of London. Do skyscrapers threaten this?

Overall, I think that in a cosmopolitan and growing capital city, London needs contemporary architecture, to embody its spirit of innovation. However, this needs to be achieved in a considered and managed way so as not to ruin the historic skyline we already have.

 

Toward the Unknown Region: how do we impart the skills and knowledge required for students to be successful in careers that currently do not exist?

Future of Jobs 2

Toward the Unknown Region[1] – Mr. Nicholas Sharman, Head of Design & Technology looks at whether integrating STEAM into the heart of a curriculum develops skills required for careers that do not currently exist.

The world of work has always been an evolving environment. However, it has never been more pertinent than now; according to the world economic forum, 65% of students entering primary school today will be working in jobs that do not currently exist[2].

As educators, this makes our job either extremely difficult, pointless or (in my view) one of the most exciting opportunities that we have been faced with for nearly 200 years since the introduction of the Victorian education system. The idea of relying solely on a knowledge-based education system is becoming outdated and will not allow students to integrate into an entirely different world of work. Automation and Artificial Intelligence will make manual and repetitive jobs obsolete, changing the way we work entirely. Ask yourself this: could a robot do your job? The integration of these developments is a conversation all in its own and one for a future post.

So, what is STEAM and why has it become so prominent in the UK education system?

The acronym STEM was (apparently) derived from the American initiative ‘STEM’ developed in 2001 by scientific administrators at the U.S. National Science Foundation (NSF)[3]. The addition of the ‘A’ representing the Arts, ultimately creating Science, Technology, Engineering, Arts and Maths. Since the introduction of STEM-based curriculums in the US, the initiative has grown exponentially throughout the globe, with the UK education system adopting the concept.

So why STEAM and what are the benefits? STEAM education is far more than just sticking subject titles together. It is a philosophy of education that embraces teaching skills and subjects in a way that resembles the real world. More importantly, it develops the skills predicted to be required for careers that currently do not exist. What are these skills and why are they so important?

Knowledge vs Skills

When we look at the education systems from around the world there are three that stand out. Japan, Singapore and Finland have all been quoted as countries that have reduced the size of their knowledge curriculum. This has allowed them to make space to develop skills and personal attributes. Comparing this to the PISA rankings, these schools are within the top 5 in the world and in Singapore’s case, ranked No1[4].

I am sure we cannot wholly attribute this to a skills-focused curriculum; however, it does ask the question – what skills are these schools developing and how much knowledge do we need?[5],[6]

  1. Mental Elasticity – having the mental flexibility to think outside of the box, see the big picture and rearrange things to find a solution.
  2. Critical Thinking – the ability to analyse various situations, considering multiple solutions and making decisions quickly through logic and reasoning.
  3. Creativity – robots may be better than you may at calculating and diagnosing problems, however, they are not very good at creating original content, thinking outside the box or being abstract.
  4. People Skills – the ability to learn how to manage and work with people (and robots), having empathy and listening
  5. SMAC (social, mobile, analytics and cloud) – learning how to use new technology and how to manage them
  6. Interdisciplinary Knowledge – understanding how to pull information from many different fields to come up with creative solutions to future problems.
Future of Jobs graph
The Future of Jobs Report by the World Economic Forum showing the pace of change in just 5 years

All of the above skills are just predictions. However, the list clearly highlights that employers will be seeking skill-based qualities, with this changing as future jobs develop and materialise. So do we need knowledge?

Well, of course we do – knowledge is the fundamental element required to be successful in using the above skills. However, as educators, we need to consider a balance of how we can make sure our students understand how important these skills will be to them in the future when an exam grade based on knowledge could be irrelevant to employers.

What subjects promote these skills?

As a Technologist, I believe there has never been a more important time in promoting and delivering the Design & Technology curriculum. The subject has for too long been misrepresented and had a stigma hanging around it due to previous specifications and people’s experiences, comments such as ‘so you teach woodwork then?’ really do not give justice to the subject.

With the introduction of the new curriculum, allowing students more opportunity to investigate and build these future skills, the subject has never been more relevant. Looking at the list of promoted skills, I cannot think of another subject that not only promotes these skills but also actively encourages the integration into every lesson. Do not get me wrong, all subjects are as equally important. Design & Technology is a subject that is able to bring them all into real-world scenarios. If we think about the knowledge that is developed in Science for example – where students can look at material properties and their effect on the user’s experience, or Religious Studies and how different signs, symbols or even colours can have different meanings in cultures affecting the design of a fully inclusive product – they can all be related to Design and Technology in one way or another.

Comparing the Design & Technology curriculum to the future skills list, we can break down the different skills it develops. It encourages mental elasticity through challenging student’s ideas and concepts, thinking differently to solve current and real-life problems. It allows students to develop critical thinking, through challenging their knowledge and understanding; ensuring students develop the ability to solve problems through investigation, iteration and failure, ultimately building resilience. It goes without saying that the subject not only encourages creativity but allows students to challenge concepts and ideas through investigating and questioning. Furthermore, it teaches the concept of ‘design thinking’ and collaborative working, allowing students to develop people skills, understanding how people work, interact and think; enhancing empathy and understanding. As technology progresses the subject follows suit, permitting students to implement and understand how new and emerging technologies are embedded, not only into the world of design but the Social, Moral and environmental effects they create. Lastly and probably most importantly, is how the subject teaches interdisciplinary knowledge. I like to describe Design & Technology as a subject that brings knowledge from all areas of the curriculum together, the creativity and aesthetics from Art, the application of Maths when looking at anthropometrics, tolerances or even ratios, how Religious Studies can inform and determine designs, how science informs and allows students to apply theory, or even the environmental impact Geography can show. I could go on and explain how every subject influences Design & Technology in one way or another, although, more importantly, it shows how we need to look at a more cohesive and cross-curricular curriculum; when this happens the future skills are inherently delivered in a real-world application.

Looking back at the question at the start of this article, we can start to conclude why having the concept of STEAM at the heart of a school environment is so important. However, it is not good enough to just ‘stick’ subjects together, there has to be a bigger picture where knowledge and skills are stitched together like a finely woven tapestry. Ideally, we would look at the primary education system, where we remove subject-specific lessons, develop co-teaching, learning that takes place through projects bringing elements from all subjects in to cohesive projects; teachers would become facilitators of learning, delivering knowledge not in a classroom but in an environment that allows more autonomous research and investigation. However, until the exam system changes, this is not going to fully happen.

So what could we be doing more? I believe we should be focusing on more cross-curricular planning, developing skills application and using knowledge to enhance learning. By developing a curriculum centred around a STEAM approach, we can start to develop the skills required for our students and the careers of the future.


References: 

[1] See https://whitmanarchive.org/published/LG/1891/poems/245 for the text to Ralph Vaughan Williams’ piece for choir and orchestra entitled ‘Toward the Unknown Region’
[2] https://www.weforum.org/reports/the-future-of-jobs-report-2018
[3] https://www.britannica.com/topic/STEM-education
[4] http://www.oecd.org/pisa/
[5] https://www.weforum.org/focus/skills-for-your-future
[6] https://www.crimsoneducation.org/uk/blog/jobs-of-the-future

Learning: Back to the Future

‘Hamlet 360: Thy Father’s Spirit’ - a virtual reality film combining traditional Shakespeare with modern VR technology

Mrs Jane Lunnon, Head of WHS, looks at the impact of digital learning on education, linking this to recent examination reforms at GCSE and A Level.

Imagine this: you are watching a production of Hamlet online. Gertrude is betraying her son, Ophelia is going mad. Claudius is hiding things and Hamlet is doing (or rather, not doing) his thing.  And you, the viewer, are not only watching this on your computer, you are also, right there, in the show, a reflection in a gilded mirror – daubed with blood and looking pretty ropey. (Your part is the ghost of Old Hamlet.)

And so, you are there and not there. You can see yourself – as watched and watcher.  How brilliant, how extraordinary, how game-changing is that? This is happening, right now. In the US, the Commonwealth Shakespeare Company have teamed up with Google: so VR tech teamed with great creativity, enabling viewers to inhabit the text – to literally become part of it.  That’s what’s happening in learning today.[1]

 

‘Hamlet 360: Thy Father’s Spirit’ - a virtual reality film combining traditional Shakespeare with modern VR technology
‘Hamlet 360: Thy Father’s Spirit’ – a virtual reality film combining traditional Shakespeare with modern VR technology

 

[1] See https://www.nytimes.com/2019/01/25/theater/hamlet-virtual-reality-google.html

And it’s not just some exotic, transatlantic experiment.  The impact of technology on the way we learn is seminal and astonishing. In our last staff meeting, our Director of Innovation and e-learning (imagine even having such a job title in a school ten years ago), was heralding the arrival of a brand new set of VR Headsets. As a school, we adopted BYOD (bring your own device) several years ago and this, when combined with the headsets and Google Expeditions, means that our pupils can journey to Africa, to Jerusalem, to Tudor England, to the inside of a black hole, to the inside of their own bodies… The impact on our students, when they do, is immediate and palpable. It’s not just gimmicks and game-playing; this is sentient, dynamic, visual learning in ways those of us who became excited by the potential of power-point in the late 1990s, could barely have imagined.

But the technological revolution in education is not just about the flashy, painting with coloured light sort of stuff (although it’s very hard not to get terribly excited by all of that). As a Microsoft Showcase school, we have adopted wholesale software like Microsoft Teams (useful baskets to keep all our meeting/lesson/admin resources), Onenote – seamless collaborative working/library spaces, and Onedrive – shared document folders. Like many schools, we have found that the truly revolutionary and transformative development in education IT was the Cloud and the way it has made accessing and sharing learning seamless and straightforward. The learning environment is no longer just in the classroom or the library. It is now, quite simply, everywhere: in the playground, on the bus, at your mate’s house, in the kitchen…and this has made a real difference to the way children learn and the way we all teach. My Year 7 English students, for example, work online – using their class TEAM. They do their homework in their own folders stored in that TEAM basket and I can then mark it (using the clever pen that writes on the screen) as soon as they do it. That means, that I can see at once if they are not quite getting the point about enjambment or the impact of verse form on the meaning of a poem – and I can adapt my next lesson plan accordingly.  No more waiting around for a week for the work to be done, the books to come in and the homework to be marked. So nothing radical there; just more efficiency, more pace, more targeted planning. Which of course leads to more opportunity for stretch and fun and better outcomes all round.

We are not simply operating as advertisers for Microsoft products here, although earlier this term we were thrilled to find ourselves acting as a SW London outpost for the BETT Conference, with 40 or so delightful Swedish educators, joining us, keen to find out what we were doing and how we were doing it. I suspect that’s the largest number of Swedes we have entertained in this building at any one time in the 140-year history of the school!  It was a real pleasure to share our experiences, to learn what they are doing and to celebrate together the range, power and versatility which technology has brought into the classroom and beyond it.

And this is important because technology doesn’t just allow us to do things in a more colourful or more efficient way. It also, clearly, changes the way that children approach learning. Much of their work in the classroom, for example, is collaborative. It is as much about team-building and communication, about effective listening, careful research and powerful articulation of ideas, as it is about the causes of the First World War, or how to integrate fractions. The skills our world now requires (as the Hamlet example above suggests) is not just technical expertise and versatility, not simply the acquisition and application of key facts, analytical thinking and problem solving but creative flair, the ability to connect and link ideas and fields of knowledge and curriculum areas often in surprising, unexpected ways. And then there’s the capacity to communicate all this persuasively and effectively both in person and on paper. These are the skills necessary for a dynamic, technological, connected and highly protean workplace and it matters that our young people are encouraged to develop them in school.

That’s why we are developing our STEAM programme so enthusiastically at WHS. Our Steam Room, staffed by scientists in residence (SiRs), is not just the base for our girls to engage in scientific research and inquiry (with external partners

as well as internally) it is also a symbol of our cross-curricular approach. The job of our SiRs, is to facilitate inter-disciplinary connections. (RS meets Science when Year 7s try to make the dyes in Joseph’s dream-coat, English meets Psychology when A Level English students engage in the psychological exploration of the characters in ‘To The Lighthouse’, Geography, Physics and Technology combine when Year 9s design wind turbines… the list goes on.)

Facility with all of this, the ability to think flexibly, imaginatively and with resilience and integrity when confronted with tough problems, this feels like the urgent pedagogical focus for us now and it feels like the best way to prepare our children for the future. I had the great good fortune of hearing Sophie Hackford speak at the GDST Summit last summer[2]. Sophie is a Futurist (which strikes me as one of the best job titles ever). Her job is to look at trends and projections and the dreams of techno-enthusiasts everywhere and work out what is likely to be coming next – and then to advise government and anyone else who will listen. She described a world in which fake and real blend imperceptibly, where the world becomes our screen and we become computers, where space is our playground and our new hang out. A world where asteroids could be bought and mined, Mars could be inhabited. All alarming and deliberately provocative perhaps, but also, exciting and reflective of the urge to think differently and to imagine the hitherto unimaginable. This again, is what the future requires of us.

What it doesn’t need, I feel sure, is for our children to show that they can sit in rows of desks and write, on paper, with a pen, regurgitating facts they have carefully learnt, for three hours at a time. And yet that, of course, is what our examination system currently requires our children to do. And indeed, has done, to a greater or lesser extent, for the last hundred years or so. Learn this, commit it to memory, show me you’ve done so by writing it out on paper. How absolutely extraordinary, that in a world which has made so much progress and right in the middle of a technological revolution, here we are, still fundamentally assessing our students’ talent and achievements at school, with a pen, paper and serried rows of desks.

We might, perhaps, take comfort from the fact that there has been significant reform in our exam system recently. More academic rigour has been brought in at A Level and at GCSE.  And yes,  A Levels and GCSEs are new(ish) – more rigorous, fatter – the modules you can endlessly resit are gone, so is the huge emphasis on coursework. They have, indeed, been reformed. But reform is not revolution. These specifications, these exams, this assessment system is not a radical re-think for a new(ish) century. It’s not even a radical re-think for the old century. These exams are not modern – as those of us who are old enough to remember the very old O Levels and A Levels can testify. Indeed, it’s all there, as it always was: little or no coursework, significant emphasis on learned material, assimilation of key facts and the ability to remember and apply those facts in writing, to time, in big exam halls with your entire cohort sitting around you, using (mostly) a pen. There’s not much there that we don’t recognise. Indeed, not much that we wouldn’t recognise if we went back to when our parents were young. Perhaps there’s more rigour, but in the context of Sophie Hackford and the Google school of innovation and reform, it feels more like rigor mortis than bracing, academic stretch and dynamic aspiration for our young people in a new century.

[2] See https://www.gdst.net/article/gdst-summit-new-frontiers-equipping-girls-future

So, let’s not wonder (along with Hamlet) “why yet [we] live, to say this thing’s to do”. The assessment of our children need not be a tragedy if we can find ways to prepare them for examinations that require them to think and act differently and which make as much use as possible of the amazing new technological tools at our disposal. There are, indeed, “more things in heaven and earth than are dreamt of in [our] philosophy”. Time to embrace them, I think.

This article was first published in Independent Education Today

The Brain Chemistry of Eating Disorders

Jo, Year 13, explores what is happening chemically inside the brains of those suffering from eating disorders and shows how important this science is to understanding these mental health conditions.

The definition of an eating disorder is any range of psychological disorders characterised by abnormal or disturbed eating habits. Anorexia is defined as a lack or loss of appetite for food and an emotional disorder characterised by an obsessive desire to lose weight by refusing to eat. Bulimia is defined as an emotional disorder characterised by a distorted body image and an obsessive desire to lose weight, in which bouts of extreme overeating are followed by fasting, self-induced vomiting or purging. Anorexia and bulimia are often chronic and relapsing disorders and anorexia has the highest death rate of any psychiatric disorder. Individuals with anorexia and bulimia are consistently characterised by perfectionism, obsessive-compulsiveness, and dysphoric mood.

Dopamine and serotonin function are integral to both of these conditions; how does brain chemistry enable us to understand what causes anorexia and bulimia?

Dopamine

Dopamine is a compound present in the body as a neurotransmitter and is primarily responsible for pleasure and reward and in turn influences our motivation and attention. It has been implicated in the symptom pattern of individuals with anorexia, specifically related to the mechanisms of reinforcement and reward in engaging in anorexic behaviours, such as restricting food intake. Dysfunction of the dopamine system contributes to characteristic traits and behaviours of individuals with anorexia which includes compulsive exercise and pursuit of weight loss.

In people suffering from anorexia dopamine levels are stimulated by restricting to the point of starving. People feel ‘rewarded’ by severely reducing their calorie intake and in the early stages of anorexia the more dopamine that is released the more rewarded they feel and the more reinforced restricting behaviour becomes. Bulimia involves dopamine serving as the ‘reward’ and ‘feel good’ chemical released in the brain when overeating. Dopamine ‘rushes’ affect people with anorexia and bulimia, but for people with anorexia starving releases dopamine, whereas for people with bulimia binge eating releases dopamine.

Serotonin

Serotonin is responsible for feelings of happiness and calm – too much serotonin can produce anxiety, while too-little may result in feelings of sadness and depression. Evidence suggests that altered brain serotonin function contributes to dysregulation of appetite, mood, and impulse control in anorexia and bulimia. High levels of serotonin may result in heightened satiety, which means it is easier to feel full. Starvation and extreme weight loss decrease levels of serotonin in the brain. This results in temporary alleviation from negative feelings and emotional disturbance which reinforces anorexic symptoms.

Tryptophan is an essential amino acid found in the diet and is the precursor of serotonin, which means that it is the molecule required to make serotonin. Theoretically, binging behaviour is consistent with reduced serotonin function while anorexia is consistent with increased serotonin activity. So decreased tryptophan levels in the brain, and therefore decreased serotonin, increases bulimic urges.

Conclusions

Distorted body image is another key concept to understand when discussing eating disorders. The area of the brain known as the insula is important for appetite regulation and also interceptive awareness, which is the ability to perceive signals from the body like touch, pain, and hunger. Chemical dysfunction in the insula, a structure in the brain that integrates the mind and body, may lead to distorted body image, which is a key feature of anorexia. Some research suggests that some of the problems people with anorexia have regarding body image distortion can be related to alterations of interceptive awareness. This could explain why a person recovering from anorexia can draw a self-portrait of their body image that is typically 3x its actual size. Prolonged untreated symptoms appear to reinforce the chemical and structural abnormalities in the brains seen in those diagnosed with anorexia and bulimia.

Therefore, in order to not only understand and but also treat both anorexia and bulimia, it is central to look at the brain chemistry behind these disorders in order to better understand how to go about successfully treating them.

 

As teachers, do we need to know about big data?

Clare Roper, the Director of Science, Technology and Engineering at WHS explores the world of big data.  As teachers should we be aware of big data? Why, and what data is being collected on our students every day… but equally relevant questions about how we could increase awareness of the almost unimaginable possibilities that big data might expose our students to in the future.

The term ‘big data’ was first included in the Oxford English Dictionary in 2013 where it was defined as “extremely large data sets that may be analysed computationally to reveal patterns, trends, and associations.”[1] In the same year it was listed by the UK government as one of the eight great technologies that now receives significant investment with the aim of ensuring the country is a world leader in innovation and development.[2]

‘Large data sets’ with approximately 10000 data points in a spreadsheet have recently introduced into the A Level Mathematics curriculum, but ‘big data’ is on a different scale entirely with the amount of data expanding at such speed, that it cannot be stored or analysed using traditional methods. In fact, it is predicted that between 2012 and 2020 the global volume of data will increase exponentially from 4.4 zettabytes to 44 zettabytes (ie. 44 x1021 bytes)[3] and data scientists now talk of ‘data lakes’ and ‘dark data’ (data that you do not know about).

But should we be collecting every piece of data imaginable in the hope it might be useful one day, and is that even sustainable or might we be sinking in these so-called lakes of data? Many data scientists argue that data on its own actually has no value at all and that it is only when it is analysed in context that it becomes valuable. With the introduction of GDPR in the EU, there has been a lot of focus on data protection, data ethics and the ownership and security of personal data.

At a recent talk at the Royal Institute, my attention was drawn to the bias that exists in some big data sets. Even our astute Key Stage 3 scientists will be aware that if the data you collect is biased, then inevitably any conclusions drawn from it will at best be misleading, but more likely, be meaningless. The same premise applies to big data. The example given by Maja Pantic from the Samsung AI Lab in Cambridge, referred to facial recognition, and the cultural and gender bias that currently exist within some of the big data behind the related software – but this is only one of countless examples of bias within the big data on humans. With more than half the world’s population online, digital data on humans makes up the majority of a phenomenal volume of big data that is generated every second. Needless to say, those people who are not online are not included in this big data, and therein lies the bias.

There are many examples in science where the approach to big data collection has been different to that collected on humans (unlike us, chemical molecules do not generate an online footprint by themselves) and new fields in many sciences are advancing because of big data. Weather forecasting and satellite navigation rely on big data and new technologies have emerged including astroinformatics, bioinformatics (boosted even further recently thanks to an ambitious goal to sequence the DNA of all life – Earth Biogenome project ), geoinformatics and pharmogenomics to name just a few. Despite the fact that the term ‘big data’ is too new to be found in any school syllabi as yet, here at WHS we are already dabbling in big data (eg. MELT project, IRIS with Ark Putney Academy, Twinkle Orbyts, UCL with Tolcross Girls’ and Tiffin Girls’ and the Missing Maps project).

To grapple with the idea of the value of big data collections and what we should or should not be storing and analysing, I turned to CERN (European Organisation of Nuclear Research). They generate millions of collisions every second from the Large Hadron Collider and therefore will have carefully considered big data collection. It was thanks to the forward thinking of the British scientist, Tim Berners-Lee at CERN that the world wide web exists as a public entity today and it seems scientists at CERN are also pioneering in their outlook on big data. Rather than store all the information from every one of the 600 million collisions per second (and create a data lake), they discard 99.99% of this data as it is produced and only store data for approximately 100 collisions per second. Their approach is born from the idea that although they might not know what they are looking for, they do know what they have already seen [4]. Although CERN is not using DNA molecules for the long-term storage of their data yet, it seems not so far-fetched that one of a number of new start-up companies may well make this a possibility soon. [5]

None of us know what challenges lie ahead for ourselves as teachers, nor our students as we prepare them for careers we have not even heard of, but it does seem that big data will influence more of what we do and invariably how we do it. Smart data, i.e. filtered big data that is actionable, seems a more attractive prospect as we work out how balance intuition and experience over newer technologies reliant on big data where there is a potential for us to unwittingly drown in the “data lakes” we are now capable of generating. Big data is an exciting, rapidly evolving entity and it is our responsibility to decide how we engage with it.

[1] Oxford Dictionaries: www.oxforddictionaries.com/definition//big-data, 2015.

[2] https://www.gov.uk/government/speeches/eight-great-technologies

[3] The Digital Universe of Opportunities: Rich Data and the Increasing Value of the Internet of Things, 2014, https://www.emc.com/leadership/digital-universe/

[4] https://home.cern/about/computing

[5] https://synbiobeta.com/entering-the-next-frontier-with-dna-data-storage/

The importance of collaborative learning

How can we encourage collaborative learning? Alex Farrer, STEAM Co-ordinator at Wimbledon High, looks at strategies to encourage creative collaboration in the classroom.

Pupils’ ability to work collaboratively in the classroom cannot just be assumed. Pupils develop high levels of teamwork skills in many areas of school life such as being part of a rowing squad or playing in an ensemble. These strengths are also being harnessed in a variety of subject areas but need to be taught and developed within a coherent framework.  Last week we were very pleased to learn that Wimbledon High was shortlisted for the TES Independent Schools Creativity Award 2019. This recognises the development of STEAM skills such as teamwork, problem solving, creativity and curiosity across the curriculum. Wimbledon High pupils are enjoying tackling intriguing STEAM activities in a variety of subject areas. One important question to ask is what sort of progression should we expect as pupils develop these skills?

The Science National Curriculum for England (D of E gov.uk 2015) outlines the “working scientifically” skills expected of pupils from year 1 upwards. Pupils are expected to answer scientific questions in a range of different ways such as in an investigation where variables can be identified and controlled and a fair test type of enquiry is possible.

However, this is not the only way of “working scientifically”. Pupils also need to use different approaches such as identifying and classifying, pattern seeking, researching and observing over time to answer scientific questions. In the excellent resource “It’s not Fair -or is it?” (Turner, Keogh, Naylor and Lawrence) useful progression grids are provided to help teachers identify the progression that might be expected as pupils develop these skills. For example, when using research skills younger pupils use books and electronic media to find things out and talk about whether an information source is useful. Older pupils can use relevant information from a range of secondary sources and evaluate how well their research has answered their questions.

The skills that are used in our STEAM lessons at Wimbledon High in both the Senior and Junior Schools utilise many of these “working scientifically” skills and skill progression grids can be very useful when planning and pitching lessons. However, our STEAM lessons happen in all subject areas and develop a range of other skills including:

  • problem solving
  • teamwork
  • creativity
  • curiosity

Carefully planned cross-curricular links allow subjects that might at first glance be considered to be very different from each other to complement each other. An example of this is a recent year 10 art lesson where STEAM was injected into the lesson in the form of chemistry knowledge and skills. Pupils greatly benefited from the opportunity to put some chemistry into art and some art into chemistry as they studied the colour blue. Curiosity was piqued and many links were made. Many questions were asked and answered as pupils worked together to learn about Egyptian Blue through the ages and recent developments in the use of the pigment for biomedical imaging.

There are many other examples of how subjects are being combined to enhance both. The physiological responses to listening to different types of music made for an interesting investigation with groups of year 7. In this STEAM Music lesson pupils with emerging teamwork skills simply shared tasks between members of the group. Pupils with more developed teamwork skills organised and negotiated different roles in the group depending on identified skills. They also checked progress and adjusted how the group was working in a supportive manner. A skill that often takes considerable practise for many of us!

Professor Roger Kneebone from Imperial College promotes the benefits of collaborating outside of your own discipline. He recently made the headlines when he discussed the dexterity skills of medical students. He talks about the ways students taking part in an artistic pursuit, playing a musical instrument or a sport develop these skills. He believes that surgeons are better at their job if they have learned those skills that being in an orchestra or a team demand.  High levels of teamwork and communication are essential to success in all of those fields, including surgery!

Ensuring that we give pupils many opportunities to develop these collaborative skills both inside and outside of lessons is key. We must have high expectations of progression in the way that pupils are developing these skills. Regular opportunities to extend and consolidate these important skills is also important. It is essential to make it clear to pupils at the start of the activity what the skill objective is and what the skill success criteria is. It is hard to develop a skill if it is not taught explicitly, so modelling key steps is helpful as is highlighting the following to pupils:

  • Why are we doing this activity?
  • Why is it important?
  • How does it link to the subject area?
  • How does it link to the real life applications?
  • What skills are we building?
  • Why are these skills important?
  • What sort of problems might be encountered?
  • How might we deal with these problems?

Teacher support during the lesson is formative and needs to turn a spotlight on successes, hitches, failures, resilience, problems and solutions. For example, the teacher might interrupt learning briefly to point out that some groups have had a problem but after some frustrations, one pupil’s bright idea changed their fortunes. The other groups are then encouraged to refocus and to try to also find a good way to solve a specific problem. There might be a reason why problems are happening. Some groups may need some scaffolding or targeted questioning to help them think their way through hitches.

STEAM lessons at Wimbledon High are providing extra opportunities for pupils to build their confidence, and to be flexible, creative and collaborative when faced with novel contexts. These skills need to be modelled and developed and progression needs to be planned carefully. STEAM is great fun, but serious fun, as the concentration seen on faces in the STEAM space show!

Twitter: @STEAM_WHS
Blog: http://www.whs-blogs.co.uk/steam-blog/

Making a living as a composer in the 21st Century – 29/06/18

Miss Katie Butler, Performing Arts Assistant at WHS and professional composer, looks at the important role of being a composer in the 21st Century.

Introduction

The role of the composer in society has changed a great deal over the centuries. Before the invention of writing and printing, music would have been passed down through oral tradition since time immemorial, but the first musical notation systems can be traced back to Ancient Greece. From there, the ability to notate music made it easier to create longer-form, more complex works, and through the centuries the process developed, from plainsong and early polyphony to the more defined periods of Western art music that we learn about in GCSE and A Level music (Renaissance, Baroque, Classical, and Romantic, up to the present day).

From pen to screen: how has technology changed the composition process?

With the explosion of technology and readily accessible media that has happened in more recent decades, there are more ways to be a composer than ever before – meaning the competition is much greater, but at the same time, so are the opportunities available. Now that we have composing software like Cubase and Logic, and sample libraries (that is, plugins of pre-recorded instruments that allow you to recreate a realistic orchestral sound from your computer), composing is no longer exclusively for those with formal musical education and the ability to read music, or a big budget to record live musicians in studios, and the lines between composer, orchestrator, sound designer and producer are becoming increasingly blurred.

In an age where anyone with a laptop can be a composer, how does this affect the opportunities open to us, and how do we take the step from composing for ourselves to making a living from it?

A little history

Going back through the centuries, many of the great Classical composers were financially able to compose the volume of work they did because of aristocratic patronage. Rich families would appoint composers to write music for private performance in their homes, providing them with a regular income and guaranteed performance opportunities, in return for entertainment and improvement of their own social standing and influence. This Classic FM article will introduce you to some of the major patrons through history. The process was similar for performers and writers; actors and musicians would be affiliated to specific families, and without patronage, we would not have the majority of Shakespeare’s work. Musicians have been making a more sustainable living from composing ever since copyright was introduced (in its earliest form in the late 18th century, and in its present since the early 20th). With rights and royalties, the great composers of previous eras would be earning a great deal more today than they would have done when they were alive.

The power of the internet

Fast-forward five-hundred years or so, and it’s a concept that’s still present today. Now that music is so widely accessible, the modern day “patron” is just a customer that downloads an album, goes to a gig or concert or buys sheet music. Websites like Patreon and Kickstarter allow freelancers invite their followers and fans to fund their work, providing exclusive and personalised content for those that subscribe. The internet is also a brilliant platform for performers to advertise their talents, as we have seen with the explosion of the “Youtuber” and Vine artists – for example, Justin Bieber, Carly Rae Jepsen, and Charlie Puth, who were all catapulted to stardom having been first spotted on their Youtube channels.

The same goes for composers. We can now market our work online with a website, and for all the Youtube videos, bloggers and adverts, there is music that get used in them, with many composers gaining a sizeable portion of their income from writing “library music”: individual tracks that could be used for all sorts of media, from adverts, corporate and educational videos to television and film. Library music companies will invite submissions from composers, where they will be professionally recorded and labelled for production companies to browse online, and composers are normally paid a one-off fee for the unlimited use of their music. One of the leading library music sites is Audio Network – take a look around the website to see the multitude of different styles that are available. Does it take the soul out of the process? Perhaps, but what it lacks in soul, it makes up for in flexibility, freedom and creative control, without the tight deadlines and clashing egos of film and television. Learn more from some composers who are making a living from library music here.

Film and television

Another strand of composing is for film and television, which has had a huge increase in popularity in recent years. It’s a career that relies almost entirely on building relationships with directors, writers and producers, and slowly working your way up. Film music has to fit a picture exactly, mirroring the movements onscreen, conveying emotion, and is very collaborative. It also involves working with directors who don’t necessarily know what they want, and requires such a broad knowledge and understanding of so many different genres of music that many people come to film composing later in their careers. While potentially hugely lucrative and undoubtedly one of the most exciting, rewarding composing careers, it is perhaps the most difficult one to break into.

Musical theatre

From the days of classical patronage to today, in order to earn a living as a composer our output is largely controlled by whoever is paying us – be this a patron, an advertising executive or a film director – but an area that allows more creative control than usual is musical theatre. Having monopolised the West End for decades, Andrew Lloyd Webber’s more recent original productions have been relative commercial flops (for example, the Phantom sequel Love Never Dies (2010), and the bizarre Stephen Ward (2013) that closed after three months), and he is now channelling his focus into helping the next generation of musical theatre writers and composers.

Love Never Dies – a musical failure? Or the catalyst for promoting young composers?

In 2017, he purchased the St James Theatre and renamed it The Other Palace, with the main purpose of bolstering new musicals, and they host regular open mic nights as well as workshops and showcases of new work. Off the back of this, composers can then earn money from licensing shows for amateur performance, or from a transfer of a show to a bigger theatre. Because the process from page to stage takes a great deal of time, other forms of income are still vital. Commercial song-writing allows this freedom to an extent, and there is a faster turnover of projects, but there is still the pressure from record labels to write hits that will sell and the competition is greater than for any other medium.

What can I do now?

As for where to get started while at school or university: GCSE and A Level Music courses will introduce you to the techniques used for composing and give you a chance to try it out, before specialising in university and postgraduate study, where you have the creative freedom to explore your own personal style without worrying about the mark schemes and hoop-jumping that comes with passing exams. You can also come along to our various composition clubs that take place during the week, where you have the freedom to work on your music. Early composition assignments can feel like creativity by numbers, but as they say, you have to learn the rules like a pro so you can break them like an artist…

It’s harder to get started making an income composing than in a lot of careers, but once established, there is essentially no cap on how far it is possible to go. It’s about finding your niche and a way of making it work for you, and new music (particularly by female composers) is being championed more now than ever. Here are some links specific to young female musicians:

PRS Women Make Music

Women In Music

Glyndebourne: Balancing the Score

If you think composing might be your thing then immerse yourself in learning more about your craft – go to gigs and concerts, see films in the cinema with the high-quality speakers and surround sound, explore both the West End and Off-West End theatre scenes (many shows have cheaper ticket lotteries or day tickets, and seats at the back for as little as £20). Seeing how others do it is the best way to learn how to do it yourself, and as Wimbledon residents with central London practically on our doorsteps, there really is no excuse not to! Most importantly, be brave and put your music out there so that people can see what you can do.

Happy writing!

Nanotechnology and its future in medicine – 07/09/18

Maya (Year 11), discusses the uses of nanotechnology in medicine, thinking about how far it has come and helped doctors. She also considers the dangerous aspects of using such small technology and the future benefits it may bring.

Technology in medicine has come far and with it the introduction of nanotechnology. Nanotechnology is the action of manipulating structures and properties at an atomic and molecular level as the technology is so small; it being one-billionth of a metre. This technology has many uses such as electronics, energy production and medicine and is useful in its diverse application. Nanotechnology is useful in medicine because of its size and how it interacts with biological molecules of the same proportion or larger. It is a valuable new tool that is being used for research and for combatting various diseases.

In medicine, nanotechnology is already being used in a wide variety of areas, the principle area being cancer treatment. In 2006 a report issued by NanoBiotech Pharma stated that developments related to nanotechnology would mostly be focused on cancer treatments. Thus, drugs such as Doxil, used to treat ovarian cancer will use nanotechnology to evade and surpass the possible effects of the immune system enabling drugs to be delivered to the disease-specific areas of the body. Nanotechnology is also helping in neuroscience where European researchers are currently using the technology to carry out electrical activity across dead brain tissue left behind by strokes and illnesses. The initial research was carried out to get a more in-depth analysis of the brain and to create more bio-compatible grids (a piece of technology that surgeons place in the brain to find where a seizure has taken place). Thus, it is more sophisticated than previous technologies which, when implanted, will not cause as much damage to existing brain tissue.

Beyond help in combatting cancer and research, nanotechnology is used in many areas in medicine from appetite control to medical tools, bone replacement and even hormone therapy. Nanotechnology is advancing all areas of medicine with Nano-sized particles enhancing new bone growth and additionally, there are even wound dressings that contain Nano-particles that allow for powerful microbial resistance. It is with these new developments that we are revolutionising the field of medicine, and with more advancements, we will be able to treat diseases as soon as they are detected.

Scientists are hoping that in the future nanotechnology can be used even further to stop chemotherapy altogether; fighting cancer by using gold and silica particles combined with nanotechnology to bind with the mutated cells in the body and then use infra-red lasers to heat up the gold particles and kill the tumour cells. This application would be beneficial as it would reduce the risk of surrounding cells being damaged as the laser would not affect them as much as the chemotherapy would.

In other areas, nanotechnology is further developing with diagnostics and medical data collection. This means that by using this technology, doctors would be able to look for the damaged genes that are associated with particular cancers and screen the tumour tissue faster and earlier than before. This process involves the Nano-scale devices being distributed through the body to detect chemical changes. There is also an external scan by use of quantum dots on the DNA of a patient which is then sequenced to check if they carry a particular debilitating genome, therefore providing a quicker and easier method for doctors to check in detail if a patient has contracted any illnesses or diseases. Furthermore, doctors will be able to gain a further in-depth analysis and understanding of the body by use of nanotechnology which surpasses the information found from x-rays and scans.

While this is a great start for nanotechnology, there is still little known about how some of the technology might affect the body. Insoluble nanotechnology for example, could have a high risk of building up in organs as they cannot diffuse into the bloodstream. Or as the nanoparticles are so small, there is no controlling where they could go, which might lead to Nano-particles entering cells and even their nuclei, which could be very dangerous for the patient. The science and technology committee from the House of Lords have reported concerns about nanotechnology on human health, stating that sufficient research has not been conducted on “understanding the behaviour and toxicology of nanomaterials” and it has not been given enough priority especially with the speed at which nanotechnology is being produced.

Nanotechnology is advancing medical treatment at a rapid rate, with new innovative technologies approved each year to help combat illnesses and diseases. Whilst more research needs to be conducted, the application of Nano-medicine will provide a platform of projected benefits that has potential to be valuable. Overall with the great burden that conditions like cancer, Alzheimer’s, HIV and cardiovascular diseases impose on the current healthcare systems, nano-technology will revolutionise healthcare with its advances techniques in the future as it progresses.

@Biology_WHS 

Critical Thinking: “the important thing is not to stop questioning.” – Albert Einstein

Richard Gale, teacher of Biology at WHS, looks at the value of critical thinking and how we can use this to help make logical and well-structured arguments.

At some point we all accept a fact or an opinion without challenging it, especially if we deem the person telling us the fact or opinion to be in a position of authority.

Challenging or questioning these people can seem daunting and rude, or at worst we could appear ignorant or stupid. However, if we never challenged or questioned ideas or perceived facts then the world would still be considered to be flat, and we would not have the theories of relativity or evolution.

This is what Einstein is getting at, that all ideas and preconceived facts should be questioned otherwise society will stagnate and no longer advance in any field of study. This process of constantly asking questions and challenging ideas is known as critical thinking.

It is said that someone who is a critical thinker will identify, analyse, evaluate and solve problems systematically rather than by intuition or instinct; almost a list of higher order thinking skills from Bloom’s taxonomy. The reason for placing critical thinking as a key higher order skill is because, as Paul and Elder (2007) noted “much of our thinking, left to itself, is biased, distorted, partial, uninformed or down-right prejudiced.  Yet the quality of our life and that of which we produce, make, or build depends precisely on the quality of our thought.”

In essence, critical thinking requires you to use your ability to reason. It is about being an active learner rather than a passive recipient of information by asking questions to understand the links that exist between different topics. It requires learners to weigh up and determine the importance and relevance of evidence and arguments, identifying arguments that are weak and those that are stronger; to build and appraise their own arguments, identify inconsistences and errors in arguments and reasoning, doing all of this in a systemic and consistent way. Then they should reflect on the justification of their own assumptions, beliefs and values. As Aristotle put it “it is the mark of an educated mind to be able to entertain a thought without accepting it.”

Critical thinkers rigorously question ideas and assumptions rather than accepting them at face value. They will always seek to determine whether the ideas, arguments and findings represent the entire picture and are open to finding that they do not. In principle anyone stating a fact or an opinion, and I am definitely including myself here as a teacher, should be able to reason why they hold that fact or opinion when asked questions and should be able to convince a class or an individual that those ideas have merit. Equally, as I know my pupils would attest too, pupils should be able to reason why they hold their opinions or ideas when questioned. Whilst this may seem daunting and at times a bit cruel, being able to think critically has become a very important skill with the onset of the new A levels.

In Biology, under the reformed linear A level, there has been in increase in the percent of marks awarded for higher order thinking skills, termed A02 and A03. A02 is to ‘apply knowledge and understanding of scientific ideas, processes, techniques and procedures’ whereas A03 is ‘analyse, interpret and evaluate scientific information, ideas and evidence, including in relation to issues.’ This is weighted between 40-45% of marks for A02 and 25-30% for A03 skills of the overall percentage across the three papers. The pupils taking the exams are expected to critically interpret data and theories, as well as analysing and interpreting the information they have learnt in completely novel situations. The following quote from Carl Segan is now more significant as knowing facts is no longer enough for pupils to succeed: “knowing a great deal is not the same as being smart; intelligence is not information alone but also judgment, the manner in which information is collected and used.”

Thankfully, we can develop and train ourselves – and others – to be critical thinkers. There are a plethora of guides and talks on how to we can develop our skills as critical thinkers, and choosing which one is most useful is tricky and to an extent futile as they all repeat the same basic principles but with different language and animations. I have tried to summarise these as follows:

  1. Always ask questions of the fact or information provided and keep going until you are satisfied that the idea has been explained fully.
  2. Evaluate the evidence given to support the idea or fact; often miss-conceptions are based on poor data or interpretations. What is the motive of the source of the information, is there bias present? Do your research and find the arguments for and against, which is stronger and why?
  3. Finally, do not assume you are right, remember we ourselves have bias and we should challenge our own assumptions. What is my truth? What are the truths of others?

We can practise these skills when we are in any lesson or lecture, as well as when we are reading, to help develop a deeper understanding of a text. Evaluating an argument requires us to think if the argument is supported by enough strong evidence.

Critical thinking skills can be practised at home and in everyday life by asking people to provide a reason for a statement. This can be done as they make it or by playing games, such as you have to swap three items you current have for three things you want, and then rationalising each choice. You can even engage in a bit of family desert island discs, taking it in turn to practise your Socratitic questioning (treat each answer with a follow up question).

There are a few pitfalls to consider when engaging with critical thinking; the first of these is ignorant certainty. This is the belief that there are definite correct answers to all questions. Remember that all current ideas and facts our just our best interpretation of the best information or data we currently have to hand and all of them are subject to re-evaluation and questioning. The next one is more relevant to critical thinking and is naïve relativism – the belief that all arguments are equal. While we should consider all arguments we cannot forget that some arguments are stronger than others and some are indeed wrong. Even Isaac Newton, genius that he was, believed that alchemy was a legitimate pursuit.

Critical thinking is not easy; you have to be prepared to let go of your own beliefs and accept new information. Doing so is uncomfortable, as we base ourselves on our beliefs but ultimately it is interesting and rewarding. As you explore your own beliefs and those of others through questioning, evaluating, researching and reviewing, know that this is enriching your ability to form arguments and enhancing your opinions and thoughts. You do not know what you will discover and where your adventure will take you, but it will take you nearer to the truth, whatever that might be. Whilst on your journey of lifelong learning remember to “think for yourselves and let others enjoy the privilege to do so, too” (Voltaire).

Follow @STEAM_WHS and @Biology_WHS on Twitter.