The Theory of Deconstruction – 21/09/18

Ava (Year 13, Head Girl) explores the Theory of Deconstruction as suggested by Derrida and discusses the confusing nature of both ideas and words.

Deconstruction is a theory principally put forward in around the 1970s by a French philosopher named Derrida, who was a man known for his leftist political views and apparently supremely fashionable coats. His theory essentially concerns the dismantling of our excessive loyalty to any particular idea, allowing us to see the aspects of truth that might be buried in its opposite. Derrida believed that all of our thinking was riddled with an unjustified assumption of always privileging one thing over another; critically, this privileging involves a failure to see the full merits and value of the supposedly lesser part of the equation. His thesis can be applied to many age-old questions: take men and women for example; men have systematically been privileged for centuries over women (for no sensible reason) meaning that society has often undervalued or undermined the full value of women.

Now this might sound like an exceedingly overly simplistic world view, and that Derrida was suggesting a sort of anarchy of language. But Derrida was far subtler than this – he simply wanted to use deconstruction to point out that ideas are always confused and riddled with logical defects and that we must keep their messiness constantly in mind. He wanted to cure humanity of its love of crude simplicity and make us more comfortable with the permanently oscillating nature of wisdom.  This is where my new-favourite word comes in: Aporia – a Greek work meaning puzzlement. Derrida thought we should all be more comfortable with a state of Aporia and suggested that refusing to deal with the confusion at the heart of language and life was to avoid grappling with the fraught and kaleidoscopic nature of reality.

This cleanly leads on to another of Derrida’s favourite words: Differánce, a critical outlook concerned with the relationship between text and meaning. The key idea being that you can never actually define a word, but instead you merely defer to other words which in themselves do not have concrete meanings. It all sounds rather airy-fairy and existentialist at this level, but if you break it down it becomes utterly reasonable. Imagine you have no idea what a tree is. Now if I try and explain a tree to you by saying it has branches and roots, this only works if you understand these other words. Thus, I am not truly defining tree, but merely deferring to other words.

Now if those words themselves cannot be truly defined either, and you again have to defer, this uproots (excuse the pun!) the entire belief system at the heart of language. It is in essence a direct attack on Logocentrism, which Derrida understood as an over-hasty, naïve devotion to reason, logic and clear definition, underpinned by a faith in language as the natural and best way to communicate.

Now, Derrida clearly wasn’t unintelligent, and was not of the belief that all hierarchies should be removed, or that we should get rid of language as a whole, but simply that we should be more aware of the irrationality that lies between the lines of language, willingly submit to a more frequent state of “Aporia”, and spend a little more time deconstructing the language and ideas that have made up the world we live in today.

Where academic and pastoral meet: why we should value what we remember and will remember what we value.

wimbledon logo

Fionnuala Kennedy, Deputy Head (Pastoral), looks at research in to memory and how this can be used to aid revision for examinations.

As with most of my thoughts about education, this one was provoked by a conversation over supper and a glass of wine with someone not involved in the educational field. Unlike most of my thoughts about education, it is based on the work of a Dutch psychologist and Chess Master born in 1914, whose initial thesis, “Het denken van den schaker”, was published in 1946 (the English translation, “Thought and Choice in Chess”, appeared in 1965).
During the 40s, 50s and 60s, Adriaan de Groot conducted a series of cognitive chess experiments which ultimately formed the basis for ‘chunking’ theory and allowed for the development of chess computers. Testing all levels of chess player, from rank beginners through to Grand Masters, de Groot’s goal was to explain how the very best chess players could visually absorb a full chess board, assess the positions of pieces, process the different numbers of moves they could make next and rank them in order of preference, and all within seconds. This process was divided into four key phases, occurring rapidly in sequence:

  1. The orientation phase – assessing the position and coming up with general ideas of what to do
  2. The exploration phase – analysing concrete variations
  3. The investigation phase – deciding on the best move
  4. The proof phase –confirming the validity of the choice reached in phase three.

This in itself is an incredibly useful model of thought and study, particularly for the examination student under pressure of time. It is, however, not this which really piqued my interest in de Groot’s study, but rather the next phases of his thinking which have since been built upon by psychologists in the US.

Having determined the role of visual perception and thought processes of Grand Masters that lead to their success, de Groot went on to consider how they would memorise and what it was about that method of memory which made them so particularly successful. And the findings were – and are – fascinating.

In de Groot’s most famous demonstration, he showed several players images of chess positions for a few seconds and asked the players to reconstruct the positions from memory.  The experts – as we might predict – made relatively few mistakes even though they had seen the position only briefly.  So far, so impressive. But, years later, Chase and Simon replicated de Groot’s finding with another expert (a master-level player) as well as an amateur and a novice.  They also added a critical control: the players viewed both real chess positions and scrambled chess positions (that included pieces not only in random positions, but also in implausible and even impossible locations). The expert excelled with the real positions – again, as might have been predicted – but performed no better than the amateur and even the novice for the scrambled positions. In essence, then, the expert advantage seems only to come from familiarity with actual chess positions, something that allows more efficient encoding or retrieval of the positions. The grand master’s memory, the test suggests, will only have absorbed the positions on the board which matter to them, which have meaning and purpose; it is not that their memories are simply ‘better’, or better-trained, but that they have become more efficient in storing meaningful patterns. Without that meaning, the expert and the novice will both struggle equally.

And this amazed me, and got me thinking. As educators, we know that theories about the ways in which we think and remember come and go, that pupils may learn in different ways, at different ages, in varying degrees of success and failure, and thus we shouldn’t jump on too many bandwagons pedagogically. I know for example that I am almost certainly more reliant on audio and visual modes of learning than kinesthetic, but then I suspect that’s because the latter didn’t really exist when I was at school; and I also tend to believe that I remember letters and words better than numbers, but this I now recognise to be because I grew up with parents who listened to music and read literature. It is not that our brains can or cannot remember aspects of learning; it is not necessarily that we have different ways of thinking and remembering and learning, or indeed brains which ‘absorb’ certain information better or worse than others. Rather:

We will remember that to which we ascribe value; we will memorise where there is pattern and meaning.

Which only goes to add more grist to the mill to Mrs Lunnon’s message delivered in our opening assembly this term: ‘What I do is me: for that I came’ (Manley-Hopkins). If we approach learning as a task which must be achieved simply to obtain an end-goal, we simply will not learn as well. Rather, if each task is ascribed a meaning and value for and within itself, it will become much easier to remember and store away. Thinking ‘I want to get 10/10 in my Spanish vocab test because I want to be top of the class’ will only make your task more difficult. Looking at each word you are learning and putting it into a context where you might use it one day, or including it in a joke in Spanish, or making a connection between the words, will save you time and maximise the chances of your brain storing that information away for you for longer.

What’s more – and this is where the pastoral side really kicks in – such an approach takes away the slog and grind of learning. Instead, meaning will surround us and be ascribed in all we do. And, of course, more excitingly than that: if we are on the look-out for meaning, it will help us to find the area which feels the most meaningful for us, in which we can readily spot and identify patterns of meaning and which fills us with joy and satisfaction. And it is this, and not simply a desire to do well or know more, which will lead to true mastery as we negotiate the chess board of our own learning and lives.

Follow @DHPastoralWHS and @Head_WHS on Twitter.