There is much to learn about literature. Great works yield data about human cultures, about history, society and politics, about language. The student or scholar can be a dispassionate anthropologist, scientist, or social historian of literature, seeking to acquire new knowledge.
We can also learn from literature, just as we learn from our parents, from past experiences, from those wiser than us. Literary study can encourage new encounters and exchanges that allow us to learn from others.
Learning about and from literature are not, and should not, be opposed stances. Educational settings encourage different emphases, but in such an intensely personal matter, it’s for each of us to choose our outlook on literature and its significance.
(inspired by a piece in memory of Kenneth W. Morgan)
A philosophical opinion piece called ‘Do Thrifty Minds Make Better Brains?’ by Andy Clark, a professor of logic and metaphysics at Edinburgh, relates how our thrifty minds conserve energy and activity by being ‘engines of prediction’. He draws on research in neuroscience to argue that we amass a bank of stored images that replace (and sometimes cancel out) new sensory data about phenomena we’ve seen before. The mind uses its existing knowledge to avoid having to process everything that confronts us as though new; instead, the brain registers anomalies from the expectations it has created: ‘What is marked and passed forward in the brain’s flow of processing are the divergences from predicted states: divergences that may be used to demand more information at those very specific points, or to guide remedial action.’
Even at this point, Clark’s argument rings many literary bells for me (e.g. George Eliot (such as the squirrel’s heart-beat line I’ve mentioned before); or T.S. Eliot’s ‘Humankind cannot bear very much reality’, given an explicitly Christian reading here—there are religious ideas lurking near these arguments). Clark also alludes to possible literary applications when he meditates on the idea that perception and imagination are linked activities:
[P]erception (at least of this stripe) now looks to be deeply linked to something not unlike imagination. For insofar as a creature can indeed predict its own sensory inputs from the “top down,” [i.e. is knowledge-driven] such a creature is well positioned to engage in familiar (though perhaps otherwise deeply puzzling) activities like dreaming and some kind of free-floating imagining. These would occur when the constraining sensory input is switched off, by closing down the sensors, leaving the system free to be driven purely from the top down. We should not suppose that all creatures deploying this strategy can engage in the kinds of self-conscious deliberate imagining that we do. Self-conscious deliberate imagining may well require substantial additional innovations, like the use of language as a means of self-cuing. But where we find perception working in this way, we may expect an interior mental life of a fairly rich stripe, replete with dreams and free-floating episodes of mental imagery.
Clark uses the example of the hollow-face illusion, which is much more easily appreciated by watching the video attached to his article than by explanation. In essence, Clark takes this famous illusion (it turned up in the Royal Institute Christmas Lectures this year, with Einstein’s head instead of Chaplin’s) as a good example of how the mind’s activity affects our perception: we always see the rounded face because the idea of a hollow face is so foreign to us that we reject the incoming sensory data, and plump for a fictional, illusory perception.
The article doesn’t actually specify any view about what literature or the arts involve, but I do find a set of productively disagreeable implications (which I stress are not made in the article) that could grow out of it. If you were to define an artist as someone with ‘an interior mental life of a fairly rich stripe, replete with dreams and free-floating episodes of mental imagery’, in short someone wildly imaginative, then you might be led to think that arty types love to play with fictions and deal with counter-factuals. In this view the artist toys with reality and lets the mind run free. Clark’s line about ‘use of language as a means of self-cuing’ could imply that poetry is a literary form that encourages the mind to engage in such creative, associative play. The creative mind makes its own false rounded faces and indulges its fantasties.
This sounds reasonable enough, and there may be something in it. But I also disagree that this provides anything like an approximation of what artistic contemplation and creation may accomplish. For a start, this state only occurs when ‘the constraining sensory input is switched off, by closing down the sensors’, so you can forget any naturalistic observation, and this also shuts down any social engagement in the relevant art: Clark refers elsewhere to how our imagination stores prejudices. In fact, I think this model of artistic creation as unplugged from reality and left to engage in play is positively immoral because it would define artistic creation as the reorganization of false certainties, severed from truth, and unwilling to participate in the world.
When I watch the hollow-face illusion, I try to convince my mind to discard its illusion and to see the hollow face that’s really there. I do this in vain– but I don’t want to settle for the illusion. Similarly, I would always pick the blue pill in the world of the Matrix. Likewise, many great artists are intensely curious and inquisitive people who also don’t want to accept fictions, but want to use their imagination to re-route the mind so that it can see what’s there more clearly. There are many ways to try to see the hollow face, such as to observe it as closely as possible (realism), or to realize that the mask you see has a deeper meaning, i.e. a hollow face (symbolism).
I’ve been reading Wallace Stevens and his early critics recently, and there is a great deal of pertinent material to draw on here. Stevens has been seen as the ‘poet of consciousness’ par excellence. Here is verse about how a singer who ‘sang beyond the genius of the sea’ fashions her world through her singing:
It was her voice that made
The sky acutest at its vanishing.
She measured to the hour its solitude.
She was the single artificer of the world
In which she sang. And when she sang, the sea,
Whatever self it had, became the self
That was her song, for she was the maker. Then we,
As we beheld her striding there alone,
Knew that there never was a world for her
Except the one she sang and, singing, made. (The Idea of Order at Key West)
The singer’s voice creates her world, parallel to the world as it is, but still a beautiful fiction. The actual sea is as inaccessible as the hollow face. Stevens’ poem may seem to go against my argument, because he explores the impossibility of seeing or singing the sea as it is. But his poetry can be read as an exploration of how we reach and deal with this limitations, as human beings. He writes elsewhere that ‘I thought we had reached a point at which we could no longer believe in anything unless we recognized that it was a fiction [….] there are fictions that are extensions of reality [….] Heaven is an extension of reality’ (letter to Henry Church, 8 December 1942) and that ‘[poetic] truth is an agreement with reality, brought about by the imagination of a man strongly disposed to be influenced by his imagination, which he believes, for a time, to be true’ (‘Figure of the Youth as Virile Poet’). These are the serious thoughts of a man engaged in the world, not someone withdrawing to a domain of happy fancy.
For Stevens here the imagination is an instrument used to reach towards truth once we know that it is unreachable: there are necessary fictions. There is mystery on the land and in the seas. In his poetry Stevens uses language to test out these limitations, and to discover the point where the mind stops us from seeing more clearly. I doubt Stevens could have seen the hollow face either, but his poetry doesn’t just mess around with versions of the illusory rounded face either—his poetry is not divorced from the understanding, despite Yvor Winters’ view that Stevens gives us ‘the most perfect laboratory of hedonism to be found in literature [….] his ideas have remained essentially unchanged for more than a quarter of a century’. His poetry, and other poetry, is a different type of laboratory: once conducting advanced experiments in how we build our sense of reality, aware that any such investigations cannot ignore the fallibility of the investigator. The imagination can do more than juggle prejudices and preconceptions into new forms: it can help us to dabble in reality and try to see hollow faces.
I visited the Wellcome Collection last week and browsed through the Code of Life. A wall was filled with shelves of the human genome, mapped and printed as sequences of letters in many volumes for each chromosome. Here’s a tiny extract from Chromosome Six:
Together the letters correspond to an individual human being, and no two people have the same sequences of letters (unless you’re identical twins?–I hope my novice biological knowledge is holding up here).It’s important for the human species that the population retains the widest possible body of genes to encourage variation and adaptation so that the species stays strong. We need a large gene pool.
Gene pools are connected to our idea of liberty. John Stuart Mill’s classic treatise On Liberty was published in 1859, the same year as Charles Darwin’s Origin of Species. Mill ‘insists on freedom of thought as the only effective means for keeping the gene pool of ideas well-stocked and ready to generate valuable original notions that can improve the general sum of happiness’ (see the article by Scott Rosenberg that this quotation comes from for more). Liberty allows societies to keep healthy in future generations, and not succumb to the intellectual incest of authoritarian regimes, censorship or other forms of ideological control.
One aspect of liberty is to keep our gene pool of ideas splashy by allowing lots of different languages, words and phrases to co-exist. There isn’t a single Book of Knowledge (no, not even Wikipedia), just as every human genome is a little bit different. It’s certainly true that we each restrict our intake of words and language, and that helps form our identity, but the point about liberty is that we keep control. So we can expect that illiberal forces in society will seek to control the language and ideas we’re exposed to, and for this to be effective it’s better off if we don’t know about it.
In an article called ‘What if we Occupied Language?‘ H. Samy Alim points out that the Occupy movement has successfully modified the associations of the word ‘occupy’ for Americans so that they no longer think first of Iraq, but think of protest movements instead. The piece goes on to discuss ways of reclaiming language: ‘in the face of such widespread language-based discrimination, Occupy Language can be a critical, progressive linguistic movement that exposes how language is used as a means of social, political and economic control.’
And here’s Eli Pariser talking about how Google and Facebook create invisible ‘filter bubbles’ with algorithms that screen what information we receive by anticipating our wants. He fears that our young internet doesn’t have the ethical checks to make sure we find out what we need as well as what we want.
Literature—you could see where I’m going with this—is a valuable way to keep the gene and words pools of society lively. It’s a seed-bank guarding our stocks of different ideas and perspectives up, and stops us from becoming too much like verbal and ideological clones. There is tension here in how literature serves that duty, though. Over at the New York Review of Books, there were sharp words exchanged between Rita Dove, who defended her anthology of twentieth-century American poetry and Helen Vendler, who reviewed it. Dove’s inclusive anthology picks from a wide range of writers from different backgrounds who all use language in different ways. Vendler rounds on the book for doing not enough justice to the canonical poets, who we can understand are those who exert exquisite control and strength through their language, which later readers can imitate. As Toni Morrison and others point out (see this blog post for a discussion), Vendler’s sort of canon is like a powerful national empire, one which does not necessarily work towards liberty. This is a fundamental difference—so fundamental that Vendler was involved in a similar spat in the 1970s when reviewing another anthology, and is fought over how literature best serves liberty (if we accept the consensus in the U.S. and Europe that the two go together).
Universities and liberal education in general also help a society’s gene pool to stay strong. Here’s a piece by Keith Thomas on how the crisis facing British universities affects everyone from student to professor, as this public function goes unappreciated by a UK government wanting to make a market in higher education. There’s my list of recommended holiday reading links exhausted. This is the type of post that I’d like to be able to read not just next year, but in fifty and five hundred years’ time to see how times have or have not changed.
The only time that I’ve been encouraged to memorize poetry was for an individual verse speaking competition at secondary school. I learnt short poems like William Blake’s ‘Poison Tree‘ and Wilfred Owen’s ‘Anthem for Doomed Youth‘ by heart, and found it more enjoyable than daunting. I’ve not thought much about memorizing poetry since, except that it feels a very traditional way to study poetry: I was learning poems as an extra-curricular activity, not in an English lesson.
Recently I’ve come to realize that it’s an extremely valuable activity if practiced properly. I don’t much care to become a walking minstrel, or to use it to improve mental agility, or to chuck away my mp3 player or the smartphone I don’t own because I have the music of poetry running round my head. I’m not interested in it for the same reason that the author of this essay called ‘Got Poetry?‘ is. No, memorizing poetry fascinates me because it forces you to meet a poem on its own terms: to hear it well, read it several times, understand its structure, internalize its rhythms, and empathize with the speaker.
I’ve occasionally said, somewhat flippantly, that my Cambridge English degree taught me everything apart from how to read at the right speed. I can spend an hour poring over a single poem, and I can leaf through whole books to extract just the quotation I think I want. But it’s only by pausing to absorb poems that a poem begins to radiate before me, and this sensitivity is trained by having it echo in my skull. Memorization helps me produce those conditions, even if I don’t spend long enough with it to be able to recite it a few days later.
This old technique is also teaching me how poetry is written. Great poets have usually read, studied and imitated previous great poets. By insisting on originality and resisting disciplined study of poetry’s music, I believe—and my conservatism surprises me—we miss out on how humans developed advanced language skills. We learn language through imitating our parents, and perhaps we learn poetry through imitating our literary forebears.
Memory is particularly associated with poetry, rather than any other form of writing. Its repetitions and subtle connections make it easier to memorize. Indeed, one theory goes that poetry originated as mnemonic writing: its patterns helped people to remember lists, stories, religious doctrine and other information. Research has shown that music and memory are associated.
You gain something by learning one poem that you don’t by casting eyes over six poems by the same poet. The memory arts activate something in the mind. Memorization may seem daunting or simply antiquated given modern-day technological advances, but even in small amounts I find it cultivates and strengthens my mental circuitry. Poets.org suggests some starting points if you feel inspired to memorize a poem.
The Radio 4 programme In Our Time discussed the philosophical continental-analytical split this week. The speakers all offered massive disclaimers at the start: there is no clear-cut distinction, continental philosophy isn’t a coherent body of thought, the geographical distinction is nonsense, and no philosopher’s work can be reduced to one of two camps. When I’m asked about different approaches to studying English or what the differences between English departments at Oxford and Cambridge are, my explanation sometimes uses the same terms ‘continental’ and ‘analytical’ in passing, and with similar caution.
I endorse the same disclaimers, since I believe that each person holds an inter-locking set of beliefs and unspoken assumptions that can’t be guessed by assigning them to a particular creed or label. I believe that you don’t begin to know people until you’ve encountered them, in person or through what they’ve produced. Still, I find it instructive to think generally about larger differences in approach to difficult questions, so here’s my take on how the continental-analytical divide, as explained through the following garbled summary, could be applied to literary study. In particular, I’ll use it to answer another question I’m sometimes asked: hasn’t most of what’s been written about Shakespeare just been made up by critics?
Breezy summaries first. Analytical philosophy (Bertrand Russell and Ludwig Wittgenstein are among its notable proponents) seeks to establish what it is that we can know through reason and logic. It pursues objective analysis to cut through all that is unproven to locate irrefutable propositions about reality that must be true. It’s unified by a method that emphasizes precision and thoroughness, and has tended to dwell on the philosophy of language: how well does language describe the world? It currently prevails in academic philosophy today, the programme told us.
Beatrice Han-Pile’s excellent introduction to continental philosophy emphasized that its concerns are existential, arising from humans having a limited understanding of the world and being aware of this. Consequently, this provokes questions unique to humans, such as ‘how do I lead a good life?’, ‘what is beauty?’ or ‘do I have I soul?’ Answers to these questions cannot be established through reason alone, but require us to draw on subjective experiences and feelings to be able to answer them. These questions demand us to confront how we can come to understanding of the world as humans, through perception (phenomenology worries about this) or interpretation (hermeneutics is the theory of interpretation). It seeks to understand how we interact with the world as whole beings, and often speaks readily to political and social concerns.
Now—and here I start to add even more planks to this rickety rope-bridge—English studies had two different origins that could be perceived as corresponding approximately (and anachronistically) from this divide. English studies, as I understand it, began in nineteenth-century London. University College (UCL) initially promoted a more utilitarian approach to English language and literature (think Jeremy Bentham), emphasising composition skills, and the factual and historical study of language. King’s College, meanwhile, was more evangelical and taught sound moral principles based on the classics: it saw religion and education as intertwined. The University of Oxford established English as a professional discipline nudged, as I’ve been told, by a renewed need to train colonial civil servants (need to check this): it promoted philology (i.e. history of language) and historical studies of literature. In twentieth-century Cambridge a different type of degree course was set up, one more socially aware and which combined ‘life, literature and thought’. In particular, ‘Practical criticism’ was a skill of close reading conceived by I. A. Richards to have psychological benefits for students, and is still associated with Cambridge though most degree courses in Britain teach close reading. Just one more creaky paragraph to come….
As literature departments developed in the UK, then, one side tends to regard English language and literature as a discipline that can inform us about history, about language, and other objective matters. This is the more analytical approach, and is closer to what I’ve encountered during my time at Oxford (disclaimer with alarm-bells etc.). The other side takes literature as a social phenomenon that presents ideal ground from which to consider how humans describe, interpret and communicate about the world around them through language. This approach is more continental, and more attuned to work I saw and worked on in Cambridge. Both sides—and it’s wearying for me to keep on with these withdrawals, but we’re almost there—have great strengths, and I doubt either exists in a pure form: they can combine and correct each other. It happens that the more analytical approach is also more dominant, I would say, in continental European, British and American literature departments right now, and if so this is probably connected, among other things, to increased specialization as we accumulate more knowledge.
So in answer to the question whether critics are just making it up about Shakespeare, there are two answers. One: no, because academic Shakespeare studies establish what we can know about his dramas and poems using primary sources and analysis. Academics provide the context and historical basis to help others read and enjoy his works. Two: no, because from our contemporary perspective we can ask important questions that aren’t less important because Shakespeare wouldn’t have asked them, such as how he represented women or colonies, or held broad views about humanity. Academics open up these questions for others to pursue. Regardless of approach, very few critics would dare to explain exactly what Shakespeare was trying to tell us in his plays, that don’t just echoes through history.
If we want to run away with this split further we could start speculating a basic neurological foundation for all this, the much-skewered left- and right-brain distinction (Jonathan Sacks criticized here for this; Iain McGilchrist here, for example). And just to tip the tension into ridicule, we could imagine a humanistic theory of everything which also include thought/ feeling and science/ religion. The sheer adaptability of this same continental-analytical split into virtually every sphere of human endeavour makes me think that it is worthwhile as a tool for thinking (because it keeps on recurring), if not for describing reality.
More importantly, it reminds me once again that no individual mind or artifact can be slotted in one of two categories. In fact, such gross simplification goes against the basic nature of humanistic study as I and others pursue it. We learn this once we start analysing a text in all its historical complexity, and as we start posing theoretical questions. In fact, English studies ought to be superb at such self-scrutiny because language, the medium through which all these ideas are communicated, is its basic subject of study. It ought to be the first to resist the shackles of such dichotomies, even though individuals will naturally incline towards approaches similar to those previously adopted by others.
My first piece of advice for anyone thinking of studying English literature at university is to remember that the literary canon doesn’t begin with Jane Austen (1775-1817). I read some Andrew Marvell (1621-78), some John Donne (1572-1631), William Shakespeare’s The Merchant of Venice (c. 1596) and his King Lear (1605) at A-Level, but medieval literature was still largely unknown to me, aside from a not particularly fruitful read-through of Geoffrey Chaucer’s The Franklin’s Tale (c. 1390) from the local library. I’m still far from knowledgeable about the whole millennium brought under the single term ‘Middle Ages’ (c.400-1500), but the formative event that kindled my interest was the outstanding exhibition at the Fitzwilliam Museum held during my first term at Cambridge, The Cambridge Illuminations (virtual exhibition still available online).
Another major exhibition of illuminated manuscripts (i.e., handwritten books with decorative images) has just opened at the British Library. It’s called Royal Manuscripts: The Genius of Illumination, and is sumptuous. The display provides a vivid, broad introduction to English elite culture between the ninth and sixteenth centuries through some of the most remarkable books that survive. It’s divided into sections that show royal influence impinging on the creation of these manuscripts: this is a period in which books are a valuable commodity, commissioned and owned by the wealthy. The opening exhibits from Edward IV’s royal library, for example, show royal crests scattered around the page-borders, alongside the fruit, animals and fantastic creatures drawn by the illuminator. The exhibition surveys Christian texts, expressions of royal identity (e.g. genealogical trees), books of instruction and reference, and items with strong links to continental culture.
These manuscripts, all at least 500 years old, are remarkably well-preserved, and it’s valuable to appreciate each illumination in context, on the page and within a book that’s been bound and presented for a particular purpose. The lighting is sometimes too dim and the visitor numbers too great to be able to pause and scrutinize each page, but it’s still very possible to absorb the meticulous detail that went into each illumination. There are many highlights (try browsing those on this page, or purchase the app), but for one example, take the image of God the Creator that’s been used to publicize the exhibition: you can see it from a bus passing the museum, or find it in the shop adorning commemorative tote bags, pin badges, fridge magnets and coffee mugs. It’s only when I left nose-breath marks on the exhibition case, however, that I noticed how the image’s details resonated and expanded with the meaning in the text: angels fill the lapis lazuli sky and the vermilion mandorla (almond-shaped panel), and God’s feet don’t quite touch the earth. Theological and artistic precision are combined.
I didn’t need the audio guide to remind me how much more there is to absorb in each of the 150 manuscripts on display than I could take in during one visit. The exhibition shows a modern library fulfilling its duty to educate the general public by bringing its most valuable material out from the store and into a gallery. The manuscripts are a key treasure in our cultural heritage that open up a vista of intellectual endeavour and royal aspiration from the past. It also shows the indissoluble relation between literature, theology, history, scholarly learning and artistic achievement in the period. The exhibition is open until 13 March 2012, and the permanent Treasures exhibition and the small display of Michael Katakis‘s photographs are also worth looking out if you visit.
Gabriella Gruder-Poni writes about her experiences as a PGCE (i.e. trainee) teacher in English in an article called ‘The Reader Gets Angry‘. She describes the repeated opposition she encountered from fellow teachers when attempting to teach students about areas of knowledge unknown or without immediate to relevance to them. Gabriella becomes isolated as she tries to introduce new material to her students: for example, another teacher criticizes her for mentioning Leonardo da Vinci on a worksheet because a student ‘won’t have heard of the 1500s or of Leonardo’.
Gabriella concludes that her state school suffered from a ‘poisonous combination of classism and anti-intellectualism’. The teachers assumed students wouldn’t or couldn’t grasp anything not already familiar to them. Topicality rules. I’ve heard similar stories from state-school teachers in English and other subjects who speak warmly of teaching, but feel disillusioned at schools as learning environments. I’ve been told several times that students aren’t challenged or appreciated as individuals. Gabriella writes that:
I eventually came to suspect that the real reason for the banishment from the classroom of anything that smacked of culture was the lack of interest not among students but among teachers. For the students, especially the younger ones, regularly showed themselves to be curious about subjects other than gadgets and celebrities, giving the lie to the teachers’ assertions that times past and distant places were ‘inappropriate’ material for lessons.
I’ve written before that I don’t think Shakespeare is for everyone, but I’m still very sympathetic to Gabriella’s experiences. I admire Gabriella’s stand on seeking to develop students’ curiosity for new ideas. I’m struck by how the teachers she encountered took relevance and familiarity to be synonymous: a subject is relevant if the student is already familiar with it. So ‘gadgets and celebrities’ are in, and other cultures, other ways of thinking are out. This seems a logical fallacy. There are always topics that are unfamiliar but have hidden relevance. I believe that it’s fundamentally a good thing to reach out to something new and seek to understand it because the process fosters tolerance, open-mindedness and curiosity. ‘Relevance’ becomes an issue when deciding which new topics to teach: that’s why British students tend (if they learn any language) to learn French or German, not Malay or Sanskrit, though all of these languages would bring pedagogical benefits to all students. One reason that Shakespeare is still read so often and still placed on school curricula is that for centuries people have empathized with the basic sense of humanity that radiates from his works, and still do.
Her frustration with vocabulary teaching is exemplary. Having read her article in another source (a college alumni magazine), I know that the series that enraged Mr F— is called Wordly Wise, which looks like a reliable method to learning new words. The main arguments against learning complicated, polysyllabic vocabulary is that it’s unfamiliar and irrelevant: the students won’t know the words, and they don’t need to know them. You might well ask why students who’ll never use or hear words like ‘spurious’, ‘reiterate’ or ‘apocryphal’ should be made to learn them. These words are just used to sound smart, right? Here’s Gabriella again:
‘They’ll never need those words’, never need words like ‘assail’, ‘assimilate’, ‘mishap’ or ‘ostentatious’. Why not? Didn’t he expect them to read and write? I began to suspect that my students’ woeful ignorance might be a consequence of attitudes like those of Mr. F—.
She’s right that if you don’t expose students to such vocabulary at school, then they’re less likely to encounter more texts that use them and so remain with the basic literacy skills to read, say, a tabloid newspaper but not much else.
There’s also an important argument to make about the larger purpose of language acquisition and usage. We don’t just learn difficult words so that we can write well: it’s usually best, as George Orwell tells us, to use the fewest and simplest words possible. But sometimes the simplest word will be ‘assimilate’ or ‘reiterate’. That’s because we use language to describe the world and make distinctions. ‘Red’, ‘yellow’, ‘green’ and ‘blue’ articulate primary colours that we see, but there’s a whole wide spectrum out there: vermillion, teal, cyan, magenta and many many others (and even more if we count Dulux neologisms like ‘Indian Ivy’ and ‘Summer Surprise’). There are many other word spectrums. ‘Gesticulate’ has a different shade of meaning to ‘point’, ‘waggle’, ‘gesture’ or ‘wave’. There’s an important moral difference between choosing the words ‘catastrophe’, ‘disaster’, ‘screw up’, ‘accident’, ‘error’, and ‘mishap’. I wouldn’t use the same word to describe losing £5 that I would to describe famine, just as I wouldn’t think that starvation and a lost takeaway meal are misfortunes of a similar scale.
A thesaurus (from a Latin word meaning ‘treasury, store-house’) helps a writer to locate just the right word in a given spectrum. Choosing words requires sensitivity and attention: just the sort of skills you learn by studying English, listening for different connotations and learning advanced vocabulary. Gabriella reports that she was scolded for overemphasizing dictionaries: they should apparently ‘only be used a last resort’. In fact, I think the impulse to look up any word you’re not sure about is one of the best habits you can learn from studying English. But if you suffer a ‘lack of faith in words’, as Gabriella puts it, then you’re not likely to appreciate the cognitive skills and mental agility to be learnt from studying new words and new literature.