Posts Tagged ‘research’

You could be forgiven for wondering what, precisely, digital literacies are. In the first edition of ‘Digital Literacies’, Dudeney et al. (2013:2) define the term as ‘the individual and social skills needed to effectively interpret, manage, share and create meaning in the growing range of digital communication channels’. This is pretty broad, and would seem to encompass more or less anything that people do with digital technology, including the advanced arts of trolling and scamming. Nine years later, in the new edition of this book (Pegrum et al., 2022:5), the authors modify their definition a little: ‘the individual and social skills needed to effectively manage meaning in an era of digitally networked, often blended, communication’. This is broader still. In the intervening years there has been a massive proliferation of ways of describing specific digital literacies, as well as more frameworks of digital literacies than anyone (bar people writing about the topic) could possibly want. Of course, there is much in common between all these descriptive and taxonomic efforts, but there is also much that differs. What, precisely, ‘digital literacies’ means changes over both time and space. It carries different meanings in Australia, Sweden and Argentina, and, perhaps, it only makes sense to have a local conceptualisation of the term (Pangrazio et al., 2020). By the time you have figured out what these differences are, things will have moved on. Being ‘digitally-literate’ literate is an ongoing task.

What, precisely, ‘digital literacies’ are only really matters when we are told that it is vital to teach them. It’s easy to agree that digital skills are quite handy in this networked world, but, unless we have a very clear idea of what they are, it’s not going to be easy to know which ones to teach or how to teach them. Before we get caught up in the practical pedagogical details, it might be useful to address three big questions:

  • How useful it is to talk about digital literacies?
  • Can digital literacies be taught?
  • Should digital literacies be taught as part of the English language curriculum?

How useful is it to talk about digital literacies?

Let’s take one example of a framework: the Cambridge Life Competencies Framework (CLC). The CLC lists six key competencies (creative thinking, critical thinking, communication, collaboration, learning to learn, and social responsibilities). Underpinning and informing these six competencies are three ‘foundation layers’: ‘emotional development’, ‘discipline knowledge’ and ‘digital literacy’. Digital literacy is broken down as follows:

It’s a curious amalgam of relatively straightforward skills and much more complex combinations of skills with knowledge, attitudes and dispositions. In the former category (see the first box in the chart above), we would find things like the ability to use tags, hashtags, search engines, and filters. In the latter (see the second box in the chart above), we would find things like the ability to recognise fake news or to understand how and why personally targeted advertising is such a money-spinner.

Another example, this one, from Pegrum et al (2018), is more complex and significantly more detailed. On the more technical side, we see references to the ability to navigate within multimodal gaming, VR and AR environments, or the ability to write and modify computer code. And for more complex combinations of skills, knowledge, attitudes and dispositions, we have things like the ability to develop a reputation and exert influence within online networks, or ‘the ability to exert a positive influence (online) by adopting a stance of intellectual humility and ethical pluralism’.

This is all a far remove from only seven years ago when ‘digital literacies’ were described as ‘the practices of reading, writing and communication made possible by digital media’ (Hafner et al., 2015) and the kinds of skills required were almost all closely connected to reading and writing. The digital world has changed, and so has our understanding of what it means to operate effectively within that world. Perhaps it is time, too, to change our terminology: ‘literacies’ is still with us, but it seems almost wilfully misleading. ‘Abilities’ or ‘competencies’ would seem to be much more appropriate terms to refer to what we are discussing in these frameworks, but ‘abilities’ probably isn’t sciency enough, and ‘competencies’ has already been done to death.

The problem with lumping all these things together under a single superordinate is that it seems to imply that there is some sort of equivalence between all the subordinate labels, that there is some categorial similarity. Pegrum et al (2022) acknowledge that there are differences of complexity between these ‘literacies’ – they use a star system to indicate degree of complexity. But I think that there is no sufficiently strong reason to put some of these things together in the first place. Dudeney at al (2013: 14) note that some of their literacies are ‘macroliteracies’ – ‘in other words, a literacy which draws on numerous other literacies – and involves linguistic, multimedia, spatial, kinaesthetic and other skills’. Why, then, call them ‘literacies’ at all? The only connection between knowing how to generate a LOLcat post and knowing how to ‘remain non-judgemental towards new perspectives, multiple viewpoints, and shifting contexts’ is that both are related to our use of digital technology. But since there is very little in our lives that is not now related in some way to digital technology, is this good enough reason to bracket these two abilities together?

Pegrum et al (2022) found that they needed to expand their list of digital literacies in the new edition of their book, and they will no doubt need to do so again nine years from now. But is the fact that something could be included in a taxonomy a good reason for actually including it? ‘Code literacy’, for example, seems rather less urgent now than it did nine years ago. I have never been convinced by gaming literacy or remix literacy. Are these really worth listing alongside the others in the table? Even if they are, nobody (including Pegrum et al.) would disagree that some prioritisation is necessary. However, when we refer to ‘digital literacies’ and how vital it is to teach them, we typically don’t specify a particular literacy and not another. We risk committing the logical error of assuming that something that holds true for a group or category, also holds true for all the members of the group or subordinates of the category.

Can digital literacies be taught?

There is clearly no particular problem in teaching and learning some digital literacies, especially the more technical ones. Unfortunately, the more specific and technical we are (e.g. when we mention a particular digital tool), the more likely it is that its shelf-life will be limited. Hardware comes and goes (I haven’t had to format a floppy disc for a while), as do apps and software. To the risk of wasting time teaching a skill that may soon be worthless, we may add the risk of not including literacies that have not yet found their way into the taxonomies. Examples include knowing how to avoid plagiarism detectors (as opposed to avoiding plagiarism) or how to use GPT-3 (and soon GPT-4) text generators. Handy for students.

The choice of digital tools is crucial when one of the key pieces of advice for teaching digital literacy is to integrate the use of digital tools into lessons (e.g. in the Cambridge Life Competencies Digital Literacy booklet). This advice skates over the key questions of which tool, and which literacy is being targeted (and why). Watching TikTok videos, using Mentimeter in class, or having a go with a VR headset may serve various educational purposes, but it would be stretching a point to argue that these activities will do much for anyone’s digital literacy. Encouraging teachers to integrate technology into their lessons (government policy in some countries) makes absolutely no sense unless the desired outcome – digital literacy – is precisely defined in advance. It rarely is. See here for further discussion.

Encouragement to include technology, any old technology, in lessons is almost never grounded in claims that a particular technical skill (e.g. navigating TikTok) has any pressing value. Rather, the justification usually comes from reference to what might be called ‘higher-order’ skills, like critical thinking: what I referred to earlier as curious amalgams of relatively straightforward skills and much more complex combinations of skills with knowledge, attitudes and dispositions.

The problem here is that it remains very uncertain whether things like ethical literacy or critical digital literacy are likely to be learnt through instruction. They can certainly be practised, and Pegrum et al (2022) have some very nice activities. The aims of these activities is typically described using a vague ‘raise awareness of’ formula, but whether they will lead, for example, to any improved ability ‘to exert a positive influence (online) by adopting a stance of intellectual humility and ethical pluralism’ is debatable. Much as the world might be a better place if classroom activities of this kind did actually work, research evidence is sadly lacking. For a more detailed look at the problems of trying to teach critical digital literacy / media information literacy, see here.

Should digital literacies be part of the English language curriculum?

So, is it ‘crucial for language teaching to […] encompass the digital literacies which are increasingly central to learners’ […] lives’ (Pegrum et al, 2022)? Well, it all depends on which digital literacies we are talking about. It also depends on what kind of learners in what kinds of learning contexts. And it depends on both institutional objectives and the personal objectives of the learners themselves. So, ‘crucial’, no, but we’ll put the choice of adjective down to rhetorical flourish.

Is it true that ‘digital literacies are as important to language learning as […] reading and writing skills […]’ (Pegrum et al., 2022: 1)? Clearly not. Since it’s hard to imagine any kind of digital literacy without some reading skills preceding it, the claim that they are comparable in importance is also best understood as rhetorical flourish.

A modicum of critical (digital) literacy is helpful when it comes to reading literature on digital literacies.

References

Dudeney, G., Hockly, N. & Pegrum, M. (2013) Digital Literacies. Harlow: Pearson Education

Hafner, C.A., Chik, A. & Jones, R. H. (2015) Digital Literacies and Language Learning, Language Learning & Technology, 19 (3): 1-  7

Pangrazio, L., Godhe, A.-L., & Ledesma, A. G. L. (2020) What is digital literacy? A comparative review of publications across three language contexts. E-Learning and Digital Media, 17(6), 442–459. https://doi.org/10.1177/204275302094629

Pegrum, M., Hockly, N. & Dudeney, G. (2022) Digital Literacies 2nd Edition. New York: Routledge

Pegrum, M., Dudeney, G. & Hockly, N. (2018) Digital Literacies Revisited. The European Journal of Applied Linguistics and TEFL, 7 (2): 3 – 24

When I last blogged about teacher wellbeing in August 2020, we were in the early throes of COVID, and Sarah Mercer and Tammy Gregersen had recently published their timely book about wellbeing (Mercer & Gregersen, 2020). Now, over two years later, it seems appropriate to take another look at the topic, to evaluate the status of the concept of ‘wellbeing’ in ELT.

Wellbeing as an object of study

The first thing to be said is that wellbeing is doing just fine. Since 1995, the frequency of use of ‘subjective well-being’ in books has increased by a factor of eight, and, across multiple languages, academic attention to wellbeing and related concepts like ‘happiness’ is growing (Barrington-Leigh, 2022). Interest in teacher wellbeing is no exception to this trend. There are, however, a few problems, according to a recent systematic review of the research literature (Hascher & Waber, 2021). There is, apparently, little consensus on how the term should be defined. There is little in the way of strong evidence that wellbeing correlates with good teaching, and, to my surprise, there is a lack of studies pointing to actual shortfalls in teacher wellbeing. Empirical evidence regarding the effectiveness of programmes aiming to foster teacher wellbeing is, less surprisingly, scarce.

Researchers in English language teacher wellbeing are well aware of all this and are doing their best to fill in the gaps. A ‘research group for wellbeing in language education’ has recently been formed at the University of Graz in Austria, where Sarah Mercer works. This is part of a push to promote positive psychology in language teaching publications, and the output of Sarah Mercer, Tammy Gregersen and their associates has been prodigious.

Next year will see the publication of a book-length treatment of the topic with ‘Teacher Well-Being in English Language Teaching An Ecological Approach’ (Herrera et al, 2023). It will be interesting to see to what extent teacher wellbeing is dealt with as a social or political issue, as opposed to something amenable to the interventions of positive psychology.

In the wider world of education, wellbeing is not as frequently seen through the lens of positive psychology as it is in ELT circles. Other perspectives exist: a focus on working conditions or a focus on mental health, for example (Hascher & Waber, 2021). And then there is neuroscience and wellbeing, which I am eagerly awaiting an ELT perspective on. I have learnt that certain brain patterns are related to lower well-being (in the medial prefrontal cortex, posterior cingulate cortex/ praecuneus, and angular gyrus areas, to be gratuitously specific). Lower wellbeing correlates with patterns that are found when the brain is at wakeful rest, such as during daydreaming and mind-wandering (Bartels et al. 2022). All of which sounds, to me, like a strong argument for mindfulness practices. Keep your eye out for ELT publishers’ webinars (see below) and you’ll no doubt hear someone taking this line, along with some nice fMRI images.

Wellbeing and self-help

Academic study of wellbeing proceeds apace, but the ultimate justification for this research can only be found in its ability to help generate solutions to a real-world problem. In this sense, it is no different from the field of applied linguistics in general (from where most of the ELT wellbeing researchers come): it is its ability to solve problems which ‘alone justifies its existence in the first place’ (Widdowson, 2018: 142).

But here we run into something of a brick wall. Whilst it is generally acknowledged that improvements to teacher wellbeing require ‘structural and systemic levels of change’ and that ‘teachers should not have to compensate for fundamental flaws in the system as a whole’ (Mercer & Gregersen, 2020: 9), the ‘solutions’ that are proposed are never primarily about systems, but always about ‘me’. Take a look at any blog post on teacher wellbeing in ELT and you will see what could be called the psychologizing of the political. This process is at the heart of the positive psychology movement which so dominates the current world of wellbeing in ELT.

A look at the Teacher Wellbeing SIG of BRAZ-TESOL (on Facebook or Instagram) gives a good sample of the kind of advice that is on offer: write out a self-appreciation list, respect others, remember you are unique, be grateful, smile, develop emotional intelligence and a growth mindset, start with yourself, take care of yourself, look after your ‘authentic self’, set goals, believe that nothing is impossible, take small steps, pause and breathe, spend time with positive people, learn to say no, and so on. This advice is offered in all seriousness, but is not so very different from the kind of advice offered by @lifeadvicebot on Twitter (‘Are you struggling with the impact of sexism? Consider cultivating a sense of gratitude’ or ‘Worried about racism? Why not try stretching your back and shoulders?).

I don’t mean to suggest that mindfulness and the other nostrums on offer will be of no benefit to anybody at all, but, however well-intentioned such advice may be, it may be ‘rather better for its promoters than for its putative beneficiaries’ (Widdowson, 2021: 47). The advice is never new or original. It is rutted with the ‘grooves of borrowed thought’, lifting directly from the long tradition of self-help literature, of which it is yet another exemplar. Like all self-improvement literature, you don’t need any deep commitment to read it. Written in an accessible style (and in the case of the BRAZ-TESOL SIG in the form of illustrated inspirational quotes), there is a slight problem with all this advice. If you do decide to dive into it repeatedly, you will quickly discover ‘that it is not such a long way from surface to bottom’ (Lichterman, 1992: 427). Like all self-help literature, as Csikszentmihalyi (1990) observed on the back cover of his best-selling work, it will probably have no effect whatsoever. Whether you agree with Csikszentmihalyi or not, there is a delicious irony in the fact that this comment appeared on the back cover of his own self-help book. Like all positive psychologists, he thought he had something new and scientifically grounded to say.

There are also increasing numbers of wellbeing coaches – a thoroughly unsurprisingly development. Many of them are positive psychology adepts, some describe themselves as neuro-science based, and have a background in Neuro-Linguistic Processing. In the context of education, expect the phrase ‘life skills’ to be thrown in from time to time. See this article from Humanising Language Teaching as an example.

But self-help literature treads familiar ground. Work on the self may seem like ‘an antidote to the anxiety-provoking uncertainties of [our] economic and social order’ (McGee, 2005: 43), but it has nowhere to go and is doomed to follow its Sisyphean path. If research into teacher wellbeing in ELT cannot shake off its association with positive psychology and self-help, its justification (and interest in it) will soon slip away.

Wellbeing as a marketing tool

Wellbeing is ideally positioned as a marketing trope … as long as the connections between low wellbeing and pay / working conditions are not dwelled on. It’s a ‘new’ and ‘virtuous’ topic that sits comfortably beside inclusivity, sustainability and environmental awareness. Teaching is a caring profession: a marketing focus on wellbeing is intended to be taken as a sign that the marketers care too. They have your best interests at heart. And when the marketing comes in the form of wellbeing tips, the marketers are offering for free something which is known to be appreciated by many teachers. Some teacher wellbeing books, like the self-published ‘The Teacher’s Guide to Self-Care: Build Resilience, Avoid Burnout, and Bring a Happier and Healthier You to the Classroom’ (Forst, 2020), have sold in considerable quantities.

BETT, which organises a global series of education shows whose purpose is to market information technology in education, is a fascinating example of wellbeing marketing. The BETT shows and the website are packed with references to wellbeing, combining the use of wellbeing to market products unrelated to wellbeing, at the same time as marketing wellbeing products. Neat, eh? Most of these uses of ‘wellbeing’ are from the last couple of years. The website has a wellbeing ‘hub’. Click on an article entitled ‘Student Wellbeing Resources’ and you’ll be taken to a list of products you can buy. Other articles, like ‘Fostering well-being and engagement with Microsoft education solutions’, are clearer from the get-go.

All the major ELT publishers have jumped on the bandwagon. Some examples … Macmillan has a ‘wellness space’ (‘a curated playlist of on-demand webinars and practical resources to specifically support your well-being – and for you to return to as often as you like’). They were also ‘delighted to have championed mindfulness at the IATEFL conference this year!’ Pearson has a ‘wellbeing zone’ – ‘packed with free resources to support teachers, parents and young people with mental health and wellbeing – from advice on coping with anxiety and exam stress, to fun activities and mindfulness’. Last year, Express Publishing chose to market one of its readers with the following introductory line: ‘#Reading for pleasure improves #empathy, #socialrelationships and our general #wellbeing’. And on it goes.

Without going as far as to say that these are practices of ‘wellbeing washing’, it is only realistic, not cynical, to wonder just how seriously these organisations take questions of teacher wellbeing. There are certainly few ELT writers who feel that their publishers have the slightest concern about their wellbeing. Similarly, we might consider the British Council, which is ‘committed to supporting policymakers, school leaders and teachers in improving mental wellbeing in schools’. But less committed, it would seem, to their own teachers in Kabul or to their staff who went on strike earlier this year in protest at forced redundancies and outsourcing of jobs.

How long ‘wellbeing’ will continue to be seen as a useful marketing trope in ELT remains to be seen. It will be hard to sustain for very long, since there is so little to say about it without repetition, and since everyone is in on the game. My guess is that ‘wellbeing’ will soon be superseded by ‘sustainability’. ‘Sustainability’ is a better hooray word than ‘wellbeing’, because it combines environmental quality and wellbeing, throwing in ‘lifelong learning’ and ‘social justice’ for good measure (Kapranov, 2022). The wellbeing zones and hubs won’t need to be dismantled just yet, but there may well be a shift towards more sustainable self-care. Here are some top tips taken from How To Self-Care The Sustainable Way on the Wearth website: snooze your way to wellbeing, indulge and preen your body, grab a cuppa, slip into a warming bath, mindfully take care of your mind, retail therapy the wholesome way. All carbon-neutral, vegan and cruelty-free.

References

Barrington-Leigh, C. P. (2022) Trends in Conceptions of Progress and Well-being. In Helliwell, J. F., Layard, R., Sachs, J. D., De Neve, J.-E., Aknin, L. B. & Wang, S. World Happiness Report 2022. https://happiness-report.s3.amazonaws.com/2022/WHR+22.pdf  New York: Sustainable Development Solutions Network.

Bartels, M., Nes, R. B., Armitage, J. M., van de Weijer, M. P., de Vries L. P. & Haworth, C. M. A. (2022) Exploring the Biological Basis for Happiness. In Helliwell, J. F., Layard, R., Sachs, J. D., De Neve, J.-E., Aknin, L. B. & Wang, S. World Happiness Report 2022. https://happiness-report.s3.amazonaws.com/2022/WHR+22.pdf  New York: Sustainable Development Solutions Network.

Csikszentmihalyi, M. (1990) Flow: The Psychology of Optimal Experience. New York: Harper & Row

Forst, S. (2020) The Teacher’s Guide to Self-Care: Build Resilience, Avoid Burnout, and Bring a Happier and Healthier You to the Classroom. The Designer Teacher, LLC

Hascher, T. & Waber, J. (2021) Teacher well-being: A systematic review of the research literature from the year 2000–2019. Educational Research Review, 34 https://www.sciencedirect.com/science/article/pii/S1747938X21000348

Kapranov, O. (2022) The Discourse of Sustainability in English Language Teaching (ELT) at the University of Oxford: Analyzing Discursive Representations. Journal of Teacher Education for Sustainability, 24 (1):35-48 https://sciendo.com/article/10.2478/jtes-2022-0004

Pentón Herrera, L. J., Martínez-Alba, G. & Trinh, E. (Eds.) (2023) Teacher Well-Being in English Language Teaching: An Ecological Approach. Abingdon: Routledge

Lichterman, P. (1992) Self-help reading as a thin culture. Media, Culture and Society, 14: 421 – 447

McGee, M. (2005) Self-Help, Inc. Oxford: OUP

Mercer, S. & Gregersen, T. (2020) Teacher Wellbeing. Oxford: OUP

Widdowson, H. G. (2018) Applied linguistics as a transdisciplinary practice: What’s in a prefix? AILA Review, 31 (1): 135- 142

Widdowson, H. G. (2021) On the Subject of English. Berlin: De Gruyter

You have probably heard of the marshmallow experiment, one of the most famous and widely cited studies in social psychology. In the experiments, led by Walter Mischel at Stanford University in 1972, pre-school children were offered a choice between an immediate small reward (such as a marshmallow) or a significantly larger reward if they could wait long enough (a few minutes) to receive it. A series of follow-up studies, beginning in 1988, found that those children who had been able to delay gratification in the original experiments had better educational achievements at school and in college than those who had less self-control.

The idea that character traits like self-control could have an important impact on educational outcomes clearly resonated with many people at the time. The studies inspired further research into what is now called socio-emotional learning, and helped to popularise many educational interventions across the world that sought to teach ‘character and resilience’ in schools. In Britain alone, £5 million was pledged for a programme in 2015 to promote what the government called ‘character work’, an initiative that saw rugby coaches being used to instil the values of respect, teamwork, enjoyment, and discipline in school children.

One person who was massively influenced by the marshmallow experiment (and who, in turn, massively influenced the character-building interventions in schools), was Angela Duckworth (Duckworth et al., 2013), who worked at Stanford between 2014 and 2015. Shortly after her studies into delay of gratification, Duckworth gave a TED talk called ‘Grit: the power of passion and perseverance’ which has now had almost 10 million views. A few years later, her book with the same title (Duckworth, 2016) was published. An instant best-seller, ‘grit’ became a ‘hot topic’ in education, and, according to the editors of a special issue of The Journal for the Psychology of Language Learning (MacIntyre & Khajavy, 2021), ‘interest appears to be rapidly expanding’. Duckworth has argued that self-control and grit are different and unrelated, but a number of studies have contradicted this view (Oxford & Khajafy, 2021), and the relationship between the two is clear in Duckworth’s intellectual and publishing trajectory.

This continued and expanding interest in grit is a little surprising. In a previous (June, 2020) blog post , I looked at the problems with the concept of ‘grit’, drawing on the work of Marcus Credé (2017; 2018) that questioned whether it made sense to talk about ‘grit’ as a unitary construct, noted the difficulty of measuring ‘grit’ and the lack of evidence in support of educational interventions to promote ‘grit’ (despite the millions and millions that have been spent). In a more recent article, Credé and his collaborator, Michael Tynan (Credé & Tynan, 2021), double-down on their criticisms, observing that ‘meta-analytic syntheses of the grit literature have shown that grit is a poor predictor of performance and success in its own right, and that it predicts success in academic and work settings far more poorly than other well-known predictors’. One of these other well-known predictors is the socio-economic status of students’ families. Credé and Tynan remain ‘deeply skeptical of the claim that grit, as a unitary construct formed by combining scores on perseverance and passion, holds much value for researchers focused on SLA—or any other domain’.

In the same journal issue as the Credé and Tynan article, Rebecca Oxford and Gholam Khajavy (2021) sound further notes of caution about work on ‘grit’. They suggest that researchers need to avoid confusing grit with other constructs like self-control – a suggestion that may be hard or impossible to follow if, in fact these constructs are not clearly separable (as Oxford and Khajavy note). They argue, too, that much more attention needs to be paid to socio-economic contexts, that structural barriers to achievement must be given fuller consideration if ‘grit’ is to contribute anything positive to social justice. Whether the other papers in this special edition of the Journal for the Psychology of Language Learning that is devoted to ‘grit’ heed the cautionary advice of Credé and Tynan, Oxford and Khajavy is, I think, open to debate. Perhaps the idea of a whole edition of a journal devoted to ‘grit’ is a problematic starting point. Since there is no shortage of reasons to believe that ‘grit’ isn’t actually a ‘thing’, why take ‘grit’ as a starting point for scholarly enquiry?

It might be instructive to go back to how ‘grit’ became a ‘thing’ in the first place. It’s an approach that the contributors to the special issue of the Journal for the Psychology of Language Learning have not adopted. This brings me back to the marshmallow test. At the time that ‘grit’ was getting going, Alfie Kohn brought out a book called ‘The Myth of the Spoiled Child’ (Kohn, 2014) that included a chapter ‘Why Self-Discipline Is Overrated: A Closer Look at Grit, Marshmallows, and Control from Within’. Kohn argued that educational ideas about ‘grit’ had misrepresented the findings of the marshmallow test and its follow-up studies. He argued that setting was more important than individual self-control, and that deferral of gratification was likely an effect, not a cause of anything. His ideas were supported by some of the original researchers, including Mischel himself. Another, Yuichi Shoda, a co-author of a key paper that linked delay of gratification to SAT scores, has observed that ‘Our paper does not mention anything about interventions or policies’ – many other factors would need to be controlled to validate the causal relationship between self-control and academic achievement (Resnick, 2018).

Interest in recent years in replicating experiments in social psychology has led to confirmation that something was seriously wrong with the follow-ups to the marshmallow experiment. Studies (e.g. Watts et al., 2018) with more representative and larger groups of children have found that correlations between academic achievement and self-control almost vanished when controlled for factors like family background and intelligence. Even if you can teach a child to delay gratification, it won’t necessarily lead to any benefits later on.

Self-control and ‘grit’ may or may not be different things, but one thing they clearly have in common is their correlation with socio-economic differences. It is distinctly possible that attention to ‘grit’, in language learning and in other fields, is a distraction from more pressing concerns. Pity the poor researchers who have hitched themselves to the ‘grit’ bandwagon … As Angela Duckworth has said, research into grit is itself ‘a delay of gratification test’ (Duckworth, 2013). You have to be really passionate about grit and show sustained persistence if you want to keep on publishing on the subject, despite all that we now know. She hopes ‘that as a field we follow through on our intentions to forgo more immediately rewarding temptations to instead do what is best for science in the long-run’. How about forgoing the immediately rewarding temptation of publishing yet more stuff on this topic?

References

Credé, M. (2018) What shall we do about grit? A critical review of what we know and what we don’t know. Educational Researcher, 47 (9), 606-611.

Credé, M. & Tynan, M. C. (2021) Should Language Acquisition Researchers Study “Grit”? A Cautionary Note and Some Suggestions. Journal for the Psychology of Language Learning, 3 (2), 37 – 44

Credé, M., Tynan, M. C. & Harms, P. D. (2017) Much ado about grit: A meta-analytic synthesis of the grit literature. Journal of Personality and Social Psychology, 113 (3)

Duckworth, A. L. (2013) Is It Really Self-control: A Critical Analysis of the “Marshmallow Test” Society of Personality and Social Psychology Connections November 10, 2013 https://spsptalks.wordpress.com/2013/11/10/is-it-really-self-control-a-critical-analysis-of-the-marshmallow-test/

Duckworth, A. L., Tsukayama, E. & Kirby, T. A. (2013) Is it really self-control? Examining the predictive power of the delay of gratification response. Personality and Social Psychology Bulletin, 39, 843-855.

Duckworth, A. (2016) Grit: the power of passion and perseverance. New York: Scribner

Kohn, A. (2014) The Myth of the Spoiled Child. Boston: Da Capo Press

MacIntyre, P. & Khajavy, G. H. (2021) Grit in Second Language Learning and Teaching: Introduction to the Special Issue. Journal for the Psychology of Language Learning, 3 (2), 1-6. http://www.jpll.org/index.php/journal/article/view/86

Oxford, R. & Khajafy, G. H. (2021) Exploring Grit: “Grit Linguistics” and Research on Domain-General Grit and L2 Grit. Journal for the Psychology of Language Learning, 3 (2), 7 – 35

Resnick, B. (2018) The “marshmallow test” said patience was a key to success. A new replication tells us s’more. Vox, June 6, 2018. https://www.vox.com/science-and-health/2018/6/6/17413000/marshmallow-test-replication-mischel-psychology

Watts, T.W., Duncan, G.J. & Quan, H. (2018) Revisiting the Marshmallow Test: A Conceptual Replication Investigating Links Between Early Delay of Gratification and Later Outcomes. Psychological Science 29 (7): 1159-1177.

There’s a video on YouTube from Oxford University Press in which the presenter, the author of a coursebook for primary English language learners (‘Oxford Discover’), describes an activity where students have a short time to write some sentences about a picture they have been shown. Then, working in pairs, they read aloud their partner’s sentences and award themselves points, with more points being given for sentences that others have not come up with. For lower level, young learners, it’s not a bad activity. It provides opportunities for varied skills practice of a limited kind and, if it works, may be quite fun and motivating. However, what I found interesting about the video is that it is entitled ‘How to teach critical thinking skills: speaking’ and the book that is being promoted claims to develop ‘21st Century Skills in critical thinking, communication, collaboration and creativity’. The presenter says that the activity achieves its critical thinking goals by promoting ‘both noticing and giving opinions, […] two very important critical thinking skills.’

Noticing (or observation) and giving opinions are often included in lists of critical thinking skills, but, for this to be the case, they must presumably be exercised in a critical way – some sort of reasoning must be involved. This is not the case here, so only the most uncritical understanding of critical thinking could consider this activity to have any connection to critical thinking. Whatever other benefits might accrue from it, it seems highly unlikely that the students’ ability to notice or express opinions will be developed.

My scepticism is not shared by many users of the book. Oxford University Press carried out a scientific-sounding ‘impact study’: this consisted of a questionnaire (n = 198) in which ‘97% of teachers reported that using Oxford Discover helps their students to improve in the full range of 21st century skills, with critical thinking and communication scoring the highest’.

Enthusiasm for critical thinking activities is extremely widespread. In 2018, TALIS, the OECD Teaching and Learning International Survey (with more than 4000 respondents) found that ‘over 80% of teachers feel confident in their ability to vary instructional strategies in their classroom and help students think critically’ and almost 60% ‘frequently or always’ ‘give students tasks that require students to think critically.’ Like the Oxford ‘impact study’, it’s worth remembering that these are self-reporting figures.

This enthusiasm is shared in the world of English language teaching, reflected in at least 17 presentations at the 2021 IATEFL conference that discussed practical ideas for promoting critical thinking. These ranged from the more familiar (e.g. textual analysis in EAP) to the more original – developing critical thinking through the use of reading reaction journals, multicultural literature, fables, creative arts performances, self-expression, escape rooms, and dice games.

In most cases, it would appear that the precise nature of the critical thinking that was ostensibly being developed was left fairly vague. This vagueness is not surprising. Practically the only thing that writers about critical thinking in education can agree on is that there is no general agreement about what, precisely, critical thinking is. Lai (2011) offers an accessible summary of a range of possible meanings, but points out that, in educational contexts, its meaning is often rather vague and encompasses other concepts (such as higher order thinking skills) which also lack clarity. Paul Dummett and John Hughes (2019: 4) plump for ‘a mindset that involves thinking reflectively, rationally and reasonably’ – a vague definition which leaves unanswered two key questions: to what extent is it a skill set or a disposition? Are these skills generic or domain specific?

When ‘critical thinking’ is left undefined, it is impossible to evaluate the claims that a particular classroom activity will contribute to the development of critical thinking. However, irrespective of the definition, there are good reasons to be sceptical about the ability of educational activities to have a positive impact on the generic critical thinking skills of learners in English language classes. There can only be critical-thinking value in the activity described at the beginning of this post if learners somehow transfer the skills they practise in the activity to other domains of their lives. This is, of course, possible, but, if we approach the question with a critical disposition, we have to conclude that it is unlikely. We may continue to believe the opposite, but this would be an uncritical act of faith.

The research evidence on the efficacy of teaching generic critical thinking is not terribly encouraging (Tricot & Sweller, 2014). There’s no shortage of anecdotal support for classroom critical thinking, but ‘education researchers have spent over a century searching for, and failing to find evidence of, transfer to unrelated domains by the use of generic-cognitive skills’ (Sweller, 2022). One recent meta-analysis (Huber & Kuncel, 2016) found insufficient evidence to justify the explicit teaching of generic critical thinking skills at college level. In an earlier blog post https://adaptivelearninginelt.wordpress.com/2020/10/16/fake-news-and-critical-thinking-in-elt/ looking at the impact of critical thinking activities on our susceptibility to fake news, I noted that research was unable to find much evidence of the value of media literacy training. When considerable time is devoted to generic critical thinking training and little or no impact is found, how likely is it that the kind of occasional, brief one-off activity in the ELT classroom will have the desired impact? Without going as far as to say that critical thinking activities in the ELT classroom have no critical-thinking value, it is uncontentious to say that we still do not know how to define critical thinking, how to assess evidence of it, or how to effectively practise and execute it (Gay & Clark, 2021).

It is ironic that there is so little critical thinking about critical thinking in the world of English language teaching, but it should not be particularly surprising. Teachers are no more immune to fads than anyone else (Fuertes-Prieto et al., 2020). Despite a complete lack of robust evidence to support them, learning styles and multiple intelligences influenced language teaching for many years. Mindfulness, growth mindsets, grit are more contemporary influences and, like critical thinking, will go the way of learning styles when the commercial and institutional forces that currently promote them find the lack of empirical supporting evidence problematic.

Critical thinking is an educational aim shared by educational authorities around the world, promoted by intergovernmental bodies like the OECD, the World Bank, the EU, and the United Nations. In Japan, for example, the ‘Ministry of Education (MEXT) puts critical thinking (CT) at the forefront of its ‘global jinzai’ (human capital for a global society) directive’ (Gay & Clark, 2021). It is taught as an academic discipline in some universities in Russia (Ivlev et al, 2021) and plans are underway to introduce it into schools in Saudi Arabia. https://www.arabnews.com/node/1764601/saudi-arabia I suspect that it doesn’t mean quite the same thing in all these places.

Critical thinking is also an educational aim that most teachers can share. Few like to think of themselves as Gradgrinds, bashing facts into their pupils’ heads: turning children into critical thinkers is what education is supposed to be all about. It holds an intuitive appeal, and even if we (20% of teachers in the TALIS survey) lack confidence in our ability to promote critical thinking in the classroom, few of us doubt the importance of trying to do so. Like learning styles, multiple intelligences and growth mindsets, it seems possible that, with critical thinking, we are pushing the wrong thing, but for the right reasons. But just how much evidence, or lack of evidence, do we need before we start getting critical about critical thinking?

References

Dummett, P. & Hughes, J. (2019) Critical Thinking in ELT. Boston: National Geographic Learning

Fuertes-Prieto, M.Á., Andrés-Sánchez, S., Corrochano-Fernández, D. et al. (2020) Pre-service Teachers’ False Beliefs in Superstitions and Pseudosciences in Relation to Science and Technology. Science & Education 29, 1235–1254 (2020). https://doi.org/10.1007/s11191-020-00140-8

Gay, S. & Clark, G. (2021) Revisiting Critical Thinking Constructs and What This Means for ELT. Critical Thinking and Language Learning, 8 (1): pp. 110 – 147

Huber, C.R. & Kuncel, N.R. (2016) Does College Teach Critical Thinking? A Meta-Analysis. Review of Educational Research. 2016: 86 (2) pp.:431-468. doi:10.3102/0034654315605917

Ivlev, V. Y., Pozdnyakov, M. V., Inozemtsez, V. A. & Chernyak, A. Z. (2021) Critical Thinking in the Structure of Educational Programs in Russian Universities. Advances in Social Science, Education and Humanities Research, volume 555: pp. 121 -128

Lai, E.R. 2011. Critical Thinking: A Literature Review. Pearson. http://images.pearsonassessments.com/images/tmrs/CriticalThinkingReviewFINAL.pdf

Sweller, J. (2022) Some Critical Thoughts about Critical and Creative Thinking. Sydney: The Centre for Independent Studies Analysis Paper 32

Tricot, A., & Sweller, J. (2014) Domain-specific knowledge and why teaching generic skills does not work. Educational Psychology Review, 26, 265- 283.

In May of last year, EL Gazette had a story entitled ‘Your new English language teacher is a robot’ that was accompanied by a stock photo of a humanoid robot, Pepper (built by SoftBank Robotics). The story was pure clickbait and the picture had nothing to do with it. The article actually concerned a chatbot (EAP Talk) to practise EAP currently under development at a Chinese university. There’s nothing especially new about chatbots: I last blogged about them in 2016 and interest in them, both research and practical, dates back to the 1970s (Lee et al., 2020). There’s nothing, as far as I can see, especially new about the Chinese EAP chatbot project either. The article concludes by saying that the academic behind the project ‘does not believe that AI can ever replace a human teacher’, but that chatbots might offer some useful benefits.

The benefits are, however, limited – a point that is acknowledged even by chatbot enthusiasts like Lee et al (2020). We are some way from having chatbots that we can actually have meaningful conversations with, but they do appear to have some potential as ‘intelligent tutoring systems’ to provide practice of and feedback on pre-designated bits of language (especially vocabulary and phrases). The main benefit that is usually given, as in the EL Gazette article, is that they are non-judgemental and may, therefore, be appropriate for shy or insecure learners.

Social robots, of the kind used in the illustration for the EL Gazette story, are, of course, not the same as chatbots. Chatbots, like EAP Talk, can be incorporated into all sorts of devices (notably phones, tablets and laptops) and all sorts of applications. If social robots are to be used for language learning, they will clearly need to incorporate chatbots, but in what ways could the other features of robots facilitate language acquisition? Pepper (the robot in the picture) has ‘touch sensors, LEDs and microphones for multimodal interactions’, along with ‘infrared sensors, bumpers, an inertial unit, 2D and 3D cameras, and sonars for omnidirectional and autonomous navigation’. How could these features help language acquisition?

Lee and Lee (2022) attempt to provide an answer to this question. Here’s what they have come up with:

By virtue of their physical embodiment, social robots have been suggested to provide language learners with direct and physical interactions, which is considered one of the basic ingredients for language learning. In addition, as social robots are generally humanoids or anthropomorphized animal shapes, they have been valued for their ability to serve as familiar conversational partners, having potential to lower the affective filter of language learners.

Is there any research evidence to back up these claims? The short answer is no. Motivation and engagement may sometimes be positively impacted, but we can’t say any more than that. As far as learning is concerned, Lee and Lee (2022: 121) write: involving social robots led to statistically similar or even higher [English language learning] outcomes compared with traditional ELT contexts (i.e. no social robot). In other words, social robots did not, on the whole, have a negative impact on learning outcomes. Hardly grounds for wild enthusiasm … Still, Lee and Lee, in the next line, refer to the ‘positive effectiveness of social robots in English teaching’ before proceeding to enumerate the ways in which these robots could be used in English language learning. Doesn’t ELT Journal have editors to pick up on this kind of thing?

So, how could these robots be used? Lee and Lee suggest (for younger learners) one-on-one vocabulary tutoring, dialogue practice, more vocabulary teaching, and personalized feedback. That’s it. It’s worth noting that all of these functions could equally well be carried out by chatbots as by social robots.

Lee and Lee discuss and describe the social robot, NAO6, also built by SoftBank Robotics. It’s a smaller and cheaper cousin of the Pepper robot that illustrates the EL Gazette article. Among Lee and Lee’s reasons for using social robots is that they ‘have become more accessible due to ever-lower costs’: NAO6 costs around £350 a month to rent. Buying it outright is also an option. Eduporium (‘Empowering the future with technology’) has one on offer for $12,990.00. According to the blurb, it helps ‘teach coding, brings literature to life, enhances special education, and allows for training simulations. Plus, its educational solutions include an intuitive interface, remote learning, and various applications for accessibility!’

It’s easy enough to understand why EL Gazette uses clickbait from time to time. I’m less clear about why ELT Journal would print this kind of nonsense. According to Lee and Lee, further research into social robots ‘would initiate a new era of language learning’ in which the robots will become ‘an important addition to the ELT arsenal’. Yeah, right …

References

Lee, H. & Lee, J. H. (2022) Social robots for English language teaching. ELT Journal 76 (1): 119 – 124

Lee, J. H., Yang, H., Shin D. & Kim, H. (2020) Chatbots. ELT Journal 74 (3): 338 – 3444

We need to talk

Posted: December 13, 2021 in Discourse, research
Tags: , , ,

In 1994, in a well-known TESOL Quarterly article entitled ‘The dysfunctions of theory/practice discourse’, Mark A. Clarke explored the imbalance in the relationship between TESOL researchers and English language teachers, and the way in which the former typically frame the latter as being less expert than themselves. In the last 25 years, the topic has regularly resurfaced, most recently with the latest issue of the Modern Language Journal, a special issue devoted entirely to ‘the Research-Practice Dialogue in Second Language Learning and Teaching’ (Sato & Loewen, 2022). At the heart of the matter is the fact that most teachers are just not terribly interested in research and rarely, if ever, read it (Borg, 2009). Much has been written on whether or not this matters, but that is not my concern here.

Sato and Loewen’s introductory article reviews the reasons behind the lack of dialogue between researchers and teachers, and, in an unintentionally comic meta move, argue that more research is needed into teachers’ lack of interest in research. This is funny because one of the reasons for a lack of dialogue between researchers and teachers is that ‘teachers have been researched too much ON and not enough WITH’ (Farrell, 2016): most research has not been carried out ‘for the teacher’s benefit and needs’, with the consequence being that ‘the result is purely academic’. Sato and Loewen’s primary focus in the article is on ‘classroom teachers’, with whom they would like to see more ‘dialogue’, but, as they acknowledge, they publish in a research journal whose ‘most likely readers are researchers’. They do not appear to have read Alan Maley’s ‘‘More Research is Needed’ – A Mantra Too Far?’ (Maley, 2016). Perhaps the article (and the Humanising Language Teaching magazine it is from) passed under their radar because it’s written for teachers (not researchers), it’s free and does not have an impact factor?

I wasn’t entirely convinced by the argument that more research about research is needed, not least because Sato and Loewen provide a fairly detailed analysis of the obstacles that exist to dialogue between researchers and teachers. They divide these into two categories:

Epistemological obstacles: the framing of researchers as generators of knowledge and teachers as consumers of knowledge; teachers’ scepticism about the relevance of some research findings to real-world teaching situations; the different discourse communities inhabited by researchers and teachers, as evidenced by the academic language choices of the former.

Practical obstacles: institutional expectations for researchers to publish in academic journals and a lack of time for researchers to engage in dialogue with teachers; teachers’ lack of time and lack of access to research.

Nothing new here, nothing contentious, either. Nothing new, either, in their argument that more dialogue between researchers and teachers would be of mutual benefit. They acknowledge that ‘In the current status of the research-practice relationship, it is researchers who are concerned about transferability of their findings to classrooms. Practitioners may not have burning motivation or urgent needs to reach out to researchers’. Consequently, it is researchers who should ‘shoulder the lion’s share of responsibility in this initiative’. This implies that, while the benefit could be mutual, it is not mutually proportionate, since researchers have both more to lose and more to gain.

They argue that it would be helpful to scrutinize closely the relationship between researchers and teachers (they prefer to use the word ‘practitioners’) and that researchers need to reflect on their own beliefs and practices, in particular the way that researchers are stake-holders in the research-practice relationship. I was disappointed that they didn’t go into more detail here and would like to suggest one angle of the ‘cui bono’ question worth exploring. The work of TESOL researchers is mostly funded by TESOL teaching. It is funded, in other words, by selling a commodity – TESOL – to a group of consumers … who are teachers. If we frame researchers as vendors and teachers as (potential) clients, [1] a rather different light is shone on pleas for more dialogue.

The first step, Sato and Loewen claim, towards achieving such a dialogue would be ‘nurturing a collaborative mindset in both researchers and teachers’. And the last of four steps to removing the obstacles to dialogue would be ‘institutional support’ for both teachers and researchers. But without institutional support, mindsets are unlikely to become more collaborative, and the suggestions for institutional support (e.g. time release and financial support for teachers) are just pie in the sky. Perhaps sensing this, Sato and Loewen conclude the article by asking whether their desire to see a more collaborative mindset (and, therefore, more dialogue) is just a dream. Back in 1994, Mark Clarke had this to say:

The only real solution to the problems I have identified would be to turn the hierarchy on its head, putting teachers on the top and arraying others-pundits, professors, administrators, researchers, and so forth-below them. This would require a major change in our thinking and in our behavior and, however reasonable it may appear to be, I do not see this happening. (Clarke, 1994: 18)

In 2017, ELT Journal published an entertaining piece of trolling by Péter Medgyes, ‘The (ir)relevance of academic research for the language teacher’, in which he refers to the expert status of researchers as related to the ‘orthodox and mistaken belief that by virtue of having churned out tons of academic papers and books, they must be performing an essential service for language education’. It is not hard to imagine the twinkle in his eye as he wrote it. In the same volume, Amos Paran (2017) picks up the bait, arguing for more dialogue between researchers and teachers. In response to Paran’s plea, Medgyes points out that there is an irony in preaching the importance of dialogue in a top-down manner. ‘As long as the playing field is uneven, it is absurd to talk about dialogue, if a dialogue is at all necessary’, he writes. The same holds true for Sato and Loewen. They acknowledge (Sato & Loewen, 2018) that ‘researchers’ top-down attitudes will not facilitate the dialogue’, but, try as they might, their own mindset is seemingly inescapable. In one article that was attempting to reach out to teachers (Sato, Loewen & Kim, 2021), they managed to make one teacher trainer, Sandy Millin, feel that teachers were being unfairly attacked.

The phrase ‘We need to talk’ has been described as, perhaps, the most dreaded four words in the English language. When you hear it, you know (1) that someone wants to talk to you (and not the other way round), (2) that, whether you want to talk or not, the other person will have their say, (3) that the discussion will almost certainly involve some criticism of you, and this may be merited, and (4) whatever happens next, it is unlikely that your relationship will improve.

References

Borg, S. (2009). English language teachers’ conceptions of research. Applied Linguistics, 30 (3): 358 – 88

Clarke, M. (1994). The dysfunctions of theory/practice discourse. TESOL Quarterly, 28: 9-26.

Farrell, T. (2016). Reflection, reflection, reflection. Responses to the Chapter:  More Research is Needed – A Mantra Too Far? Humanising Language Teaching, 18 (3) http://old.hltmag.co.uk/jun16/mart.htm

Maley, A. (2016). ‘More Research is Needed’ – A Mantra Too Far? Humanising Language Teaching, 18 (3) http://old.hltmag.co.uk/jun16/mart.htm

Medgyes, P. (2017). The (ir)relevance of academic research for the language teacher. ELT Journal, 71 (4): 491–498

Paran, A. (2017). ‘Only connect’: researchers and teachers in dialogue. ELT Journal, 71 (4): 499 – 508

Sato, M., & Loewen, S. (2022). The research-practice dialogue in second language learning and teaching: Past, present, and future. The Modern Language Journal, 106 (3)

Sato, M. & Loewen, S. (Eds.) (2019) Evidence-Based Second Language Pedagogy. New York: Routledge

Sato, M. & Loewen, S. (2018). Do teachers care about research? The research–pedagogy dialogue. ELT Journal 73 (1): 1 – 10

Sato, M., Loewen, S. & Kim, Y. J. (2021) The role and value of researchers for teachers: five principles for mutual benefit. TESOL AL Forum September 2021. http://newsmanager.commpartners.com/tesolalis/issues/2021-08-30/email.html#4


[1] I am actually a direct customer of Sato and Loewen, having bought for £35 last year a copy of their edited volume ‘Evidence-Based Second Language Pedagogy’. According to the back cover, it is a ‘cutting-edge collection of empirical research [which closes] the gap between research and practice’. In reality, it’s a fairly random collection of articles of very mixed quality, many of which are co-authored by ‘top scholars’ and the PhD students they are supervising. It does nothing to close any gaps between research and practice and I struggle to see how it could be of any conceivable benefit to teachers.

Five years ago, in 2016, there was an interesting debate in the pages of the journal ‘Psychological Review’. It began with an article by Jeffrey Bowers (2016a), a psychologist at the University of Bristol, who argued that neuroscience (as opposed to psychology) has little, or nothing, to offer us, and is unlikely ever to be able to do so, in terms of improving classroom instruction. He wasn’t the first to question the relevance of neuroscience to education (see, for example, Willingham, 2009), but this was a full-frontal attack. Bowers argued that ‘neuroscience rarely offers insights into instruction above and beyond psychology’ and that neuroscientific evidence that the brain changes in response to instruction are irrelevant. His article was followed by two counter-arguments (Gabrieli, 2016; Howard-Jones, et al., 2016), which took him to task for too narrowly limiting the scope of education to classroom instruction (neglecting, for example, educational policy), for ignoring the predictive power of neuroimaging on neurodevelopmental differences (and, therefore, its potential value in individualising curricula), and for failing to take account of the progress that neuroscience, in collaboration with educators, has already made. Bowers’ main argument, that educational neuroscience had little to tell us about teaching, was not really addressed in the counter-arguments, and Bowers (2016b) came back with a counter-counter-rebuttal.

The brain responding to seductive details

In some ways, the debate, like so many of the kind, suffered from the different priorities of the participants. For Gabriele and Howard-Jones et al., Bowers had certainly overstated his case, but they weren’t entirely in disagreement with him. Paul Howard-Jones has been quoted by André Hedlund as saying that ‘all neuroscience can do is confirm what we’ve been doing all along and give us new insights into a couple of new things’. One of Howard-Jones’ co-authors, Usha Goswami, director of the Centre for Neuroscience in Education at the University of Cambridge, has said that ‘there is a gulf between current science and classroom applications’ (Goswami, 2006).

For teachers, though, it is the classroom applications that are of interest. Claims for the relevance of neuroscience to ELT have been made by many. We [in ESL / EFL] need it, writes Curtis Kelly (2017). Insights from neuroscience can, apparently, make textbooks more ‘brain friendly’ (Helgesen & Kelly, 2015). Herbert Puchta’s books are advertised by Cambridge University Press as ‘based on the latest insights into how the brain works fresh from the field of neuroscience’. You can watch a British Council talk by Rachael Roberts, entitled ‘Using your brain: what neuroscience can teach us about learning’. And, in the year following the Bowers debate, Carol Lethaby and Patricia Harries gave a presentation at IATEFL Glasgow (Lethaby & Harries, 2018) entitled ‘Research and teaching: What has neuroscience ever done for us?’ – a title that I have lifted for this blog post. Lethaby and Harries provide a useful short summary of the relevance of neuroscience to ELT, and I will begin my discussion with that. They expand on this in their recent book (Lethaby, Mayne & Harries, 2021), a book I highly recommend.

So what, precisely, does neuroscience have to tell English language teachers? Lethaby and Harries put forward three main arguments. Firstly, neuroscience can help us to bust neuromyths (the examples they give are right / left brain dominance and learning styles). Secondly, it can provide information that informs teaching (the examples given are the importance of prior knowledge and the value of translation). Finally, it can validate existing best practice (the example given is the importance of prior knowledge). Let’s take a closer look.

I have always enjoyed a bit of neuromyth busting and I wrote about ‘Left brains and right brains in English language teaching’ a long time ago. It is certainly true that neuroscience has helped to dispel this myth: it is ‘simplistic at best and utter hogwash at worst’ (Dörnyei, 2009: 49). However, we did not need neuroscience to rubbish the practical teaching applications of this myth, which found their most common expression in Neuro-Linguistic Programming (NLP) and Brain Gym. Neuroscience simply banged in the final nail in the coffin of these trends. The same is true for learning styles and the meshing hypothesis. It’s also worth noting that, despite the neuroscientific evidence, such myths are taking a long time to die … a point I will return to at the end of this post.

Lethaby and Harries’s second and third arguments are essentially the same, unless, in their second point they are arguing that neuroscience can provide new information. I struggle, however, to see anything that is new. Neuroimaging apparently shows that the medial prefrontal cortex is activated when prior knowledge is accessed, but we have long known (since Vygotsky, at least!) that effective learning builds on previous knowledge. Similarly, the amygdala (known to be associated with the processing of emotions) may play an important role in learning, but we don’t need to know about the amygdala to understand the role of affect in learning. Lastly, the neuroscientific finding that different languages are not ‘stored’ in separate parts of the brain (Spivey & Hirsch, 2003) is useful to substantiate arguments that translation can have a positive role to play in learning another language, but convincing arguments predate findings such as these by many, many years. This would all seem to back up Howard-Jones’s observation about confirming what we’ve been doing and giving us new insights into a couple of new things. It isn’t the most compelling case for the relevance of neuroscience to ELT.

Chapter 2 of Carol Lethaby’s new book, ‘An Introduction to Evidence-based Teaching in the English Language Classroom’ is devoted to ‘Science and neuroscience’. The next chapter is called ‘Psychology and cognitive science’ and practically all the evidence for language teaching approaches in the rest of the book is drawn from cognitive (rather than neuro-) science. I think the same is true for the work of Kelly, Helgesen, Roberts and Puchta that I mentioned earlier.

It is perhaps the case these days that educationalists prefer to refer to ‘Mind, Brain, and Education Science’ (MBE) – the ‘intersection of neuroscience, education, and psychology’ – rather than educational neuroscience, but, looking at the literature of MBE, there’s a lot more education and psychology than there is neuroscience (although the latter always gets a mention). Probably the most comprehensive and well-known volume of practical ideas deriving from MBE is ‘Making Classrooms Better’ (Tokuhama-Espinosa, 2014). Of the 50 practical applications listed, most are either inspired by the work of John Hattie (2009) or the work of cognitive psychologists. Neuroscience hardly gets a look in.

To wrap up, I’d like to return to the question of neuroscience’s role in busting neuromyths. References to neuroscience, especially when accompanied by fMRI images, have a seductive appeal to many: they confer a sense of ‘scientific’ authority. Many teachers, it seems, are keen to hear about neuroscience (Pickering & Howard-Jones, 2007). Even when the discourse contains irrelevant neuroscientific information (diagrams of myelination come to mind), it seems that many of us find this satisfying (Weisberg et al., 2015; Weisberg et al., 2008). It gives an illusion of explanatory depth (Rozenblit & Keil, 2002), the so-called ‘seductive details effect’. You are far more likely to see conference presentations, blog posts and magazine articles extolling the virtues of neuroscientific findings than you are to come across things like I am writing here. But is it possible that the much-touted idea that neuroscience can bust neuromyths is itself a myth?

Sadly, we have learnt in recent times that scientific explanations have only very limited impact on the beliefs of large swathes of the population (including teachers, of course). Think of climate change and COVID. Why should neuroscience be any different? It probably isn’t. Scurich & Shniderman (2014) found that ‘neuroscience is more likely to be accepted and credited when it confirms prior beliefs’. We are more likely to accept neuroscientific findings because we ‘find them intuitively satisfying, not because they are accurate’ (Weisberg, et al. 2008). Teaching teachers about educational neuroscience may not make much, if any, difference (Tham et al., 2019). I think there is a danger in using educational neuroscience, seductive details and all, to validate what we already do (as opposed to questioning what we do). And for those who don’t already do these things, they’ll probably ignore such findings as there are, anyway.

References

Bowers, J. (2016a) The practical and principled problems with educational Neuroscience. Psychological Review 123 (5) 600 – 612

Bowers, J.S. (2016b) Psychology, not educational neuroscience, is the way forward for improving educational outcomes for all children: Reply to Gabrieli (2016) and Howard-Jones et al. (2016). Psychological Review. 123 (5):628-35.

Dörnyei, Z. (2009) The Psychology of Second Language Acquisition. Oxford: Oxford University Press

Gabrieli, J.D. (2016) The promise of educational neuroscience: Comment on Bowers (2016). Psychological Review. 123 (5):613-9

Goswami , U. (2006). Neuroscience and education: From research to practice? Nature Reviews Neuroscience, 7: 406 – 413

Hattie, J. (2009) Visible Learning: A synthesis of over 800 meta-analyses relating to achievement. London: Routledge

Helgesen, M. & Kelly, C. (2015) Do-it-yourself: Ways to make your textbook more brain-friendly’ SPELT Quarterly, 30 (3): 32 – 37

Howard-Jones, P.A., Varma. S., Ansari, D., Butterworth, B., De Smedt, B., Goswami, U., Laurillard, D. & Thomas, M. S. (2016) The principles and practices of educational neuroscience: Comment on Bowers (2016). Psychological Review. 123 (5):620-7

Kelly, C. (2017) The Brain Studies Boom: Using Neuroscience in ESL/EFL Teacher Training. In Gregersen, T. S. & MacIntyre, P. D. (Eds.) Innovative Practices in Language Teacher Education pp.79-99 Springer

Lethaby, C. & Harries, P. (2018) Research and teaching: What has neuroscience ever done for us?’ in Pattison, T. (Ed.) IATEFL Glasgow Conference Selections 2017. Faversham, Kent, UK: IATEFL  p. 36- 37

Lethaby, C., Mayne, R. & Harries, P. (2021) An Introduction to Evidence-Based Teaching in the English Language Classroom. Shoreham-by-Sea: Pavilion Publishing

McCabe, D.P. & Castel, A.D. (2008) Seeing is believing: The effect of brain images on judgments of scientific reasoning. Cognition 107: 343–352.

Pickering, S. J. & Howard-Jones, P. (2007) Educators’ views on the role of neuroscience in education: findings from a study of UK and international perspectives. Mind Brain Education 1: 109–113.

Rozenblit, L., & Keil, F. (2002). The misunderstood limits of folk science: an illusion of explanatory depth. Cognitive science, 26(5), 521–562.

Scurich, N., & Shniderman, A. (2014) The selective allure of neuroscientific explanations. PLOS One, 9 (9), e107529. http://dx.doi.org/10.1371/journal.pone. 0107529.

Spivey, M. V. & Hirsch, J. (2003) ‘Shared and separate systems in bilingual language processing: Converging evidence from eyetracking and brain imaging’ Brain and Language, 86: 70 – 82

Tham, R., Walker, Z., Tan, S.H.D., Low, L.T. & Annabel Chan, S.H. (2019) Translating educational neuroscience for teachers. Learning: Research and Practice, 5 (2): 149-173 Singapore: National Institute of Education

Tokuhama-Espinosa, T. (2014) Making Classrooms Better. New York: Norton

Weisberg, D. S., Taylor, J. C. V. & Hopkins, E.J. (2015) Deconstructing the seductive allure of neuroscience explanations. Judgment and Decision Making, Vol. 10, No. 5, September 2015, pp. 429–441

Weisberg, D. S., Keil, F. C., Goodstein, J., Rawson, E., & Gray, J. R. (2008). The seductive allure of neuroscience explanations. Journal of cognitive neuroscience, 20 (3): 470–477.

Willingham, D. T. (2009). Three problems in the marriage of neuroscience and education. Cortex, 45: 54-55.

I’ve written about mindset before (here), but a recent publication caught my eye, and I thought it was worth sharing.

Earlier this year, the OECD produced a report on its 2018 PISA assessments. This was significant because it was the first time that the OECD had attempted to measure mindsets and correlate them to academic achievements. Surveying some 600,000 15-year-old students in 78 countries and economies, it is, to date, the biggest, most global attempt to study the question. Before going any further, a caveat is in order. The main focus of PISA 2018 was on reading, so any correlations that are found between mindsets and achievement can only be interpreted in the context of gains in reading skills. This is important to bear in mind, as previous research into mindsets indicates that mindsets may have different impacts on different school subjects.

There has been much debate about how best to measure mindsets and, indeed, whether they can be measured at all. The OECD approached the question by asking students to respond to the statement ‘Your intelligence is something about you that you can’t change very much’ by choosing “strongly disagree”, “disagree”, “agree”, or “strongly agree”. Disagreeing with the statement was considered a precursor of a growth mindset, as it is more likely that someone who thinks intelligence can change will challenge him/herself to improve it. Across the sample, almost two-thirds of students showed a growth mindset, but there were big differences between countries, with students in Estonia, Denmark, and Germany being much more growth-oriented than those in Greece, Mexico or Poland (among OECD countries) and the Philippines, Panama, Indonesia or Kosovo (among the non-OECD countries). In line with previous research, students from socio-economically advantaged backgrounds presented a growth mindset more often than those from socio-economically disadvantaged backgrounds.

I have my problems with the research methodology. A 15-year-old from a wealthy country is much more likely than peers in other countries to have experienced mindset interventions in school: motivational we-can-do-it posters, workshops on neuroplasticity, biographical explorations of success stories and the like. In some places, some students have been so exposed to this kind of thing that school leaders have realised that growth mindset interventions should be much more subtle, avoiding the kind of crude, explicit proselytising that simply makes many students roll their eyes. In contexts such as these, most students now know what they are supposed to believe concerning the malleability of intelligence, irrespective of what they actually believe. Therefore, asking them, in a formal context, to respond to statements which are obviously digging at mindsets is an invitation to provide what they know is the ‘correct response’. Others, who have not been so fortunate in receiving mindset training, are less likely to know the correct answer. Therefore, the research results probably tell us as much about educational practices as they do about mindsets. There are other issues with the chosen measurement tool, discussed in the report, including acquiescent bias and the fact that the cognitive load required by the question increases the likelihood of a random response. Still, let’s move on.

The report found that growth mindsets correlated with academic achievement in some (typically wealthier) countries, but not in others. Wisely, the report cautions that the findings do not establish cause-and-effect relations. This is wise because a growth mindset may, to some extent, be the result of academic success, rather than the cause. As the report observes, students performing well may associate their success to internal characteristics of effort and perseverance, while those performing poorly may attribute it to immutable characteristics to preserve their self-esteem.

However, the report does list the ways in which a growth mindset can lead to better achievement. These include valuing school more, setting more ambitious learning goals, higher levels of self-efficacy, higher levels of motivation and lower levels of fear of failure. This is a very circular kind of logic. These attributes are the attributes of growth mindset, but are they the results of a growth mindset or simply the constituent parts of it? Incidentally, they were measured in the same way as the measurement of mindset, by asking students to respond to statements like “I find satisfaction in working as hard as I can” or “My goal is to learn as much as possible”. The questions are so loaded that we need to be very sceptical about the meaning of the results. The concluding remarks to this section of the report clearly indicate the bias of the research. The question that is asked is not “Can growth mindset lead to better results?” but “How can growth mindset lead to better results?”

Astonishingly, the research did not investigate the impact of growth mindset interventions in schools on growth mindset. Perhaps, this is too hard to do in any reliable way. After all, what counts as a growth mindset intervention? A little homily from the teacher about how we can all learn from our mistakes or some nice posters on the walls? Or a more full-blooded workshop about neural plasticity with follow-up tasks? Instead, the research investigated more general teaching practices. The results were interesting. The greatest impacts on growth mindset come when students perceive their teachers as being supportive in a safe learning environment, and when teachers adapt their teaching to the needs of the class, as opposed to simply following a fixed syllabus. The findings about teacher feedback were less clear: “Whether teacher feedback influences students’ growth mindset development or the other way around, further research is required to investigate this relationship, and why it could differ according to students’ proficiency in reading”.

The final chapter of this report does not include any references to data from the PISA 2018 exercise. Instead, it repeats, in a very selective way, previous research findings such as:

  • Growth mindset interventions yield modest average treatment effects, but larger effects for specific subgroups.
  • Growth-mindset interventions fare well in both scalability and cost-effectiveness dimensions.

It ignores any discussion about whether we should be bothering with growth mindsets at all. It tells us something we already know (about the importance of teacher support and adapting teaching to the needs of the class), but somehow concludes that “growth mindset interventions […] can be cost-effective ways to raise students’ outcomes on a large scale”. It is, to my mind, a classic example, of ‘research’ that is looking to prove a point, rather than critically investigate a phenomenon. In that sense, it is the very opposite of science.

OECD (2021) Sky’s the Limit: Growth Mindset, Students, and Schools in PISA. https://www.oecd.org/pisa/growth-mindset.pdf

Since no single definition of critical thinking prevails (Dummett & Hughes, 2019: 2), discussions of the topic invariably begin with attempts to provide a definition. Lai (2011) offers an accessible summary of a range of possible meanings, but points out that, in educational contexts, its meaning is often rather vague and encompasses other concepts (such as higher order thinking skills) which also lack clarity. Paul Dummett and John Hughes (2019: 4) plump for ‘a mindset that involves thinking reflectively, rationally and reasonably’ – a definition which involves a vague noun (that could mean a fixed state of mind, a learned attitude, a disposition or a mood) and three highly subjective adverbs. I don’t think I could do any better. However, instead of looking for a definition, we can reach a sort of understanding by looking at examples of it. Dummett and Hughes’ book is extremely rich in practical examples, and the picture that emerges of critical thinking is complex and multifaceted.

As you might expect of a weasel word like ‘critical thinking’, there appears to be general agreement that it’s a ‘good thing’. Paul Dummett suggests that there are two common reasons for promoting the inclusion of critical thinking activities in the language classroom. The first of these is a desire to get students thinking for themselves. The second is the idea ‘that we live in an age of misinformation in which only the critically minded can avoid manipulation or slavish conformity’. Neither seems contentious at first glance, although he points out that ‘they tend to lead to a narrow application of critical thinking in ELT materials: that is to say, the analysis of texts and evaluation of the ideas expressed in them’. It’s the second of these rationales that I’d like to explore further.

Penny Ur (2020: 9) offers a more extended version of it:

The role of critical thinking in education has become more central in the 21st century, simply because there is far more information readily available to today’s students than there was in previous centuries (mainly, but not only, online), and it is vital for them to be able to deal with such input wisely. They need to be able to distinguish between what is important and what is trivial, between truth and lies, between fact and opinion, between logical argument and specious propaganda […] Without such skills and awareness of the need to exercise them, they are liable to find themselves victims of commercial or political interests, their thinking manipulated by persuasion disguised as information.

In the same edited collection Olja Milosevic (2020:18) echoes Ur’s argument:

Critical thinking becomes even more important as communication increasingly moves online. Students find an overwhelming amount of information and need to be taught how to evaluate its relevance, accuracy and quality. If teachers do not teach students how to go beyond surface meaning, students cannot be expected to practise it.

In the passages I’ve quoted, these writers are referring to one particular kind of critical thinking. The ability to critically evaluate the reliability, accuracy, etc of a text is generally considered to be a part of what is usually called ‘media information literacy’. In these times of fake news, so the argument goes, it is vital for students to develop (with their teachers’ help) the necessary skills to spot fake news when they see it. The most prototypical critical thinking activity in ELT classrooms is probably one in which students analyse some fake news, such as the website about the Pacific Tree Octopus (which is the basis of a lesson in Dudeney et al., 2013: 198 – 203).

Before considering media information literacy in more detail, it’s worth noting in passing that a rationale for critical thinking activities is no rationale at all if it only concerns one aspect of critical thinking, since it has applied attributes of a part (media information literacy) to a bigger whole (critical thinking).

There is no shortage of good (free) material available for dealing with fake news in the ELT classroom. Examples include work by James Taylor, Chia Suan Chong and Tyson Seburn. Material of this kind may result in lively, interesting, cognitively challenging, communicative and, therefore, useful lessons. But how likely is it that material of this kind will develop learners’ media information literacy and, by extension therefore, their critical thinking skills? How likely is it that teaching material of this kind will help people identify (and reject) fake news? Is it possible that material of this kind is valuable despite its rationale, rather than because of it? In the spirit of rational, reflective and reasonable thinking, these are questions that seem to be worth exploring.

ELT classes and fake news

James Taylor has suggested that the English language classroom is ‘the perfect venue for [critical thinking] skills to be developed’. Although academic English courses necessarily involve elements of critical thinking, I’m not so sure that media information literacy (and, specifically, the identification of fake news) can be adequately addressed in general English classes. There are so many areas, besides those that are specifically language-focussed, competing for space in language classes (think of all those other 21st century skills), that it is hard to see how sufficient time can be found for real development of this skill. It requires modelling, practice of the skill, feedback on the practice, and more practice (Mulnix, 2010): it needs time. Fake news activities in the language classroom would, of course, be of greater value if they were part of an integrated approach across the curriculum. Unfortunately, this is rarely the case.

Information literacy skills

Training materials for media information literacy usually involve a number of stages. These include things like fact-checking and triangulation of different sources, consideration of web address, analysis of images, other items on the site, source citation and so on. The problem, however, is that news-fakers have become so good at what they do. The tree octopus site is very crude in comparison to what can be produced nowadays by people who have learnt to profit from the online economy of misinformation. Facebook employs an army of algorithmic and human fact-checkers, but still struggles. The bottom line is that background knowledge is needed (this is as true for media information literacy as it is for critical thinking more generally) (Willingham, 2007). With news, the scope of domain knowledge is so vast that it is extremely hard to transfer one’s ability to critically evaluate one particular piece of news to another. We are all fooled from time to time.

Media information literacy interventions: research on effectiveness

With the onset of COVID-19, the ability to identify fake news has become, more than ever, a matter of life and death. There is little question that this ability correlates strongly with analytic thinking (see, for example, Stanley et al., 2020). What is much less clear is how we can go about promoting analytic thinking. Analytic thinking comes in different varieties, and another hot-off-the-press research study into susceptibility to COVID-19 fake news (Roozenbeek et al., 2020) has found that the ability to spot fake news may correlate more strongly with numerical literacy than with reasoning ability. In fact, the research team found that a lack of numerical literacy was the most consistent predictor of susceptibility to misinformation about COVID-19. Perhaps we are attempting to develop the wrong kind of analytic thinking?

In educational contexts, attempts to promote media information literacy typically seek to develop reasoning abilities, and the evidence for their effectiveness is mixed. First of all, it needs to be said that ‘little large-scale evidence exists on the effectiveness of promoting digital media literacy as a response to online misinformation’ (Guess et al., 2020). An early meta-analysis (Jeong et al., 2012) found that such interventions had a positive effect, when the interventions were long (not one-off), but impacted more on students’ knowledge than they did on their behaviour. More recently, Huguet et al (2019) were unable to draw ‘definitive conclusions from past research, such as what kinds of media literacy practices work and under what conditions’. And this year, a study by Guess et al (2020) did not generate sufficient evidence ‘to conclude that the [media information literacy] intervention changed real-world consumption of false news’. I am unaware of any robust research in this area in the context of ELT.

It’s all rather disappointing. Why are we not better at it? After all, teachers of media studies have been exploring pathways for many years now. One possible answer is this: Media information literacy, like critical thinking more generally, is a skill that is acquirable, but it can only be acquired if there is a disposition to do so. The ability to think critically and the disposition to do so are separate entities (Facione, 2000). Training learners to be more critical in their approach to media information may be so much pissing in the wind if the disposition to be sceptical is not there. Shaping dispositions is a much harder task than training skills.

Both of the research studies into susceptibility to COVID-19 misinformation that I referred to earlier in this section underscore the significance of dispositions to analytic thinking. Roozenbeek et al (2020) found, in line with much previous research (for example, Jost et al. 2018), that political conservatism is associated with a slightly higher susceptibility to misinformation. Political views (on either side of the political spectrum) rarely change as a result of exposure to science or reasoned thinking. They also found that ‘self-identifying as a member of a minority predicts susceptibility to misinformation about the virus in all countries surveyed’ (except, interestingly, in the UK). Again, when issues of identity are at stake, emotional responses tend to trump rational ones.

Rational, reflective and reasonable thinking about media information literacy leads to an uncomfortable red-pill rabbit-hole. This is how Bulger and Davidson (2018) put it:

The extent to which media literacy can combat the problematic news environment is an open question. Is denying the existence of climate change a media literacy problem? Is believing that a presidential candidate was running a sex-trafficking ring out of a pizza shop a media literacy problem? Can media literacy combat the intentionally opaque systems of serving news on social media platforms? Or intentional campaigns of disinformation?

Teachers and fake news

The assumption that the critical thinking skills of young people can be developed through the intervention of their teachers is rarely problematized. It should be. A recent study of Spanish pre-service teachers (Fuertes-Prieto et al., 2020) showed that their ‘level of belief in pseudoscientific issues is comparable, or even higher in some cases to those of the general population’. There is no reason to believe that this changes after they have qualified. Teachers are probably no more likely to change their beliefs when presented with empirical evidence (Menz et al., 2020) than people from any other profession. Research has tended to focus on teachers’ lack of critical thinking in areas related to their work, but, things may be no different in the wider world. It is estimated that over a quarter of teachers in the US voted for the world’s greatest peddler of fake news in the 2016 presidential election.

It is also interesting to note that the sharing of fake news on social media is much more widespread among older people (including US teachers who have an average age of 42.4) than those under 30 (Bouygues, 2019).

Institutional contexts and fake news

Cory Doctorow has suggested that the fake news problem is not a problem of identifying what is true and what is fake, but a problem ‘about how we know whether something is true. We’re not disagreeing about facts, we’re disagreeing about epistemology’. In a post-modernist world of ‘Truth Decay’ (Kavanagh & Rich, 2018), where there is ‘a blurring of the line between opinion and fact’, epistemological authority is a rare commodity. Medicine, social sciences and applied linguistics are all currently experiencing a ‘replication crisis’ (Ioannidis, 2005) and we had a British education minister saying that ‘people of this country have had enough of experts’.

News reporting has always relied to some extent on trust in the reliability of the news source. The BBC or CNN might attempt to present themselves as more objective than, say, Fox News or InfoWars, but trust in all news outlets has collapsed globally in recent years. As Michael Shudson has written in the Columbia Journalism Review, ‘all news outlets write from a set of values, not simply from a disinterested effort at truth’. If a particular news channel manifestly shares different values from your own, it is easy to reject the veracity of the news it reports. Believers in COVID conspiracy theories often hold their views precisely because of their rejection of the epistemological authority of mainstream news and the WHO or governments who support lockdown measures.

The training of media information literacy in schools is difficult because, for many people in the US (and elsewhere), education is not dissimilar to mainstream media. They ‘are seen as the enemy — two institutions who are trying to have power over how people think. Two institutions that are trying to assert authority over epistemology’ (boyd, 2018). Schools have always been characterized by imbalances in power (between students and teachers / administrators), and this power dynamic is not conducive to open-minded enquiry. Children are often more aware of the power of their teachers than they are accepting of their epistemological authority. They are enjoined to be critical thinkers, but only about certain things and only up to a certain point. One way for children to redress the power imbalance is to reject the epistemological authority of their teachers. I think this may explain why a group of young children I observed recently coming out of a lesson devoted to environmental issues found such pleasure in joking about Greta ‘Thunfisch’.

Power relationships in schools are reflected and enacted in the interaction patterns between teachers and students. The most common of these is ‘initiation-response-feedback (IRF)’ and it is unlikely that this is particularly conducive to rational, reflective and reasonable thinking. At the same time, as Richard Paul, one of the early advocates of critical thinking in schools, noted, much learning activity is characterised by lower order thinking skills, especially memorization (Paul, 1992: 22). With this kind of backdrop, training in media information literacy is more likely to be effective if it goes beyond the inclusion of a few ‘fake news’ exercises: a transformation in the way that the teaching is done will also be needed. Benesch (1999) describes this as a more ‘dialogic’ approach and there is some evidence that a more dialogic approach can have a positive impact on students’ dispositions (e.g. Hajhosseiny, 2012).

I think that David Buckingham (2019a) captures the educational problem very neatly:

There’s a danger here of assuming that we are dealing with a rational process – or at least one that can, by some pedagogical means, be made rational. But from an educational perspective, we surely have to begin with the question of why people might believe apparently ‘fake’ news in the first place. Where we decide to place our trust is as much to do with fantasy, emotion and desire, as with rational calculation. All of us are inclined to believe what we want to believe.

Fake news: a problem or a symptom of a problem?

There has always been fake news. The big problem now is ‘the speed and ease of its dissemination, and it exists primarily because today’s digital capitalism makes it extremely profitable – look at Google and Facebook – to produce and circulate false but click-worthy narratives’ (Morosov, 2017). Fake news taps into and amplifies broader tendencies and divides in society: the problem is not straightforward and is unlikely to be easy to eradicate (Buckingham, 2019a: 3).

There is increasing discussion of media regulation and the recent banning by Facebook of Holocaust denial and QAnon is a recognition that some regulation cannot now be avoided. But strict regulations would threaten the ‘basic business model, and the enormous profitability’ of social media companies (Buckingham, 2009b) and there are real practical and ethical problems in working out exactly how regulation would happen. Governments do not know what to do.

Lacking any obvious alternative, media information literacy is often seen as the solution: can’t we ‘fact check and moderate our way out of this conundrum’ (boyd, 2018)? danah boyd’s stark response is, no, this will fail. It’s an inadequate solution to an oversimplified problem (Buckingham, 2019a).

Along with boyd and Buckingham, I’m not trying to argue that we drop media information literacy activities from educational (including ELT) programmes. Quite the opposite. But if we want our students to think reflectively, rationally and reasonably, I think we will need to start by doing the same.

References

Benesch, S. (1999). Thinking critically, thinking dialogically. TESOL Quarterly, 33: pp. 573 – 580

Bouygues, H. L. (2019). Fighting Fake News: Lessons From The Information Wars. Reboot Foundation https://reboot-foundation.org/fighting-fake-news/

boyd, d. (2018). You Think You Want Media Literacy… Do You? Data and Society: Points https://points.datasociety.net/you-think-you-want-media-literacy-do-you-7cad6af18ec2

Buckingham, D. (2019a). Teaching Media in a ‘Post-Truth’ Age: Fake News, Media Bias and the Challenge for Media Literacy Education. Cultura y Educación 31(2): pp. 1-19

Buckingham, D. (2019b). Rethinking digital literacy: Media education in the age of digital capitalism. https://ddbuckingham.files.wordpress.com/2019/12/media-education-in-digital-capitalism.pdf

Bulger, M. & Davidson, P. (2018). The Promises, Challenges and Futures of Media Literacy. Data and Society. https://datasociety.net/pubs/oh/DataAndSociety_Media_Literacy_2018.pdf

Doctorow, C. (2017). Three kinds of propaganda, and what to do about them. boingboing 25th February 2017, https://boingboing.net/2017/02/25/counternarratives-not-fact-che.html

Dudeney, G., Hockly, N. & Pegrum, M. (2013). Digital Literacies. Harlow: Pearson Education

Dummett, P. & Hughes, J. (2019). Critical Thinking in ELT. Boston: National Geographic Learning

Facione, P. A. (2000). The disposition toward critical thinking: Its character, measurement, and relation to critical thinking skill. Informal Logic, 20(1), 61–84.

Fuertes-Prieto, M.Á., Andrés-Sánchez, S., Corrochano-Fernández, D. et al. (2020). Pre-service Teachers’ False Beliefs in Superstitions and Pseudosciences in Relation to Science and Technology. Science & Education 29, 1235–1254 (2020). https://doi.org/10.1007/s11191-020-00140-8

Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, N., Reifler, J. & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences Jul 2020, 117 (27) 15536-15545; DOI: 10.1073/pnas.1920498117

Hajhosseiny, M. (2012). The Effect of Dialogic Teaching on Students’ Critical Thinking Disposition. Procedia – Social and Behavioral Sciences, 69: pp. 1358 – 1368

Huguet, A., Kavanagh, J., Baker, G. & Blumenthal, M. S. (2019). Exploring Media Literacy Education as a Tool for Mitigating Truth Decay. RAND Corporation, https://www.rand.org/content/dam/rand/pubs/research_reports/RR3000/RR3050/RAND_RR3050.pdf

Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine 2 (8): e124. https://doi.org/10.1371/journal.pmed.0020124

Jeong, S. H., Cho, H., & Hwang, Y. (2012). Media literacy interventions: A meta-analytic review. Journal of Communication, 62, pp. 454–472

Jones-Jang, S. M., Mortensen, T. & Liu, J. (2019). Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. American Behavioral Scientist, pp. 1 – 18, doi:10.1177/0002764219869406

Jost, J. T., van der Linden, S., Panagopoulos, C. & Hardin, C. D. (2018). Ideological asymmetries in conformity, desire for shared reality, and the spread of misinformation. Current Opinion in Psychology, 23: pp/ 77-83. doi:10.1016/j.copsyc.2018.01.003

Kavanagh, J. & Rich, M. D. (2018). Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND Corporation, https://www.rand.org/pubs/research_reports/RR2314.html

Lai, E.R. 2011. Critical Thinking: A Literature Review. Pearson. http://images.pearsonassessments.com/images/tmrs/CriticalThinkingReviewFINAL.pdf

Menz, C., Spinath, B. & Seifried, E. (2020). Misconceptions die hard: prevalence and reduction of wrong beliefs in topics from educational psychology among preservice teachers. European Journal of Psychology of Education https://doi.org/10.1007/s10212-020-00474-5

Milosevic, O. (2020). Promoting critical thinking in the EFL classroom. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp.17 – 22

Morozov, E. (2017). Moral panic over fake news hides the real enemy – the digital giants. The Guardian, 8 January 2017 https://www.theguardian.com/commentisfree/2017/jan/08/blaming-fake-news-not-the-answer-democracy-crisis

Mulnix, J.W. 2010. ‘Thinking critically about critical thinking’ Educational Philosophy and Theory, 2010

Paul, R. W. (1992). Critical thinking: What, why, and how? New Directions for Community Colleges, 77: pp. 3–24.

Roozenbeek, J., Schneider, C.R., Dryhurst, S., Kerr, J., Freeman, A. L. J., Recchia, G., van der Bles, A. M. & and van der Linden, S. (2020). Susceptibility to misinformation about COVID-19 around the world. Royal Society Open Science, 7 (10) https://doi.org/10.1098/rsos.201199

Stanley, M., Barr, N., Peters, K. & Seli, P. (2020). Analytic-thinking predicts hoax beliefs and helping behaviors in response to the COVID-19 pandemic. PsyArxiv Preprints doi:10.31234/osf.io/m3vt

Ur, P. (2020). Critical Thinking. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp.9 – 16

Willingham, D. T. (2007). Critical Thinking: Why Is It So Hard to Teach? American Educator Summer 2007: pp. 8 – 19

A week or so ago, someone in the Macmillan marketing department took it upon themselves to send out this tweet. What grabbed my attention was the claim that it is ‘a well-known fact’ that teaching students a growth mindset makes them perform better academically over time. The easily demonstrable reality (which I’ll come on to) is that this is not a fact. It’s fake news, being used for marketing purposes. The tweet links to a blog post of over a year ago. In it, Chia Suan Chong offers five tips for developing a growth mindset in students: educating students about neuroplasticity, delving deeper into success stories, celebrating challenges and mistakes, encouraging students to go outside their comfort zones, and giving ‘growth-mindset-feedback’. All of which, she suggests, might help our students. Indeed, it might, and, even if it doesn’t, it might be worth a try anyway. Chia doesn’t make any claims beyond the potential of the suggested strategies, so I wonder where the Macmillan Twitter account person got the ‘well-known fact’.

If you google ‘mindset ELT’, you will find webpage after webpage offering tips about how to promote growth mindset in learners. It’s rare for the writers of these pages to claim that the positive effects of mindset interventions are a ‘fact’, but it’s even rarer to come across anyone who suggests that mindset interventions might be an à la mode waste of time and effort. Even in more serious literature (e.g. Mercer, S. & Ryan, S. (2010). A mindset for EFL: learners’ beliefs about the role of natural talent. ELT Journal, 64 (4): 436 – 444), the approach is fundamentally enthusiastic, with no indication that there might be a problem with mindset theory. Given that this enthusiasm is repeated so often, perhaps we should not blame the Macmillan tweeter for falling victim to the illusory truth effect. After all, it appears that 98% of teachers in the US feel that growth mindset approaches should be adopted in schools (Hendrick, 2019).

Chia suggests that we can all have fixed mindsets in certain domains (e.g. I know all about that, there’s nothing more I can learn). One domain where it seems that fixed mindsets are prevalent is mindset theory itself. This post is an attempt to nudge towards more ‘growth’ and, in trying to persuade you to be more sceptical, I will quote as much as possible from Carol Dweck, the founder of mindset theory, and her close associates.

Carol Dweck’s book ‘Mindset: The New Psychology of Success’ appeared in 2006. In it, she argued that people can be placed on a continuum between those who have ‘a fixed mindset–those who believe that abilities are fixed—[and who] are less likely to flourish [and] those with a growth mindset–those who believe that abilities can be developed’ (from the back cover of the updated (2007) version of the book). There was nothing especially new about the idea. It is very close to Bandura’s (1982) theory of self-efficacy, which will be familiar to anyone who has read Zoltán Dörnyei’s more recent work on motivation in language learning. It’s closely related to Carl Roger’s (1969) ideas about self-concept and it’s not a million miles removed, either, from Maslow’s (1943) theory of self-actualization. The work of Rogers and Maslow was at the heart of the ‘humanistic turn’ in ELT in the latter part of the 20th century (see, for example, Early, 1981), so mindset theory is likely to resonate with anyone who was inspired by the humanistic work of people like Moskowitz, Stevick or Rinvolucri. The appeal of mindset theory is easy to see. Besides its novelty value, it resonates emotionally with the values that many teachers share, writes Tom Bennett: it feels right that you don’t criticise the person, but invite them to believe that, through hard work and persistence, you can achieve.

We might even trace interest in the importance of self-belief back to the Stoics (who, incidentally but not coincidentally, are experiencing a revival of interest), but Carol Dweck introduced a more modern flavour to the old wine and packaged it skilfully and accessibly in shiny new bottles. Her book was a runaway bestseller, with sales in the millions, and her TED Talk has now had over 11 million views. It was in education that mindset theory became particularly popular. As a mini-industry it is now worth millions and millions. Just one research project into the efficacy of one mindset product has received 3.5 million dollars in US federal funding.

But, much like other ideas that have done a roaring trade in popular psychology (Howard Gardner’s ‘multiple intelligences theory, for example) which seem to offer simple solutions to complex problems, there was soon pushback. It wasn’t hard for critics to scoff at motivational ‘yes-you-can’ posters in classrooms or accounts of well-meaning but misguided teacher interventions, like this one reported by Carl Hendrick:

One teacher [took] her children out into the pristine snow covering the school playground, she instructed them to walk around, taking note of their footprints. “Look at these paths you’ve been creating,” the teacher said. “In the same way that you’re creating new pathways in the snow, learning creates new pathways in your brain.”

Carol Dweck was sympathetic to the critics. She has described the early reaction to her book as ‘uncontrollable’. She freely admits that she and her colleagues had underestimated the issues around mindset interventions in the classrooms and that such interventions were ‘not yet evidence-based’. She identified two major areas where mindset interventions have gone awry. The first of these is when a teacher teaches the concept of mindsets to students, but does not change other policies and practices in the classroom. The second is that some teachers have focussed too much on praising their learners’ efforts. Teachers have taken mindset recipes and tips, without due consideration. She says:

Teachers have to ask, what exactly is the evidence suggesting? They have to realise it takes deep thought and deep experimentation on their part in the classroom to see how best the concept can be implemented there. This should be a group enterprise, in which they share what worked, what did not work, for whom and when. People need to recognise we are researchers, we have produced a body of evidence that says under these conditions this is what happened. We have not explored all the conditions that are possible. Teacher feedback on what is working and not working is hugely valuable to us to tell us what we have not done and what we need to do.

Critics like Dylan William, Carl Hendrick and Timothy Bates found that it was impossible to replicate Dweck’s findings, and that there were at best weak correlations between growth mindset and academic achievement, and between mindset interventions and academic gains. They were happy to concede that typical mindset interventions would not do any harm, but asked whether the huge amounts of money being spent on mindset would not be better invested elsewhere.

Carol Dweck seems to like the phrase ‘not yet’. She argues, in her TED Talk, that simply using the words ‘not yet’ can build students’ confidence, and her tip is often repeated by others. She also talks about mindset interventions being ‘not yet evidence-based’, which is a way of declaring her confidence that they soon will be. But, with huge financial backing, Dweck and her colleagues have recently been carrying out a lot of research and the results are now coming in. There are a small number of recent investigations that advocates of mindset interventions like to point to. For reasons of space, I’ll refer to two of them.

The first (Outes-Leon, et al., 2020) of these looked at an intervention with children in the first grades in a few hundred Peruvian secondary schools. The intervention consisted of students individually reading a text designed to introduce them to the concept of growth-mindset. This was followed by a group debate about the text, before students had to write individually a reflective letter to a friend/relative describing what they had learned. In total, this amounted to about 90 minutes of activity. Subsequently, teachers made a subjective assessment of the ‘best’ letters and attached these to the classroom wall, along with a growth mindset poster, for the rest of the school year. Teachers were also asked to take a picture of the students alongside the letters and the poster and to share this picture by email.

Academic progress was measured 2 and 14 months after the intervention and compared to a large control group. The short-term (2 months) impact of the intervention was positive for mathematics, but less so for reading comprehension. (Why?) These gains were only visible in regional schools, not at all in metropolitan schools. Similar results were found when looking at the medium-term (14 month) impact. The reasons for this are unclear. It is hypothesized that the lower-achieving students in regional schools might benefit more from the intervention. Smaller class sizes in regional schools might also be a factor. But, of course, many other explanations are possible.

The paper is entitled The Power of Believing You Can Get Smarter. The authors make it clear that they were looking for positive evidence of the intervention and they were supported by mindset advocates (e.g. David Yeager) from the start. It was funded by the World Bank, which is a long-standing advocate of growth mindset interventions. (Rather jumping the gun, the World Bank’s Mindset Team wrote in 2014 that teaching growth mindset is not just another policy fad. It is backed by a burgeoning body of empirical research.) The paper’s authors conclude that ‘the benefits of the intervention were relevant and long-lasting in the Peruvian context’, and they focus strongly on the low costs of the intervention. They acknowledge that the way the tool is introduced (design of the intervention) and the context in which this occurs (i.e., school and teacher characteristics) both matter to understand potential gains. But without understanding the role of the context, we haven’t really learned anything practical that we can take away from the research. Our understanding of the power of believing you can get smarter has not been meaningfully advanced.

The second of these studies (Yeager et al., 2019) took many thousands of lower-achieving American 9th graders from a representative sample of schools. It is a very well-designed and thoroughly reported piece of research. The intervention consisted of two 25-minute online sessions, 20 days apart, which sought to reduce the negative effort beliefs of students (the belief that having to try hard or ask for help means you lack ability), fixed-trait attributions (the attribution that failure stems from low ability) and performance avoidance goals (the goal of never looking stupid). An analysis of academic achievement at the end of the school year indicated clearly that the intervention led to improved performance. These results lead to very clear grounds for optimism about the potential of growth mindset interventions, but the report is careful to avoid overstatement. We have learnt about one particular demographic with one particular intervention, but it would be wrong to generalise beyond that. The researchers had hoped that the intervention would help to compensate for unsupportive school norms, but found that this was not the case. Instead, they found that it was when the peer norm supported the adoption of intellectual challenges that the intervention promoted sustained benefits. Context, as in the Peruvian study, was crucial. The authors write:

We emphasize that not all forms of growth mindset interventions can be expected to increase grades or advanced course-taking, even in the targeted subgroups. New growth mindset interventions that go beyond the module and population tested here will need to be subjected to rigorous development and validation processes.

I think that a reasonable conclusion from reading this research is that it may well be worth experimenting with growth mindset interventions in English language classes, but without any firm expectation of any positive impact. If nothing else, the interventions might provide useful, meaningful practice of the four skills. First, though, it would make sense to read two other pieces of research (Sisk et al., 2018; Burgoyne et al., 2020). Unlike the projects I have just discussed, these were not carried out by researchers with an a priori enthusiasm for growth-mindset interventions. And the results were rather different.

The first of these (Sisk et al., 2018) was a meta-analysis of the literature. It found that there was only a weak correlation between mindset and academic achievement, and only a weak correlation between mindset interventions and academic gains. It did, however, lend support to one of the conclusions of Yeager et al (2019), that such interventions may benefit students who are academically at risk.

The second (Burgoyne et al., 2020) found that the foundations of mind-set theory are not firm and that bold claims about mind-set appear to be overstated. Other constructs such as self-efficacy and need for achievement, [were] found to correlate much more strongly with presumed associates of mind-set.

So, where does this leave us? We are clearly a long way from ‘facts’; mindset interventions are ‘not yet evidence-based’. Carl Hendrick (2019) provides a useful summary:

The truth is we simply haven’t been able to translate the research on the benefits of a growth mindset into any sort of effective, consistent practice that makes an appreciable difference in student academic attainment. In many cases, growth mindset theory has been misrepresented and miscast as simply a means of motivating the unmotivated through pithy slogans and posters. […] Recent evidence would suggest that growth mindset interventions are not the elixir of student learning that many of its proponents claim it to be. The growth mindset appears to be a viable construct in the lab, which, when administered in the classroom via targeted interventions, doesn’t seem to work at scale. It is hard to dispute that having a self-belief in their own capacity for change is a positive attribute for students. Paradoxically, however, that aspiration is not well served by direct interventions that try to instil it.

References

Bandura, Albert (1982). Self-efficacy mechanism in human agency. American Psychologist, 37 (2): pp. 122–147. doi:10.1037/0003-066X.37.2.122.

Burgoyne, A. P., Hambrick, D. Z., & Macnamara, B. N. (2020). How Firm Are the Foundations of Mind-Set Theory? The Claims Appear Stronger Than the Evidence. Psychological Science, 31(3), 258–267. https://doi.org/10.1177/0956797619897588

Early, P. (Ed.) ELT Documents 1113 – Humanistic Approaches: An Empirical View. London: The British Council

Dweck, C. S. (2006). Mindset: The New Psychology of Success. New York: Ballantine Books

Hendrick, C. (2019). The growth mindset problem. Aeon,11 March 2019.

Maslow, A. (1943). A Theory of Human Motivation. Psychological Review, 50: pp. 370-396.

Outes-Leon, I., Sanchez, A. & Vakis, R. (2020). The Power of Believing You Can Get Smarter : The Impact of a Growth-Mindset Intervention on Academic Achievement in Peru (English). Policy Research working paper, no. WPS 9141 Washington, D.C. : World Bank Group. http://documents.worldbank.org/curated/en/212351580740956027/The-Power-of-Believing-You-Can-Get-Smarter-The-Impact-of-a-Growth-Mindset-Intervention-on-Academic-Achievement-in-Peru

Rogers, C. R. (1969). Freedom to Learn: A View of What Education Might Become. Columbus, Ohio: Charles Merill

Sisk, V. F., Burgoyne, A. P., Sun, J., Butler, J. L., Macnamara, B. N. (2018). To what extent and under which circumstances are growth mind-sets important to academic achievement? Two meta-analyses. Psychological Science, 29, 549–571. doi:10.1177/0956797617739704

Yeager, D.S., Hanselman, P., Walton, G.M. et al. (2019). A national experiment reveals where a growth mindset improves achievement. Nature 573, 364–369. https://doi.org/10.1038/s41586-019-1466-y