Posts Tagged ‘research’

You have probably heard of the marshmallow experiment, one of the most famous and widely cited studies in social psychology. In the experiments, led by Walter Mischel at Stanford University in 1972, pre-school children were offered a choice between an immediate small reward (such as a marshmallow) or a significantly larger reward if they could wait long enough (a few minutes) to receive it. A series of follow-up studies, beginning in 1988, found that those children who had been able to delay gratification in the original experiments had better educational achievements at school and in college than those who had less self-control.

The idea that character traits like self-control could have an important impact on educational outcomes clearly resonated with many people at the time. The studies inspired further research into what is now called socio-emotional learning, and helped to popularise many educational interventions across the world that sought to teach ‘character and resilience’ in schools. In Britain alone, £5 million was pledged for a programme in 2015 to promote what the government called ‘character work’, an initiative that saw rugby coaches being used to instil the values of respect, teamwork, enjoyment, and discipline in school children.

One person who was massively influenced by the marshmallow experiment (and who, in turn, massively influenced the character-building interventions in schools), was Angela Duckworth (Duckworth et al., 2013), who worked at Stanford between 2014 and 2015. Shortly after her studies into delay of gratification, Duckworth gave a TED talk called ‘Grit: the power of passion and perseverance’ which has now had almost 10 million views. A few years later, her book with the same title (Duckworth, 2016) was published. An instant best-seller, ‘grit’ became a ‘hot topic’ in education, and, according to the editors of a special issue of The Journal for the Psychology of Language Learning (MacIntyre & Khajavy, 2021), ‘interest appears to be rapidly expanding’. Duckworth has argued that self-control and grit are different and unrelated, but a number of studies have contradicted this view (Oxford & Khajafy, 2021), and the relationship between the two is clear in Duckworth’s intellectual and publishing trajectory.

This continued and expanding interest in grit is a little surprising. In a previous (June, 2020) blog post , I looked at the problems with the concept of ‘grit’, drawing on the work of Marcus Credé (2017; 2018) that questioned whether it made sense to talk about ‘grit’ as a unitary construct, noted the difficulty of measuring ‘grit’ and the lack of evidence in support of educational interventions to promote ‘grit’ (despite the millions and millions that have been spent). In a more recent article, Credé and his collaborator, Michael Tynan (Credé & Tynan, 2021), double-down on their criticisms, observing that ‘meta-analytic syntheses of the grit literature have shown that grit is a poor predictor of performance and success in its own right, and that it predicts success in academic and work settings far more poorly than other well-known predictors’. One of these other well-known predictors is the socio-economic status of students’ families. Credé and Tynan remain ‘deeply skeptical of the claim that grit, as a unitary construct formed by combining scores on perseverance and passion, holds much value for researchers focused on SLA—or any other domain’.

In the same journal issue as the Credé and Tynan article, Rebecca Oxford and Gholam Khajavy (2021) sound further notes of caution about work on ‘grit’. They suggest that researchers need to avoid confusing grit with other constructs like self-control – a suggestion that may be hard or impossible to follow if, in fact these constructs are not clearly separable (as Oxford and Khajavy note). They argue, too, that much more attention needs to be paid to socio-economic contexts, that structural barriers to achievement must be given fuller consideration if ‘grit’ is to contribute anything positive to social justice. Whether the other papers in this special edition of the Journal for the Psychology of Language Learning that is devoted to ‘grit’ heed the cautionary advice of Credé and Tynan, Oxford and Khajavy is, I think, open to debate. Perhaps the idea of a whole edition of a journal devoted to ‘grit’ is a problematic starting point. Since there is no shortage of reasons to believe that ‘grit’ isn’t actually a ‘thing’, why take ‘grit’ as a starting point for scholarly enquiry?

It might be instructive to go back to how ‘grit’ became a ‘thing’ in the first place. It’s an approach that the contributors to the special issue of the Journal for the Psychology of Language Learning have not adopted. This brings me back to the marshmallow test. At the time that ‘grit’ was getting going, Alfie Kohn brought out a book called ‘The Myth of the Spoiled Child’ (Kohn, 2014) that included a chapter ‘Why Self-Discipline Is Overrated: A Closer Look at Grit, Marshmallows, and Control from Within’. Kohn argued that educational ideas about ‘grit’ had misrepresented the findings of the marshmallow test and its follow-up studies. He argued that setting was more important than individual self-control, and that deferral of gratification was likely an effect, not a cause of anything. His ideas were supported by some of the original researchers, including Mischel himself. Another, Yuichi Shoda, a co-author of a key paper that linked delay of gratification to SAT scores, has observed that ‘Our paper does not mention anything about interventions or policies’ – many other factors would need to be controlled to validate the causal relationship between self-control and academic achievement (Resnick, 2018).

Interest in recent years in replicating experiments in social psychology has led to confirmation that something was seriously wrong with the follow-ups to the marshmallow experiment. Studies (e.g. Watts et al., 2018) with more representative and larger groups of children have found that correlations between academic achievement and self-control almost vanished when controlled for factors like family background and intelligence. Even if you can teach a child to delay gratification, it won’t necessarily lead to any benefits later on.

Self-control and ‘grit’ may or may not be different things, but one thing they clearly have in common is their correlation with socio-economic differences. It is distinctly possible that attention to ‘grit’, in language learning and in other fields, is a distraction from more pressing concerns. Pity the poor researchers who have hitched themselves to the ‘grit’ bandwagon … As Angela Duckworth has said, research into grit is itself ‘a delay of gratification test’ (Duckworth, 2013). You have to be really passionate about grit and show sustained persistence if you want to keep on publishing on the subject, despite all that we now know. She hopes ‘that as a field we follow through on our intentions to forgo more immediately rewarding temptations to instead do what is best for science in the long-run’. How about forgoing the immediately rewarding temptation of publishing yet more stuff on this topic?

References

Credé, M. (2018) What shall we do about grit? A critical review of what we know and what we don’t know. Educational Researcher, 47 (9), 606-611.

Credé, M. & Tynan, M. C. (2021) Should Language Acquisition Researchers Study “Grit”? A Cautionary Note and Some Suggestions. Journal for the Psychology of Language Learning, 3 (2), 37 – 44

Credé, M., Tynan, M. C. & Harms, P. D. (2017) Much ado about grit: A meta-analytic synthesis of the grit literature. Journal of Personality and Social Psychology, 113 (3)

Duckworth, A. L. (2013) Is It Really Self-control: A Critical Analysis of the “Marshmallow Test” Society of Personality and Social Psychology Connections November 10, 2013 https://spsptalks.wordpress.com/2013/11/10/is-it-really-self-control-a-critical-analysis-of-the-marshmallow-test/

Duckworth, A. L., Tsukayama, E. & Kirby, T. A. (2013) Is it really self-control? Examining the predictive power of the delay of gratification response. Personality and Social Psychology Bulletin, 39, 843-855.

Duckworth, A. (2016) Grit: the power of passion and perseverance. New York: Scribner

Kohn, A. (2014) The Myth of the Spoiled Child. Boston: Da Capo Press

MacIntyre, P. & Khajavy, G. H. (2021) Grit in Second Language Learning and Teaching: Introduction to the Special Issue. Journal for the Psychology of Language Learning, 3 (2), 1-6. http://www.jpll.org/index.php/journal/article/view/86

Oxford, R. & Khajafy, G. H. (2021) Exploring Grit: “Grit Linguistics” and Research on Domain-General Grit and L2 Grit. Journal for the Psychology of Language Learning, 3 (2), 7 – 35

Resnick, B. (2018) The “marshmallow test” said patience was a key to success. A new replication tells us s’more. Vox, June 6, 2018. https://www.vox.com/science-and-health/2018/6/6/17413000/marshmallow-test-replication-mischel-psychology

Watts, T.W., Duncan, G.J. & Quan, H. (2018) Revisiting the Marshmallow Test: A Conceptual Replication Investigating Links Between Early Delay of Gratification and Later Outcomes. Psychological Science 29 (7): 1159-1177.

There’s a video on YouTube from Oxford University Press in which the presenter, the author of a coursebook for primary English language learners (‘Oxford Discover’), describes an activity where students have a short time to write some sentences about a picture they have been shown. Then, working in pairs, they read aloud their partner’s sentences and award themselves points, with more points being given for sentences that others have not come up with. For lower level, young learners, it’s not a bad activity. It provides opportunities for varied skills practice of a limited kind and, if it works, may be quite fun and motivating. However, what I found interesting about the video is that it is entitled ‘How to teach critical thinking skills: speaking’ and the book that is being promoted claims to develop ‘21st Century Skills in critical thinking, communication, collaboration and creativity’. The presenter says that the activity achieves its critical thinking goals by promoting ‘both noticing and giving opinions, […] two very important critical thinking skills.’

Noticing (or observation) and giving opinions are often included in lists of critical thinking skills, but, for this to be the case, they must presumably be exercised in a critical way – some sort of reasoning must be involved. This is not the case here, so only the most uncritical understanding of critical thinking could consider this activity to have any connection to critical thinking. Whatever other benefits might accrue from it, it seems highly unlikely that the students’ ability to notice or express opinions will be developed.

My scepticism is not shared by many users of the book. Oxford University Press carried out a scientific-sounding ‘impact study’: this consisted of a questionnaire (n = 198) in which ‘97% of teachers reported that using Oxford Discover helps their students to improve in the full range of 21st century skills, with critical thinking and communication scoring the highest’.

Enthusiasm for critical thinking activities is extremely widespread. In 2018, TALIS, the OECD Teaching and Learning International Survey (with more than 4000 respondents) found that ‘over 80% of teachers feel confident in their ability to vary instructional strategies in their classroom and help students think critically’ and almost 60% ‘frequently or always’ ‘give students tasks that require students to think critically.’ Like the Oxford ‘impact study’, it’s worth remembering that these are self-reporting figures.

This enthusiasm is shared in the world of English language teaching, reflected in at least 17 presentations at the 2021 IATEFL conference that discussed practical ideas for promoting critical thinking. These ranged from the more familiar (e.g. textual analysis in EAP) to the more original – developing critical thinking through the use of reading reaction journals, multicultural literature, fables, creative arts performances, self-expression, escape rooms, and dice games.

In most cases, it would appear that the precise nature of the critical thinking that was ostensibly being developed was left fairly vague. This vagueness is not surprising. Practically the only thing that writers about critical thinking in education can agree on is that there is no general agreement about what, precisely, critical thinking is. Lai (2011) offers an accessible summary of a range of possible meanings, but points out that, in educational contexts, its meaning is often rather vague and encompasses other concepts (such as higher order thinking skills) which also lack clarity. Paul Dummett and John Hughes (2019: 4) plump for ‘a mindset that involves thinking reflectively, rationally and reasonably’ – a vague definition which leaves unanswered two key questions: to what extent is it a skill set or a disposition? Are these skills generic or domain specific?

When ‘critical thinking’ is left undefined, it is impossible to evaluate the claims that a particular classroom activity will contribute to the development of critical thinking. However, irrespective of the definition, there are good reasons to be sceptical about the ability of educational activities to have a positive impact on the generic critical thinking skills of learners in English language classes. There can only be critical-thinking value in the activity described at the beginning of this post if learners somehow transfer the skills they practise in the activity to other domains of their lives. This is, of course, possible, but, if we approach the question with a critical disposition, we have to conclude that it is unlikely. We may continue to believe the opposite, but this would be an uncritical act of faith.

The research evidence on the efficacy of teaching generic critical thinking is not terribly encouraging (Tricot & Sweller, 2014). There’s no shortage of anecdotal support for classroom critical thinking, but ‘education researchers have spent over a century searching for, and failing to find evidence of, transfer to unrelated domains by the use of generic-cognitive skills’ (Sweller, 2022). One recent meta-analysis (Huber & Kuncel, 2016) found insufficient evidence to justify the explicit teaching of generic critical thinking skills at college level. In an earlier blog post https://adaptivelearninginelt.wordpress.com/2020/10/16/fake-news-and-critical-thinking-in-elt/ looking at the impact of critical thinking activities on our susceptibility to fake news, I noted that research was unable to find much evidence of the value of media literacy training. When considerable time is devoted to generic critical thinking training and little or no impact is found, how likely is it that the kind of occasional, brief one-off activity in the ELT classroom will have the desired impact? Without going as far as to say that critical thinking activities in the ELT classroom have no critical-thinking value, it is uncontentious to say that we still do not know how to define critical thinking, how to assess evidence of it, or how to effectively practise and execute it (Gay & Clark, 2021).

It is ironic that there is so little critical thinking about critical thinking in the world of English language teaching, but it should not be particularly surprising. Teachers are no more immune to fads than anyone else (Fuertes-Prieto et al., 2020). Despite a complete lack of robust evidence to support them, learning styles and multiple intelligences influenced language teaching for many years. Mindfulness, growth mindsets, grit are more contemporary influences and, like critical thinking, will go the way of learning styles when the commercial and institutional forces that currently promote them find the lack of empirical supporting evidence problematic.

Critical thinking is an educational aim shared by educational authorities around the world, promoted by intergovernmental bodies like the OECD, the World Bank, the EU, and the United Nations. In Japan, for example, the ‘Ministry of Education (MEXT) puts critical thinking (CT) at the forefront of its ‘global jinzai’ (human capital for a global society) directive’ (Gay & Clark, 2021). It is taught as an academic discipline in some universities in Russia (Ivlev et al, 2021) and plans are underway to introduce it into schools in Saudi Arabia. https://www.arabnews.com/node/1764601/saudi-arabia I suspect that it doesn’t mean quite the same thing in all these places.

Critical thinking is also an educational aim that most teachers can share. Few like to think of themselves as Gradgrinds, bashing facts into their pupils’ heads: turning children into critical thinkers is what education is supposed to be all about. It holds an intuitive appeal, and even if we (20% of teachers in the TALIS survey) lack confidence in our ability to promote critical thinking in the classroom, few of us doubt the importance of trying to do so. Like learning styles, multiple intelligences and growth mindsets, it seems possible that, with critical thinking, we are pushing the wrong thing, but for the right reasons. But just how much evidence, or lack of evidence, do we need before we start getting critical about critical thinking?

References

Dummett, P. & Hughes, J. (2019) Critical Thinking in ELT. Boston: National Geographic Learning

Fuertes-Prieto, M.Á., Andrés-Sánchez, S., Corrochano-Fernández, D. et al. (2020) Pre-service Teachers’ False Beliefs in Superstitions and Pseudosciences in Relation to Science and Technology. Science & Education 29, 1235–1254 (2020). https://doi.org/10.1007/s11191-020-00140-8

Gay, S. & Clark, G. (2021) Revisiting Critical Thinking Constructs and What This Means for ELT. Critical Thinking and Language Learning, 8 (1): pp. 110 – 147

Huber, C.R. & Kuncel, N.R. (2016) Does College Teach Critical Thinking? A Meta-Analysis. Review of Educational Research. 2016: 86 (2) pp.:431-468. doi:10.3102/0034654315605917

Ivlev, V. Y., Pozdnyakov, M. V., Inozemtsez, V. A. & Chernyak, A. Z. (2021) Critical Thinking in the Structure of Educational Programs in Russian Universities. Advances in Social Science, Education and Humanities Research, volume 555: pp. 121 -128

Lai, E.R. 2011. Critical Thinking: A Literature Review. Pearson. http://images.pearsonassessments.com/images/tmrs/CriticalThinkingReviewFINAL.pdf

Sweller, J. (2022) Some Critical Thoughts about Critical and Creative Thinking. Sydney: The Centre for Independent Studies Analysis Paper 32

Tricot, A., & Sweller, J. (2014) Domain-specific knowledge and why teaching generic skills does not work. Educational Psychology Review, 26, 265- 283.

In May of last year, EL Gazette had a story entitled ‘Your new English language teacher is a robot’ that was accompanied by a stock photo of a humanoid robot, Pepper (built by SoftBank Robotics). The story was pure clickbait and the picture had nothing to do with it. The article actually concerned a chatbot (EAP Talk) to practise EAP currently under development at a Chinese university. There’s nothing especially new about chatbots: I last blogged about them in 2016 and interest in them, both research and practical, dates back to the 1970s (Lee et al., 2020). There’s nothing, as far as I can see, especially new about the Chinese EAP chatbot project either. The article concludes by saying that the academic behind the project ‘does not believe that AI can ever replace a human teacher’, but that chatbots might offer some useful benefits.

The benefits are, however, limited – a point that is acknowledged even by chatbot enthusiasts like Lee et al (2020). We are some way from having chatbots that we can actually have meaningful conversations with, but they do appear to have some potential as ‘intelligent tutoring systems’ to provide practice of and feedback on pre-designated bits of language (especially vocabulary and phrases). The main benefit that is usually given, as in the EL Gazette article, is that they are non-judgemental and may, therefore, be appropriate for shy or insecure learners.

Social robots, of the kind used in the illustration for the EL Gazette story, are, of course, not the same as chatbots. Chatbots, like EAP Talk, can be incorporated into all sorts of devices (notably phones, tablets and laptops) and all sorts of applications. If social robots are to be used for language learning, they will clearly need to incorporate chatbots, but in what ways could the other features of robots facilitate language acquisition? Pepper (the robot in the picture) has ‘touch sensors, LEDs and microphones for multimodal interactions’, along with ‘infrared sensors, bumpers, an inertial unit, 2D and 3D cameras, and sonars for omnidirectional and autonomous navigation’. How could these features help language acquisition?

Lee and Lee (2022) attempt to provide an answer to this question. Here’s what they have come up with:

By virtue of their physical embodiment, social robots have been suggested to provide language learners with direct and physical interactions, which is considered one of the basic ingredients for language learning. In addition, as social robots are generally humanoids or anthropomorphized animal shapes, they have been valued for their ability to serve as familiar conversational partners, having potential to lower the affective filter of language learners.

Is there any research evidence to back up these claims? The short answer is no. Motivation and engagement may sometimes be positively impacted, but we can’t say any more than that. As far as learning is concerned, Lee and Lee (2022: 121) write: involving social robots led to statistically similar or even higher [English language learning] outcomes compared with traditional ELT contexts (i.e. no social robot). In other words, social robots did not, on the whole, have a negative impact on learning outcomes. Hardly grounds for wild enthusiasm … Still, Lee and Lee, in the next line, refer to the ‘positive effectiveness of social robots in English teaching’ before proceeding to enumerate the ways in which these robots could be used in English language learning. Doesn’t ELT Journal have editors to pick up on this kind of thing?

So, how could these robots be used? Lee and Lee suggest (for younger learners) one-on-one vocabulary tutoring, dialogue practice, more vocabulary teaching, and personalized feedback. That’s it. It’s worth noting that all of these functions could equally well be carried out by chatbots as by social robots.

Lee and Lee discuss and describe the social robot, NAO6, also built by SoftBank Robotics. It’s a smaller and cheaper cousin of the Pepper robot that illustrates the EL Gazette article. Among Lee and Lee’s reasons for using social robots is that they ‘have become more accessible due to ever-lower costs’: NAO6 costs around £350 a month to rent. Buying it outright is also an option. Eduporium (‘Empowering the future with technology’) has one on offer for $12,990.00. According to the blurb, it helps ‘teach coding, brings literature to life, enhances special education, and allows for training simulations. Plus, its educational solutions include an intuitive interface, remote learning, and various applications for accessibility!’

It’s easy enough to understand why EL Gazette uses clickbait from time to time. I’m less clear about why ELT Journal would print this kind of nonsense. According to Lee and Lee, further research into social robots ‘would initiate a new era of language learning’ in which the robots will become ‘an important addition to the ELT arsenal’. Yeah, right …

References

Lee, H. & Lee, J. H. (2022) Social robots for English language teaching. ELT Journal 76 (1): 119 – 124

Lee, J. H., Yang, H., Shin D. & Kim, H. (2020) Chatbots. ELT Journal 74 (3): 338 – 3444

We need to talk

Posted: December 13, 2021 in Discourse, research
Tags: , , ,

In 1994, in a well-known TESOL Quarterly article entitled ‘The dysfunctions of theory/practice discourse’, Mark A. Clarke explored the imbalance in the relationship between TESOL researchers and English language teachers, and the way in which the former typically frame the latter as being less expert than themselves. In the last 25 years, the topic has regularly resurfaced, most recently with the latest issue of the Modern Language Journal, a special issue devoted entirely to ‘the Research-Practice Dialogue in Second Language Learning and Teaching’ (Sato & Loewen, 2022). At the heart of the matter is the fact that most teachers are just not terribly interested in research and rarely, if ever, read it (Borg, 2009). Much has been written on whether or not this matters, but that is not my concern here.

Sato and Loewen’s introductory article reviews the reasons behind the lack of dialogue between researchers and teachers, and, in an unintentionally comic meta move, argue that more research is needed into teachers’ lack of interest in research. This is funny because one of the reasons for a lack of dialogue between researchers and teachers is that ‘teachers have been researched too much ON and not enough WITH’ (Farrell, 2016): most research has not been carried out ‘for the teacher’s benefit and needs’, with the consequence being that ‘the result is purely academic’. Sato and Loewen’s primary focus in the article is on ‘classroom teachers’, with whom they would like to see more ‘dialogue’, but, as they acknowledge, they publish in a research journal whose ‘most likely readers are researchers’. They do not appear to have read Alan Maley’s ‘‘More Research is Needed’ – A Mantra Too Far?’ (Maley, 2016). Perhaps the article (and the Humanising Language Teaching magazine it is from) passed under their radar because it’s written for teachers (not researchers), it’s free and does not have an impact factor?

I wasn’t entirely convinced by the argument that more research about research is needed, not least because Sato and Loewen provide a fairly detailed analysis of the obstacles that exist to dialogue between researchers and teachers. They divide these into two categories:

Epistemological obstacles: the framing of researchers as generators of knowledge and teachers as consumers of knowledge; teachers’ scepticism about the relevance of some research findings to real-world teaching situations; the different discourse communities inhabited by researchers and teachers, as evidenced by the academic language choices of the former.

Practical obstacles: institutional expectations for researchers to publish in academic journals and a lack of time for researchers to engage in dialogue with teachers; teachers’ lack of time and lack of access to research.

Nothing new here, nothing contentious, either. Nothing new, either, in their argument that more dialogue between researchers and teachers would be of mutual benefit. They acknowledge that ‘In the current status of the research-practice relationship, it is researchers who are concerned about transferability of their findings to classrooms. Practitioners may not have burning motivation or urgent needs to reach out to researchers’. Consequently, it is researchers who should ‘shoulder the lion’s share of responsibility in this initiative’. This implies that, while the benefit could be mutual, it is not mutually proportionate, since researchers have both more to lose and more to gain.

They argue that it would be helpful to scrutinize closely the relationship between researchers and teachers (they prefer to use the word ‘practitioners’) and that researchers need to reflect on their own beliefs and practices, in particular the way that researchers are stake-holders in the research-practice relationship. I was disappointed that they didn’t go into more detail here and would like to suggest one angle of the ‘cui bono’ question worth exploring. The work of TESOL researchers is mostly funded by TESOL teaching. It is funded, in other words, by selling a commodity – TESOL – to a group of consumers … who are teachers. If we frame researchers as vendors and teachers as (potential) clients, [1] a rather different light is shone on pleas for more dialogue.

The first step, Sato and Loewen claim, towards achieving such a dialogue would be ‘nurturing a collaborative mindset in both researchers and teachers’. And the last of four steps to removing the obstacles to dialogue would be ‘institutional support’ for both teachers and researchers. But without institutional support, mindsets are unlikely to become more collaborative, and the suggestions for institutional support (e.g. time release and financial support for teachers) are just pie in the sky. Perhaps sensing this, Sato and Loewen conclude the article by asking whether their desire to see a more collaborative mindset (and, therefore, more dialogue) is just a dream. Back in 1994, Mark Clarke had this to say:

The only real solution to the problems I have identified would be to turn the hierarchy on its head, putting teachers on the top and arraying others-pundits, professors, administrators, researchers, and so forth-below them. This would require a major change in our thinking and in our behavior and, however reasonable it may appear to be, I do not see this happening. (Clarke, 1994: 18)

In 2017, ELT Journal published an entertaining piece of trolling by Péter Medgyes, ‘The (ir)relevance of academic research for the language teacher’, in which he refers to the expert status of researchers as related to the ‘orthodox and mistaken belief that by virtue of having churned out tons of academic papers and books, they must be performing an essential service for language education’. It is not hard to imagine the twinkle in his eye as he wrote it. In the same volume, Amos Paran (2017) picks up the bait, arguing for more dialogue between researchers and teachers. In response to Paran’s plea, Medgyes points out that there is an irony in preaching the importance of dialogue in a top-down manner. ‘As long as the playing field is uneven, it is absurd to talk about dialogue, if a dialogue is at all necessary’, he writes. The same holds true for Sato and Loewen. They acknowledge (Sato & Loewen, 2018) that ‘researchers’ top-down attitudes will not facilitate the dialogue’, but, try as they might, their own mindset is seemingly inescapable. In one article that was attempting to reach out to teachers (Sato, Loewen & Kim, 2021), they managed to make one teacher trainer, Sandy Millin, feel that teachers were being unfairly attacked.

The phrase ‘We need to talk’ has been described as, perhaps, the most dreaded four words in the English language. When you hear it, you know (1) that someone wants to talk to you (and not the other way round), (2) that, whether you want to talk or not, the other person will have their say, (3) that the discussion will almost certainly involve some criticism of you, and this may be merited, and (4) whatever happens next, it is unlikely that your relationship will improve.

References

Borg, S. (2009). English language teachers’ conceptions of research. Applied Linguistics, 30 (3): 358 – 88

Clarke, M. (1994). The dysfunctions of theory/practice discourse. TESOL Quarterly, 28: 9-26.

Farrell, T. (2016). Reflection, reflection, reflection. Responses to the Chapter:  More Research is Needed – A Mantra Too Far? Humanising Language Teaching, 18 (3) http://old.hltmag.co.uk/jun16/mart.htm

Maley, A. (2016). ‘More Research is Needed’ – A Mantra Too Far? Humanising Language Teaching, 18 (3) http://old.hltmag.co.uk/jun16/mart.htm

Medgyes, P. (2017). The (ir)relevance of academic research for the language teacher. ELT Journal, 71 (4): 491–498

Paran, A. (2017). ‘Only connect’: researchers and teachers in dialogue. ELT Journal, 71 (4): 499 – 508

Sato, M., & Loewen, S. (2022). The research-practice dialogue in second language learning and teaching: Past, present, and future. The Modern Language Journal, 106 (3)

Sato, M. & Loewen, S. (Eds.) (2019) Evidence-Based Second Language Pedagogy. New York: Routledge

Sato, M. & Loewen, S. (2018). Do teachers care about research? The research–pedagogy dialogue. ELT Journal 73 (1): 1 – 10

Sato, M., Loewen, S. & Kim, Y. J. (2021) The role and value of researchers for teachers: five principles for mutual benefit. TESOL AL Forum September 2021. http://newsmanager.commpartners.com/tesolalis/issues/2021-08-30/email.html#4


[1] I am actually a direct customer of Sato and Loewen, having bought for £35 last year a copy of their edited volume ‘Evidence-Based Second Language Pedagogy’. According to the back cover, it is a ‘cutting-edge collection of empirical research [which closes] the gap between research and practice’. In reality, it’s a fairly random collection of articles of very mixed quality, many of which are co-authored by ‘top scholars’ and the PhD students they are supervising. It does nothing to close any gaps between research and practice and I struggle to see how it could be of any conceivable benefit to teachers.

Five years ago, in 2016, there was an interesting debate in the pages of the journal ‘Psychological Review’. It began with an article by Jeffrey Bowers (2016a), a psychologist at the University of Bristol, who argued that neuroscience (as opposed to psychology) has little, or nothing, to offer us, and is unlikely ever to be able to do so, in terms of improving classroom instruction. He wasn’t the first to question the relevance of neuroscience to education (see, for example, Willingham, 2009), but this was a full-frontal attack. Bowers argued that ‘neuroscience rarely offers insights into instruction above and beyond psychology’ and that neuroscientific evidence that the brain changes in response to instruction are irrelevant. His article was followed by two counter-arguments (Gabrieli, 2016; Howard-Jones, et al., 2016), which took him to task for too narrowly limiting the scope of education to classroom instruction (neglecting, for example, educational policy), for ignoring the predictive power of neuroimaging on neurodevelopmental differences (and, therefore, its potential value in individualising curricula), and for failing to take account of the progress that neuroscience, in collaboration with educators, has already made. Bowers’ main argument, that educational neuroscience had little to tell us about teaching, was not really addressed in the counter-arguments, and Bowers (2016b) came back with a counter-counter-rebuttal.

The brain responding to seductive details

In some ways, the debate, like so many of the kind, suffered from the different priorities of the participants. For Gabriele and Howard-Jones et al., Bowers had certainly overstated his case, but they weren’t entirely in disagreement with him. Paul Howard-Jones has been quoted by André Hedlund as saying that ‘all neuroscience can do is confirm what we’ve been doing all along and give us new insights into a couple of new things’. One of Howard-Jones’ co-authors, Usha Goswami, director of the Centre for Neuroscience in Education at the University of Cambridge, has said that ‘there is a gulf between current science and classroom applications’ (Goswami, 2006).

For teachers, though, it is the classroom applications that are of interest. Claims for the relevance of neuroscience to ELT have been made by many. We [in ESL / EFL] need it, writes Curtis Kelly (2017). Insights from neuroscience can, apparently, make textbooks more ‘brain friendly’ (Helgesen & Kelly, 2015). Herbert Puchta’s books are advertised by Cambridge University Press as ‘based on the latest insights into how the brain works fresh from the field of neuroscience’. You can watch a British Council talk by Rachael Roberts, entitled ‘Using your brain: what neuroscience can teach us about learning’. And, in the year following the Bowers debate, Carol Lethaby and Patricia Harries gave a presentation at IATEFL Glasgow (Lethaby & Harries, 2018) entitled ‘Research and teaching: What has neuroscience ever done for us?’ – a title that I have lifted for this blog post. Lethaby and Harries provide a useful short summary of the relevance of neuroscience to ELT, and I will begin my discussion with that. They expand on this in their recent book (Lethaby, Mayne & Harries, 2021), a book I highly recommend.

So what, precisely, does neuroscience have to tell English language teachers? Lethaby and Harries put forward three main arguments. Firstly, neuroscience can help us to bust neuromyths (the examples they give are right / left brain dominance and learning styles). Secondly, it can provide information that informs teaching (the examples given are the importance of prior knowledge and the value of translation). Finally, it can validate existing best practice (the example given is the importance of prior knowledge). Let’s take a closer look.

I have always enjoyed a bit of neuromyth busting and I wrote about ‘Left brains and right brains in English language teaching’ a long time ago. It is certainly true that neuroscience has helped to dispel this myth: it is ‘simplistic at best and utter hogwash at worst’ (Dörnyei, 2009: 49). However, we did not need neuroscience to rubbish the practical teaching applications of this myth, which found their most common expression in Neuro-Linguistic Programming (NLP) and Brain Gym. Neuroscience simply banged in the final nail in the coffin of these trends. The same is true for learning styles and the meshing hypothesis. It’s also worth noting that, despite the neuroscientific evidence, such myths are taking a long time to die … a point I will return to at the end of this post.

Lethaby and Harries’s second and third arguments are essentially the same, unless, in their second point they are arguing that neuroscience can provide new information. I struggle, however, to see anything that is new. Neuroimaging apparently shows that the medial prefrontal cortex is activated when prior knowledge is accessed, but we have long known (since Vygotsky, at least!) that effective learning builds on previous knowledge. Similarly, the amygdala (known to be associated with the processing of emotions) may play an important role in learning, but we don’t need to know about the amygdala to understand the role of affect in learning. Lastly, the neuroscientific finding that different languages are not ‘stored’ in separate parts of the brain (Spivey & Hirsch, 2003) is useful to substantiate arguments that translation can have a positive role to play in learning another language, but convincing arguments predate findings such as these by many, many years. This would all seem to back up Howard-Jones’s observation about confirming what we’ve been doing and giving us new insights into a couple of new things. It isn’t the most compelling case for the relevance of neuroscience to ELT.

Chapter 2 of Carol Lethaby’s new book, ‘An Introduction to Evidence-based Teaching in the English Language Classroom’ is devoted to ‘Science and neuroscience’. The next chapter is called ‘Psychology and cognitive science’ and practically all the evidence for language teaching approaches in the rest of the book is drawn from cognitive (rather than neuro-) science. I think the same is true for the work of Kelly, Helgesen, Roberts and Puchta that I mentioned earlier.

It is perhaps the case these days that educationalists prefer to refer to ‘Mind, Brain, and Education Science’ (MBE) – the ‘intersection of neuroscience, education, and psychology’ – rather than educational neuroscience, but, looking at the literature of MBE, there’s a lot more education and psychology than there is neuroscience (although the latter always gets a mention). Probably the most comprehensive and well-known volume of practical ideas deriving from MBE is ‘Making Classrooms Better’ (Tokuhama-Espinosa, 2014). Of the 50 practical applications listed, most are either inspired by the work of John Hattie (2009) or the work of cognitive psychologists. Neuroscience hardly gets a look in.

To wrap up, I’d like to return to the question of neuroscience’s role in busting neuromyths. References to neuroscience, especially when accompanied by fMRI images, have a seductive appeal to many: they confer a sense of ‘scientific’ authority. Many teachers, it seems, are keen to hear about neuroscience (Pickering & Howard-Jones, 2007). Even when the discourse contains irrelevant neuroscientific information (diagrams of myelination come to mind), it seems that many of us find this satisfying (Weisberg et al., 2015; Weisberg et al., 2008). It gives an illusion of explanatory depth (Rozenblit & Keil, 2002), the so-called ‘seductive details effect’. You are far more likely to see conference presentations, blog posts and magazine articles extolling the virtues of neuroscientific findings than you are to come across things like I am writing here. But is it possible that the much-touted idea that neuroscience can bust neuromyths is itself a myth?

Sadly, we have learnt in recent times that scientific explanations have only very limited impact on the beliefs of large swathes of the population (including teachers, of course). Think of climate change and COVID. Why should neuroscience be any different? It probably isn’t. Scurich & Shniderman (2014) found that ‘neuroscience is more likely to be accepted and credited when it confirms prior beliefs’. We are more likely to accept neuroscientific findings because we ‘find them intuitively satisfying, not because they are accurate’ (Weisberg, et al. 2008). Teaching teachers about educational neuroscience may not make much, if any, difference (Tham et al., 2019). I think there is a danger in using educational neuroscience, seductive details and all, to validate what we already do (as opposed to questioning what we do). And for those who don’t already do these things, they’ll probably ignore such findings as there are, anyway.

References

Bowers, J. (2016a) The practical and principled problems with educational Neuroscience. Psychological Review 123 (5) 600 – 612

Bowers, J.S. (2016b) Psychology, not educational neuroscience, is the way forward for improving educational outcomes for all children: Reply to Gabrieli (2016) and Howard-Jones et al. (2016). Psychological Review. 123 (5):628-35.

Dörnyei, Z. (2009) The Psychology of Second Language Acquisition. Oxford: Oxford University Press

Gabrieli, J.D. (2016) The promise of educational neuroscience: Comment on Bowers (2016). Psychological Review. 123 (5):613-9

Goswami , U. (2006). Neuroscience and education: From research to practice? Nature Reviews Neuroscience, 7: 406 – 413

Hattie, J. (2009) Visible Learning: A synthesis of over 800 meta-analyses relating to achievement. London: Routledge

Helgesen, M. & Kelly, C. (2015) Do-it-yourself: Ways to make your textbook more brain-friendly’ SPELT Quarterly, 30 (3): 32 – 37

Howard-Jones, P.A., Varma. S., Ansari, D., Butterworth, B., De Smedt, B., Goswami, U., Laurillard, D. & Thomas, M. S. (2016) The principles and practices of educational neuroscience: Comment on Bowers (2016). Psychological Review. 123 (5):620-7

Kelly, C. (2017) The Brain Studies Boom: Using Neuroscience in ESL/EFL Teacher Training. In Gregersen, T. S. & MacIntyre, P. D. (Eds.) Innovative Practices in Language Teacher Education pp.79-99 Springer

Lethaby, C. & Harries, P. (2018) Research and teaching: What has neuroscience ever done for us?’ in Pattison, T. (Ed.) IATEFL Glasgow Conference Selections 2017. Faversham, Kent, UK: IATEFL  p. 36- 37

Lethaby, C., Mayne, R. & Harries, P. (2021) An Introduction to Evidence-Based Teaching in the English Language Classroom. Shoreham-by-Sea: Pavilion Publishing

McCabe, D.P. & Castel, A.D. (2008) Seeing is believing: The effect of brain images on judgments of scientific reasoning. Cognition 107: 343–352.

Pickering, S. J. & Howard-Jones, P. (2007) Educators’ views on the role of neuroscience in education: findings from a study of UK and international perspectives. Mind Brain Education 1: 109–113.

Rozenblit, L., & Keil, F. (2002). The misunderstood limits of folk science: an illusion of explanatory depth. Cognitive science, 26(5), 521–562.

Scurich, N., & Shniderman, A. (2014) The selective allure of neuroscientific explanations. PLOS One, 9 (9), e107529. http://dx.doi.org/10.1371/journal.pone. 0107529.

Spivey, M. V. & Hirsch, J. (2003) ‘Shared and separate systems in bilingual language processing: Converging evidence from eyetracking and brain imaging’ Brain and Language, 86: 70 – 82

Tham, R., Walker, Z., Tan, S.H.D., Low, L.T. & Annabel Chan, S.H. (2019) Translating educational neuroscience for teachers. Learning: Research and Practice, 5 (2): 149-173 Singapore: National Institute of Education

Tokuhama-Espinosa, T. (2014) Making Classrooms Better. New York: Norton

Weisberg, D. S., Taylor, J. C. V. & Hopkins, E.J. (2015) Deconstructing the seductive allure of neuroscience explanations. Judgment and Decision Making, Vol. 10, No. 5, September 2015, pp. 429–441

Weisberg, D. S., Keil, F. C., Goodstein, J., Rawson, E., & Gray, J. R. (2008). The seductive allure of neuroscience explanations. Journal of cognitive neuroscience, 20 (3): 470–477.

Willingham, D. T. (2009). Three problems in the marriage of neuroscience and education. Cortex, 45: 54-55.

I’ve written about mindset before (here), but a recent publication caught my eye, and I thought it was worth sharing.

Earlier this year, the OECD produced a report on its 2018 PISA assessments. This was significant because it was the first time that the OECD had attempted to measure mindsets and correlate them to academic achievements. Surveying some 600,000 15-year-old students in 78 countries and economies, it is, to date, the biggest, most global attempt to study the question. Before going any further, a caveat is in order. The main focus of PISA 2018 was on reading, so any correlations that are found between mindsets and achievement can only be interpreted in the context of gains in reading skills. This is important to bear in mind, as previous research into mindsets indicates that mindsets may have different impacts on different school subjects.

There has been much debate about how best to measure mindsets and, indeed, whether they can be measured at all. The OECD approached the question by asking students to respond to the statement ‘Your intelligence is something about you that you can’t change very much’ by choosing “strongly disagree”, “disagree”, “agree”, or “strongly agree”. Disagreeing with the statement was considered a precursor of a growth mindset, as it is more likely that someone who thinks intelligence can change will challenge him/herself to improve it. Across the sample, almost two-thirds of students showed a growth mindset, but there were big differences between countries, with students in Estonia, Denmark, and Germany being much more growth-oriented than those in Greece, Mexico or Poland (among OECD countries) and the Philippines, Panama, Indonesia or Kosovo (among the non-OECD countries). In line with previous research, students from socio-economically advantaged backgrounds presented a growth mindset more often than those from socio-economically disadvantaged backgrounds.

I have my problems with the research methodology. A 15-year-old from a wealthy country is much more likely than peers in other countries to have experienced mindset interventions in school: motivational we-can-do-it posters, workshops on neuroplasticity, biographical explorations of success stories and the like. In some places, some students have been so exposed to this kind of thing that school leaders have realised that growth mindset interventions should be much more subtle, avoiding the kind of crude, explicit proselytising that simply makes many students roll their eyes. In contexts such as these, most students now know what they are supposed to believe concerning the malleability of intelligence, irrespective of what they actually believe. Therefore, asking them, in a formal context, to respond to statements which are obviously digging at mindsets is an invitation to provide what they know is the ‘correct response’. Others, who have not been so fortunate in receiving mindset training, are less likely to know the correct answer. Therefore, the research results probably tell us as much about educational practices as they do about mindsets. There are other issues with the chosen measurement tool, discussed in the report, including acquiescent bias and the fact that the cognitive load required by the question increases the likelihood of a random response. Still, let’s move on.

The report found that growth mindsets correlated with academic achievement in some (typically wealthier) countries, but not in others. Wisely, the report cautions that the findings do not establish cause-and-effect relations. This is wise because a growth mindset may, to some extent, be the result of academic success, rather than the cause. As the report observes, students performing well may associate their success to internal characteristics of effort and perseverance, while those performing poorly may attribute it to immutable characteristics to preserve their self-esteem.

However, the report does list the ways in which a growth mindset can lead to better achievement. These include valuing school more, setting more ambitious learning goals, higher levels of self-efficacy, higher levels of motivation and lower levels of fear of failure. This is a very circular kind of logic. These attributes are the attributes of growth mindset, but are they the results of a growth mindset or simply the constituent parts of it? Incidentally, they were measured in the same way as the measurement of mindset, by asking students to respond to statements like “I find satisfaction in working as hard as I can” or “My goal is to learn as much as possible”. The questions are so loaded that we need to be very sceptical about the meaning of the results. The concluding remarks to this section of the report clearly indicate the bias of the research. The question that is asked is not “Can growth mindset lead to better results?” but “How can growth mindset lead to better results?”

Astonishingly, the research did not investigate the impact of growth mindset interventions in schools on growth mindset. Perhaps, this is too hard to do in any reliable way. After all, what counts as a growth mindset intervention? A little homily from the teacher about how we can all learn from our mistakes or some nice posters on the walls? Or a more full-blooded workshop about neural plasticity with follow-up tasks? Instead, the research investigated more general teaching practices. The results were interesting. The greatest impacts on growth mindset come when students perceive their teachers as being supportive in a safe learning environment, and when teachers adapt their teaching to the needs of the class, as opposed to simply following a fixed syllabus. The findings about teacher feedback were less clear: “Whether teacher feedback influences students’ growth mindset development or the other way around, further research is required to investigate this relationship, and why it could differ according to students’ proficiency in reading”.

The final chapter of this report does not include any references to data from the PISA 2018 exercise. Instead, it repeats, in a very selective way, previous research findings such as:

  • Growth mindset interventions yield modest average treatment effects, but larger effects for specific subgroups.
  • Growth-mindset interventions fare well in both scalability and cost-effectiveness dimensions.

It ignores any discussion about whether we should be bothering with growth mindsets at all. It tells us something we already know (about the importance of teacher support and adapting teaching to the needs of the class), but somehow concludes that “growth mindset interventions […] can be cost-effective ways to raise students’ outcomes on a large scale”. It is, to my mind, a classic example, of ‘research’ that is looking to prove a point, rather than critically investigate a phenomenon. In that sense, it is the very opposite of science.

OECD (2021) Sky’s the Limit: Growth Mindset, Students, and Schools in PISA. https://www.oecd.org/pisa/growth-mindset.pdf

Since no single definition of critical thinking prevails (Dummett & Hughes, 2019: 2), discussions of the topic invariably begin with attempts to provide a definition. Lai (2011) offers an accessible summary of a range of possible meanings, but points out that, in educational contexts, its meaning is often rather vague and encompasses other concepts (such as higher order thinking skills) which also lack clarity. Paul Dummett and John Hughes (2019: 4) plump for ‘a mindset that involves thinking reflectively, rationally and reasonably’ – a definition which involves a vague noun (that could mean a fixed state of mind, a learned attitude, a disposition or a mood) and three highly subjective adverbs. I don’t think I could do any better. However, instead of looking for a definition, we can reach a sort of understanding by looking at examples of it. Dummett and Hughes’ book is extremely rich in practical examples, and the picture that emerges of critical thinking is complex and multifaceted.

As you might expect of a weasel word like ‘critical thinking’, there appears to be general agreement that it’s a ‘good thing’. Paul Dummett suggests that there are two common reasons for promoting the inclusion of critical thinking activities in the language classroom. The first of these is a desire to get students thinking for themselves. The second is the idea ‘that we live in an age of misinformation in which only the critically minded can avoid manipulation or slavish conformity’. Neither seems contentious at first glance, although he points out that ‘they tend to lead to a narrow application of critical thinking in ELT materials: that is to say, the analysis of texts and evaluation of the ideas expressed in them’. It’s the second of these rationales that I’d like to explore further.

Penny Ur (2020: 9) offers a more extended version of it:

The role of critical thinking in education has become more central in the 21st century, simply because there is far more information readily available to today’s students than there was in previous centuries (mainly, but not only, online), and it is vital for them to be able to deal with such input wisely. They need to be able to distinguish between what is important and what is trivial, between truth and lies, between fact and opinion, between logical argument and specious propaganda […] Without such skills and awareness of the need to exercise them, they are liable to find themselves victims of commercial or political interests, their thinking manipulated by persuasion disguised as information.

In the same edited collection Olja Milosevic (2020:18) echoes Ur’s argument:

Critical thinking becomes even more important as communication increasingly moves online. Students find an overwhelming amount of information and need to be taught how to evaluate its relevance, accuracy and quality. If teachers do not teach students how to go beyond surface meaning, students cannot be expected to practise it.

In the passages I’ve quoted, these writers are referring to one particular kind of critical thinking. The ability to critically evaluate the reliability, accuracy, etc of a text is generally considered to be a part of what is usually called ‘media information literacy’. In these times of fake news, so the argument goes, it is vital for students to develop (with their teachers’ help) the necessary skills to spot fake news when they see it. The most prototypical critical thinking activity in ELT classrooms is probably one in which students analyse some fake news, such as the website about the Pacific Tree Octopus (which is the basis of a lesson in Dudeney et al., 2013: 198 – 203).

Before considering media information literacy in more detail, it’s worth noting in passing that a rationale for critical thinking activities is no rationale at all if it only concerns one aspect of critical thinking, since it has applied attributes of a part (media information literacy) to a bigger whole (critical thinking).

There is no shortage of good (free) material available for dealing with fake news in the ELT classroom. Examples include work by James Taylor, Chia Suan Chong and Tyson Seburn. Material of this kind may result in lively, interesting, cognitively challenging, communicative and, therefore, useful lessons. But how likely is it that material of this kind will develop learners’ media information literacy and, by extension therefore, their critical thinking skills? How likely is it that teaching material of this kind will help people identify (and reject) fake news? Is it possible that material of this kind is valuable despite its rationale, rather than because of it? In the spirit of rational, reflective and reasonable thinking, these are questions that seem to be worth exploring.

ELT classes and fake news

James Taylor has suggested that the English language classroom is ‘the perfect venue for [critical thinking] skills to be developed’. Although academic English courses necessarily involve elements of critical thinking, I’m not so sure that media information literacy (and, specifically, the identification of fake news) can be adequately addressed in general English classes. There are so many areas, besides those that are specifically language-focussed, competing for space in language classes (think of all those other 21st century skills), that it is hard to see how sufficient time can be found for real development of this skill. It requires modelling, practice of the skill, feedback on the practice, and more practice (Mulnix, 2010): it needs time. Fake news activities in the language classroom would, of course, be of greater value if they were part of an integrated approach across the curriculum. Unfortunately, this is rarely the case.

Information literacy skills

Training materials for media information literacy usually involve a number of stages. These include things like fact-checking and triangulation of different sources, consideration of web address, analysis of images, other items on the site, source citation and so on. The problem, however, is that news-fakers have become so good at what they do. The tree octopus site is very crude in comparison to what can be produced nowadays by people who have learnt to profit from the online economy of misinformation. Facebook employs an army of algorithmic and human fact-checkers, but still struggles. The bottom line is that background knowledge is needed (this is as true for media information literacy as it is for critical thinking more generally) (Willingham, 2007). With news, the scope of domain knowledge is so vast that it is extremely hard to transfer one’s ability to critically evaluate one particular piece of news to another. We are all fooled from time to time.

Media information literacy interventions: research on effectiveness

With the onset of COVID-19, the ability to identify fake news has become, more than ever, a matter of life and death. There is little question that this ability correlates strongly with analytic thinking (see, for example, Stanley et al., 2020). What is much less clear is how we can go about promoting analytic thinking. Analytic thinking comes in different varieties, and another hot-off-the-press research study into susceptibility to COVID-19 fake news (Roozenbeek et al., 2020) has found that the ability to spot fake news may correlate more strongly with numerical literacy than with reasoning ability. In fact, the research team found that a lack of numerical literacy was the most consistent predictor of susceptibility to misinformation about COVID-19. Perhaps we are attempting to develop the wrong kind of analytic thinking?

In educational contexts, attempts to promote media information literacy typically seek to develop reasoning abilities, and the evidence for their effectiveness is mixed. First of all, it needs to be said that ‘little large-scale evidence exists on the effectiveness of promoting digital media literacy as a response to online misinformation’ (Guess et al., 2020). An early meta-analysis (Jeong et al., 2012) found that such interventions had a positive effect, when the interventions were long (not one-off), but impacted more on students’ knowledge than they did on their behaviour. More recently, Huguet et al (2019) were unable to draw ‘definitive conclusions from past research, such as what kinds of media literacy practices work and under what conditions’. And this year, a study by Guess et al (2020) did not generate sufficient evidence ‘to conclude that the [media information literacy] intervention changed real-world consumption of false news’. I am unaware of any robust research in this area in the context of ELT.

It’s all rather disappointing. Why are we not better at it? After all, teachers of media studies have been exploring pathways for many years now. One possible answer is this: Media information literacy, like critical thinking more generally, is a skill that is acquirable, but it can only be acquired if there is a disposition to do so. The ability to think critically and the disposition to do so are separate entities (Facione, 2000). Training learners to be more critical in their approach to media information may be so much pissing in the wind if the disposition to be sceptical is not there. Shaping dispositions is a much harder task than training skills.

Both of the research studies into susceptibility to COVID-19 misinformation that I referred to earlier in this section underscore the significance of dispositions to analytic thinking. Roozenbeek et al (2020) found, in line with much previous research (for example, Jost et al. 2018), that political conservatism is associated with a slightly higher susceptibility to misinformation. Political views (on either side of the political spectrum) rarely change as a result of exposure to science or reasoned thinking. They also found that ‘self-identifying as a member of a minority predicts susceptibility to misinformation about the virus in all countries surveyed’ (except, interestingly, in the UK). Again, when issues of identity are at stake, emotional responses tend to trump rational ones.

Rational, reflective and reasonable thinking about media information literacy leads to an uncomfortable red-pill rabbit-hole. This is how Bulger and Davidson (2018) put it:

The extent to which media literacy can combat the problematic news environment is an open question. Is denying the existence of climate change a media literacy problem? Is believing that a presidential candidate was running a sex-trafficking ring out of a pizza shop a media literacy problem? Can media literacy combat the intentionally opaque systems of serving news on social media platforms? Or intentional campaigns of disinformation?

Teachers and fake news

The assumption that the critical thinking skills of young people can be developed through the intervention of their teachers is rarely problematized. It should be. A recent study of Spanish pre-service teachers (Fuertes-Prieto et al., 2020) showed that their ‘level of belief in pseudoscientific issues is comparable, or even higher in some cases to those of the general population’. There is no reason to believe that this changes after they have qualified. Teachers are probably no more likely to change their beliefs when presented with empirical evidence (Menz et al., 2020) than people from any other profession. Research has tended to focus on teachers’ lack of critical thinking in areas related to their work, but, things may be no different in the wider world. It is estimated that over a quarter of teachers in the US voted for the world’s greatest peddler of fake news in the 2016 presidential election.

It is also interesting to note that the sharing of fake news on social media is much more widespread among older people (including US teachers who have an average age of 42.4) than those under 30 (Bouygues, 2019).

Institutional contexts and fake news

Cory Doctorow has suggested that the fake news problem is not a problem of identifying what is true and what is fake, but a problem ‘about how we know whether something is true. We’re not disagreeing about facts, we’re disagreeing about epistemology’. In a post-modernist world of ‘Truth Decay’ (Kavanagh & Rich, 2018), where there is ‘a blurring of the line between opinion and fact’, epistemological authority is a rare commodity. Medicine, social sciences and applied linguistics are all currently experiencing a ‘replication crisis’ (Ioannidis, 2005) and we had a British education minister saying that ‘people of this country have had enough of experts’.

News reporting has always relied to some extent on trust in the reliability of the news source. The BBC or CNN might attempt to present themselves as more objective than, say, Fox News or InfoWars, but trust in all news outlets has collapsed globally in recent years. As Michael Shudson has written in the Columbia Journalism Review, ‘all news outlets write from a set of values, not simply from a disinterested effort at truth’. If a particular news channel manifestly shares different values from your own, it is easy to reject the veracity of the news it reports. Believers in COVID conspiracy theories often hold their views precisely because of their rejection of the epistemological authority of mainstream news and the WHO or governments who support lockdown measures.

The training of media information literacy in schools is difficult because, for many people in the US (and elsewhere), education is not dissimilar to mainstream media. They ‘are seen as the enemy — two institutions who are trying to have power over how people think. Two institutions that are trying to assert authority over epistemology’ (boyd, 2018). Schools have always been characterized by imbalances in power (between students and teachers / administrators), and this power dynamic is not conducive to open-minded enquiry. Children are often more aware of the power of their teachers than they are accepting of their epistemological authority. They are enjoined to be critical thinkers, but only about certain things and only up to a certain point. One way for children to redress the power imbalance is to reject the epistemological authority of their teachers. I think this may explain why a group of young children I observed recently coming out of a lesson devoted to environmental issues found such pleasure in joking about Greta ‘Thunfisch’.

Power relationships in schools are reflected and enacted in the interaction patterns between teachers and students. The most common of these is ‘initiation-response-feedback (IRF)’ and it is unlikely that this is particularly conducive to rational, reflective and reasonable thinking. At the same time, as Richard Paul, one of the early advocates of critical thinking in schools, noted, much learning activity is characterised by lower order thinking skills, especially memorization (Paul, 1992: 22). With this kind of backdrop, training in media information literacy is more likely to be effective if it goes beyond the inclusion of a few ‘fake news’ exercises: a transformation in the way that the teaching is done will also be needed. Benesch (1999) describes this as a more ‘dialogic’ approach and there is some evidence that a more dialogic approach can have a positive impact on students’ dispositions (e.g. Hajhosseiny, 2012).

I think that David Buckingham (2019a) captures the educational problem very neatly:

There’s a danger here of assuming that we are dealing with a rational process – or at least one that can, by some pedagogical means, be made rational. But from an educational perspective, we surely have to begin with the question of why people might believe apparently ‘fake’ news in the first place. Where we decide to place our trust is as much to do with fantasy, emotion and desire, as with rational calculation. All of us are inclined to believe what we want to believe.

Fake news: a problem or a symptom of a problem?

There has always been fake news. The big problem now is ‘the speed and ease of its dissemination, and it exists primarily because today’s digital capitalism makes it extremely profitable – look at Google and Facebook – to produce and circulate false but click-worthy narratives’ (Morosov, 2017). Fake news taps into and amplifies broader tendencies and divides in society: the problem is not straightforward and is unlikely to be easy to eradicate (Buckingham, 2019a: 3).

There is increasing discussion of media regulation and the recent banning by Facebook of Holocaust denial and QAnon is a recognition that some regulation cannot now be avoided. But strict regulations would threaten the ‘basic business model, and the enormous profitability’ of social media companies (Buckingham, 2009b) and there are real practical and ethical problems in working out exactly how regulation would happen. Governments do not know what to do.

Lacking any obvious alternative, media information literacy is often seen as the solution: can’t we ‘fact check and moderate our way out of this conundrum’ (boyd, 2018)? danah boyd’s stark response is, no, this will fail. It’s an inadequate solution to an oversimplified problem (Buckingham, 2019a).

Along with boyd and Buckingham, I’m not trying to argue that we drop media information literacy activities from educational (including ELT) programmes. Quite the opposite. But if we want our students to think reflectively, rationally and reasonably, I think we will need to start by doing the same.

References

Benesch, S. (1999). Thinking critically, thinking dialogically. TESOL Quarterly, 33: pp. 573 – 580

Bouygues, H. L. (2019). Fighting Fake News: Lessons From The Information Wars. Reboot Foundation https://reboot-foundation.org/fighting-fake-news/

boyd, d. (2018). You Think You Want Media Literacy… Do You? Data and Society: Points https://points.datasociety.net/you-think-you-want-media-literacy-do-you-7cad6af18ec2

Buckingham, D. (2019a). Teaching Media in a ‘Post-Truth’ Age: Fake News, Media Bias and the Challenge for Media Literacy Education. Cultura y Educación 31(2): pp. 1-19

Buckingham, D. (2019b). Rethinking digital literacy: Media education in the age of digital capitalism. https://ddbuckingham.files.wordpress.com/2019/12/media-education-in-digital-capitalism.pdf

Bulger, M. & Davidson, P. (2018). The Promises, Challenges and Futures of Media Literacy. Data and Society. https://datasociety.net/pubs/oh/DataAndSociety_Media_Literacy_2018.pdf

Doctorow, C. (2017). Three kinds of propaganda, and what to do about them. boingboing 25th February 2017, https://boingboing.net/2017/02/25/counternarratives-not-fact-che.html

Dudeney, G., Hockly, N. & Pegrum, M. (2013). Digital Literacies. Harlow: Pearson Education

Dummett, P. & Hughes, J. (2019). Critical Thinking in ELT. Boston: National Geographic Learning

Facione, P. A. (2000). The disposition toward critical thinking: Its character, measurement, and relation to critical thinking skill. Informal Logic, 20(1), 61–84.

Fuertes-Prieto, M.Á., Andrés-Sánchez, S., Corrochano-Fernández, D. et al. (2020). Pre-service Teachers’ False Beliefs in Superstitions and Pseudosciences in Relation to Science and Technology. Science & Education 29, 1235–1254 (2020). https://doi.org/10.1007/s11191-020-00140-8

Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, N., Reifler, J. & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences Jul 2020, 117 (27) 15536-15545; DOI: 10.1073/pnas.1920498117

Hajhosseiny, M. (2012). The Effect of Dialogic Teaching on Students’ Critical Thinking Disposition. Procedia – Social and Behavioral Sciences, 69: pp. 1358 – 1368

Huguet, A., Kavanagh, J., Baker, G. & Blumenthal, M. S. (2019). Exploring Media Literacy Education as a Tool for Mitigating Truth Decay. RAND Corporation, https://www.rand.org/content/dam/rand/pubs/research_reports/RR3000/RR3050/RAND_RR3050.pdf

Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine 2 (8): e124. https://doi.org/10.1371/journal.pmed.0020124

Jeong, S. H., Cho, H., & Hwang, Y. (2012). Media literacy interventions: A meta-analytic review. Journal of Communication, 62, pp. 454–472

Jones-Jang, S. M., Mortensen, T. & Liu, J. (2019). Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. American Behavioral Scientist, pp. 1 – 18, doi:10.1177/0002764219869406

Jost, J. T., van der Linden, S., Panagopoulos, C. & Hardin, C. D. (2018). Ideological asymmetries in conformity, desire for shared reality, and the spread of misinformation. Current Opinion in Psychology, 23: pp/ 77-83. doi:10.1016/j.copsyc.2018.01.003

Kavanagh, J. & Rich, M. D. (2018). Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND Corporation, https://www.rand.org/pubs/research_reports/RR2314.html

Lai, E.R. 2011. Critical Thinking: A Literature Review. Pearson. http://images.pearsonassessments.com/images/tmrs/CriticalThinkingReviewFINAL.pdf

Menz, C., Spinath, B. & Seifried, E. (2020). Misconceptions die hard: prevalence and reduction of wrong beliefs in topics from educational psychology among preservice teachers. European Journal of Psychology of Education https://doi.org/10.1007/s10212-020-00474-5

Milosevic, O. (2020). Promoting critical thinking in the EFL classroom. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp.17 – 22

Morozov, E. (2017). Moral panic over fake news hides the real enemy – the digital giants. The Guardian, 8 January 2017 https://www.theguardian.com/commentisfree/2017/jan/08/blaming-fake-news-not-the-answer-democracy-crisis

Mulnix, J.W. 2010. ‘Thinking critically about critical thinking’ Educational Philosophy and Theory, 2010

Paul, R. W. (1992). Critical thinking: What, why, and how? New Directions for Community Colleges, 77: pp. 3–24.

Roozenbeek, J., Schneider, C.R., Dryhurst, S., Kerr, J., Freeman, A. L. J., Recchia, G., van der Bles, A. M. & and van der Linden, S. (2020). Susceptibility to misinformation about COVID-19 around the world. Royal Society Open Science, 7 (10) https://doi.org/10.1098/rsos.201199

Stanley, M., Barr, N., Peters, K. & Seli, P. (2020). Analytic-thinking predicts hoax beliefs and helping behaviors in response to the COVID-19 pandemic. PsyArxiv Preprints doi:10.31234/osf.io/m3vt

Ur, P. (2020). Critical Thinking. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp.9 – 16

Willingham, D. T. (2007). Critical Thinking: Why Is It So Hard to Teach? American Educator Summer 2007: pp. 8 – 19

A week or so ago, someone in the Macmillan marketing department took it upon themselves to send out this tweet. What grabbed my attention was the claim that it is ‘a well-known fact’ that teaching students a growth mindset makes them perform better academically over time. The easily demonstrable reality (which I’ll come on to) is that this is not a fact. It’s fake news, being used for marketing purposes. The tweet links to a blog post of over a year ago. In it, Chia Suan Chong offers five tips for developing a growth mindset in students: educating students about neuroplasticity, delving deeper into success stories, celebrating challenges and mistakes, encouraging students to go outside their comfort zones, and giving ‘growth-mindset-feedback’. All of which, she suggests, might help our students. Indeed, it might, and, even if it doesn’t, it might be worth a try anyway. Chia doesn’t make any claims beyond the potential of the suggested strategies, so I wonder where the Macmillan Twitter account person got the ‘well-known fact’.

If you google ‘mindset ELT’, you will find webpage after webpage offering tips about how to promote growth mindset in learners. It’s rare for the writers of these pages to claim that the positive effects of mindset interventions are a ‘fact’, but it’s even rarer to come across anyone who suggests that mindset interventions might be an à la mode waste of time and effort. Even in more serious literature (e.g. Mercer, S. & Ryan, S. (2010). A mindset for EFL: learners’ beliefs about the role of natural talent. ELT Journal, 64 (4): 436 – 444), the approach is fundamentally enthusiastic, with no indication that there might be a problem with mindset theory. Given that this enthusiasm is repeated so often, perhaps we should not blame the Macmillan tweeter for falling victim to the illusory truth effect. After all, it appears that 98% of teachers in the US feel that growth mindset approaches should be adopted in schools (Hendrick, 2019).

Chia suggests that we can all have fixed mindsets in certain domains (e.g. I know all about that, there’s nothing more I can learn). One domain where it seems that fixed mindsets are prevalent is mindset theory itself. This post is an attempt to nudge towards more ‘growth’ and, in trying to persuade you to be more sceptical, I will quote as much as possible from Carol Dweck, the founder of mindset theory, and her close associates.

Carol Dweck’s book ‘Mindset: The New Psychology of Success’ appeared in 2006. In it, she argued that people can be placed on a continuum between those who have ‘a fixed mindset–those who believe that abilities are fixed—[and who] are less likely to flourish [and] those with a growth mindset–those who believe that abilities can be developed’ (from the back cover of the updated (2007) version of the book). There was nothing especially new about the idea. It is very close to Bandura’s (1982) theory of self-efficacy, which will be familiar to anyone who has read Zoltán Dörnyei’s more recent work on motivation in language learning. It’s closely related to Carl Roger’s (1969) ideas about self-concept and it’s not a million miles removed, either, from Maslow’s (1943) theory of self-actualization. The work of Rogers and Maslow was at the heart of the ‘humanistic turn’ in ELT in the latter part of the 20th century (see, for example, Early, 1981), so mindset theory is likely to resonate with anyone who was inspired by the humanistic work of people like Moskowitz, Stevick or Rinvolucri. The appeal of mindset theory is easy to see. Besides its novelty value, it resonates emotionally with the values that many teachers share, writes Tom Bennett: it feels right that you don’t criticise the person, but invite them to believe that, through hard work and persistence, you can achieve.

We might even trace interest in the importance of self-belief back to the Stoics (who, incidentally but not coincidentally, are experiencing a revival of interest), but Carol Dweck introduced a more modern flavour to the old wine and packaged it skilfully and accessibly in shiny new bottles. Her book was a runaway bestseller, with sales in the millions, and her TED Talk has now had over 11 million views. It was in education that mindset theory became particularly popular. As a mini-industry it is now worth millions and millions. Just one research project into the efficacy of one mindset product has received 3.5 million dollars in US federal funding.

But, much like other ideas that have done a roaring trade in popular psychology (Howard Gardner’s ‘multiple intelligences theory, for example) which seem to offer simple solutions to complex problems, there was soon pushback. It wasn’t hard for critics to scoff at motivational ‘yes-you-can’ posters in classrooms or accounts of well-meaning but misguided teacher interventions, like this one reported by Carl Hendrick:

One teacher [took] her children out into the pristine snow covering the school playground, she instructed them to walk around, taking note of their footprints. “Look at these paths you’ve been creating,” the teacher said. “In the same way that you’re creating new pathways in the snow, learning creates new pathways in your brain.”

Carol Dweck was sympathetic to the critics. She has described the early reaction to her book as ‘uncontrollable’. She freely admits that she and her colleagues had underestimated the issues around mindset interventions in the classrooms and that such interventions were ‘not yet evidence-based’. She identified two major areas where mindset interventions have gone awry. The first of these is when a teacher teaches the concept of mindsets to students, but does not change other policies and practices in the classroom. The second is that some teachers have focussed too much on praising their learners’ efforts. Teachers have taken mindset recipes and tips, without due consideration. She says:

Teachers have to ask, what exactly is the evidence suggesting? They have to realise it takes deep thought and deep experimentation on their part in the classroom to see how best the concept can be implemented there. This should be a group enterprise, in which they share what worked, what did not work, for whom and when. People need to recognise we are researchers, we have produced a body of evidence that says under these conditions this is what happened. We have not explored all the conditions that are possible. Teacher feedback on what is working and not working is hugely valuable to us to tell us what we have not done and what we need to do.

Critics like Dylan William, Carl Hendrick and Timothy Bates found that it was impossible to replicate Dweck’s findings, and that there were at best weak correlations between growth mindset and academic achievement, and between mindset interventions and academic gains. They were happy to concede that typical mindset interventions would not do any harm, but asked whether the huge amounts of money being spent on mindset would not be better invested elsewhere.

Carol Dweck seems to like the phrase ‘not yet’. She argues, in her TED Talk, that simply using the words ‘not yet’ can build students’ confidence, and her tip is often repeated by others. She also talks about mindset interventions being ‘not yet evidence-based’, which is a way of declaring her confidence that they soon will be. But, with huge financial backing, Dweck and her colleagues have recently been carrying out a lot of research and the results are now coming in. There are a small number of recent investigations that advocates of mindset interventions like to point to. For reasons of space, I’ll refer to two of them.

The first (Outes-Leon, et al., 2020) of these looked at an intervention with children in the first grades in a few hundred Peruvian secondary schools. The intervention consisted of students individually reading a text designed to introduce them to the concept of growth-mindset. This was followed by a group debate about the text, before students had to write individually a reflective letter to a friend/relative describing what they had learned. In total, this amounted to about 90 minutes of activity. Subsequently, teachers made a subjective assessment of the ‘best’ letters and attached these to the classroom wall, along with a growth mindset poster, for the rest of the school year. Teachers were also asked to take a picture of the students alongside the letters and the poster and to share this picture by email.

Academic progress was measured 2 and 14 months after the intervention and compared to a large control group. The short-term (2 months) impact of the intervention was positive for mathematics, but less so for reading comprehension. (Why?) These gains were only visible in regional schools, not at all in metropolitan schools. Similar results were found when looking at the medium-term (14 month) impact. The reasons for this are unclear. It is hypothesized that the lower-achieving students in regional schools might benefit more from the intervention. Smaller class sizes in regional schools might also be a factor. But, of course, many other explanations are possible.

The paper is entitled The Power of Believing You Can Get Smarter. The authors make it clear that they were looking for positive evidence of the intervention and they were supported by mindset advocates (e.g. David Yeager) from the start. It was funded by the World Bank, which is a long-standing advocate of growth mindset interventions. (Rather jumping the gun, the World Bank’s Mindset Team wrote in 2014 that teaching growth mindset is not just another policy fad. It is backed by a burgeoning body of empirical research.) The paper’s authors conclude that ‘the benefits of the intervention were relevant and long-lasting in the Peruvian context’, and they focus strongly on the low costs of the intervention. They acknowledge that the way the tool is introduced (design of the intervention) and the context in which this occurs (i.e., school and teacher characteristics) both matter to understand potential gains. But without understanding the role of the context, we haven’t really learned anything practical that we can take away from the research. Our understanding of the power of believing you can get smarter has not been meaningfully advanced.

The second of these studies (Yeager et al., 2019) took many thousands of lower-achieving American 9th graders from a representative sample of schools. It is a very well-designed and thoroughly reported piece of research. The intervention consisted of two 25-minute online sessions, 20 days apart, which sought to reduce the negative effort beliefs of students (the belief that having to try hard or ask for help means you lack ability), fixed-trait attributions (the attribution that failure stems from low ability) and performance avoidance goals (the goal of never looking stupid). An analysis of academic achievement at the end of the school year indicated clearly that the intervention led to improved performance. These results lead to very clear grounds for optimism about the potential of growth mindset interventions, but the report is careful to avoid overstatement. We have learnt about one particular demographic with one particular intervention, but it would be wrong to generalise beyond that. The researchers had hoped that the intervention would help to compensate for unsupportive school norms, but found that this was not the case. Instead, they found that it was when the peer norm supported the adoption of intellectual challenges that the intervention promoted sustained benefits. Context, as in the Peruvian study, was crucial. The authors write:

We emphasize that not all forms of growth mindset interventions can be expected to increase grades or advanced course-taking, even in the targeted subgroups. New growth mindset interventions that go beyond the module and population tested here will need to be subjected to rigorous development and validation processes.

I think that a reasonable conclusion from reading this research is that it may well be worth experimenting with growth mindset interventions in English language classes, but without any firm expectation of any positive impact. If nothing else, the interventions might provide useful, meaningful practice of the four skills. First, though, it would make sense to read two other pieces of research (Sisk et al., 2018; Burgoyne et al., 2020). Unlike the projects I have just discussed, these were not carried out by researchers with an a priori enthusiasm for growth-mindset interventions. And the results were rather different.

The first of these (Sisk et al., 2018) was a meta-analysis of the literature. It found that there was only a weak correlation between mindset and academic achievement, and only a weak correlation between mindset interventions and academic gains. It did, however, lend support to one of the conclusions of Yeager et al (2019), that such interventions may benefit students who are academically at risk.

The second (Burgoyne et al., 2020) found that the foundations of mind-set theory are not firm and that bold claims about mind-set appear to be overstated. Other constructs such as self-efficacy and need for achievement, [were] found to correlate much more strongly with presumed associates of mind-set.

So, where does this leave us? We are clearly a long way from ‘facts’; mindset interventions are ‘not yet evidence-based’. Carl Hendrick (2019) provides a useful summary:

The truth is we simply haven’t been able to translate the research on the benefits of a growth mindset into any sort of effective, consistent practice that makes an appreciable difference in student academic attainment. In many cases, growth mindset theory has been misrepresented and miscast as simply a means of motivating the unmotivated through pithy slogans and posters. […] Recent evidence would suggest that growth mindset interventions are not the elixir of student learning that many of its proponents claim it to be. The growth mindset appears to be a viable construct in the lab, which, when administered in the classroom via targeted interventions, doesn’t seem to work at scale. It is hard to dispute that having a self-belief in their own capacity for change is a positive attribute for students. Paradoxically, however, that aspiration is not well served by direct interventions that try to instil it.

References

Bandura, Albert (1982). Self-efficacy mechanism in human agency. American Psychologist, 37 (2): pp. 122–147. doi:10.1037/0003-066X.37.2.122.

Burgoyne, A. P., Hambrick, D. Z., & Macnamara, B. N. (2020). How Firm Are the Foundations of Mind-Set Theory? The Claims Appear Stronger Than the Evidence. Psychological Science, 31(3), 258–267. https://doi.org/10.1177/0956797619897588

Early, P. (Ed.) ELT Documents 1113 – Humanistic Approaches: An Empirical View. London: The British Council

Dweck, C. S. (2006). Mindset: The New Psychology of Success. New York: Ballantine Books

Hendrick, C. (2019). The growth mindset problem. Aeon,11 March 2019.

Maslow, A. (1943). A Theory of Human Motivation. Psychological Review, 50: pp. 370-396.

Outes-Leon, I., Sanchez, A. & Vakis, R. (2020). The Power of Believing You Can Get Smarter : The Impact of a Growth-Mindset Intervention on Academic Achievement in Peru (English). Policy Research working paper, no. WPS 9141 Washington, D.C. : World Bank Group. http://documents.worldbank.org/curated/en/212351580740956027/The-Power-of-Believing-You-Can-Get-Smarter-The-Impact-of-a-Growth-Mindset-Intervention-on-Academic-Achievement-in-Peru

Rogers, C. R. (1969). Freedom to Learn: A View of What Education Might Become. Columbus, Ohio: Charles Merill

Sisk, V. F., Burgoyne, A. P., Sun, J., Butler, J. L., Macnamara, B. N. (2018). To what extent and under which circumstances are growth mind-sets important to academic achievement? Two meta-analyses. Psychological Science, 29, 549–571. doi:10.1177/0956797617739704

Yeager, D.S., Hanselman, P., Walton, G.M. et al. (2019). A national experiment reveals where a growth mindset improves achievement. Nature 573, 364–369. https://doi.org/10.1038/s41586-019-1466-y

subtitlesAs both a language learner and a teacher, I have a number of questions about the value of watching subtitled videos for language learning. My interest is in watching extended videos, rather than short clips for classroom use, so I am concerned with incidental, rather than intentional, learning, mostly of vocabulary. My questions include:

  • Is it better to watch a video that is subtitled or unsubtitled?
  • Is it better to watch a video with L1 or L2 subtitles?
  • If a video is watched more than once, what is the best way to start and proceed? In which order (no subtitles, L1 subtitles and L2 subtitles) is it best to watch?

For help, I turned to three recent books about video and language learning: Ben Goldstein and Paul Driver’s Language Learning with Digital Video (CUP, 2015), Kieran Donaghy’s Film in Action (Delta, 2015) and Jamie Keddie’s Bringing Online Video into the Classroom (OUP, 2014). I was surprised to find no advice, but, as I explored more, I discovered that there may be a good reason for these authors’ silence.

There is now a huge literature out there on subtitles and language learning, and I cannot claim to have read it all. But I think I have read enough to understand that I am not going to find clear-cut answers to my questions.

The learning value of subtitles

It has been known for some time that the use of subtitles during extensive viewing of video in another language can help in the acquisition of that language. The main gains are in vocabulary acquisition and the development of listening skills (Montero Perez et al., 2013). This is true of both L1 subtitles (with an L2 audio track), sometimes called interlingual subtitles, (Incalcaterra McLoughlin et al, 2011) and L2 subtitles (with an L2 audio track), sometimes called intralingual subtitles or captions (Vanderplank, 1988). Somewhat more surprisingly, vocabulary gains may also come from what are called reversed subtitles (L2 subtitles and an L1 audio track) (Burczyńska, 2015). Of course, certain conditions apply for subtitled video to be beneficial, and I’ll come on to these. But there is general research agreement (an exception is Karakaş & Sariçoban, 2012) that more learning is likely to take place from watching a subtitled video in a target language than an unsubtitled one.

Opposition to the use of subtitles as a tool for language learning has mostly come from three angles. The first of these, which concerns L1 subtitles, is an antipathy to any use at all of L1. Although such an attitude remains entrenched in some quarters, there is no evidence to support it (Hall & Cook, 2012; Kerr, 2016). Researchers and, increasingly, teachers have moved on.

The second reservation that is sometimes expressed is that learners may not attend to either the audio track or the subtitles if they do not need to. They may, for example, ignore the subtitles in the case of reversed subtitles or ignore the L2 audio track when there are L1 subtitles. This can, of course, happen, but it seems that, on the whole, this is not the case. In an eye-tracking study by Bisson et al (2012), for example, it was found that most people followed the subtitles, irrespective of what kind they were. Unsurprisingly, they followed the subtitles more closely when the audio track was in a language that was less familiar. When conditions are right (see below), reading subtitles becomes a very efficient and partly automatized cognitive activity, which does not prevent people from processing the audio track at the same time (d’Ydewalle & Pavakanun, 1997).

Related to the second reservation is the concern that the two sources of information (audio and subtitles), combined with other information (images and music or sound effects), may be in competition and lead to cognitive overload, impacting negatively on both comprehension and learning. Recent research suggests that this concern is ungrounded (Kruger et al, 2014). L1 subtitles generate less cognitive load than L2 subtitles, but overload is not normally reached and mental resources are still available for learning (Baranowska, 2020). The absence of subtitles generates more cognitive load.

Conditions for learning

Before looking at the differences between L1 and L2 subtitles, it’s a good idea to look at the conditions under which learning is more likely to take place with subtitles. Some of these are obvious, others less so.

First of all, the video material must be of sufficient intrinsic interest to the learner. Secondly, the subtitles must be of a sufficiently high quality. This is not always the case with automatically generated captions, especially if the speech-to-text software struggles with the audio accent. It is also not always the case with professionally produced L1 subtitles, especially when the ‘translations are non-literal and made at the phrase level, making it hard to find connections between the subtitle text and the words in the video’ (Kovacs, 2013, cited by Zabalbeascoa et al., 2015: 112). As a minimum, standard subtitling guidelines, such as those produced for the British Channel 4, should be followed. These limit, for example, the number of characters per line to about 40 and a maximum of two lines.

For reasons that I’ll come on to, learners should be able to switch easily between L1 and L2 subtitles. They are also likely to benefit if reliably accurate glosses or hyperlinks are ‘embedded in the subtitles, making it possible for a learner to simply click for additional verbal, auditory or even pictorial glosses’ (Danan, 2015: 49).

At least as important as considerations of the materials or tools, is a consideration of what the learner brings to the activity (Frumuselu, 2019: 104). Vanderplank (2015) describes these different kinds of considerations as the ‘effects of’ subtitles on a learner and the ‘effects with’ subtitles on learner behaviour.

In order to learn from subtitles, you need to be able to read fast enough to process them. Anyone with a slow reading speed (e.g. some dyslexics) in their own language is going to struggle. Even with L1 subtitles, Vanderplank (2015: 24) estimates that it is only around the age of 10 that children can do this with confidence. Familarity with both the subject matter and with subtitle use will impact on this ability to read subtitles fast enough.

With L2 subtitles, the language proficiency of the learner related to the level of difficulty (especially lexical difficulty) of the subtitles will clearly be of some significance. It is unlikely that L2 subtitles will be of much benefit to beginners (Taylor, 2005). It also suggests that, at lower levels, materials need to be chosen carefully. On the whole, researchers have found that higher proficiency levels correlate with greater learning gains (Pujadas & Muñoz, 2019; Suárez & Gesa, 2019), but one earlier meta-analysis (Montero Perez et al., 2013) did not find that proficiency levels were significant.

Measures of general language proficiency may be too blunt an instrument to help us all of the time. I can learn more from Portuguese than from Arabic subtitles, even though I am a beginner in both languages. The degree of proximity between two languages, especially the script (Winke et al., 2010), is also likely to be significant.

But a wide range of other individual learner differences will also impact on the learning from subtitles. It is known that learners approach subtitles in varied and idiosyncratic ways (Pujolá, 2002), with some using L2 subtitles only as a ‘back-up’ and others relying on them more. Vanderplank (2019) grouped learners into three broad categories: minimal users who were focused throughout on enjoying films as they would in their L1, evolving users who showed marked changes in their viewing behaviour over time, and maximal users who tended to be experienced at using films to enhance their language learning.

Categories like these are only the tip of the iceberg. Sensory preferences, personality types, types of motivation, the impact of subtitles on anxiety levels and metacognitive strategy awareness are all likely to be important. For the last of these, Danan (2015: 47) asks whether learners should be taught ‘techniques to make better use of subtitles and compensate for weaknesses: techniques such as a quick reading of subtitles before listening, confirmation of word recognition or meaning after listening, as well as focus on form for spelling or grammatical accuracy?’

In short, it is, in practice, virtually impossible to determine optimal conditions for learning from subtitles, because we cannot ‘take into account all the psycho-social, cultural and pedagogic parameters’ (Gambier, 2015). With that said, it’s time to take a closer look at the different potential of L1 and L2 subtitles.

L1 vs L2 subtitles

Since all other things are almost never equal, it is not possible to say that one kind of subtitles offers greater potential for learning than another. As regards gains in vocabulary acquisition and listening comprehension, there is no research consensus (Baranowska, 2020: 107). Research does, however, offer us a number of pointers.

Extensive viewing of subtitled video (both L1 and L2) can offer ‘massive quantities of authentic and comprehensible input’ (Vanderplank, 1988: 273). With lower level learners, the input is likely to be more comprehensible with L1 subtitles, and, therefore, more enjoyable and motivating. This makes them often more suitable for what Caimi (2015: 11) calls ‘leisure viewing’. Vocabulary acquisition may be better served with L2 subtitles, because they can help viewers to recognize the words that are being spoken, increase their interaction with the target language, provide further language context, and increase the redundancy of information, thereby enhancing the possibility of this input being stored in long-term memory (Frumuselu et al., 2015). These effects are much more likely with Vanderplank’s (2019) motivated, ‘maximal’ users than with ‘minimal’ users.

There is one further area where L2 subtitles may have the edge over L1. One of the values of extended listening in a target language is the improvement in phonetic retuning (see, for example, Reinisch & Holt, 2013), the ability to adjust the phonetic boundaries in your own language to the boundaries that exist in the target language. Learning how to interpret unusual speech-sounds, learning how to deal with unusual mappings between sounds and words and learning how to deal with the acoustic variations of different speakers of the target language are all important parts of acquiring another language. Research by Mitterer and McQueen (2009) suggests that L2 subtitles help in this process, but L1 subtitles hinder it.

Classroom implications?

The literature on subtitles and language learning echoes with the refrain of ‘more research needed’, but I’m not sure that further research will lead to less ambiguous, practical conclusions. One of my initial questions concerned the optimal order of use of different kinds of subtitles. In most extensive viewing contexts, learners are unlikely to watch something more than twice. If they do (watching a recorded academic lecture, for example), they are likely to be more motivated by a desire to learn from the content than to learn language from the content. L1 subtitles will probably be preferred, and will have the added bonus of facilitating note-taking in the L1. For learners who are more motivated to learn the target language (Vanderplank’s ‘maximal’ users), a sequence of subtitle use, starting with the least cognitively challenging and moving to greater challenge, probably makes sense. Danan (2015: 46) suggests starting with an L1 soundtrack and reversed (L2) subtitles, then moving on to an L2 soundtrack and L2 subtitles, and ending with an L2 soundtrack and no subtitles. I would replace her first stage with an L2 soundtrack and L1 subtitles, but this is based on hunch rather than research.

This sequencing of subtitle use is common practice in language classrooms, but, here, (1) the video clips are usually short, and (2) the aim is often not incidental learning of vocabulary. Typically, the video clip has been selected as a tool for deliberate teaching of language items, so different conditions apply. At least one study has confirmed the value of the common teaching practice of pre-teaching target vocabulary items before viewing (Pujadas & Muñoz, 2019). The drawback is that, by getting learners to focus on particular items, less incidental learning of other language features is likely to take place. Perhaps this doesn’t matter too much. In a short clip of a few minutes, the opportunities for incidental learning are limited, anyway. With short clips and a deliberate learning aim, it seems reasonable to use L2 subtitles for a first viewing, and no subtitles thereafter.

An alternative frequent use of short video clips in classrooms is to use them as a springboard for speaking. In these cases, Baranowska (2020: 113) suggests that teachers may opt for L1 subtitles first, and follow up with L2 subtitles. Of course, with personal viewing devices or in online classes, teachers may want to exploit the possibilities of differentiating the subtitle condition for different learners.

REFERENCES

Baranowska, K. (2020). Learning most with least effort: subtitles and cognitive load. ELT Journal 74 (2): pp.105 – 115

Bisson, M.-J., Van Heuven, W.J.B., Conklin, K. and Tunney, R.J. (2012). Processing of native and foreign language subtitles in films: An eye tracking study. Applied Psycholingistics, 35 (2): pp. 399 – 418

Burczyńska, P. (2015). Reversed Subtitles as a Powerful Didactic Tool in SLA. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 221 – 244)

Caimi, A. (2015). Introduction. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 9 – 18)

Danan, M. (2015). Subtitling as a Language Learning Tool: Past Findings, Current Applications, and Future Paths. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 41 – 61)

d’Ydewalle, G. & Pavakanun, U. (1997). Could Enjoying a Movie Lead to Language Acquisition?. In: Winterhoff-Spurk, P., van der Voort, T.H.A. (Eds.) New Horizons in Media Psychology. VS Verlag für Sozialwissenschaften, Wiesbaden. https://doi.org/10.1007/978-3-663-10899-3_10

Frumuselu, A.D., de Maeyer, S., Donche, V. & Gutierrez Colon Plana, M. (2015). Television series inside the EFL classroom: bridging the gap between teaching and learning informal language through subtitles. Linguistics and Education, 32: pp. 107 – 17

Frumuselu, A. D. (2019). ‘A Friend in Need is a Film Indeed’: Teaching Colloquial Expressions with Subtitled Television Series. In Herrero, C. & Vanderschelden, I. (Eds.) Using Film and Media in the Language Classroom. Bristol: Multimedia Matters. pp.92 – 107

Gambier, Y. (2015). Subtitles and Language Learning (SLL): Theoretical background. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 63 – 82)

Hall, G. & Cook, G. (2012). Own-language Use in Language Teaching and Learning. Language Learning, 45 (3): pp. 271 – 308

Incalcaterra McLoughlin, L., Biscio, M. & Ní Mhainnín, M. A. (Eds.) (2011). Audiovisual Translation, Subtitles and Subtitling. Theory and Practice. Bern: Peter Lang

Karakaş, A. & Sariçoban, A. (2012). The impact of watching subtitled animated cartoons on incidental vocabulary learning of ELT students. Teaching English with Technology, 12 (4): pp. 3 – 15

Kerr, P. (2016). Questioning ‘English-only’ Classrooms: Own-language Use in ELT. In Hall, G. (Ed.) The Routledge Handbook of English Language Teaching (pp. 513 – 526)

Kruger, J. L., Hefer, E. & Matthew, G. (2014). Attention distribution and cognitive load in a subtitled academic lecture: L1 vs. L2. Journal of Eye Movement Research, 7: pp. 1 – 15

Mitterer, H. & McQueen, J. M. (2009). Foreign Subtitles Help but Native-Language Subtitles Harm Foreign Speech Perception. PLoS ONE 4 (11): e7785.doi:10.1371/journal.pone.0007785

Montero Perez, M., Van Den Noortgate, W., & Desmet, P. (2013). Captioned video for L2 listening and vocabulary learning: A meta-analysis. System, 41, pp. 720–739 doi:10.1016/j.system.2013.07.013

Pujadas, G. & Muñoz, C. (2019). Extensive viewing of captioned and subtitled TV series: a study of L2 vocabulary learning by adolescents, The Language Learning Journal, 47:4, 479-496, DOI: 10.1080/09571736.2019.1616806

Pujolá, J.- T. (2002). CALLing for help: Researching language learning strategies using help facilities in a web-based multimedia program. ReCALL, 14 (2): pp. 235 – 262

Reinisch, E. & Holt, L. L. (2013). Lexically Guided Phonetic Retuning of Foreign-Accented Speech and Its Generalization. Journal of Experimental Psychology: Human Perception and Performance. Advance online publication. doi: 10.1037/a0034409

Suárez, M. & Gesa, F. (2019) Learning vocabulary with the support of sustained exposure to captioned video: do proficiency and aptitude make a difference? The Language Learning Journal, 47:4, 497-517, DOI: 10.1080/09571736.2019.1617768

Taylor, G. (2005). Perceived processing strategies of students watching captioned video. Foreign Language Annals, 38(3), pp. 422-427

Vanderplank, R. (1988). The value of teletext subtitles in language learning. ELT Journal, 42 (4): pp. 272 – 281

Vanderplank, R. (2015). Thirty Years of Research into Captions / Same Language Subtitles and Second / Foreign Language Learning: Distinguishing between ‘Effects of’ Subtitles and ‘Effects with’ Subtitles for Future Research. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 19 – 40)

Vanderplank, R. (2019). ‘Gist watching can only take you so far’: attitudes, strategies and changes in behaviour in watching films with captions, The Language Learning Journal, 47:4, 407-423, DOI: 10.1080/09571736.2019.1610033

Winke, P., Gass, S. M., & Sydorenko, T. (2010). The Effects of Captioning Videos Used for Foreign Language Listening Activities. Language Learning & Technology, 1 (1): pp. 66 – 87

Zabalbeascoa, P., González-Casillas, S. & Pascual-Herce, R. (2015). In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences Bern: Peter Lang (pp. 105–126)

Precarity

Barely liveable hourly wages, no job security because there is no permanent contract (so employment may be terminated at short or no notice), no social security, paid health care or pension, struggling to meet everyday needs, such as food and accommodation … this is the situation for at least one in five workers in the UK and similar figures exist in many countries (e.g. one in six in New Zealand). As Bourdieu (1998: 81ff.) noted, job insecurity is now everywhere.

Many English language teachers, especially those working for private schools or universities operating like private schools, belong to what has been termed the global educational precariat. In addition to language school and university language teachers, there are hundreds of thousands of teachers, mostly American and British, working in English-medium schools ‘international schools’ around the world (Bunnell, 2016). Besides financial insecurity, many of these teachers also suffer from a lack of agency and a marginalisation of their professional identities (Poole, 2019). There’s a very useful article on ‘precarity’ in ELT Journal (Walsh, 2019) that I’d recommend.

Even teachers with reasonable pay and job security are facing attacks on their pay and working conditions. A few weeks ago in Jordan, security forces shut down the teachers’ union and arrested leading members. Teachers union leaders have also been imprisoned recently in Iran and Cambodia. The pages of the website of Education International , a global federation of teachers’ trade unions, catalogue the crises in education and the lives of teachers around the world.

Teacher bashing, in particular attacks on teacher unions, has been relentless. Four years ago, it was reported that teacher bashing had ‘reached unprecedented levels’ in the US (Saltzman, 2017: 39), where there has been a concerted attempt, over many years, to blame teachers for shortcomings in the educational system (see, for example, Kumashiro, 2012). Although it may have been the US that led the way, closely followed by Australia and the UK, attacks on teachers have become a global phenomenon. Mary Compton and Lois Weiner’s book, ‘The Global Assault on Teaching, Teachers and their Unions’ (Compton & Weiner 2008), gives examples from China to South Africa, from Denmark to Mexico, of how teachers’ pay and conditions have been eroded. The reason? Quite simply, it is because teachers have stood in the way of so-called ‘reforms’ (e.g. pay cuts). It is because they have, as they are doing now in times of COVID-19, stood in the way of what governments have wanted to do. In an earlier post, I wrote in more detail about the ways in which the World Bank has spearheaded the drive towards privatized, lower cost education around the world.

COVID-19 has, of course, made matters worse, much worse. As often as not, the pandemic has been used as an excuse to accelerate attacks on teachers that were well under way long before.

Wellbeing

In the circumstances, it is not surprising that teacher wellbeing has recently become a more talked-about topic. Precisely because there is so little of it about.

The publication earlier this year of a book about teacher wellbeing (Mercer & Gregersen, 2020) for language teachers is very timely. The authors acknowledge that real change for wellbeing [must] addresses structural and systemic levels of change and is not just a matter for individual teachers to cope with alone. They acknowledge that teachers should not have to compensate for fundamental flaws in the system as a whole that undermine their wellbeing, and they express concern about the risks associated with discussing teacher wellbeing at the individual level and not acknowledging that the systems in which teachers work may be at fault (Mercer & Gregersen, 2020: 9). But, with these caveats out of the way, the matter is closed, and the whole book is about how individuals can improve their wellbeing. Indeed, the book begins: As you read the title of this chapter, you might have thought how self-seeking or egocentric it sounds: It’s all about me? Our response is, ‘Yes, you!’ Throughout this book, we want you to focus your attention on yourself for a change, without any guilty feelings (Mercer & Gregersen, 2020: 1). Mindfulness techniques, tips for time management, ways of thinking positively and so on – it’s a compendium of self-help advice that may be helpful for language teachers. The real ravages of precarity, the real causes of so much lack of wellbeing, these do not get a mention.

Banksy_-_Grin_Reaper_With_TagPositive psychology

Mercer and Gregersen’s approach is directly inspired by the work of Martin Seligman, often referred to as the founder of ‘positive psychology’ (see, for example, Seligman, 2011; 2018). Positive psychology and Seligman’s ideas about wellbeing are not uncontested (see, for example, Bache & Reardon, 2016; Bache & Scott, 2018). The nub of the critiques is that positive psychology chooses to focus on happiness or wellbeing, rather than, say, justice, solidarity or loyalty. It articulates an underlying individualism and narrow sense of the social (Cabanas & Illouz, 2019: 68) and it is, therefore, not entirely surprising that much of the funding that made the rapid growth of positive psychology possible came from the ultra-conservative and religious institution, the John Templeton Foundation (Cabanas & Illouz, 2019: 20).

Mercer and Gregersen are not unaware of such critiques (see, for example, MacIntyre et al., 2016: 375). They mention the critiques of Barbara Ehrenreich (Ehrenreich, 2009), but, to the best of my knowledge, they have never troubled to respond to them. They have a very clear agenda – the promotion of positive psychology ideas in language teaching / learning contexts – which is made explicit in MacIntyre and Mercer (2014). A slew of articles, books and conference presentations have followed since then, and ‘Teacher Wellbeing’ is one of them. Mission seems to have been achieved.

Positive psychology has not only been criticised for its focus on the individual. Others have focused on its foundational assumptions, including decontextualized and ethnocentric claims; theoretical oversimplifications, tautologies and contradictions; methodological shortcomings; severe replicability problems; exaggerated generalizations; and even its therapeutic efficacy and scientific status (Cabanas & Illous, 2019: 29). Probably the most important of these critics was Richard Lazarus, whose work is certainly familiar to Mercer, Gregersen and their collaborators, since Lazarus’s criticisms are listed in MacIntyre and Mercer (2014) and elsewhere. These include:

  • the over-use of crosssectional research designs
  • a tendency to treat emotion too simplistically as either positive or negative
  • inadequate attention to both differences among individuals within a group as well as the overlap between groups when discussing statistically significant group differences
  • poor quality measurement of emotions.

However, as with the critiques of Ehrenreich, I have yet to find any examples of these authors actually addressing the criticisms. Instead, they prefer to talk about how problems such as those listed above need to be avoided in the future. For example, there is no doubt that the future development of the [positive psychology] approach within SLA can learn from these and other criticisms, write MacIntyre and Mercer (2014:161), and they see the future of positive psychology in language learning / teaching as being fundamentally grounded in science.

Empirical science

Acknowledging, but without actually addressing, past criticisms of the scientific shortcomings of positive psychology, MacIntyre and Mercer (2014: 15) insist that positive psychology is the empirical study of how people thrive and flourish […] it represents a form of “rebirth” for humanistic psychology, but with a stronger emphasis on empirical research. The word ‘empirical’ appears 4 times on this page and another 5 times in the article. In their follow-up book, ‘Positive Psychology in SLA’ (Macintyre et al., 2016), there is a whole section (over a third of the book) entitled ‘Empirical’. In a historical survey of positive psychology in foreign language teaching, written by close collaborators of Mercer, Gregersen and MacIntyre (Dewaele et al.,2019), the same focus on empirical science is chosen, with a description of positive psychology as being underpinned by solid empirical research. The frequency of this word choice is enough to set alarm bells ringing.

A year before the MacIntyre and Mercer article (2014), an article by Brown et al (2013) questioned one of the key empirical foundations of positive psychology, the so-called ‘critical positivity ratio’ (Fredrickson & Losada, 2005). Wikipedia explains this as the ratio of positive to negative emotions which distinguishes “flourishing” people from “languishing” people, and the ratio was 2.9013. A slightly later article (Brown et al, 2014) further debunked the work of Fredrickson, arguing that her work was full of conceptual difficulties and statistical flaws. Wikipedia now describes the ‘critical positivity ratio’ as ‘a largely discredited concept’. In contrast, Mercer and Gregersen (2020: 14) acknowledge that although the exact ratio (3:1) of positivity has been called into question by some, they reassert the value of Fredrickson’s work. They neither cite the criticisms, nor rebut them. In this, they are following a well-established tradition of positive psychology (Rhyff, 2003).

Given growing scepticism about the claims of positive psychology, MacIntyre et al (2016) elected to double-down. Even if empirical evidence for positive psychology was in short supply, it was incumbent on them to provide it. Hence, the section in their book entitled ‘Empirical’. Personally, I would have advised against it. The whole point of positive psychology, as outlined by Seligman, is to promote ‘wellbeing’. But what, exactly, is this? For some, like Mercer and Gregersen (2020: 3), it’s about finding meaning and connection in the world. For others, it’s not a ‘thing’ that needs research to uncover its essential nature, but as a social and cultural construction which is interesting as such, not least for what it can tell us about other social and cultural phenomena (Ereaut & Whiting, 2008). We may agree that it’s ‘a good thing’, but it lacks solidity as a construct. Even Seligman (2011: 15) comes to the conclusion that ‘wellbeing’ is not ‘a real thing. Rather, he says, it is a construct which has several measurable elements, each a real thing, each contributing to well-being, but none defining well-being. This, however, simply raises the question of how much of a ‘thing’ each of these elements are (Dodge et al., 2012). Seligman’s elements (Positive Emotion, Engagement, Relationships, Meaning, and Accomplishment (PERMA)) form the basis of Mercer and Gregersen’s book, but none lend themselves to clear, workable definitions. In the absence of construct validity, empirical research evidence will prove hard to find.

How well does the ‘Empirical’ section of Positive Psychology in SLA (MacIntyre et al., 2016) stand up? I don’t have space here to discuss all 7 chapters. However, I’ve selected the first of these, ‘Positive Psychology Exercises Build Social Capital for Language Learners: Preliminary Evidence’ (Gregersen et al, 2016) because it includes ‘evidence’ in the title and because it was written by two of the book’s editors. The research reported in this chapter involved five volunteer women, aged 20 -23, in an English program at an American university, who took part in a number of positive psychology exercises (PPEs) which entailed laughter, exercise, interaction with animals, listening to music, expressing gratitude and engaging in altruism. The data collected was self-rating questionnaires and some self-reflection discussion. The results indicated that the PPEs led to more positive emotions, with exercise and laughter leading to the greatest gains (but since the order of the PPEs was not randomized, and since the sample size was so small, this doesn’t really tell us anything). Some of the participants doubted the value of some of the PPEs. However, the participants developed better relationships with their partners and this may have led to gains in confidence. The authors conclude that although the present data-set is small, we see preliminary evidence of all three pillars of positive psychology supporting positive outcomes (p.164).

My own view is that this is wishful thinking. The only thing that this study does is to indicate that in this particular context with these particular learners, feeling good about what you are doing may help things along a bit. In addition, this has absolutely nothing to do with ‘social capital’, which the authors seem to have misunderstood. Citing an article by Nawyn et al (2012), they describe ‘social capital’ as emerging friendships that provide learners with positive emotional experiences and intangible resources for language acquisition (Gregersen et al, 2016: 147). But this is a misreading of the Nawyn et al article, which adheres fairly closely to Bourdieu’s notion of social capital as fundamentally about power relations, but extends it beyond purely economic power relations. Given the connections between the lack of teacher wellbeing and precarity, and given Bourdieu’s writings about precarity, the authors’ attempt to bring Bourdieu into their justification of positive psychological experiences, best undertaken at the individual level (Gregersen et al., 2016: 149), is really quite extraordinary. And if this is empirical evidence for anything, I’m a positive psychologist!

Cui bono?

It may be that some of the exercises suggested in Teacher Wellbeing will be of benefit to some, even many, teachers. Maybe. But the claims of empirical science behind this book are questionable, to say the least. More beneficial to teacher wellbeing would almost certainly be strong teacher unions, but these are only mentioned in passing. There is, incidentally, some recent evidence from the U.S. (Han, 2020), that highly unionized districts have higher average teacher quality and improved educational outcomes. But positive psychologists seem unwilling to explore the role that unions might play in teacher wellbeing. It is not, perhaps, coincidental that the chapter in Teacher Wellbeing that deals with teachers in their workplaces contains three recommendations for further reading, and all three are written for managers. The first on the list is called Build It: The Rebel Playbook for World-class Employee Engagement (Elliott & Corey, 2018).

The problems that teachers are facing, exacerbated by COVID-19, are fundamentally systemic and political. Mercer and Gregersen may be aware that there is a risk associated with discussing teacher wellbeing at the individual level and not acknowledging that the systems in which teachers work may be at fault, but it’s a risk they have chosen to take, believing that their self-help ideas are sufficiently valuable to make the risk worthwhile. I agree with a writer on the National Education Association blog, who thinks that self-care is important but argues that it is an insufficient and entirely too passive way to address the problems teachers are encountering today.

There are other ways of conceptualising teacher wellbeing (see, for example, the entries on the Education International website with this tag) and the Mercer / Gregersen book may be viewed as an attempt to ‘claim the field’. To return to Paul Walsh, whose article about precarity I recommended earlier, it is useful to see the current interest in teacher wellbeing in context. He writes: Well-being has entered ELT at a time when teachers have been demanding greater visibility and acceptance of issues such as mental health, poor working conditions, non-native speaker and gender equality. Yet to subsume these issues under a catch-all category does them a disservice. Because as soon as we put these issues under the well-being umbrella, they effectively vanish in a cloud of conceptual mist—and lose their sharp edges.

In this sense, a book like Teacher Wellbeing, although well-meaning, may well contribute to the undermining of the very thing it seeks to promote.

References

Bache, I. & Reardon, L. (2016) The Politics and Policy of Wellbeing: Understanding the Rise and Significance of a New Agenda. Cheltenham: Edward Elgar

Bache, I. and Scott, K. (eds.) (2018). The Politics of Wellbeing: Theory, Policy and Practice. Palgrave Macmillan

Bourdieu, P. (1998). Acts of Resistance: against the new myths of our time. Cambridge: Polity Press

Brown, N. J. L., Sokal, A. D., & Friedman, H. L. (2013). The complex dynamics of wishful thinking: The critical positivity ratio. American Psychologist, pp. 68, 801–813. http://dx.doi.org/10.1037/a0032850

Brown, N. J. L., MacDonald, D. A., Samanta, M. P., Friedman, H. L. & Coyne, J. C. (2014). A critical reanalysis of the relationship between genomics and well-being. Proceedings of the National Academy of Sciences of the United States of America, 111, 12705–12709. http://dx.doi.org/10.1073/pnas.1407057111

Bunnell, T. (2016). Teachers in International schools: a global educational ‘precariat’? Globalisation, Societies and Education, 14(4), pp. 543-559

Cabanas, E. & Illouz, E. (2019). Manufacturing Happy Citizens. Cambridge: Polity Press

Compton, M. & Weiner, L. (Eds.) (2008). The Global Assault on Teaching, Teachers and their Unions. Palgrave Macmillan

Dewaele, J. M., Chen, X., Padilla, A. M. & Lake, J. (2019). The Flowering of Positive Psychology in Foreign Language Teaching and Acquisition Research. Frontiers in psychology, 10, 2128. https://doi.org/10.3389/fpsyg.2019.02128

Dodge, R., Daly, A., Huyton, J. & Sanders, L. (2012). The challenge of defining wellbeing. International Journal of Wellbeing, 2(3), pp. 222-235. doi:10.5502/ijw.v2i3.4

Ehrenreich, B. (2009). Bright-Sided: How the relentless promotion of positive thinking has undermined America. New York: Metropolitan Books

Elliott, G. & Corey, D. (2018). Build It: The Rebel Playbook for World-class Employee Engagement. Chichester: Wiley

Ereaut, G. & Whiting, R. (2008). What do we mean by ‘wellbeing’? And why might it matter? Research Report No DCSF-RW073 Department for Children, Schools and Families https://dera.ioe.ac.uk/8572/1/dcsf-rw073%20v2.pdf

Fredrickson, B. L. & Losada, M.F. (2005). Positive affect and the complex dynamics of human flourishing. American Psychology, 60 (7): pp. 678–86. doi:10.1037/0003-066X.60.7.678

Gregersen, T., MacIntyre, P.D. & Meza, M. (2016). Positive Psychology Exercises Build Social Capital for Language Learners: Preliminary Evidence. In MacIntyre, P.D., Gregersen, T. & Mercer, S. (Eds.) Positive Psychology in SLA. Bristol: Multilingual Matters. pp.147 – 167

Han, E. S. (2020). The Myth of Unions’ Overprotection of Bad Teachers: Evidence from the District–Teacher Matched Data on Teacher Turnover. Industrial Relations, 59 (2): pp. 316 – 352

Kumashiro, K. K. (2012). Bad Teacher! How Blaming Teachers Distorts the Bigger Picture. Teachers College Press

Lazarus, R. S. (2003). Target article: Does the positive psychology movement have legs? Psychological Inquiry, 14 (2): pp. 93 – 109

MacIntyre, P.D. & Mercer, S. (2014). Introducing positive psychology to SLA. Studies in Second Language Learning and Teaching, 4 (2): pp. 153 -172

MacIntyre, P.D., Gregersen, T. & Mercer, S. (2016). Conclusion. In MacIntyre, P.D., Gregersen, T. & Mercer, S. (Eds.) Positive Psychology in SLA. Bristol: Multilingual Matters. pp.374 – 379

MacIntyre, P.D., Gregersen, T. & Mercer, S. (Eds.) (2016). Positive Psychology in SLA. Bristol: Multilingual Matters.

Mercer, S. & Gregersen, T. (2020). Teacher Wellbeing. Oxford: OUP

Nawyn, S.J., Gjokai, L., Agbenyiga, D.L. & Grace, B. (2012). Linguistic isolation, social capital, and immigrant belonging. Journal of Contemporary Ethnography 41 (3), pp.255 -282

Poole, A. (2019). International Education Teachers’ Experiences as an Educational Precariat in China. Journal of Research in International Education, 18 (1): pp. 60-76

Ryff, C. D. (2003). Corners of myopia in the positive psychology parade. Psychological Inquiry, 14: pp. 153–159

Saltzman, K. J. 2017. Scripted Bodies. Routledge

Seligman, M. (2011). Flourish – A new understanding of happiness and well-being – and how to achieve them. London: Nicholas Brealey Publishing.

Seligman, M. (2018). PERMA and the building blocks of well-being. The Journal of Positive Psychology, DOI: 10.1080/17439760.2018.1437466

Walsh, P. (2019). Precarity. ELT Journal, 73 (4), pp.459 – 462

Wong, P. T. P., & Roy, S. (2017). Critique of positive psychology and positive interventions. In N. J. L. Brown, T. Lomas, & F. J. Eiroa-Orosa (Eds.) The Routledge International Handbook of Critical Positive Psychology. London: Routledge.