Archive for the ‘research’ Category

There’s a video on YouTube from Oxford University Press in which the presenter, the author of a coursebook for primary English language learners (‘Oxford Discover’), describes an activity where students have a short time to write some sentences about a picture they have been shown. Then, working in pairs, they read aloud their partner’s sentences and award themselves points, with more points being given for sentences that others have not come up with. For lower level, young learners, it’s not a bad activity. It provides opportunities for varied skills practice of a limited kind and, if it works, may be quite fun and motivating. However, what I found interesting about the video is that it is entitled ‘How to teach critical thinking skills: speaking’ and the book that is being promoted claims to develop ‘21st Century Skills in critical thinking, communication, collaboration and creativity’. The presenter says that the activity achieves its critical thinking goals by promoting ‘both noticing and giving opinions, […] two very important critical thinking skills.’

Noticing (or observation) and giving opinions are often included in lists of critical thinking skills, but, for this to be the case, they must presumably be exercised in a critical way – some sort of reasoning must be involved. This is not the case here, so only the most uncritical understanding of critical thinking could consider this activity to have any connection to critical thinking. Whatever other benefits might accrue from it, it seems highly unlikely that the students’ ability to notice or express opinions will be developed.

My scepticism is not shared by many users of the book. Oxford University Press carried out a scientific-sounding ‘impact study’: this consisted of a questionnaire (n = 198) in which ‘97% of teachers reported that using Oxford Discover helps their students to improve in the full range of 21st century skills, with critical thinking and communication scoring the highest’.

Enthusiasm for critical thinking activities is extremely widespread. In 2018, TALIS, the OECD Teaching and Learning International Survey (with more than 4000 respondents) found that ‘over 80% of teachers feel confident in their ability to vary instructional strategies in their classroom and help students think critically’ and almost 60% ‘frequently or always’ ‘give students tasks that require students to think critically.’ Like the Oxford ‘impact study’, it’s worth remembering that these are self-reporting figures.

This enthusiasm is shared in the world of English language teaching, reflected in at least 17 presentations at the 2021 IATEFL conference that discussed practical ideas for promoting critical thinking. These ranged from the more familiar (e.g. textual analysis in EAP) to the more original – developing critical thinking through the use of reading reaction journals, multicultural literature, fables, creative arts performances, self-expression, escape rooms, and dice games.

In most cases, it would appear that the precise nature of the critical thinking that was ostensibly being developed was left fairly vague. This vagueness is not surprising. Practically the only thing that writers about critical thinking in education can agree on is that there is no general agreement about what, precisely, critical thinking is. Lai (2011) offers an accessible summary of a range of possible meanings, but points out that, in educational contexts, its meaning is often rather vague and encompasses other concepts (such as higher order thinking skills) which also lack clarity. Paul Dummett and John Hughes (2019: 4) plump for ‘a mindset that involves thinking reflectively, rationally and reasonably’ – a vague definition which leaves unanswered two key questions: to what extent is it a skill set or a disposition? Are these skills generic or domain specific?

When ‘critical thinking’ is left undefined, it is impossible to evaluate the claims that a particular classroom activity will contribute to the development of critical thinking. However, irrespective of the definition, there are good reasons to be sceptical about the ability of educational activities to have a positive impact on the generic critical thinking skills of learners in English language classes. There can only be critical-thinking value in the activity described at the beginning of this post if learners somehow transfer the skills they practise in the activity to other domains of their lives. This is, of course, possible, but, if we approach the question with a critical disposition, we have to conclude that it is unlikely. We may continue to believe the opposite, but this would be an uncritical act of faith.

The research evidence on the efficacy of teaching generic critical thinking is not terribly encouraging (Tricot & Sweller, 2014). There’s no shortage of anecdotal support for classroom critical thinking, but ‘education researchers have spent over a century searching for, and failing to find evidence of, transfer to unrelated domains by the use of generic-cognitive skills’ (Sweller, 2022). One recent meta-analysis (Huber & Kuncel, 2016) found insufficient evidence to justify the explicit teaching of generic critical thinking skills at college level. In an earlier blog post https://adaptivelearninginelt.wordpress.com/2020/10/16/fake-news-and-critical-thinking-in-elt/ looking at the impact of critical thinking activities on our susceptibility to fake news, I noted that research was unable to find much evidence of the value of media literacy training. When considerable time is devoted to generic critical thinking training and little or no impact is found, how likely is it that the kind of occasional, brief one-off activity in the ELT classroom will have the desired impact? Without going as far as to say that critical thinking activities in the ELT classroom have no critical-thinking value, it is uncontentious to say that we still do not know how to define critical thinking, how to assess evidence of it, or how to effectively practise and execute it (Gay & Clark, 2021).

It is ironic that there is so little critical thinking about critical thinking in the world of English language teaching, but it should not be particularly surprising. Teachers are no more immune to fads than anyone else (Fuertes-Prieto et al., 2020). Despite a complete lack of robust evidence to support them, learning styles and multiple intelligences influenced language teaching for many years. Mindfulness, growth mindsets, grit are more contemporary influences and, like critical thinking, will go the way of learning styles when the commercial and institutional forces that currently promote them find the lack of empirical supporting evidence problematic.

Critical thinking is an educational aim shared by educational authorities around the world, promoted by intergovernmental bodies like the OECD, the World Bank, the EU, and the United Nations. In Japan, for example, the ‘Ministry of Education (MEXT) puts critical thinking (CT) at the forefront of its ‘global jinzai’ (human capital for a global society) directive’ (Gay & Clark, 2021). It is taught as an academic discipline in some universities in Russia (Ivlev et al, 2021) and plans are underway to introduce it into schools in Saudi Arabia. https://www.arabnews.com/node/1764601/saudi-arabia I suspect that it doesn’t mean quite the same thing in all these places.

Critical thinking is also an educational aim that most teachers can share. Few like to think of themselves as Gradgrinds, bashing facts into their pupils’ heads: turning children into critical thinkers is what education is supposed to be all about. It holds an intuitive appeal, and even if we (20% of teachers in the TALIS survey) lack confidence in our ability to promote critical thinking in the classroom, few of us doubt the importance of trying to do so. Like learning styles, multiple intelligences and growth mindsets, it seems possible that, with critical thinking, we are pushing the wrong thing, but for the right reasons. But just how much evidence, or lack of evidence, do we need before we start getting critical about critical thinking?

References

Dummett, P. & Hughes, J. (2019) Critical Thinking in ELT. Boston: National Geographic Learning

Fuertes-Prieto, M.Á., Andrés-Sánchez, S., Corrochano-Fernández, D. et al. (2020) Pre-service Teachers’ False Beliefs in Superstitions and Pseudosciences in Relation to Science and Technology. Science & Education 29, 1235–1254 (2020). https://doi.org/10.1007/s11191-020-00140-8

Gay, S. & Clark, G. (2021) Revisiting Critical Thinking Constructs and What This Means for ELT. Critical Thinking and Language Learning, 8 (1): pp. 110 – 147

Huber, C.R. & Kuncel, N.R. (2016) Does College Teach Critical Thinking? A Meta-Analysis. Review of Educational Research. 2016: 86 (2) pp.:431-468. doi:10.3102/0034654315605917

Ivlev, V. Y., Pozdnyakov, M. V., Inozemtsez, V. A. & Chernyak, A. Z. (2021) Critical Thinking in the Structure of Educational Programs in Russian Universities. Advances in Social Science, Education and Humanities Research, volume 555: pp. 121 -128

Lai, E.R. 2011. Critical Thinking: A Literature Review. Pearson. http://images.pearsonassessments.com/images/tmrs/CriticalThinkingReviewFINAL.pdf

Sweller, J. (2022) Some Critical Thoughts about Critical and Creative Thinking. Sydney: The Centre for Independent Studies Analysis Paper 32

Tricot, A., & Sweller, J. (2014) Domain-specific knowledge and why teaching generic skills does not work. Educational Psychology Review, 26, 265- 283.

In the latest issue of ‘Language Teaching’, there’s a ‘state-of-the-art’ article by Frank Boers entitled ‘Glossing and vocabulary learning’. The effect of glosses (‘a brief definition or synonym, either in L1 or L2, which is provided with [a] text’ (Nation, 2013: 238)) on reading comprehension and vocabulary acquisition has been well researched over the years. See Kim et al. (2020) for just one recent meta-analysis.

It’s a subject I have written about before on this blog (see here), when I focussed on Plonsky ad Ziegler (2016), a critical evaluation of a number of CALL meta-analyses, including a few that investigated glosses. Plonsky and Ziegler found that glosses can have a positive effect on language learning, that digital glosses may be more valuable than paper-based ones, and that both L1 and L2 glosses can be beneficial (clearly, the quality / accuracy of the gloss is as important as the language it is written in). Different learners have different preferences. Boers’ article covers similar ground, without, I think, adding any new takeaways. It concludes with a predictable call for further research.

Boers has a short section on the ‘future of glossing’ in which he notes that (1) ‘onscreen reading [is] becoming the default mode’, and (2) that ‘materials developers no longer need to create glosses themselves, but can insert hyperlinks to online resources’. This is not the future, but the present. In my last blog post on glossing (August 2017), I discussed Lingro, a digital dictionary tool that you can have running in the background, allowing you to click on any word on any website and bring up L1 or L2 glosses. My reservation about Lingro was that the quality of the glosses left much to be desired, relying as they did on Wiktionary. Things would be rather different if it used decent content – sourced, for example, from Oxford dictionaries, Robert (for French) or Duden (for German).

And this is where the content for the Google Dictionary for Chrome extension comes from. It’s free, and takes only seconds to install. It allows you to double-click on a word to bring up translations or English definitions. One more click will take you to a more extensive dictionary page. It also allows you to select a phrase or longer passage and bring up translations generated by Google Translate. It allows you to keep track of the items you have looked up, and to download these on a spreadsheet, which can then be converted to flashcards (e.g. Quizlet) if you wish. If you use the Safari browser, a similar tool is already installed. It has similar features to the Google extension, but also offers you the possibility of linking to examples of the targeted word in web sources like Wikipedia.

Boers was thinking of the provision of hyperlinks, but with these browser extensions it is entirely up to the reader of a text to decide how many and which items to look up, what kind of items (single words, phrases or longer passages) they want to look up, how far they want to explore the information available to them, and what they want to do with the information (e.g. store / record it).

It’s extraordinary that a ‘state-of-the-art article’ in an extremely reputable journal should be so out of date. The value of glossing in language learning is in content-focussed reading, and these tools mean that any text on the web can be glossed. I think this means that further research of the kind that Boers means would be a waste of time and effort. The availability of free technology does not, of course, solve all our problems. Learners will continue to benefit from guidance, support and motivation in selecting appropriate texts to read. They will likely benefit from training in optimal ways of using these browser extensions. They may need help in finding a balance between content-focussed reading and content-focussed reading with a language learning payoff.

References

Boers, F. (2022). Glossing and vocabulary learning. Language Teaching, 55 (1), 1 – 23

Kim, H.S., Lee, J.H. & Lee, H. (2020). The relative effects of L1 and L2 glosses on L2 learning: A meta-analysis. Language Teaching Research. December 2020.

Nation, I.S.P. (2013). Learning Vocabulary in Another Language. Cambridge: Cambridge University Press

Plonsky, L. & Ziegler, N. (2016). The CALL–SLA interface: insights from a second-order synthesis. Language Learning & Technology 20 / 2: 17 – 37

The world of language learning and teaching is full of theoretical constructs and claims, most of which have their moment of glory in the sun before being eclipsed and disappearing from view. In a recent article looking at the theoretical claims of translanguaging enthusiasts, Jim Cummins (2021) suggests that three criteria might be used to evaluate them:

1 Empirical adequacy – to what extent is the claim consistent with all the relevant empirical evidence?

2 Logical coherence – to what extent is the claim internally consistent and non-contradictory?

3 Consequential validity – to what extent is the claim useful in promoting effective pedagogy and policies?

Take English as a Lingua Franca (ELF), for example. In its early days, there was much excitement about developing databases of ELF usage in order to identify those aspects of pronunciation and lexico-grammar that mattered for intercultural intelligibility. The Lingua Franca Core (a list of pronunciation features that are problematic in ELF settings when ELF users mix them up) proved to be the most lasting product of the early empirical research into ELF (Jenkins, 2000). It made intuitive good sense, was potentially empowering for learners and teachers, was clearly a useful tool in combating native-speakerism, and was relatively easy to implement in educational policy and practice.

But problems with the construct of ELF quickly appeared. ELF was a positive reframing of the earlier notion of interlanguage – an idea that had deficit firmly built in, since interlanguage was a point that a language learner had reached somewhere on the way to being like a native-speaker. Interlanguage contained elements of the L1, and this led to interest in how such elements might become fossilized, a metaphor with very negative connotations. With a strong desire to move away from framings of deficit, ELF recognised and celebrated code-switching as an integral element in ELF interactions (Seidlhofer, 2011: 105). Deviations from idealised native-speaker norms of English were no longer to be seen as errors in need of correction, but as legitimate forms of the language (of ELF) itself.

However, it soon became clear that it was not possible to describe ELF in terms of the particular language forms that its users employed. In response, ELF researchers reframed ELF. The focus shifted to how people of different language backgrounds used English to communicate in particular situations – how they languaged, in other words. ELF was no longer a thing, but an action. This helped in terms of internal consistency, but most teachers remained unclear about how the ELF.2 insight should impact on their classroom practices. If we can’t actually say what ELF looks like, what are teachers supposed to do with the idea? And much as we might like to wish away the idea of native speakers (and their norms), these ideas are very hard to expunge completely (MacKenzie, 2014: 170).

Twenty years after ELF became widely used as a term, ELF researchers lament the absence of any sizable changes in classroom practices (Bayyurt & Dewey, 2020). There are practices that meet the ELF seal of approval (see, for example, Kiczkowiak & Lowe, 2018), and these include an increase in exposure to the diversity of English use worldwide, engagement in critical classroom discussion about the globalisation of the English language, and non-penalisation of innovative, but intelligible forms (Galloway, 2018: 471). It is, however, striking that these practices long pre-date the construct of ELF. They are not direct products of ELF.

Part of the ‘problem’, as ELF researchers see it, has been that ELF has been so hard to define. Less generously, we might suggest that the construct of ELF was flawed from the start. Useful, no doubt, as a heuristic, but time to move on. Jennifer Jenkins, one of the most well-known names in ELF, has certainly not been afraid to move on. Her article (Jenkins, 2015) refines ELF.2 into ELF.3, which she now labels as ‘English as a Multilingual Franca’. In this reframed model, ELF is not so much concerned with the difference between native speakers and non-native speakers, as with the difference between monolinguals and multilinguals. Multilingual, rather than ‘English’, is now the superordinate attribute. Since ELF is now about interactions, rather than ELF as a collection of forms, it follows, in ELF.3, that ELF may not actually contain any English forms at all. There is a logic here, albeit somewhat convoluted, but there’s also a problem for ELF as a construct, too. If ELF is fundamentally about multilingual communication, what need is there for the term ‘ELF’? ‘Translanguaging’ will do perfectly well instead. The graph from Google Trends reveals the rises and falls of these two terms in the academic discourse space. After peaking in 2008 the term ‘English as a Lingua Franca’ now appears to be in irreversible decline.

So, let’s now turn to ‘translanguaging’. What do Cummins, and others, have to say about the construct? The word has not been around for long. Most people trace it back to the end of the last century (Baker, 2001) and a set of bilingual pedagogical practices in the context of Welsh-English bilingual programmes intended to revitalise the Welsh language. In the early days, translanguaging was no more than a classroom practice that allowed or encouraged the use (by both learners and teachers) of more than one language for the purposes of study. The object of study might be another language, or it might be another part of the curriculum. When I wrote a book about the use of L1 in the learning and teaching of English (Kerr, 2014), I could have called it ‘Translanguaging Activities’, but the editors and I felt that the word ‘translanguaging’ might be seen as obscure jargon. I defined the word at the time as ‘similar to code-switching, the process of mixing elements form two languages’.

But obscure jargon no longer. There is, for example, a nice little collection of activities that involve L1 for the EFL / ESL classroom put together by Jason Anderson http://www.jasonanderson.org.uk/downloads/Jasons_ideas_for_translanguaging_in_the_EFL_ESL_classroom.pdf that he has chosen to call ‘Ideas for translanguaging’. In practical terms, there’s nothing here that you might not have found twenty or more years ago (e.g. in Duff, 1989; or Deller & Rinvolucri, 2002), long before anyone started using the word ‘translanguaging’. Anderson’s motivation for choosing the word ‘translanguaging’ is that he hopes it will promote a change of mindset in which a spirit of (language) inclusivity prevails (Anderson, 2018). Another example: the different ways that L1 may be used in a language classroom have recently been investigated by Rabbidge (2019) in a book entitled ‘Translanguaging in EFL Contexts’. Rabbidge offers a taxonomy of translanguaging moments. These are a little different from previous classifications (e.g. Ellis, 1994; Kim & Elder, 2005), but only a little. The most significant novelty is that these moments are now framed as ‘translanguaging’, rather than as ‘use of L1’. Example #3: the most well-known and widely-sold book that offers practical ideas that are related to translanguaging is ‘The Translanguaging Classroom’ by García and colleagues (2017). English language teachers working in EFL / ESL / ESOL contexts are unlikely to find much, if anything, new here by way of practical ideas. What they will find, however, is a theoretical reframing. It is the theoretical reframing that Anderson and Rabbidge draw their inspiration from.

The construct of translanguaging, then, like English as a Lingua Franca, has brought little that is new in practical terms. Its consequential validity does not really need to be investigated, since the pedagogical reasons for some use of other languages in the learning / teaching of English were already firmly established (but not, perhaps, widely accepted) a long time ago. How about the theory? Does it stand up to closer scrutiny any better than ELF?

Like ELF, ‘translanguaging’ is generally considered not to be a thing, but an action. And, like ELF, it has a definition problem, so precisely what kind of action this might be is open to debate. For some, it isn’t even an action: Tian et al (2021: 4) refer to it as ‘more like an emerging perspective or lens that could provide new insights to understand and examine language and language (in) education’. Its usage bounces around from user to user, each of whom may appropriate it in different ways. It is in competition with other terms including translingual practice, multilanguaging, and plurilingualism (Li, 2018). It is what has been called a ‘strategically deployable shifter’ (Moore, 2015). It is also unquestionably a word that sets a tone, since ‘translanguaging’ is a key part of the discourse of multilingualism / plurilingualism, which is in clear opposition to the unfavourable images evoked by the term ‘monolingualism’, often presented as a methodological mistake or a kind of subjectivity gone wrong (Gramling, 2016: 4). ‘Translanguaging’ has become a hooray word: criticize it at your peril.

What started as a classroom practice has morphed into a theory (Li, 2018; García, 2009), one that is and is likely to remain unstable. The big questions centre around the difference between ‘strong translanguaging’ (a perspective that insists that ‘named languages’ are socially constructed and have no linguistic or cognitive reality) and ‘weak translanguaging’ (a perspective that acknowledges boundaries between named languages but seeks to soften them). There are discussions, too, about what to call these forms of translanguaging. The ‘strong’ version has been dubbed by Cummins (2021) ‘Unitary Translanguaging Theory’ and by Bonacina-Pugh et al. (2021) ‘Fluid Languaging Approach’. Corresponding terms for the ‘weak’ version are ‘Crosslinguistic Translanguaging Theory’ and ‘Fixed Language Approach’. Subsidiary, related debates centre around code-switching: is it a form of translanguaging or is it a construct better avoided altogether since it assumes separate linguistic systems (Cummins, 2021)?

It’s all very confusing. Cenoz and Gorter (2021) in their short guide to pedagogical translanguaging struggle for clarity, but fail to get there. They ‘completely agree’ with García about the fluid nature of languages as ‘social constructs’ with ‘no clear-cut boundaries’, but still consider named languages as ‘distinct’ and refer to them as such in their booklet. Cutting your way through this thicket of language is a challenge, to put it mildly. It’s also probably a waste of time. As Cummins (2021: 16) notes, the confusion is ‘completely unnecessary’ since ‘there is no difference in the instructional practices that are implied by so-called strong and weak versions of translanguaging’. There are also more important questions to investigate, not least the extent to which the approaches to multilingualism developed by people like García in the United States are appropriate or effective in other contexts with different values (Jaspers, 2018; 2019).

The monolingualism that both ELF and translanguaging stand in opposition to may be a myth, a paradigm or a pathology, but, whatever it is, it is deeply embedded in the ways that our societies are organised, and the ways that we think. It is, writes David Gramling (2016: 3), ‘clearly not yet inclined to be waved off the stage by a university professor, nor even by a ‘multilingual turn’.’ In the end, ELF failed to have much impact. It’s time for translanguaging to have a turn. So, out with the old, in with the new. Or perhaps not really all that new at all.

The king is dead. Long live the king and a happy new year!

References

Anderson, J. (2018) Reimagining English language learners from a translingual perspective. ELT Journal 72 (1): 26 – 37

Baker, C. (2001) Foundations of Bilingual Education and Bilingualism, 3rd edn. Bristol: Multilingual Matters

Bayyurt, Y. & Dewey, M. (2020) Locating ELF in ELT. ELT Journal, 74 (4): 369 – 376

Bonacina-Pugh, F., Da Costa Cabral, I., & Huang, J. (2021) Translanguaging in education. Language Teaching, 54 (4): 439-471

Cenoz, J. & Gorter, D. (2021) Pedagogical Translanguaging. Cambridge: Cambridge University Press

Cummins, J. (2021) Translanguaging: A critical analysis of theoretical claims. In Juvonen, P. & Källkvist, M. (Eds.) Pedagogical Translanguaging: Theoretical, Methodological and Empirical Perspectives. Bristol: Multilingual Matters pp. 7 – 36

Deller, S. & Rinvolucri, M. (2002) Using the Mother Tongue. Peaslake, Surrey: Delta

Duff, A. (1989) Translation. Oxford: OUP

Ellis, R. (1994) Instructed Second Language Acquisition. Oxford: OUP

Galloway, N. (2018) ELF and ELT Teaching Materials. In Jenkins, J., Baker, W. & Dewey, M. (Eds.) The Routledge Handbook of English as a Lingua Franca. Abingdon, Oxon.: Routledge, pp. 468 – 480.

García, O., Ibarra Johnson, S. & Seltzer, K. (2017) The Translanguaging Classroom. Philadelphia: Caslon

García, O. (2009) Bilingual Education in the 21st Century: A Global Perspective. Malden / Oxford: Wiley / Blackwell

Gramling, D. (2016) The Invention of Monolingualism. New York: Bloomsbury

Jaspers, J. (2019) Authority and morality in advocating heteroglossia. Language, Culture and Society, 1: 1, 83 – 105

Jaspers, J. (2018) The transformative limits of translanguaging. Language & Communication, 58: 1 – 10

Jenkins, J. (2000) The Phonology of English as an International Language. Oxford: Oxford University Press

Jenkins, J. (2015) Repositioning English and multilingualism in English as a lingua franca. Englishes in Practice, 2 (3): 49-85

Kerr, P. (2014) Translation and Own-language Activities. Cambridge: Cambridge University Press

Kiczkowiak, M. & Lowe, R. J. (2018) Teaching English as a Lingua Franca. Stuttgart: Delta

Kim, S.-H. & Elder, C. (2005) Language choices and pedagogical functions in the foreign language classroom: A cross-linguistic functional analysis of teacher talk. Language Teaching Research, 9 (4): 355 – 380

Li, W. (2018) Translanguaging as a Practical Theory of Language. Applied Linguistics, 39 (1): 9 – 30

MacKenzie, I. (2014) English as a Lingua Franca. Abingdon, Oxon.: Routledge

Moore, R. (2015) From Revolutionary Monolingualism to Reactionary Multilingualism: Top-Down Discourses of Linguistic Diversity in Europe, 1794 – present. Language and Communication, 44: 19 – 30

Rabbidge, M. (2019) Translanguaging in EFL Contexts. Abingdon, Oxon.: Routledge

Seidlhofer, B. (2011) Understanding English as a Lingua Franca. Oxford: OUP

Tian, Z., Aghai, L., Sayer, P. & Schissel, J. L. (Eds.) (2020) Envisioning TESOL through a translanguaging lens: Global perspectives. Cham, CH: Springer Nature.

We need to talk

Posted: December 13, 2021 in Discourse, research
Tags: , , ,

In 1994, in a well-known TESOL Quarterly article entitled ‘The dysfunctions of theory/practice discourse’, Mark A. Clarke explored the imbalance in the relationship between TESOL researchers and English language teachers, and the way in which the former typically frame the latter as being less expert than themselves. In the last 25 years, the topic has regularly resurfaced, most recently with the latest issue of the Modern Language Journal, a special issue devoted entirely to ‘the Research-Practice Dialogue in Second Language Learning and Teaching’ (Sato & Loewen, 2022). At the heart of the matter is the fact that most teachers are just not terribly interested in research and rarely, if ever, read it (Borg, 2009). Much has been written on whether or not this matters, but that is not my concern here.

Sato and Loewen’s introductory article reviews the reasons behind the lack of dialogue between researchers and teachers, and, in an unintentionally comic meta move, argue that more research is needed into teachers’ lack of interest in research. This is funny because one of the reasons for a lack of dialogue between researchers and teachers is that ‘teachers have been researched too much ON and not enough WITH’ (Farrell, 2016): most research has not been carried out ‘for the teacher’s benefit and needs’, with the consequence being that ‘the result is purely academic’. Sato and Loewen’s primary focus in the article is on ‘classroom teachers’, with whom they would like to see more ‘dialogue’, but, as they acknowledge, they publish in a research journal whose ‘most likely readers are researchers’. They do not appear to have read Alan Maley’s ‘‘More Research is Needed’ – A Mantra Too Far?’ (Maley, 2016). Perhaps the article (and the Humanising Language Teaching magazine it is from) passed under their radar because it’s written for teachers (not researchers), it’s free and does not have an impact factor?

I wasn’t entirely convinced by the argument that more research about research is needed, not least because Sato and Loewen provide a fairly detailed analysis of the obstacles that exist to dialogue between researchers and teachers. They divide these into two categories:

Epistemological obstacles: the framing of researchers as generators of knowledge and teachers as consumers of knowledge; teachers’ scepticism about the relevance of some research findings to real-world teaching situations; the different discourse communities inhabited by researchers and teachers, as evidenced by the academic language choices of the former.

Practical obstacles: institutional expectations for researchers to publish in academic journals and a lack of time for researchers to engage in dialogue with teachers; teachers’ lack of time and lack of access to research.

Nothing new here, nothing contentious, either. Nothing new, either, in their argument that more dialogue between researchers and teachers would be of mutual benefit. They acknowledge that ‘In the current status of the research-practice relationship, it is researchers who are concerned about transferability of their findings to classrooms. Practitioners may not have burning motivation or urgent needs to reach out to researchers’. Consequently, it is researchers who should ‘shoulder the lion’s share of responsibility in this initiative’. This implies that, while the benefit could be mutual, it is not mutually proportionate, since researchers have both more to lose and more to gain.

They argue that it would be helpful to scrutinize closely the relationship between researchers and teachers (they prefer to use the word ‘practitioners’) and that researchers need to reflect on their own beliefs and practices, in particular the way that researchers are stake-holders in the research-practice relationship. I was disappointed that they didn’t go into more detail here and would like to suggest one angle of the ‘cui bono’ question worth exploring. The work of TESOL researchers is mostly funded by TESOL teaching. It is funded, in other words, by selling a commodity – TESOL – to a group of consumers … who are teachers. If we frame researchers as vendors and teachers as (potential) clients, [1] a rather different light is shone on pleas for more dialogue.

The first step, Sato and Loewen claim, towards achieving such a dialogue would be ‘nurturing a collaborative mindset in both researchers and teachers’. And the last of four steps to removing the obstacles to dialogue would be ‘institutional support’ for both teachers and researchers. But without institutional support, mindsets are unlikely to become more collaborative, and the suggestions for institutional support (e.g. time release and financial support for teachers) are just pie in the sky. Perhaps sensing this, Sato and Loewen conclude the article by asking whether their desire to see a more collaborative mindset (and, therefore, more dialogue) is just a dream. Back in 1994, Mark Clarke had this to say:

The only real solution to the problems I have identified would be to turn the hierarchy on its head, putting teachers on the top and arraying others-pundits, professors, administrators, researchers, and so forth-below them. This would require a major change in our thinking and in our behavior and, however reasonable it may appear to be, I do not see this happening. (Clarke, 1994: 18)

In 2017, ELT Journal published an entertaining piece of trolling by Péter Medgyes, ‘The (ir)relevance of academic research for the language teacher’, in which he refers to the expert status of researchers as related to the ‘orthodox and mistaken belief that by virtue of having churned out tons of academic papers and books, they must be performing an essential service for language education’. It is not hard to imagine the twinkle in his eye as he wrote it. In the same volume, Amos Paran (2017) picks up the bait, arguing for more dialogue between researchers and teachers. In response to Paran’s plea, Medgyes points out that there is an irony in preaching the importance of dialogue in a top-down manner. ‘As long as the playing field is uneven, it is absurd to talk about dialogue, if a dialogue is at all necessary’, he writes. The same holds true for Sato and Loewen. They acknowledge (Sato & Loewen, 2018) that ‘researchers’ top-down attitudes will not facilitate the dialogue’, but, try as they might, their own mindset is seemingly inescapable. In one article that was attempting to reach out to teachers (Sato, Loewen & Kim, 2021), they managed to make one teacher trainer, Sandy Millin, feel that teachers were being unfairly attacked.

The phrase ‘We need to talk’ has been described as, perhaps, the most dreaded four words in the English language. When you hear it, you know (1) that someone wants to talk to you (and not the other way round), (2) that, whether you want to talk or not, the other person will have their say, (3) that the discussion will almost certainly involve some criticism of you, and this may be merited, and (4) whatever happens next, it is unlikely that your relationship will improve.

References

Borg, S. (2009). English language teachers’ conceptions of research. Applied Linguistics, 30 (3): 358 – 88

Clarke, M. (1994). The dysfunctions of theory/practice discourse. TESOL Quarterly, 28: 9-26.

Farrell, T. (2016). Reflection, reflection, reflection. Responses to the Chapter:  More Research is Needed – A Mantra Too Far? Humanising Language Teaching, 18 (3) http://old.hltmag.co.uk/jun16/mart.htm

Maley, A. (2016). ‘More Research is Needed’ – A Mantra Too Far? Humanising Language Teaching, 18 (3) http://old.hltmag.co.uk/jun16/mart.htm

Medgyes, P. (2017). The (ir)relevance of academic research for the language teacher. ELT Journal, 71 (4): 491–498

Paran, A. (2017). ‘Only connect’: researchers and teachers in dialogue. ELT Journal, 71 (4): 499 – 508

Sato, M., & Loewen, S. (2022). The research-practice dialogue in second language learning and teaching: Past, present, and future. The Modern Language Journal, 106 (3)

Sato, M. & Loewen, S. (Eds.) (2019) Evidence-Based Second Language Pedagogy. New York: Routledge

Sato, M. & Loewen, S. (2018). Do teachers care about research? The research–pedagogy dialogue. ELT Journal 73 (1): 1 – 10

Sato, M., Loewen, S. & Kim, Y. J. (2021) The role and value of researchers for teachers: five principles for mutual benefit. TESOL AL Forum September 2021. http://newsmanager.commpartners.com/tesolalis/issues/2021-08-30/email.html#4


[1] I am actually a direct customer of Sato and Loewen, having bought for £35 last year a copy of their edited volume ‘Evidence-Based Second Language Pedagogy’. According to the back cover, it is a ‘cutting-edge collection of empirical research [which closes] the gap between research and practice’. In reality, it’s a fairly random collection of articles of very mixed quality, many of which are co-authored by ‘top scholars’ and the PhD students they are supervising. It does nothing to close any gaps between research and practice and I struggle to see how it could be of any conceivable benefit to teachers.

I’ve written about mindset before (here), but a recent publication caught my eye, and I thought it was worth sharing.

Earlier this year, the OECD produced a report on its 2018 PISA assessments. This was significant because it was the first time that the OECD had attempted to measure mindsets and correlate them to academic achievements. Surveying some 600,000 15-year-old students in 78 countries and economies, it is, to date, the biggest, most global attempt to study the question. Before going any further, a caveat is in order. The main focus of PISA 2018 was on reading, so any correlations that are found between mindsets and achievement can only be interpreted in the context of gains in reading skills. This is important to bear in mind, as previous research into mindsets indicates that mindsets may have different impacts on different school subjects.

There has been much debate about how best to measure mindsets and, indeed, whether they can be measured at all. The OECD approached the question by asking students to respond to the statement ‘Your intelligence is something about you that you can’t change very much’ by choosing “strongly disagree”, “disagree”, “agree”, or “strongly agree”. Disagreeing with the statement was considered a precursor of a growth mindset, as it is more likely that someone who thinks intelligence can change will challenge him/herself to improve it. Across the sample, almost two-thirds of students showed a growth mindset, but there were big differences between countries, with students in Estonia, Denmark, and Germany being much more growth-oriented than those in Greece, Mexico or Poland (among OECD countries) and the Philippines, Panama, Indonesia or Kosovo (among the non-OECD countries). In line with previous research, students from socio-economically advantaged backgrounds presented a growth mindset more often than those from socio-economically disadvantaged backgrounds.

I have my problems with the research methodology. A 15-year-old from a wealthy country is much more likely than peers in other countries to have experienced mindset interventions in school: motivational we-can-do-it posters, workshops on neuroplasticity, biographical explorations of success stories and the like. In some places, some students have been so exposed to this kind of thing that school leaders have realised that growth mindset interventions should be much more subtle, avoiding the kind of crude, explicit proselytising that simply makes many students roll their eyes. In contexts such as these, most students now know what they are supposed to believe concerning the malleability of intelligence, irrespective of what they actually believe. Therefore, asking them, in a formal context, to respond to statements which are obviously digging at mindsets is an invitation to provide what they know is the ‘correct response’. Others, who have not been so fortunate in receiving mindset training, are less likely to know the correct answer. Therefore, the research results probably tell us as much about educational practices as they do about mindsets. There are other issues with the chosen measurement tool, discussed in the report, including acquiescent bias and the fact that the cognitive load required by the question increases the likelihood of a random response. Still, let’s move on.

The report found that growth mindsets correlated with academic achievement in some (typically wealthier) countries, but not in others. Wisely, the report cautions that the findings do not establish cause-and-effect relations. This is wise because a growth mindset may, to some extent, be the result of academic success, rather than the cause. As the report observes, students performing well may associate their success to internal characteristics of effort and perseverance, while those performing poorly may attribute it to immutable characteristics to preserve their self-esteem.

However, the report does list the ways in which a growth mindset can lead to better achievement. These include valuing school more, setting more ambitious learning goals, higher levels of self-efficacy, higher levels of motivation and lower levels of fear of failure. This is a very circular kind of logic. These attributes are the attributes of growth mindset, but are they the results of a growth mindset or simply the constituent parts of it? Incidentally, they were measured in the same way as the measurement of mindset, by asking students to respond to statements like “I find satisfaction in working as hard as I can” or “My goal is to learn as much as possible”. The questions are so loaded that we need to be very sceptical about the meaning of the results. The concluding remarks to this section of the report clearly indicate the bias of the research. The question that is asked is not “Can growth mindset lead to better results?” but “How can growth mindset lead to better results?”

Astonishingly, the research did not investigate the impact of growth mindset interventions in schools on growth mindset. Perhaps, this is too hard to do in any reliable way. After all, what counts as a growth mindset intervention? A little homily from the teacher about how we can all learn from our mistakes or some nice posters on the walls? Or a more full-blooded workshop about neural plasticity with follow-up tasks? Instead, the research investigated more general teaching practices. The results were interesting. The greatest impacts on growth mindset come when students perceive their teachers as being supportive in a safe learning environment, and when teachers adapt their teaching to the needs of the class, as opposed to simply following a fixed syllabus. The findings about teacher feedback were less clear: “Whether teacher feedback influences students’ growth mindset development or the other way around, further research is required to investigate this relationship, and why it could differ according to students’ proficiency in reading”.

The final chapter of this report does not include any references to data from the PISA 2018 exercise. Instead, it repeats, in a very selective way, previous research findings such as:

  • Growth mindset interventions yield modest average treatment effects, but larger effects for specific subgroups.
  • Growth-mindset interventions fare well in both scalability and cost-effectiveness dimensions.

It ignores any discussion about whether we should be bothering with growth mindsets at all. It tells us something we already know (about the importance of teacher support and adapting teaching to the needs of the class), but somehow concludes that “growth mindset interventions […] can be cost-effective ways to raise students’ outcomes on a large scale”. It is, to my mind, a classic example, of ‘research’ that is looking to prove a point, rather than critically investigate a phenomenon. In that sense, it is the very opposite of science.

OECD (2021) Sky’s the Limit: Growth Mindset, Students, and Schools in PISA. https://www.oecd.org/pisa/growth-mindset.pdf

‘Pre-teaching’ (of vocabulary) is a widely-used piece of language teaching jargon, but it’s a strange expression. The ‘pre’ indicates that it’s something that comes before something else that is more important, what Chia Suan Chong calls ‘the main event’, which is usually some reading or listening work. The basic idea, it seems, is to lessen the vocabulary load of the subsequent activity. If the focus on vocabulary were the ‘main event’, we might refer to the next activity as ‘post-reading’ or ‘post-listening’ … but we never do.

The term is used in standard training manuals by both Jim Scrivener (2005: 230 – 233) and Jeremy Harmer (2012: 137) and, with a few caveats, the practice is recommended. Now read this from the ELT Nile Glossary:

For many years teachers were recommended to pre-teach vocabulary before working on texts. Nowadays though, some question this, suggesting that the contexts that teachers are able to set up for pre-teaching are rarely meaningful and that pre-teaching in fact prevents learners from developing the attack strategies they need for dealing with challenging texts.

Chia is one of those doing this questioning. She suggests that ‘we cut out pre-teaching altogether and go straight for the main event. After all, if it’s a receptive skills lesson, then shouldn’t the focus be on reading/listening skills and strategies? And most importantly, pre-teaching prevents learners’ from developing a tolerance of ambiguity – a skill that is vital in language learning.’ Scott Thornbury is another who has expressed doubts about the value of PTV, although he is more circumspect in his opinions. He has argued that working out the meaning of vocabulary from context is probably a better approach and that PTV inadequately prepares learners for the real world. If we have to pre-teach, he argues, get it out of the way ‘as quickly and efficiently as possible’ … or ‘try post-teaching instead’.

Both Chia and Scott touch on the alternatives, and guessing the meaning of unknown words from context is one of them. I’ve discussed this area in an earlier post. Not wanting to rehash the content of that post here, the simple summary is this: it’s complicated. We cannot, with any degree of certainty, say that guessing meaning from context leads to more gains in either reading / listening comprehension or vocabulary development than PTV or one of the other alternatives – encouraging / allowing monolingual or bilingual dictionary look up (see this post on the topic), providing a glossary (see this post) or doing post-text vocabulary work.

In attempting to move towards a better understanding, the first problem is that there is very little research into the relationship between PTV and improved reading / listening comprehension. What there is (e.g. Webb, 2009) suggests that pre-teaching can improve comprehension and speed up reading, but there are other things that a teacher can do (e.g. previous presentation of comprehension questions or the provision of pictorial support) that appear to lead to more gains in these areas (Pellicer-Sánchez et al., 2021). It’s not exactly a ringing endorsement. There is even less research looking at the relationship between PTV and vocabulary development. What there is (Pellicer-Sánchez et al., 2021) suggests that pre-teaching leads to more vocabulary gains than when learners read without any support. But the reading-only condition is unlikely in most real-world learning contexts, where there is a teacher, dictionary or classmate who can be turned to. A more interesting contrast is perhaps between PTV and during-reading vocabulary instruction, which is a common approach in many classrooms. One study (File & Adams, 2010) looked at precisely this area and found little difference between the approaches in terms of vocabulary gains. The limited research does not provide us with any compelling reasons either for or against PTV.

Another problem is, as usual, that the research findings often imply more than was actually demonstrated. The abstract for the study by Pellicer-Sánchez et al (2021) states that pre‐reading instruction led to more vocabulary learning. But this needs to be considered in the light of the experimental details.

The study involved 87 L2 undergraduates and postgraduates studying at a British university. Their level of English was therefore very high, and we can’t really generalise to other learners at other levels in other conditions. The text that they read contained a number of pseudo-words and was 2,290 words long. The text itself, a narrative, was of no intrinsic interest, so the students reading it would treat it as an object of study and they would notice the pseudo-words, because their level of English was already high, and because they knew that the focus of the research was on ‘new words’. In other words, the students’ behaviour was probably not at all typical of a student in a ‘normal’ classroom. In addition, the pseudo-words were all Anglo-Saxon looking, and not therefore representative of the kinds of unknown items that students would encounter in authentic (or even pedagogical) texts (which would have a high proportion of words with Latin roots). I’m afraid I don’t think that the study tells us anything of value.

Perhaps research into an area like this, with so many variables that need to be controlled, is unlikely ever to provide teachers with clear answers to what appears to be a simple question: is PTV a good idea or not? However, I think we can get closer to something resembling useful advice if we take another tack. For this, I think two additional questions need to be asked. First, what is the intended main learning opportunity (note that I avoid the term ‘learning outcome’!) of the ‘main event’ – the reading or listening. Second, following on from the first question, what is the point of PTV, i.e. in what ways might it contribute to enriching the learning opportunities of the ‘main event’?

To answer the first question, I think it is useful to go back to a distinction made almost forty years ago in a paper by Tim Johns and Florence Davies (1983). They contrasted the Text as a Linguistic Object (TALO) with the Text as a Vehicle for Information (TAVI). The former (TALO) is something that language students study to learn language from in a direct way. It has typically been written or chosen to illustrate and to contextualise bits of grammar, and to provide opportunities for lexical ‘quarrying’. The latter (TAVI) is a text with intrinsic interest, read for information or pleasure, and therefore more appropriately selected by the learner, rather than the teacher. For an interesting discussion on TALO and TAVI, see this 2015 post from Geoff Jordan.

Johns and Davies wrote their article in pre-Headway days when texts in almost all coursebooks were unashamedly TALOs, and when what were called top-down reading skills (reading for gist / detail, etc.) were only just beginning to find their way into language teaching materials. TAVIs were separate, graded readers, for example. In some parts of the world, TALOs and TAVIs are still separate, often with one teacher dealing with the teaching of discrete items of language through TALOs, and another responsible for ‘skills development’ through TAVIs. But, increasingly, under the influence of British publishers and methodologists, attempts have been made to combine TALOs and TAVIs in a single package. The syllabus of most contemporary coursebooks, fundamentally driven by a discrete-item grammar plus vocabulary approach, also offer a ‘skills’ strand which requires texts to be intrinsically interesting, meaningful and relevant to today’s 21st century learners. The texts are required to carry out two functions.

Recent years have seen an increasingly widespread questioning of this approach. Does the exploitation of reading and listening texts in coursebooks (mostly through comprehension questions) actually lead to gains in reading and listening skills? Is there anything more than testing of comprehension going on? Or do they simply provide practice in strategic approaches to reading / listening, strategies which could probably be transferred from L1? As a result of the work of scholars like William Grabe (reading) and John Field and Richard Cauldwell (listening), there is now little, if any, debate in the world of research about these questions. If we want to develop the reading / listening skills of our students, the approach of most coursebooks is not the way to go about it. For a start, the reading texts are usually too short and the listening texts too long.

Most texts that are found in most contemporary coursebooks are TALOs dressed up to look like TAVIs. Their fundamental purpose is to illustrate and contextualise language that has either been pre-taught or will be explored later. They are first and foremost vehicles for language, and only secondarily vehicles for information. They are written and presented in as interesting a way as possible in order to motivate learners to engage with the TALO. Sometimes, they succeed.

However, there are occasions (even in coursebooks) when texts are TAVIs – used for purely ‘skills’ purposes, language use as opposed to language study. Typically, they (reading or listening texts) are used as springboards for speaking and / or writing practice that follows. It’s the information in the text that matters most.

So, where does all this take us with PTV? Here is my attempt at a break-down of advice.

1 TALOs where the text contains a set of new lexical items which are a core focus of the lesson

If the text is basically a contextualized illustration of a set of lexical items (and, usually, a particular grammatical structure), there is a strong case for PTV. This is, of course, assuming that these items are of sufficiently high frequency to be suitable candidates for direct vocabulary instruction. If this is so, there is also a strong case to be made for the PTV to be what has been called ‘rich instruction’, which ‘involves (1) spending time on the word; (2) explicitly exploring several aspects of what is involved in knowing a word; and (3) involving learners in thoughtfully and actively processing the word’ (Nation, 2013: 117). In instances like this, PTV is something of a misnomer. It’s just plain teaching, and is likely to need as much, or more, time than exploration of the text (which may be viewed as further practice of / exposure to the lexis).

If the text is primarily intended as lexical input, there is also a good case to be made for making the target items it contains more salient by, for example, highlighting them or putting them in bold (Choi, 2017). At the same time, if ‘PTV’ is to lead to lexical gains, these are likely to be augmented by post-reading tasks which also focus explicitly on the target items (Sonbul & Schmitt, 2010).

2 TALOs which contain a set of lexical items that are necessary for comprehension of the text, but not a core focus of the lesson (e.g. because they are low-frequency)

PTV is often time-consuming, and necessarily so if the instruction is rich. If it is largely restricted to matching items to meanings (e.g. through translation), it is likely to have little impact on vocabulary development, and its short-term impact on comprehension appears to be limited. Research suggests that the use of a glossary is more efficient, since learners will only refer to it when they need to (whereas PTV is likely to devote some time to some items that are known to some learners, and this takes place before the knowledge is required … and may therefore be forgotten in the interim). Glossaries lead to better comprehension (Alessi & Dwyer, 2008).

3 TAVIs

I don’t have any principled objection to the occasional use of texts as TALOs, but it seems fairly clear that a healthy textual diet for language learners will contain substantially more TAVIs than TALOs, substantially more extensive reading than intensive reading of the kind found in most coursebooks. If we focused less often on direct instruction of grammar (a change of emphasis which is long overdue), there would be less need for TALOs, anyway. With TAVIs, there seems to be no good reason for PTV: glossaries or digital dictionary look-up will do just fine.

However, one alternative justification and use of PTV is offered by Scott Thornbury. He suggests identifying a relatively small number of keywords from a text that will be needed for global understanding. Some of them may be unknown to the learners, and for these, learners use dictionaries to check meaning. Then, looking at the list of key words learners predict what the text will be about. The rationale here is that if learners engage with these words before encountering them in the text, it ‘may be an effective way of activating a learner’s schema for the text, and this may help to support comprehension’ (Ballance, 2018). However, as Ballance notes, describing this kind of activity as PTV would be something of a misnomer: it is a useful addition to a teacher’s repertoire of schema-activation activities (which might be used with both TAVIs and TALOs).

In short …

The big question about PTV, then, is not one of ‘yes’ or ‘no’. It’s about the point of the activity. Balance (2018) offers a good summary:

‘In sum, for teachers to use PTV effectively, it is essential that they clearly identify a rationale for including PTV within a lesson, select the words to be taught in conjunction with this rationale and also design the vocabulary learning or development exercise in a manner that is commensurate with this rationale. The rationale should be the determining factor in the design of a PTV component within a lesson, and different rationales for using PTV naturally lead to markedly different selections of vocabulary items to be studied and different exercise designs.’

REFERENCES

Alessi, S. & Dwyer, A. (2008). Vocabulary assistance before and during reading. Reading in a Foreign Language, 20 (2): pp. 246 – 263

Ballance, O. J. (2018). Strategies for pre-teaching vocabulary in context. In The TESOL Encyclopedia of English Language Teaching (pp. 1-7). Wiley. https://doi.org/10.1002/9781118784235.eelt0732

Choi, S. (2017). Processing and learning of enhanced English collocations: An eye movement study. Language Teaching Research, 21, 403–426. https://doi.org/10.1177/1362168816653271

File, K. A. & Adams, R. (2010). Should vocabulary instruction be integrated or isolated? TESOL Quarterly, 24, 222–249.

Harmer, J. (2012). Essential Teacher Knowledge. Harlow: Pearson

Johns, T. & Davies, F. (1983). Text as a vehicle for information: the classroom use of written texts in teaching reading in a foreign language. Reading in a Foreign Language, 1 (1): pp. 1 – 19

Nation, I. S. P. (2013). Learning Vocabulary in Another Language 2nd Edition. Cambridge: Cambridge University Press

Pellicer-Sánchez, A., Conklin, K. & Vilkaitė-Lozdienė, L. (2021). The effect of pre-reading instruction on vocabulary learning: An investigation of L1 and L2 readers’ eye movements. Language Learning, 0 (0), 0-0. https://onlinelibrary.wiley.com/doi/full/10.1111/lang.12430

Scrivener, J. (2005). Learning Teaching 2nd Edition. Oxford: Macmillan

Sonbul, S. & Schmitt, N. (2010). Direct teaching of vocabulary after reading: is it worth the effort? ELT Journal 64 (3): pp.253 – 260

Webb, S. (2009). The effects of pre‐learning vocabulary on reading comprehension and writing. The Canadian Modern Language Review, 65 (3): pp. 441–470.

A week or so ago, someone in the Macmillan marketing department took it upon themselves to send out this tweet. What grabbed my attention was the claim that it is ‘a well-known fact’ that teaching students a growth mindset makes them perform better academically over time. The easily demonstrable reality (which I’ll come on to) is that this is not a fact. It’s fake news, being used for marketing purposes. The tweet links to a blog post of over a year ago. In it, Chia Suan Chong offers five tips for developing a growth mindset in students: educating students about neuroplasticity, delving deeper into success stories, celebrating challenges and mistakes, encouraging students to go outside their comfort zones, and giving ‘growth-mindset-feedback’. All of which, she suggests, might help our students. Indeed, it might, and, even if it doesn’t, it might be worth a try anyway. Chia doesn’t make any claims beyond the potential of the suggested strategies, so I wonder where the Macmillan Twitter account person got the ‘well-known fact’.

If you google ‘mindset ELT’, you will find webpage after webpage offering tips about how to promote growth mindset in learners. It’s rare for the writers of these pages to claim that the positive effects of mindset interventions are a ‘fact’, but it’s even rarer to come across anyone who suggests that mindset interventions might be an à la mode waste of time and effort. Even in more serious literature (e.g. Mercer, S. & Ryan, S. (2010). A mindset for EFL: learners’ beliefs about the role of natural talent. ELT Journal, 64 (4): 436 – 444), the approach is fundamentally enthusiastic, with no indication that there might be a problem with mindset theory. Given that this enthusiasm is repeated so often, perhaps we should not blame the Macmillan tweeter for falling victim to the illusory truth effect. After all, it appears that 98% of teachers in the US feel that growth mindset approaches should be adopted in schools (Hendrick, 2019).

Chia suggests that we can all have fixed mindsets in certain domains (e.g. I know all about that, there’s nothing more I can learn). One domain where it seems that fixed mindsets are prevalent is mindset theory itself. This post is an attempt to nudge towards more ‘growth’ and, in trying to persuade you to be more sceptical, I will quote as much as possible from Carol Dweck, the founder of mindset theory, and her close associates.

Carol Dweck’s book ‘Mindset: The New Psychology of Success’ appeared in 2006. In it, she argued that people can be placed on a continuum between those who have ‘a fixed mindset–those who believe that abilities are fixed—[and who] are less likely to flourish [and] those with a growth mindset–those who believe that abilities can be developed’ (from the back cover of the updated (2007) version of the book). There was nothing especially new about the idea. It is very close to Bandura’s (1982) theory of self-efficacy, which will be familiar to anyone who has read Zoltán Dörnyei’s more recent work on motivation in language learning. It’s closely related to Carl Roger’s (1969) ideas about self-concept and it’s not a million miles removed, either, from Maslow’s (1943) theory of self-actualization. The work of Rogers and Maslow was at the heart of the ‘humanistic turn’ in ELT in the latter part of the 20th century (see, for example, Early, 1981), so mindset theory is likely to resonate with anyone who was inspired by the humanistic work of people like Moskowitz, Stevick or Rinvolucri. The appeal of mindset theory is easy to see. Besides its novelty value, it resonates emotionally with the values that many teachers share, writes Tom Bennett: it feels right that you don’t criticise the person, but invite them to believe that, through hard work and persistence, you can achieve.

We might even trace interest in the importance of self-belief back to the Stoics (who, incidentally but not coincidentally, are experiencing a revival of interest), but Carol Dweck introduced a more modern flavour to the old wine and packaged it skilfully and accessibly in shiny new bottles. Her book was a runaway bestseller, with sales in the millions, and her TED Talk has now had over 11 million views. It was in education that mindset theory became particularly popular. As a mini-industry it is now worth millions and millions. Just one research project into the efficacy of one mindset product has received 3.5 million dollars in US federal funding.

But, much like other ideas that have done a roaring trade in popular psychology (Howard Gardner’s ‘multiple intelligences theory, for example) which seem to offer simple solutions to complex problems, there was soon pushback. It wasn’t hard for critics to scoff at motivational ‘yes-you-can’ posters in classrooms or accounts of well-meaning but misguided teacher interventions, like this one reported by Carl Hendrick:

One teacher [took] her children out into the pristine snow covering the school playground, she instructed them to walk around, taking note of their footprints. “Look at these paths you’ve been creating,” the teacher said. “In the same way that you’re creating new pathways in the snow, learning creates new pathways in your brain.”

Carol Dweck was sympathetic to the critics. She has described the early reaction to her book as ‘uncontrollable’. She freely admits that she and her colleagues had underestimated the issues around mindset interventions in the classrooms and that such interventions were ‘not yet evidence-based’. She identified two major areas where mindset interventions have gone awry. The first of these is when a teacher teaches the concept of mindsets to students, but does not change other policies and practices in the classroom. The second is that some teachers have focussed too much on praising their learners’ efforts. Teachers have taken mindset recipes and tips, without due consideration. She says:

Teachers have to ask, what exactly is the evidence suggesting? They have to realise it takes deep thought and deep experimentation on their part in the classroom to see how best the concept can be implemented there. This should be a group enterprise, in which they share what worked, what did not work, for whom and when. People need to recognise we are researchers, we have produced a body of evidence that says under these conditions this is what happened. We have not explored all the conditions that are possible. Teacher feedback on what is working and not working is hugely valuable to us to tell us what we have not done and what we need to do.

Critics like Dylan William, Carl Hendrick and Timothy Bates found that it was impossible to replicate Dweck’s findings, and that there were at best weak correlations between growth mindset and academic achievement, and between mindset interventions and academic gains. They were happy to concede that typical mindset interventions would not do any harm, but asked whether the huge amounts of money being spent on mindset would not be better invested elsewhere.

Carol Dweck seems to like the phrase ‘not yet’. She argues, in her TED Talk, that simply using the words ‘not yet’ can build students’ confidence, and her tip is often repeated by others. She also talks about mindset interventions being ‘not yet evidence-based’, which is a way of declaring her confidence that they soon will be. But, with huge financial backing, Dweck and her colleagues have recently been carrying out a lot of research and the results are now coming in. There are a small number of recent investigations that advocates of mindset interventions like to point to. For reasons of space, I’ll refer to two of them.

The first (Outes-Leon, et al., 2020) of these looked at an intervention with children in the first grades in a few hundred Peruvian secondary schools. The intervention consisted of students individually reading a text designed to introduce them to the concept of growth-mindset. This was followed by a group debate about the text, before students had to write individually a reflective letter to a friend/relative describing what they had learned. In total, this amounted to about 90 minutes of activity. Subsequently, teachers made a subjective assessment of the ‘best’ letters and attached these to the classroom wall, along with a growth mindset poster, for the rest of the school year. Teachers were also asked to take a picture of the students alongside the letters and the poster and to share this picture by email.

Academic progress was measured 2 and 14 months after the intervention and compared to a large control group. The short-term (2 months) impact of the intervention was positive for mathematics, but less so for reading comprehension. (Why?) These gains were only visible in regional schools, not at all in metropolitan schools. Similar results were found when looking at the medium-term (14 month) impact. The reasons for this are unclear. It is hypothesized that the lower-achieving students in regional schools might benefit more from the intervention. Smaller class sizes in regional schools might also be a factor. But, of course, many other explanations are possible.

The paper is entitled The Power of Believing You Can Get Smarter. The authors make it clear that they were looking for positive evidence of the intervention and they were supported by mindset advocates (e.g. David Yeager) from the start. It was funded by the World Bank, which is a long-standing advocate of growth mindset interventions. (Rather jumping the gun, the World Bank’s Mindset Team wrote in 2014 that teaching growth mindset is not just another policy fad. It is backed by a burgeoning body of empirical research.) The paper’s authors conclude that ‘the benefits of the intervention were relevant and long-lasting in the Peruvian context’, and they focus strongly on the low costs of the intervention. They acknowledge that the way the tool is introduced (design of the intervention) and the context in which this occurs (i.e., school and teacher characteristics) both matter to understand potential gains. But without understanding the role of the context, we haven’t really learned anything practical that we can take away from the research. Our understanding of the power of believing you can get smarter has not been meaningfully advanced.

The second of these studies (Yeager et al., 2019) took many thousands of lower-achieving American 9th graders from a representative sample of schools. It is a very well-designed and thoroughly reported piece of research. The intervention consisted of two 25-minute online sessions, 20 days apart, which sought to reduce the negative effort beliefs of students (the belief that having to try hard or ask for help means you lack ability), fixed-trait attributions (the attribution that failure stems from low ability) and performance avoidance goals (the goal of never looking stupid). An analysis of academic achievement at the end of the school year indicated clearly that the intervention led to improved performance. These results lead to very clear grounds for optimism about the potential of growth mindset interventions, but the report is careful to avoid overstatement. We have learnt about one particular demographic with one particular intervention, but it would be wrong to generalise beyond that. The researchers had hoped that the intervention would help to compensate for unsupportive school norms, but found that this was not the case. Instead, they found that it was when the peer norm supported the adoption of intellectual challenges that the intervention promoted sustained benefits. Context, as in the Peruvian study, was crucial. The authors write:

We emphasize that not all forms of growth mindset interventions can be expected to increase grades or advanced course-taking, even in the targeted subgroups. New growth mindset interventions that go beyond the module and population tested here will need to be subjected to rigorous development and validation processes.

I think that a reasonable conclusion from reading this research is that it may well be worth experimenting with growth mindset interventions in English language classes, but without any firm expectation of any positive impact. If nothing else, the interventions might provide useful, meaningful practice of the four skills. First, though, it would make sense to read two other pieces of research (Sisk et al., 2018; Burgoyne et al., 2020). Unlike the projects I have just discussed, these were not carried out by researchers with an a priori enthusiasm for growth-mindset interventions. And the results were rather different.

The first of these (Sisk et al., 2018) was a meta-analysis of the literature. It found that there was only a weak correlation between mindset and academic achievement, and only a weak correlation between mindset interventions and academic gains. It did, however, lend support to one of the conclusions of Yeager et al (2019), that such interventions may benefit students who are academically at risk.

The second (Burgoyne et al., 2020) found that the foundations of mind-set theory are not firm and that bold claims about mind-set appear to be overstated. Other constructs such as self-efficacy and need for achievement, [were] found to correlate much more strongly with presumed associates of mind-set.

So, where does this leave us? We are clearly a long way from ‘facts’; mindset interventions are ‘not yet evidence-based’. Carl Hendrick (2019) provides a useful summary:

The truth is we simply haven’t been able to translate the research on the benefits of a growth mindset into any sort of effective, consistent practice that makes an appreciable difference in student academic attainment. In many cases, growth mindset theory has been misrepresented and miscast as simply a means of motivating the unmotivated through pithy slogans and posters. […] Recent evidence would suggest that growth mindset interventions are not the elixir of student learning that many of its proponents claim it to be. The growth mindset appears to be a viable construct in the lab, which, when administered in the classroom via targeted interventions, doesn’t seem to work at scale. It is hard to dispute that having a self-belief in their own capacity for change is a positive attribute for students. Paradoxically, however, that aspiration is not well served by direct interventions that try to instil it.

References

Bandura, Albert (1982). Self-efficacy mechanism in human agency. American Psychologist, 37 (2): pp. 122–147. doi:10.1037/0003-066X.37.2.122.

Burgoyne, A. P., Hambrick, D. Z., & Macnamara, B. N. (2020). How Firm Are the Foundations of Mind-Set Theory? The Claims Appear Stronger Than the Evidence. Psychological Science, 31(3), 258–267. https://doi.org/10.1177/0956797619897588

Early, P. (Ed.) ELT Documents 1113 – Humanistic Approaches: An Empirical View. London: The British Council

Dweck, C. S. (2006). Mindset: The New Psychology of Success. New York: Ballantine Books

Hendrick, C. (2019). The growth mindset problem. Aeon,11 March 2019.

Maslow, A. (1943). A Theory of Human Motivation. Psychological Review, 50: pp. 370-396.

Outes-Leon, I., Sanchez, A. & Vakis, R. (2020). The Power of Believing You Can Get Smarter : The Impact of a Growth-Mindset Intervention on Academic Achievement in Peru (English). Policy Research working paper, no. WPS 9141 Washington, D.C. : World Bank Group. http://documents.worldbank.org/curated/en/212351580740956027/The-Power-of-Believing-You-Can-Get-Smarter-The-Impact-of-a-Growth-Mindset-Intervention-on-Academic-Achievement-in-Peru

Rogers, C. R. (1969). Freedom to Learn: A View of What Education Might Become. Columbus, Ohio: Charles Merill

Sisk, V. F., Burgoyne, A. P., Sun, J., Butler, J. L., Macnamara, B. N. (2018). To what extent and under which circumstances are growth mind-sets important to academic achievement? Two meta-analyses. Psychological Science, 29, 549–571. doi:10.1177/0956797617739704

Yeager, D.S., Hanselman, P., Walton, G.M. et al. (2019). A national experiment reveals where a growth mindset improves achievement. Nature 573, 364–369. https://doi.org/10.1038/s41586-019-1466-y

subtitlesAs both a language learner and a teacher, I have a number of questions about the value of watching subtitled videos for language learning. My interest is in watching extended videos, rather than short clips for classroom use, so I am concerned with incidental, rather than intentional, learning, mostly of vocabulary. My questions include:

  • Is it better to watch a video that is subtitled or unsubtitled?
  • Is it better to watch a video with L1 or L2 subtitles?
  • If a video is watched more than once, what is the best way to start and proceed? In which order (no subtitles, L1 subtitles and L2 subtitles) is it best to watch?

For help, I turned to three recent books about video and language learning: Ben Goldstein and Paul Driver’s Language Learning with Digital Video (CUP, 2015), Kieran Donaghy’s Film in Action (Delta, 2015) and Jamie Keddie’s Bringing Online Video into the Classroom (OUP, 2014). I was surprised to find no advice, but, as I explored more, I discovered that there may be a good reason for these authors’ silence.

There is now a huge literature out there on subtitles and language learning, and I cannot claim to have read it all. But I think I have read enough to understand that I am not going to find clear-cut answers to my questions.

The learning value of subtitles

It has been known for some time that the use of subtitles during extensive viewing of video in another language can help in the acquisition of that language. The main gains are in vocabulary acquisition and the development of listening skills (Montero Perez et al., 2013). This is true of both L1 subtitles (with an L2 audio track), sometimes called interlingual subtitles, (Incalcaterra McLoughlin et al, 2011) and L2 subtitles (with an L2 audio track), sometimes called intralingual subtitles or captions (Vanderplank, 1988). Somewhat more surprisingly, vocabulary gains may also come from what are called reversed subtitles (L2 subtitles and an L1 audio track) (Burczyńska, 2015). Of course, certain conditions apply for subtitled video to be beneficial, and I’ll come on to these. But there is general research agreement (an exception is Karakaş & Sariçoban, 2012) that more learning is likely to take place from watching a subtitled video in a target language than an unsubtitled one.

Opposition to the use of subtitles as a tool for language learning has mostly come from three angles. The first of these, which concerns L1 subtitles, is an antipathy to any use at all of L1. Although such an attitude remains entrenched in some quarters, there is no evidence to support it (Hall & Cook, 2012; Kerr, 2016). Researchers and, increasingly, teachers have moved on.

The second reservation that is sometimes expressed is that learners may not attend to either the audio track or the subtitles if they do not need to. They may, for example, ignore the subtitles in the case of reversed subtitles or ignore the L2 audio track when there are L1 subtitles. This can, of course, happen, but it seems that, on the whole, this is not the case. In an eye-tracking study by Bisson et al (2012), for example, it was found that most people followed the subtitles, irrespective of what kind they were. Unsurprisingly, they followed the subtitles more closely when the audio track was in a language that was less familiar. When conditions are right (see below), reading subtitles becomes a very efficient and partly automatized cognitive activity, which does not prevent people from processing the audio track at the same time (d’Ydewalle & Pavakanun, 1997).

Related to the second reservation is the concern that the two sources of information (audio and subtitles), combined with other information (images and music or sound effects), may be in competition and lead to cognitive overload, impacting negatively on both comprehension and learning. Recent research suggests that this concern is ungrounded (Kruger et al, 2014). L1 subtitles generate less cognitive load than L2 subtitles, but overload is not normally reached and mental resources are still available for learning (Baranowska, 2020). The absence of subtitles generates more cognitive load.

Conditions for learning

Before looking at the differences between L1 and L2 subtitles, it’s a good idea to look at the conditions under which learning is more likely to take place with subtitles. Some of these are obvious, others less so.

First of all, the video material must be of sufficient intrinsic interest to the learner. Secondly, the subtitles must be of a sufficiently high quality. This is not always the case with automatically generated captions, especially if the speech-to-text software struggles with the audio accent. It is also not always the case with professionally produced L1 subtitles, especially when the ‘translations are non-literal and made at the phrase level, making it hard to find connections between the subtitle text and the words in the video’ (Kovacs, 2013, cited by Zabalbeascoa et al., 2015: 112). As a minimum, standard subtitling guidelines, such as those produced for the British Channel 4, should be followed. These limit, for example, the number of characters per line to about 40 and a maximum of two lines.

For reasons that I’ll come on to, learners should be able to switch easily between L1 and L2 subtitles. They are also likely to benefit if reliably accurate glosses or hyperlinks are ‘embedded in the subtitles, making it possible for a learner to simply click for additional verbal, auditory or even pictorial glosses’ (Danan, 2015: 49).

At least as important as considerations of the materials or tools, is a consideration of what the learner brings to the activity (Frumuselu, 2019: 104). Vanderplank (2015) describes these different kinds of considerations as the ‘effects of’ subtitles on a learner and the ‘effects with’ subtitles on learner behaviour.

In order to learn from subtitles, you need to be able to read fast enough to process them. Anyone with a slow reading speed (e.g. some dyslexics) in their own language is going to struggle. Even with L1 subtitles, Vanderplank (2015: 24) estimates that it is only around the age of 10 that children can do this with confidence. Familarity with both the subject matter and with subtitle use will impact on this ability to read subtitles fast enough.

With L2 subtitles, the language proficiency of the learner related to the level of difficulty (especially lexical difficulty) of the subtitles will clearly be of some significance. It is unlikely that L2 subtitles will be of much benefit to beginners (Taylor, 2005). It also suggests that, at lower levels, materials need to be chosen carefully. On the whole, researchers have found that higher proficiency levels correlate with greater learning gains (Pujadas & Muñoz, 2019; Suárez & Gesa, 2019), but one earlier meta-analysis (Montero Perez et al., 2013) did not find that proficiency levels were significant.

Measures of general language proficiency may be too blunt an instrument to help us all of the time. I can learn more from Portuguese than from Arabic subtitles, even though I am a beginner in both languages. The degree of proximity between two languages, especially the script (Winke et al., 2010), is also likely to be significant.

But a wide range of other individual learner differences will also impact on the learning from subtitles. It is known that learners approach subtitles in varied and idiosyncratic ways (Pujolá, 2002), with some using L2 subtitles only as a ‘back-up’ and others relying on them more. Vanderplank (2019) grouped learners into three broad categories: minimal users who were focused throughout on enjoying films as they would in their L1, evolving users who showed marked changes in their viewing behaviour over time, and maximal users who tended to be experienced at using films to enhance their language learning.

Categories like these are only the tip of the iceberg. Sensory preferences, personality types, types of motivation, the impact of subtitles on anxiety levels and metacognitive strategy awareness are all likely to be important. For the last of these, Danan (2015: 47) asks whether learners should be taught ‘techniques to make better use of subtitles and compensate for weaknesses: techniques such as a quick reading of subtitles before listening, confirmation of word recognition or meaning after listening, as well as focus on form for spelling or grammatical accuracy?’

In short, it is, in practice, virtually impossible to determine optimal conditions for learning from subtitles, because we cannot ‘take into account all the psycho-social, cultural and pedagogic parameters’ (Gambier, 2015). With that said, it’s time to take a closer look at the different potential of L1 and L2 subtitles.

L1 vs L2 subtitles

Since all other things are almost never equal, it is not possible to say that one kind of subtitles offers greater potential for learning than another. As regards gains in vocabulary acquisition and listening comprehension, there is no research consensus (Baranowska, 2020: 107). Research does, however, offer us a number of pointers.

Extensive viewing of subtitled video (both L1 and L2) can offer ‘massive quantities of authentic and comprehensible input’ (Vanderplank, 1988: 273). With lower level learners, the input is likely to be more comprehensible with L1 subtitles, and, therefore, more enjoyable and motivating. This makes them often more suitable for what Caimi (2015: 11) calls ‘leisure viewing’. Vocabulary acquisition may be better served with L2 subtitles, because they can help viewers to recognize the words that are being spoken, increase their interaction with the target language, provide further language context, and increase the redundancy of information, thereby enhancing the possibility of this input being stored in long-term memory (Frumuselu et al., 2015). These effects are much more likely with Vanderplank’s (2019) motivated, ‘maximal’ users than with ‘minimal’ users.

There is one further area where L2 subtitles may have the edge over L1. One of the values of extended listening in a target language is the improvement in phonetic retuning (see, for example, Reinisch & Holt, 2013), the ability to adjust the phonetic boundaries in your own language to the boundaries that exist in the target language. Learning how to interpret unusual speech-sounds, learning how to deal with unusual mappings between sounds and words and learning how to deal with the acoustic variations of different speakers of the target language are all important parts of acquiring another language. Research by Mitterer and McQueen (2009) suggests that L2 subtitles help in this process, but L1 subtitles hinder it.

Classroom implications?

The literature on subtitles and language learning echoes with the refrain of ‘more research needed’, but I’m not sure that further research will lead to less ambiguous, practical conclusions. One of my initial questions concerned the optimal order of use of different kinds of subtitles. In most extensive viewing contexts, learners are unlikely to watch something more than twice. If they do (watching a recorded academic lecture, for example), they are likely to be more motivated by a desire to learn from the content than to learn language from the content. L1 subtitles will probably be preferred, and will have the added bonus of facilitating note-taking in the L1. For learners who are more motivated to learn the target language (Vanderplank’s ‘maximal’ users), a sequence of subtitle use, starting with the least cognitively challenging and moving to greater challenge, probably makes sense. Danan (2015: 46) suggests starting with an L1 soundtrack and reversed (L2) subtitles, then moving on to an L2 soundtrack and L2 subtitles, and ending with an L2 soundtrack and no subtitles. I would replace her first stage with an L2 soundtrack and L1 subtitles, but this is based on hunch rather than research.

This sequencing of subtitle use is common practice in language classrooms, but, here, (1) the video clips are usually short, and (2) the aim is often not incidental learning of vocabulary. Typically, the video clip has been selected as a tool for deliberate teaching of language items, so different conditions apply. At least one study has confirmed the value of the common teaching practice of pre-teaching target vocabulary items before viewing (Pujadas & Muñoz, 2019). The drawback is that, by getting learners to focus on particular items, less incidental learning of other language features is likely to take place. Perhaps this doesn’t matter too much. In a short clip of a few minutes, the opportunities for incidental learning are limited, anyway. With short clips and a deliberate learning aim, it seems reasonable to use L2 subtitles for a first viewing, and no subtitles thereafter.

An alternative frequent use of short video clips in classrooms is to use them as a springboard for speaking. In these cases, Baranowska (2020: 113) suggests that teachers may opt for L1 subtitles first, and follow up with L2 subtitles. Of course, with personal viewing devices or in online classes, teachers may want to exploit the possibilities of differentiating the subtitle condition for different learners.

REFERENCES

Baranowska, K. (2020). Learning most with least effort: subtitles and cognitive load. ELT Journal 74 (2): pp.105 – 115

Bisson, M.-J., Van Heuven, W.J.B., Conklin, K. and Tunney, R.J. (2012). Processing of native and foreign language subtitles in films: An eye tracking study. Applied Psycholingistics, 35 (2): pp. 399 – 418

Burczyńska, P. (2015). Reversed Subtitles as a Powerful Didactic Tool in SLA. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 221 – 244)

Caimi, A. (2015). Introduction. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 9 – 18)

Danan, M. (2015). Subtitling as a Language Learning Tool: Past Findings, Current Applications, and Future Paths. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 41 – 61)

d’Ydewalle, G. & Pavakanun, U. (1997). Could Enjoying a Movie Lead to Language Acquisition?. In: Winterhoff-Spurk, P., van der Voort, T.H.A. (Eds.) New Horizons in Media Psychology. VS Verlag für Sozialwissenschaften, Wiesbaden. https://doi.org/10.1007/978-3-663-10899-3_10

Frumuselu, A.D., de Maeyer, S., Donche, V. & Gutierrez Colon Plana, M. (2015). Television series inside the EFL classroom: bridging the gap between teaching and learning informal language through subtitles. Linguistics and Education, 32: pp. 107 – 17

Frumuselu, A. D. (2019). ‘A Friend in Need is a Film Indeed’: Teaching Colloquial Expressions with Subtitled Television Series. In Herrero, C. & Vanderschelden, I. (Eds.) Using Film and Media in the Language Classroom. Bristol: Multimedia Matters. pp.92 – 107

Gambier, Y. (2015). Subtitles and Language Learning (SLL): Theoretical background. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 63 – 82)

Hall, G. & Cook, G. (2012). Own-language Use in Language Teaching and Learning. Language Learning, 45 (3): pp. 271 – 308

Incalcaterra McLoughlin, L., Biscio, M. & Ní Mhainnín, M. A. (Eds.) (2011). Audiovisual Translation, Subtitles and Subtitling. Theory and Practice. Bern: Peter Lang

Karakaş, A. & Sariçoban, A. (2012). The impact of watching subtitled animated cartoons on incidental vocabulary learning of ELT students. Teaching English with Technology, 12 (4): pp. 3 – 15

Kerr, P. (2016). Questioning ‘English-only’ Classrooms: Own-language Use in ELT. In Hall, G. (Ed.) The Routledge Handbook of English Language Teaching (pp. 513 – 526)

Kruger, J. L., Hefer, E. & Matthew, G. (2014). Attention distribution and cognitive load in a subtitled academic lecture: L1 vs. L2. Journal of Eye Movement Research, 7: pp. 1 – 15

Mitterer, H. & McQueen, J. M. (2009). Foreign Subtitles Help but Native-Language Subtitles Harm Foreign Speech Perception. PLoS ONE 4 (11): e7785.doi:10.1371/journal.pone.0007785

Montero Perez, M., Van Den Noortgate, W., & Desmet, P. (2013). Captioned video for L2 listening and vocabulary learning: A meta-analysis. System, 41, pp. 720–739 doi:10.1016/j.system.2013.07.013

Pujadas, G. & Muñoz, C. (2019). Extensive viewing of captioned and subtitled TV series: a study of L2 vocabulary learning by adolescents, The Language Learning Journal, 47:4, 479-496, DOI: 10.1080/09571736.2019.1616806

Pujolá, J.- T. (2002). CALLing for help: Researching language learning strategies using help facilities in a web-based multimedia program. ReCALL, 14 (2): pp. 235 – 262

Reinisch, E. & Holt, L. L. (2013). Lexically Guided Phonetic Retuning of Foreign-Accented Speech and Its Generalization. Journal of Experimental Psychology: Human Perception and Performance. Advance online publication. doi: 10.1037/a0034409

Suárez, M. & Gesa, F. (2019) Learning vocabulary with the support of sustained exposure to captioned video: do proficiency and aptitude make a difference? The Language Learning Journal, 47:4, 497-517, DOI: 10.1080/09571736.2019.1617768

Taylor, G. (2005). Perceived processing strategies of students watching captioned video. Foreign Language Annals, 38(3), pp. 422-427

Vanderplank, R. (1988). The value of teletext subtitles in language learning. ELT Journal, 42 (4): pp. 272 – 281

Vanderplank, R. (2015). Thirty Years of Research into Captions / Same Language Subtitles and Second / Foreign Language Learning: Distinguishing between ‘Effects of’ Subtitles and ‘Effects with’ Subtitles for Future Research. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 19 – 40)

Vanderplank, R. (2019). ‘Gist watching can only take you so far’: attitudes, strategies and changes in behaviour in watching films with captions, The Language Learning Journal, 47:4, 407-423, DOI: 10.1080/09571736.2019.1610033

Winke, P., Gass, S. M., & Sydorenko, T. (2010). The Effects of Captioning Videos Used for Foreign Language Listening Activities. Language Learning & Technology, 1 (1): pp. 66 – 87

Zabalbeascoa, P., González-Casillas, S. & Pascual-Herce, R. (2015). In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences Bern: Peter Lang (pp. 105–126)

I noted in a recent post about current trends in ELT that mindfulness has been getting a fair amount of attention recently. Here are three recent examples:

  • Pearson recently produced the Pearson Experiences: A Pocket Guide to Mindfulness, written by Amy Malloy. Amy has written also written a series of blog posts for Pearson on the topic and she is a Pearson-sponsored speaker (Why use mindfulness in the classroom?) at the English Australia Ed-Tech SIG Online Symposium this week.
  • Russell Stannard has written two posts for Express Publishing (here and here)
  • Sarah Mercer and Tammy Gregersen’s new book, ‘Teacher Wellbeing’ (OUP, 2020) includes a section in which they recommend mindfulness practices to teachers as a buffer against stress and as a way to promote wellbeing.

The claims

Definitions of mindfulness often vary slightly, but the following from Amy Malloy is typical: ‘mindfulness is about the awareness that comes from consciously focussing on the present moment’. Claims for the value of mindfulness practices also vary slightly. Besides the general improvements to wellbeing suggested by Sarah and Tammy, attention, concentration and resilience are also commonly mentioned.

Amy: [Mindfulness] develops [children’s] brains, which in turn helps them find it easier to calm down and stay calm. … It changes our brains for the better. …. [Mindfulness] helps children concentrate more easily on classroom activities.

Russell: Students going through mindfulness training have increased levels of determination and willpower, they are less likely to give up if they find something difficult … Mindfulness has been shown to improve concentration. Students are able to study for longer periods of time and are more focused … Studies have shown that practicing mindfulness can lead to reduced levels of anxiety and stress.

In addition to the behavioural changes that mindfulness can supposedly bring about, both Amy and Russell refer to neurological changes:

Amy: Studies have shown that the people who regularly practise mindfulness develop the areas of the brain associated with patience, compassion, focus, concentration and emotional regulation.

Russell: At the route of our current understanding of mindfulness is brain plasticity. … in probably the most famous neuroimaging research project, scientists took a group of people and found that by doing a programme of 8 weeks of mindfulness training based around gratitude, they could actually increase the size of the areas of the brain generally associated with happiness.

Supporting evidence

In her pocket guide for Pearson, Amy provides no references to support her claims.

In Russell’s first post, he links to a piece of research which looked at the self-reported psychological impact of a happiness training programme developed by a German cabaret artist and talk show host. The programme wasn’t specifically mindfulness-oriented, so tells us nothing about mindfulness, but it is also highly suspect as a piece of research, not least because one of the co-authors is the cabaret artist himself. His second link is to an article about human attention, a long-studied area of psychology, but this has nothing to do with mindfulness, although Russell implies that there is a connection. His third link is to a very selective review of research into mindfulness, written by two mindfulness enthusiasts. It’s not so much a review of research as a selection of articles which support mindfulness advocacy.

In his second post, Russell links to a review of mindfulness-based interventions (MBIs) in education. Appearing in the ‘Mindfulness’ journal, it is obviously in broad support of MBIs, but its conclusions are hedged: ‘Research on the neurobiology of mindfulness in adults suggests that sustained mindfulness practice can ….’ ‘mindfulness training holds promise for being one such intervention for teachers.’ His second link is to a masterpiece of pseudo-science delivered by Joe Dispenza, author of many titles including ‘Becoming Supernatural: How Common People are Doing the Uncommon’ and ‘Breaking the Habit of Being Yourself’. Russell’s 3rd link is to an interview with Matthieu Ricard, one of the Dalai Lama’s translators. Interestingly, but not in this interview, Ricard is very dismissive of secular mindfulness (‘Buddhist meditation without the Buddhism’). His fourth link is to a video presentation about mindfulness from Diana Winston of UCLA. The presenter doesn’t give citations for the research she mentions (so I can’t follow them up): instead, she plugs her own book.

Sarah and Tammy’s three references are not much better. The first is to a self-help book, called ‘Every Teacher Matters: Inspiring Well-Being through Mindfulness’ by K. Lovewell (2012), whose other work includes ‘The Little Book of Self-Compassion: Learn to be your own Best Friend’. The second (Cresswell, J. D. & Lindsay, E.K. (2014). How does mindfulness training affect health? A mindfulness stress buffering account. Current Directions in Psychological Science 23 (6): pp. 401-407) is more solid, but a little dated now. The third (Garland, E., Gaylord, S.A. & Fredrickson, B. L. (2011). Positive Reappraisal Mediates the Stress-Reductive Effects of Mindfulness: An Upward Spiral Process. Mindfulness 2 (1): pp. 59 – 67) is an interesting piece, but of limited value since there was no control group in the research and it tells us nothing about MBIs per se.

The supporting evidence provided by these writers for the claims they make is thin, to say the least. It is almost as if the truth of the claims is self-evident, and for these writers (all of whom use mindfulness practices themselves) there is clearly a personal authentication. But, not having had an epiphany myself and being somewhat reluctant to roll a raisin around my mouth, concentrating on its texture and flavours, fully focussing on the process of eating it (as recommended by Sarah and Tammy), I will, instead, consciously focus on the present moment of research.

Mindfulness and research

The first thing to know is that there has been a lot of research into mindfulness in recent years. The second thing to know is that much of it is poor quality. Here’s why:

  • There is no universally accepted technical definition of ‘mindfulness’ nor any broad agreement about detailed aspects of the underlying concept to which it refers (Van Dam, N. T. et al. (2018). Mind the Hype: A Critical Evaluation and Prescriptive Agenda for Research on Mindfulness and Meditation. Perspectives on Psychological Science 13: pp. 36 – 61)
  • To date, there are at least nine different psychometric questionnaires, all of which define and measure mindfulness differently (Purser, R.E. (2019). McMindfulness. Repeater Books. p.128)
  • Mindfulness research tends to rely on self-reporting, which is notoriously unreliable.
  • The majority of studies did not utilize randomized control groups (Goyal, M., Singh, S., Sibinga, E.S., et al. (201). Meditation Programs for Psychological Stress and Well-being: A Systematic Review and Meta-analysis. JAMA Intern Med. 2014. doi:10.1001/ jamainternmed.2013.13018).
  • Early meditation studies were mostly cross-sectional studies: that is, they compared data from a group of meditators with data from a control group at one point in time. A cross-sectional study design precludes causal attribution. (Tang, Y., Hölzel, B. & Posner, M. (2015). The neuroscience of mindfulness meditation. Nature Reviews Neuroscience 16, 213–225)
  • Sample sizes tend to be small and there is often no active control group. There are few randomized controlled trials (Dunning, D.L., Griffiths, K., Kuyken, W., Crane, C., Foulkes, L., Parker, J. and Dalgleish, T. (2019), Research Review: The effects of mindfulness‐based interventions on cognition and mental health in children and adolescents – a meta‐analysis of randomized controlled trials. (Journal of Child Psychology and Psychiatry, 60: 244-258. doi:10.1111/jcpp.12980)
  • There is a relatively strong bias towards the publication of positive or significant results (Coronado-Montoya, S., Levis, A.W., Kwakkenbos, L., Steele, R.J., Turner, E.H. & Thombs, B.D. (2016). Reporting of Positive Results in Randomized Controlled Trials of Mindfulness-Based Mental Health Interventions. PLoS ONE 11(4): e0153220. https://doi.org/10.1371/journal.pone.0153220)
  • More recent years have not seen significant improvements in the rigorousness of research (Goldberg SB, Tucker RP, Greene PA, Simpson TL, Kearney DJ, Davidson RJ (2017). Is mindfulness research methodology improving over time? A systematic review. PLoS ONE 12(10): e0187298).

 

The overall quality of the research into mindfulness is so poor that a group of fifteen researchers came together to write a paper entitled ‘Mind the Hype: A Critical Evaluation and Prescriptive Agenda for Research on Mindfulness and Meditation’ (Van Dam, N. T. et al. (2018). Mind the Hype: A Critical Evaluation and Prescriptive Agenda for Research on Mindfulness and Meditation. Perspectives on Psychological Science 13: pp. 36 – 61).

So, the research is problematic and replication is needed, but it does broadly support the claim that mindfulness meditation exerts beneficial effects on physical and mental health, and cognitive performance (Tang, Y., Hölzel, B. & Posner, M. (2015). The neuroscience of mindfulness meditation. Nature Reviews Neuroscience 16, 213–225). The italicized broadly is important here. As one of the leaders of the British Mindfulness in Schools Project (which has trained thousands of teachers in the UK) puts it, ‘research on mindfulness in schools is still in its infancy, particularly in relation to impacts on behaviour, academic performance and physical health. It can best be described as ‘promising’ and ‘worth trying’ (Weare, K. (2018). Evidence for the Impact of Mindfulness on Children and Young People. The Mindfulness in Schools Project). We don’t know what kind of MBIs are most effective, what kind of ‘dosage’ should be administered, what kinds of students it is (and is not) appropriate for, whether instructor training is significant or what cost-benefits it might bring. In short, there is more that we do not know than we know.

One systematic review, for example, found that MBIs had ‘small, positive effects on cognitive and socioemotional processes but these effects were not seen for behavioral or academic outcomes’. What happened to the promises of improved concentration, calmer behaviour and willpower? The review concludes that ‘the evidence from this review urges caution in the widespread adoption of MBIs and encourages rigorous evaluation of the practice should schools choose to implement it’ (Maynard, B. R., Solis, M., Miller, V. & Brendel, K. E. (2017). Mindfulness-based interventions for improving cognition, academic achievement, behavior and socio-emotional functioning of primary and secondary students. A Campbell Systematic Review 2017:5).

What about the claims for neurological change? As a general rule, references to neuroscience by educators should be taken with skepticism. Whilst it appears that ‘mindfulness meditation might cause neuroplastic changes in the structure and function of brain regions involved in regulation of attention, emotion and self-awareness’ (Tang, Y., Hölzel, B. & Posner, M. (2015). The neuroscience of mindfulness meditation. Nature Reviews Neuroscience 16, 213–225), this doesn’t really tell us very much. A complex mental state like mindfulness ‘is likely to be supported by the large-scale brain networks’ (ibid) and insights derived from fMRI scans of particular parts of the brain provide us with, at best, only a trivial understanding of what is going on. Without a clear definition of what mindfulness actually is, it is going to be some time before we unravel the neural mechanisms underpinning it. If, in fact, we ever do. By way of comparison, you might be interested in reading about neuroscientific studies into prayer , which also appears to correlate with enhanced wellbeing.

Rather than leaving things with the research, I’d like to leave you with a few more short mindfulness raisins to chew on.

Mindfulness and money

As Russell says in his blog post, ‘research in science doesn’t come out of a vacuum’. Indeed, it tends to follow the money. It is estimated that mindfulness is now ‘a $4 billion industry’ (Purser, R.E. (2019). McMindfulness. Repeater Books. p.13): ‘More than 100,000 books for sale on Amazon have a variant of ‘mindfulness’ in their title, touting the benefits of Mindful Parenting, Mindful Eating, Mindful Teaching, Mindful Therapy, Mindful Leadership, Mindful Finance, a Mindful Nation, and Mindful Dog Owners, to name just a few. There is also The Mindfulness Coloring Book, a bestselling genre in itself. Besides books, there are workshops, online courses, glossy magazines, documentary films, smartphone apps, bells, cushions, bracelets, beauty products and other paraphernalia, as well as a lucrative and burgeoning conference circuit’.

It is precisely because so much money is at stake that so much research has been funded. More proof is desperately needed, and it is sadly unforthcoming. Meanwhile, in the immortal words of Kayleigh McEnany, ‘science should not stand in the way.’

Minefulness and the individual

Mindfulness may be aptly described as a ‘technology of the self’. Ronald Purser, the author of ‘McMindfulness’, puts it like this: ‘Rather than discussing how attention is monetized and manipulated by corporations such as Google, Facebook, Twitter and Apple, [mindfulness advocates] locate crisis in our minds. It is not the nature of the capitalist system that is inherently problematic; rather, it is the failure of individuals to be mindful and resilient in a precarious and uncertain economy. Then they sell us solutions that make us contented mindful capitalists’.

It is this focus on the individual that makes it so appealing to right-wing foundations (e.g. the Templeton Foundation) that fund the research into mindfulness. For more on this topic, see my post about grit .

Mindfulness and religion

It is striking how often mindfulness advocates, like Amy, feel the need to insist that mindfulness is not a religious practice. Historically, of course, mindfulness comes direct from a Buddhist tradition, but in its present Western incarnation, it is a curious hybrid. Jon Kabat-Zinn  who, more than anyone else, has transformed mindfulness into a marketable commodity, is profoundly ambiguous on the topic. Buddhists, like Matthieu Ricard or David Forbes (author of ‘Mindfulness and its Discontents’, Fernwood Publishing, 2019), have little time for the cultural appropriation of the Pali term ‘Sati’, especially when mindfulness is employed by the American military for training for snipers. Others, like Goldie Hawn, whose MindUP programme sells well in the US, are quite clear about their religious affiliation and their desire to bring Buddhism into schools through the back door.

I personally find it hard to see the banging of Tibetan bowls as anything other than a religious act, but I am less bothered by this than those American school districts who saw MBIs as ‘covert religious indoctrination’ and banned them. Having said that, why not promote more prayer in schools if the ‘neuroscience’ supports it?

Clare is a busy teacher

Image from Andrew Percival , inspired by The Ladybird Book of Mindfulness and similar titles.

The ‘Routledge Handbook of Language Learning and Technology’ (eds. Farr and Murray, 2016) claims to be ‘the essential reference’ on the topic and its first two sections are devoted to ‘Historical and conceptual concepts’ and ‘Core issues’. One chapter (‘Limitations and boundaries in language learning and technology’ by Kern and Malinowski) mentions that ‘a growing body of research in intercultural communication and online language learning recognises how all technologies are embedded in cultural and linguistic practices, meaning that a given technological artefact can be used in radically different ways, and for different purposes by different groups of people’ (p.205). However, in terms of critical analyses of technology and language learning, that’s about as far as this book goes. In over 500 pages, there is one passing reference to privacy and a couple of brief mentions of the digital divide. There is no meaningful consideration of the costs, ownership or externalities of EdTech, of the ways in which EdTech is sold and marketed, of the vested interests that profit from EdTech, of the connections between EdTech and the privatisation of education, of the non-educational uses to which data is put, or of the implications of attention tracking, facial analysis and dataveillance in educational settings.

The Routledge Handbook is not alone in this respect. Li Li’s ‘New Technologies and Language Learning’ (Palgrave, 2017) is breathlessly enthusiastic about the potential of EdTech. The opening chapter catalogues a series of huge investments in global EdTech, as if the scale of investment was an indication of its wisdom. No mention of the lack of evidence that huge investments into IWBs and PCs in classrooms led to any significant improvement in learning. No mention of how these investments were funded (or which other parts of budgets were cut). Instead, we are told that ‘computers can promote visual, verbal and kinaesthetic learning’ (p.5).

I have never come across a book-length critical analysis of technology and language learning. As the world of language teaching jumps on board Zoom, Google Meet, Microsoft Teams, Skype (aka Microsoft) and the like, the need for a better critical awareness of EdTech and language learning has never been more urgent. Fortunately, there is a growing body of critical literature on technology and general education. Here are my twelve favourites:

Big Data in Education1 Big Data in Education

Ben Williamson (Sage, 2017)

An investigation into the growing digitalization and datafication of education. Williamson looks at how education policy is enacted through digital tools, the use of learning analytics and educational data science. His interest is in the way that technology has reshaped the way we think about education and the book may be read as a critical response to the techno-enthusiasm of Mayer-Schönberger and Cukier’s ‘Learning with Big Data: The Future of Education’ (Houghton Mifflin Harcourt, 2014). Williamson’s blog, Code Acts in Education, is excellent.

 

Distrusting Educational Technology2 Distrusting Educational Technology

Neil Selwyn (Routledge, 2014)

Neil Selwyn is probably the most widely-quoted critical voice in this field, and this book is as good a place to start with his work as any. EdTech, for Selwyn, is a profoundly political affair, and this book explores the gulf between how it could be used, and how it is actually used. Unpacking the ideological agendas of what EdTech is and does, Selwyn covers the reduction of education along data-driven lines, the deskilling of educational labour, the commodification of learning, issues of inequality, and much more. An essential primer.

 

 

The Great American Education Industrial Complex3 The Great American Education-Industrial Complex

Anthony G. Picciano & Joel Spring (Routledge, 2013)

Covering similar ground to both ‘Education Networks’ and ‘Edu.net’ (see below), this book’s subtitle, ‘Ideology, Technology, and Profit’, says it all. Chapter 4 (‘Technology in American Education’) is of particular interest, tracing the recent history of EdTech and the for-profit sector. Chapter 5 provides a wide range of examples of the growing privatization (through EdTech) of American schooling.

 

 

Disruptive Fixation4 Disruptive Fixation

Christo Sims (Princeton University Press, 2017)

The story of a New York school, funded by philanthropists and put together by games designers and educational reformers, that promised to ‘reinvent the classroom for the digital age’. And how it all went wrong … reverting to conventional rote learning with an emphasis on discipline, along with gender and racialized class divisions. A cautionary tale about techno-philanthropism.

 

 

Education Networks5 Education Networks

Joel Spring (Routledge, 2012)

Similar in many ways to ‘Edu.net’ (see below), this is an analysis of the relationships between the interest groups (international agencies, private companies and philanthropic foundations) that are pushing for greater use of EdTech. Spring considers the psychological, social and political implications of the growth of EdTech and concludes with a discussion of the dangers of consumerist approaches to education and dataveillance.

 

 

Edunet6 Edu.net

Stephen J. Ball, Carolina Junemann & Diego Santori (Routledge, 2017)

An account of the ways in which international agencies, private companies (e.g. Bridge International Academies, Pearson) and philanthropic foundations shape global education policies, with a particular focus on India and Ghana. These policies include the standardisation of education, the focus on core subjects, the use of corporate management models and test-based accountability, and are key planks in what has been referred to as the Global Education Reform Movement (GERM). Chapter 4 (‘Following things’) focusses on the role of EdTech in realising GERM goals.

 

Education and Technology7 Education and Technology

Neil Selwyn (Continuum, 2011)

Although covering some similar ground to his ‘Distrusting Educational Technology’, this handy volume summarises key issues, including ‘does technology inevitably change education?’, ‘what can history tell us about education and technology?’, ‘does technology improve learning?’, ‘does technology make education fairer?’, ‘will technology displace the teacher?’ and ‘will technology displace the school?’.

 

 

The Evolution of American Educational Technology8 The Evolution of American Educational Technology

Paul Saettler (Information Age, 2004)

A goldmine of historical information, this is the first of three history books on my list. Early educational films from the start of the 20th century, educational radio, teaching machines and programmed instruction, early computer-assisted instruction like the PLATO project, educational broadcasting and television … moving on to interactive video, teleconferencing, and artificial intelligence. A fascinatingly detailed study of educational dreams and obsolescence.

 

Oversold and Underused9 Oversold and Underused

Larry Cuban (Harvard University Press, 2003)

Larry Cuban’s ground-breaking ‘Teachers and Machines: The Classroom Use of Technology since 1920’ (published in 1986, four years before Saettler’s history) was arguably the first critical evaluation of EdTech. In this title, Cuban pursues his interest in the troubled relationship between teachers and technology, arguing that more attention needs to be paid to the civic and social goals of schooling, goals that make the question of how many computers are in classrooms trivial. Larry Cuban’s blog is well worth following.

 

The Flickering Mind10 The Flickering Mind

Todd Oppenheimer (Random House, 2003)

A journalistic account of how approximately $70 billion was thrown at EdTech in American schools at the end of the 20th century in an attempt to improve them. It’s a tale of getting the wrong priorities, technological obsolescence and, ultimately, a colossal waste of money. Technology has changed since the writing of this book, but as the epigram of Alphonse Karr (cited by Oppenheimer in his afterword) puts it – ‘plus ça change, plus c’est la même chose’.

 

 

Teaching Machines11 Teaching Machines

Bill Ferster (John Hopkins University Press, 2014)

This is the third history of EdTech on my list. A critical look at past attempts to automate instruction, and learning from successes and failures as a way of trying to avoid EdTech insanity (‘doing the same thing over and over again and expecting different results’). Not explicitly political, but the final chapter offers a useful framework for ‘making sense of teaching machines’.

 

 

The Technical Fix12 The Technical Fix

Kevin Robbins & Frank Webster (Macmillan, 1989)

Over thirty years old now, this remarkably prescient book situates the push for more EdTech in Britain in the 1980s as a part of broader social and political forces demanding a more market-oriented and entrepreneurial approach to education. The argument that EdTech cannot be extracted from relations of power and the social values that these entail is presented forcefully. Technology, write the authors, ‘is always shaped by, even constitutive of, prevailing values and power distribution’.

 

 

And here’s hoping that Audrey Watters’ new book sees the light of day soon, so it can be added to the list of history books!