Since no single definition of critical thinking prevails (Dummett & Hughes, 2019: 2), discussions of the topic invariably begin with attempts to provide a definition. Lai (2011) offers an accessible summary of a range of possible meanings, but points out that, in educational contexts, its meaning is often rather vague and encompasses other concepts (such as higher order thinking skills) which also lack clarity. Paul Dummett and John Hughes (2019: 4) plump for ‘a mindset that involves thinking reflectively, rationally and reasonably’ – a definition which involves a vague noun (that could mean a fixed state of mind, a learned attitude, a disposition or a mood) and three highly subjective adverbs. I don’t think I could do any better. However, instead of looking for a definition, we can reach a sort of understanding by looking at examples of it. Dummett and Hughes’ book is extremely rich in practical examples, and the picture that emerges of critical thinking is complex and multifaceted.

As you might expect of a weasel word like ‘critical thinking’, there appears to be general agreement that it’s a ‘good thing’. Paul Dummett suggests that there are two common reasons for promoting the inclusion of critical thinking activities in the language classroom. The first of these is a desire to get students thinking for themselves. The second is the idea ‘that we live in an age of misinformation in which only the critically minded can avoid manipulation or slavish conformity’. Neither seems contentious at first glance, although he points out that ‘they tend to lead to a narrow application of critical thinking in ELT materials: that is to say, the analysis of texts and evaluation of the ideas expressed in them’. It’s the second of these rationales that I’d like to explore further.

Penny Ur (2020: 9) offers a more extended version of it:

The role of critical thinking in education has become more central in the 21st century, simply because there is far more information readily available to today’s students than there was in previous centuries (mainly, but not only, online), and it is vital for them to be able to deal with such input wisely. They need to be able to distinguish between what is important and what is trivial, between truth and lies, between fact and opinion, between logical argument and specious propaganda […] Without such skills and awareness of the need to exercise them, they are liable to find themselves victims of commercial or political interests, their thinking manipulated by persuasion disguised as information.

In the same edited collection Olja Milosevic (2020:18) echoes Ur’s argument:

Critical thinking becomes even more important as communication increasingly moves online. Students find an overwhelming amount of information and need to be taught how to evaluate its relevance, accuracy and quality. If teachers do not teach students how to go beyond surface meaning, students cannot be expected to practise it.

In the passages I’ve quoted, these writers are referring to one particular kind of critical thinking. The ability to critically evaluate the reliability, accuracy, etc of a text is generally considered to be a part of what is usually called ‘media information literacy’. In these times of fake news, so the argument goes, it is vital for students to develop (with their teachers’ help) the necessary skills to spot fake news when they see it. The most prototypical critical thinking activity in ELT classrooms is probably one in which students analyse some fake news, such as the website about the Pacific Tree Octopus (which is the basis of a lesson in Dudeney et al., 2013: 198 – 203).

Before considering media information literacy in more detail, it’s worth noting in passing that a rationale for critical thinking activities is no rationale at all if it only concerns one aspect of critical thinking, since it has applied attributes of a part (media information literacy) to a bigger whole (critical thinking).

There is no shortage of good (free) material available for dealing with fake news in the ELT classroom. Examples include work by James Taylor, Chia Suan Chong and Tyson Seburn. Material of this kind may result in lively, interesting, cognitively challenging, communicative and, therefore, useful lessons. But how likely is it that material of this kind will develop learners’ media information literacy and, by extension therefore, their critical thinking skills? How likely is it that teaching material of this kind will help people identify (and reject) fake news? Is it possible that material of this kind is valuable despite its rationale, rather than because of it? In the spirit of rational, reflective and reasonable thinking, these are questions that seem to be worth exploring.

ELT classes and fake news

James Taylor has suggested that the English language classroom is ‘the perfect venue for [critical thinking] skills to be developed’. Although academic English courses necessarily involve elements of critical thinking, I’m not so sure that media information literacy (and, specifically, the identification of fake news) can be adequately addressed in general English classes. There are so many areas, besides those that are specifically language-focussed, competing for space in language classes (think of all those other 21st century skills), that it is hard to see how sufficient time can be found for real development of this skill. It requires modelling, practice of the skill, feedback on the practice, and more practice (Mulnix, 2010): it needs time. Fake news activities in the language classroom would, of course, be of greater value if they were part of an integrated approach across the curriculum. Unfortunately, this is rarely the case.

Information literacy skills

Training materials for media information literacy usually involve a number of stages. These include things like fact-checking and triangulation of different sources, consideration of web address, analysis of images, other items on the site, source citation and so on. The problem, however, is that news-fakers have become so good at what they do. The tree octopus site is very crude in comparison to what can be produced nowadays by people who have learnt to profit from the online economy of misinformation. Facebook employs an army of algorithmic and human fact-checkers, but still struggles. The bottom line is that background knowledge is needed (this is as true for media information literacy as it is for critical thinking more generally) (Willingham, 2007). With news, the scope of domain knowledge is so vast that it is extremely hard to transfer one’s ability to critically evaluate one particular piece of news to another. We are all fooled from time to time.

Media information literacy interventions: research on effectiveness

With the onset of COVID-19, the ability to identify fake news has become, more than ever, a matter of life and death. There is little question that this ability correlates strongly with analytic thinking (see, for example, Stanley et al., 2020). What is much less clear is how we can go about promoting analytic thinking. Analytic thinking comes in different varieties, and another hot-off-the-press research study into susceptibility to COVID-19 fake news (Roozenbeek et al., 2020) has found that the ability to spot fake news may correlate more strongly with numerical literacy than with reasoning ability. In fact, the research team found that a lack of numerical literacy was the most consistent predictor of susceptibility to misinformation about COVID-19. Perhaps we are attempting to develop the wrong kind of analytic thinking?

In educational contexts, attempts to promote media information literacy typically seek to develop reasoning abilities, and the evidence for their effectiveness is mixed. First of all, it needs to be said that ‘little large-scale evidence exists on the effectiveness of promoting digital media literacy as a response to online misinformation’ (Guess et al., 2020). An early meta-analysis (Jeong et al., 2012) found that such interventions had a positive effect, when the interventions were long (not one-off), but impacted more on students’ knowledge than they did on their behaviour. More recently, Huguet et al (2019) were unable to draw ‘definitive conclusions from past research, such as what kinds of media literacy practices work and under what conditions’. And this year, a study by Guess et al (2020) did not generate sufficient evidence ‘to conclude that the [media information literacy] intervention changed real-world consumption of false news’. I am unaware of any robust research in this area in the context of ELT.

It’s all rather disappointing. Why are we not better at it? After all, teachers of media studies have been exploring pathways for many years now. One possible answer is this: Media information literacy, like critical thinking more generally, is a skill that is acquirable, but it can only be acquired if there is a disposition to do so. The ability to think critically and the disposition to do so are separate entities (Facione, 2000). Training learners to be more critical in their approach to media information may be so much pissing in the wind if the disposition to be sceptical is not there. Shaping dispositions is a much harder task than training skills.

Both of the research studies into susceptibility to COVID-19 misinformation that I referred to earlier in this section underscore the significance of dispositions to analytic thinking. Roozenbeek et al (2020) found, in line with much previous research (for example, Jost et al. 2018), that political conservatism is associated with a slightly higher susceptibility to misinformation. Political views (on either side of the political spectrum) rarely change as a result of exposure to science or reasoned thinking. They also found that ‘self-identifying as a member of a minority predicts susceptibility to misinformation about the virus in all countries surveyed’ (except, interestingly, in the UK). Again, when issues of identity are at stake, emotional responses tend to trump rational ones.

Rational, reflective and reasonable thinking about media information literacy leads to an uncomfortable red-pill rabbit-hole. This is how Bulger and Davidson (2018) put it:

The extent to which media literacy can combat the problematic news environment is an open question. Is denying the existence of climate change a media literacy problem? Is believing that a presidential candidate was running a sex-trafficking ring out of a pizza shop a media literacy problem? Can media literacy combat the intentionally opaque systems of serving news on social media platforms? Or intentional campaigns of disinformation?

Teachers and fake news

The assumption that the critical thinking skills of young people can be developed through the intervention of their teachers is rarely problematized. It should be. A recent study of Spanish pre-service teachers (Fuertes-Prieto et al., 2020) showed that their ‘level of belief in pseudoscientific issues is comparable, or even higher in some cases to those of the general population’. There is no reason to believe that this changes after they have qualified. Teachers are probably no more likely to change their beliefs when presented with empirical evidence (Menz et al., 2020) than people from any other profession. Research has tended to focus on teachers’ lack of critical thinking in areas related to their work, but, things may be no different in the wider world. It is estimated that over a quarter of teachers in the US voted for the world’s greatest peddler of fake news in the 2016 presidential election.

It is also interesting to note that the sharing of fake news on social media is much more widespread among older people (including US teachers who have an average age of 42.4) than those under 30 (Bouygues, 2019).

Institutional contexts and fake news

Cory Doctorow has suggested that the fake news problem is not a problem of identifying what is true and what is fake, but a problem ‘about how we know whether something is true. We’re not disagreeing about facts, we’re disagreeing about epistemology’. In a post-modernist world of ‘Truth Decay’ (Kavanagh & Rich, 2018), where there is ‘a blurring of the line between opinion and fact’, epistemological authority is a rare commodity. Medicine, social sciences and applied linguistics are all currently experiencing a ‘replication crisis’ (Ioannidis, 2005) and we had a British education minister saying that ‘people of this country have had enough of experts’.

News reporting has always relied to some extent on trust in the reliability of the news source. The BBC or CNN might attempt to present themselves as more objective than, say, Fox News or InfoWars, but trust in all news outlets has collapsed globally in recent years. As Michael Shudson has written in the Columbia Journalism Review, ‘all news outlets write from a set of values, not simply from a disinterested effort at truth’. If a particular news channel manifestly shares different values from your own, it is easy to reject the veracity of the news it reports. Believers in COVID conspiracy theories often hold their views precisely because of their rejection of the epistemological authority of mainstream news and the WHO or governments who support lockdown measures.

The training of media information literacy in schools is difficult because, for many people in the US (and elsewhere), education is not dissimilar to mainstream media. They ‘are seen as the enemy — two institutions who are trying to have power over how people think. Two institutions that are trying to assert authority over epistemology’ (boyd, 2018). Schools have always been characterized by imbalances in power (between students and teachers / administrators), and this power dynamic is not conducive to open-minded enquiry. Children are often more aware of the power of their teachers than they are accepting of their epistemological authority. They are enjoined to be critical thinkers, but only about certain things and only up to a certain point. One way for children to redress the power imbalance is to reject the epistemological authority of their teachers. I think this may explain why a group of young children I observed recently coming out of a lesson devoted to environmental issues found such pleasure in joking about Greta ‘Thunfisch’.

Power relationships in schools are reflected and enacted in the interaction patterns between teachers and students. The most common of these is ‘initiation-response-feedback (IRF)’ and it is unlikely that this is particularly conducive to rational, reflective and reasonable thinking. At the same time, as Richard Paul, one of the early advocates of critical thinking in schools, noted, much learning activity is characterised by lower order thinking skills, especially memorization (Paul, 1992: 22). With this kind of backdrop, training in media information literacy is more likely to be effective if it goes beyond the inclusion of a few ‘fake news’ exercises: a transformation in the way that the teaching is done will also be needed. Benesch (1999) describes this as a more ‘dialogic’ approach and there is some evidence that a more dialogic approach can have a positive impact on students’ dispositions (e.g. Hajhosseiny, 2012).

I think that David Buckingham (2019a) captures the educational problem very neatly:

There’s a danger here of assuming that we are dealing with a rational process – or at least one that can, by some pedagogical means, be made rational. But from an educational perspective, we surely have to begin with the question of why people might believe apparently ‘fake’ news in the first place. Where we decide to place our trust is as much to do with fantasy, emotion and desire, as with rational calculation. All of us are inclined to believe what we want to believe.

Fake news: a problem or a symptom of a problem?

There has always been fake news. The big problem now is ‘the speed and ease of its dissemination, and it exists primarily because today’s digital capitalism makes it extremely profitable – look at Google and Facebook – to produce and circulate false but click-worthy narratives’ (Morosov, 2017). Fake news taps into and amplifies broader tendencies and divides in society: the problem is not straightforward and is unlikely to be easy to eradicate (Buckingham, 2019a: 3).

There is increasing discussion of media regulation and the recent banning by Facebook of Holocaust denial and QAnon is a recognition that some regulation cannot now be avoided. But strict regulations would threaten the ‘basic business model, and the enormous profitability’ of social media companies (Buckingham, 2009b) and there are real practical and ethical problems in working out exactly how regulation would happen. Governments do not know what to do.

Lacking any obvious alternative, media information literacy is often seen as the solution: can’t we ‘fact check and moderate our way out of this conundrum’ (boyd, 2018)? danah boyd’s stark response is, no, this will fail. It’s an inadequate solution to an oversimplified problem (Buckingham, 2019a).

Along with boyd and Buckingham, I’m not trying to argue that we drop media information literacy activities from educational (including ELT) programmes. Quite the opposite. But if we want our students to think reflectively, rationally and reasonably, I think we will need to start by doing the same.

References

Benesch, S. (1999). Thinking critically, thinking dialogically. TESOL Quarterly, 33: pp. 573 – 580

Bouygues, H. L. (2019). Fighting Fake News: Lessons From The Information Wars. Reboot Foundation https://reboot-foundation.org/fighting-fake-news/

boyd, d. (2018). You Think You Want Media Literacy… Do You? Data and Society: Points https://points.datasociety.net/you-think-you-want-media-literacy-do-you-7cad6af18ec2

Buckingham, D. (2019a). Teaching Media in a ‘Post-Truth’ Age: Fake News, Media Bias and the Challenge for Media Literacy Education. Cultura y Educación 31(2): pp. 1-19

Buckingham, D. (2019b). Rethinking digital literacy: Media education in the age of digital capitalism. https://ddbuckingham.files.wordpress.com/2019/12/media-education-in-digital-capitalism.pdf

Bulger, M. & Davidson, P. (2018). The Promises, Challenges and Futures of Media Literacy. Data and Society. https://datasociety.net/pubs/oh/DataAndSociety_Media_Literacy_2018.pdf

Doctorow, C. (2017). Three kinds of propaganda, and what to do about them. boingboing 25th February 2017, https://boingboing.net/2017/02/25/counternarratives-not-fact-che.html

Dudeney, G., Hockly, N. & Pegrum, M. (2013). Digital Literacies. Harlow: Pearson Education

Dummett, P. & Hughes, J. (2019). Critical Thinking in ELT. Boston: National Geographic Learning

Facione, P. A. (2000). The disposition toward critical thinking: Its character, measurement, and relation to critical thinking skill. Informal Logic, 20(1), 61–84.

Fuertes-Prieto, M.Á., Andrés-Sánchez, S., Corrochano-Fernández, D. et al. (2020). Pre-service Teachers’ False Beliefs in Superstitions and Pseudosciences in Relation to Science and Technology. Science & Education 29, 1235–1254 (2020). https://doi.org/10.1007/s11191-020-00140-8

Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, N., Reifler, J. & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences Jul 2020, 117 (27) 15536-15545; DOI: 10.1073/pnas.1920498117

Hajhosseiny, M. (2012). The Effect of Dialogic Teaching on Students’ Critical Thinking Disposition. Procedia – Social and Behavioral Sciences, 69: pp. 1358 – 1368

Huguet, A., Kavanagh, J., Baker, G. & Blumenthal, M. S. (2019). Exploring Media Literacy Education as a Tool for Mitigating Truth Decay. RAND Corporation, https://www.rand.org/content/dam/rand/pubs/research_reports/RR3000/RR3050/RAND_RR3050.pdf

Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine 2 (8): e124. https://doi.org/10.1371/journal.pmed.0020124

Jeong, S. H., Cho, H., & Hwang, Y. (2012). Media literacy interventions: A meta-analytic review. Journal of Communication, 62, pp. 454–472

Jones-Jang, S. M., Mortensen, T. & Liu, J. (2019). Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. American Behavioral Scientist, pp. 1 – 18, doi:10.1177/0002764219869406

Jost, J. T., van der Linden, S., Panagopoulos, C. & Hardin, C. D. (2018). Ideological asymmetries in conformity, desire for shared reality, and the spread of misinformation. Current Opinion in Psychology, 23: pp/ 77-83. doi:10.1016/j.copsyc.2018.01.003

Kavanagh, J. & Rich, M. D. (2018). Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND Corporation, https://www.rand.org/pubs/research_reports/RR2314.html

Lai, E.R. 2011. Critical Thinking: A Literature Review. Pearson. http://images.pearsonassessments.com/images/tmrs/CriticalThinkingReviewFINAL.pdf

Menz, C., Spinath, B. & Seifried, E. (2020). Misconceptions die hard: prevalence and reduction of wrong beliefs in topics from educational psychology among preservice teachers. European Journal of Psychology of Education https://doi.org/10.1007/s10212-020-00474-5

Milosevic, O. (2020). Promoting critical thinking in the EFL classroom. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp.17 – 22

Morozov, E. (2017). Moral panic over fake news hides the real enemy – the digital giants. The Guardian, 8 January 2017 https://www.theguardian.com/commentisfree/2017/jan/08/blaming-fake-news-not-the-answer-democracy-crisis

Mulnix, J.W. 2010. ‘Thinking critically about critical thinking’ Educational Philosophy and Theory, 2010

Paul, R. W. (1992). Critical thinking: What, why, and how? New Directions for Community Colleges, 77: pp. 3–24.

Roozenbeek, J., Schneider, C.R., Dryhurst, S., Kerr, J., Freeman, A. L. J., Recchia, G., van der Bles, A. M. & and van der Linden, S. (2020). Susceptibility to misinformation about COVID-19 around the world. Royal Society Open Science, 7 (10) https://doi.org/10.1098/rsos.201199

Stanley, M., Barr, N., Peters, K. & Seli, P. (2020). Analytic-thinking predicts hoax beliefs and helping behaviors in response to the COVID-19 pandemic. PsyArxiv Preprints doi:10.31234/osf.io/m3vt

Ur, P. (2020). Critical Thinking. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp.9 – 16

Willingham, D. T. (2007). Critical Thinking: Why Is It So Hard to Teach? American Educator Summer 2007: pp. 8 – 19

‘Pre-teaching’ (of vocabulary) is a widely-used piece of language teaching jargon, but it’s a strange expression. The ‘pre’ indicates that it’s something that comes before something else that is more important, what Chia Suan Chong calls ‘the main event’, which is usually some reading or listening work. The basic idea, it seems, is to lessen the vocabulary load of the subsequent activity. If the focus on vocabulary were the ‘main event’, we might refer to the next activity as ‘post-reading’ or ‘post-listening’ … but we never do.

The term is used in standard training manuals by both Jim Scrivener (2005: 230 – 233) and Jeremy Harmer (2012: 137) and, with a few caveats, the practice is recommended. Now read this from the ELT Nile Glossary:

For many years teachers were recommended to pre-teach vocabulary before working on texts. Nowadays though, some question this, suggesting that the contexts that teachers are able to set up for pre-teaching are rarely meaningful and that pre-teaching in fact prevents learners from developing the attack strategies they need for dealing with challenging texts.

Chia is one of those doing this questioning. She suggests that ‘we cut out pre-teaching altogether and go straight for the main event. After all, if it’s a receptive skills lesson, then shouldn’t the focus be on reading/listening skills and strategies? And most importantly, pre-teaching prevents learners’ from developing a tolerance of ambiguity – a skill that is vital in language learning.’ Scott Thornbury is another who has expressed doubts about the value of PTV, although he is more circumspect in his opinions. He has argued that working out the meaning of vocabulary from context is probably a better approach and that PTV inadequately prepares learners for the real world. If we have to pre-teach, he argues, get it out of the way ‘as quickly and efficiently as possible’ … or ‘try post-teaching instead’.

Both Chia and Scott touch on the alternatives, and guessing the meaning of unknown words from context is one of them. I’ve discussed this area in an earlier post. Not wanting to rehash the content of that post here, the simple summary is this: it’s complicated. We cannot, with any degree of certainty, say that guessing meaning from context leads to more gains in either reading / listening comprehension or vocabulary development than PTV or one of the other alternatives – encouraging / allowing monolingual or bilingual dictionary look up (see this post on the topic), providing a glossary (see this post) or doing post-text vocabulary work.

In attempting to move towards a better understanding, the first problem is that there is very little research into the relationship between PTV and improved reading / listening comprehension. What there is (e.g. Webb, 2009) suggests that pre-teaching can improve comprehension and speed up reading, but there are other things that a teacher can do (e.g. previous presentation of comprehension questions or the provision of pictorial support) that appear to lead to more gains in these areas (Pellicer-Sánchez et al., 2021). It’s not exactly a ringing endorsement. There is even less research looking at the relationship between PTV and vocabulary development. What there is (Pellicer-Sánchez et al., 2021) suggests that pre-teaching leads to more vocabulary gains than when learners read without any support. But the reading-only condition is unlikely in most real-world learning contexts, where there is a teacher, dictionary or classmate who can be turned to. A more interesting contrast is perhaps between PTV and during-reading vocabulary instruction, which is a common approach in many classrooms. One study (File & Adams, 2010) looked at precisely this area and found little difference between the approaches in terms of vocabulary gains. The limited research does not provide us with any compelling reasons either for or against PTV.

Another problem is, as usual, that the research findings often imply more than was actually demonstrated. The abstract for the study by Pellicer-Sánchez et al (2021) states that pre‐reading instruction led to more vocabulary learning. But this needs to be considered in the light of the experimental details.

The study involved 87 L2 undergraduates and postgraduates studying at a British university. Their level of English was therefore very high, and we can’t really generalise to other learners at other levels in other conditions. The text that they read contained a number of pseudo-words and was 2,290 words long. The text itself, a narrative, was of no intrinsic interest, so the students reading it would treat it as an object of study and they would notice the pseudo-words, because their level of English was already high, and because they knew that the focus of the research was on ‘new words’. In other words, the students’ behaviour was probably not at all typical of a student in a ‘normal’ classroom. In addition, the pseudo-words were all Anglo-Saxon looking, and not therefore representative of the kinds of unknown items that students would encounter in authentic (or even pedagogical) texts (which would have a high proportion of words with Latin roots). I’m afraid I don’t think that the study tells us anything of value.

Perhaps research into an area like this, with so many variables that need to be controlled, is unlikely ever to provide teachers with clear answers to what appears to be a simple question: is PTV a good idea or not? However, I think we can get closer to something resembling useful advice if we take another tack. For this, I think two additional questions need to be asked. First, what is the intended main learning opportunity (note that I avoid the term ‘learning outcome’!) of the ‘main event’ – the reading or listening. Second, following on from the first question, what is the point of PTV, i.e. in what ways might it contribute to enriching the learning opportunities of the ‘main event’?

To answer the first question, I think it is useful to go back to a distinction made almost forty years ago in a paper by Tim Johns and Florence Davies (1983). They contrasted the Text as a Linguistic Object (TALO) with the Text as a Vehicle for Information (TAVI). The former (TALO) is something that language students study to learn language from in a direct way. It has typically been written or chosen to illustrate and to contextualise bits of grammar, and to provide opportunities for lexical ‘quarrying’. The latter (TAVI) is a text with intrinsic interest, read for information or pleasure, and therefore more appropriately selected by the learner, rather than the teacher. For an interesting discussion on TALO and TAVI, see this 2015 post from Geoff Jordan.

Johns and Davies wrote their article in pre-Headway days when texts in almost all coursebooks were unashamedly TALOs, and when what were called top-down reading skills (reading for gist / detail, etc.) were only just beginning to find their way into language teaching materials. TAVIs were separate, graded readers, for example. In some parts of the world, TALOs and TAVIs are still separate, often with one teacher dealing with the teaching of discrete items of language through TALOs, and another responsible for ‘skills development’ through TAVIs. But, increasingly, under the influence of British publishers and methodologists, attempts have been made to combine TALOs and TAVIs in a single package. The syllabus of most contemporary coursebooks, fundamentally driven by a discrete-item grammar plus vocabulary approach, also offer a ‘skills’ strand which requires texts to be intrinsically interesting, meaningful and relevant to today’s 21st century learners. The texts are required to carry out two functions.

Recent years have seen an increasingly widespread questioning of this approach. Does the exploitation of reading and listening texts in coursebooks (mostly through comprehension questions) actually lead to gains in reading and listening skills? Is there anything more than testing of comprehension going on? Or do they simply provide practice in strategic approaches to reading / listening, strategies which could probably be transferred from L1? As a result of the work of scholars like William Grabe (reading) and John Field and Richard Cauldwell (listening), there is now little, if any, debate in the world of research about these questions. If we want to develop the reading / listening skills of our students, the approach of most coursebooks is not the way to go about it. For a start, the reading texts are usually too short and the listening texts too long.

Most texts that are found in most contemporary coursebooks are TALOs dressed up to look like TAVIs. Their fundamental purpose is to illustrate and contextualise language that has either been pre-taught or will be explored later. They are first and foremost vehicles for language, and only secondarily vehicles for information. They are written and presented in as interesting a way as possible in order to motivate learners to engage with the TALO. Sometimes, they succeed.

However, there are occasions (even in coursebooks) when texts are TAVIs – used for purely ‘skills’ purposes, language use as opposed to language study. Typically, they (reading or listening texts) are used as springboards for speaking and / or writing practice that follows. It’s the information in the text that matters most.

So, where does all this take us with PTV? Here is my attempt at a break-down of advice.

1 TALOs where the text contains a set of new lexical items which are a core focus of the lesson

If the text is basically a contextualized illustration of a set of lexical items (and, usually, a particular grammatical structure), there is a strong case for PTV. This is, of course, assuming that these items are of sufficiently high frequency to be suitable candidates for direct vocabulary instruction. If this is so, there is also a strong case to be made for the PTV to be what has been called ‘rich instruction’, which ‘involves (1) spending time on the word; (2) explicitly exploring several aspects of what is involved in knowing a word; and (3) involving learners in thoughtfully and actively processing the word’ (Nation, 2013: 117). In instances like this, PTV is something of a misnomer. It’s just plain teaching, and is likely to need as much, or more, time than exploration of the text (which may be viewed as further practice of / exposure to the lexis).

If the text is primarily intended as lexical input, there is also a good case to be made for making the target items it contains more salient by, for example, highlighting them or putting them in bold (Choi, 2017). At the same time, if ‘PTV’ is to lead to lexical gains, these are likely to be augmented by post-reading tasks which also focus explicitly on the target items (Sonbul & Schmitt, 2010).

2 TALOs which contain a set of lexical items that are necessary for comprehension of the text, but not a core focus of the lesson (e.g. because they are low-frequency)

PTV is often time-consuming, and necessarily so if the instruction is rich. If it is largely restricted to matching items to meanings (e.g. through translation), it is likely to have little impact on vocabulary development, and its short-term impact on comprehension appears to be limited. Research suggests that the use of a glossary is more efficient, since learners will only refer to it when they need to (whereas PTV is likely to devote some time to some items that are known to some learners, and this takes place before the knowledge is required … and may therefore be forgotten in the interim). Glossaries lead to better comprehension (Alessi & Dwyer, 2008).

3 TAVIs

I don’t have any principled objection to the occasional use of texts as TALOs, but it seems fairly clear that a healthy textual diet for language learners will contain substantially more TAVIs than TALOs, substantially more extensive reading than intensive reading of the kind found in most coursebooks. If we focused less often on direct instruction of grammar (a change of emphasis which is long overdue), there would be less need for TALOs, anyway. With TAVIs, there seems to be no good reason for PTV: glossaries or digital dictionary look-up will do just fine.

However, one alternative justification and use of PTV is offered by Scott Thornbury. He suggests identifying a relatively small number of keywords from a text that will be needed for global understanding. Some of them may be unknown to the learners, and for these, learners use dictionaries to check meaning. Then, looking at the list of key words learners predict what the text will be about. The rationale here is that if learners engage with these words before encountering them in the text, it ‘may be an effective way of activating a learner’s schema for the text, and this may help to support comprehension’ (Ballance, 2018). However, as Ballance notes, describing this kind of activity as PTV would be something of a misnomer: it is a useful addition to a teacher’s repertoire of schema-activation activities (which might be used with both TAVIs and TALOs).

In short …

The big question about PTV, then, is not one of ‘yes’ or ‘no’. It’s about the point of the activity. Balance (2018) offers a good summary:

‘In sum, for teachers to use PTV effectively, it is essential that they clearly identify a rationale for including PTV within a lesson, select the words to be taught in conjunction with this rationale and also design the vocabulary learning or development exercise in a manner that is commensurate with this rationale. The rationale should be the determining factor in the design of a PTV component within a lesson, and different rationales for using PTV naturally lead to markedly different selections of vocabulary items to be studied and different exercise designs.’

REFERENCES

Alessi, S. & Dwyer, A. (2008). Vocabulary assistance before and during reading. Reading in a Foreign Language, 20 (2): pp. 246 – 263

Ballance, O. J. (2018). Strategies for pre-teaching vocabulary in context. In The TESOL Encyclopedia of English Language Teaching (pp. 1-7). Wiley. https://doi.org/10.1002/9781118784235.eelt0732

Choi, S. (2017). Processing and learning of enhanced English collocations: An eye movement study. Language Teaching Research, 21, 403–426. https://doi.org/10.1177/1362168816653271

File, K. A. & Adams, R. (2010). Should vocabulary instruction be integrated or isolated? TESOL Quarterly, 24, 222–249.

Harmer, J. (2012). Essential Teacher Knowledge. Harlow: Pearson

Johns, T. & Davies, F. (1983). Text as a vehicle for information: the classroom use of written texts in teaching reading in a foreign language. Reading in a Foreign Language, 1 (1): pp. 1 – 19

Nation, I. S. P. (2013). Learning Vocabulary in Another Language 2nd Edition. Cambridge: Cambridge University Press

Pellicer-Sánchez, A., Conklin, K. & Vilkaitė-Lozdienė, L. (2021). The effect of pre-reading instruction on vocabulary learning: An investigation of L1 and L2 readers’ eye movements. Language Learning, 0 (0), 0-0. https://onlinelibrary.wiley.com/doi/full/10.1111/lang.12430

Scrivener, J. (2005). Learning Teaching 2nd Edition. Oxford: Macmillan

Sonbul, S. & Schmitt, N. (2010). Direct teaching of vocabulary after reading: is it worth the effort? ELT Journal 64 (3): pp.253 – 260

Webb, S. (2009). The effects of pre‐learning vocabulary on reading comprehension and writing. The Canadian Modern Language Review, 65 (3): pp. 441–470.

In the last post, I mentioned a lesson plan from an article by Pegrum, M., Dudeney, G. & Hockly, N. (2018. Digital literacies revisited. The European Journal of Applied Linguistics and TEFL, 7 (2), pp. 3-24) in which students discuss the data that is collected by fitness apps and the possibility of using this data to calculate health insurance premiums, before carrying out and sharing online research about companies that track personal data. It’s a nice plan, unfortunately pay-walled, but you could try requesting a copy through Research Gate.

The only other off-the-shelf lesson plan I have been able to find is entitled ‘You and Your Data’ from the British Council. Suitable for level B2, this plan, along with a photocopiable pdf, contains with a vocabulary task (matching), a reading text (you and your data, who uses our data and why, can you protect your data) with true / false and sentence completion tasks) and a discussion (what do you do to protect our data). The material was written to coincide with Safer Internet Day (an EU project), which takes place in early February (next data 9 February 2021). The related website, Better Internet for Kids, contains links to a wide range of educational resources for younger learners.

For other resources, a good first stop is Ina Sander’s ‘A Critically Commented Guide to Data Literacy Tools’ in which she describes and evaluates a wide range of educational online resources for developing critical data literacy. Some of the resources that I discuss below are also evaluated in this guide. Here are some suggestion for learning / teaching resources.

A glossary

This is simply a glossary of terms that are useful in discussing data issues. It could easily be converted into a matching exercise or flashcards.

A series of interactive videos

Do not Track’ is an award-winning series of interactive videos, produced by a consortium of broadcasters. In seven parts, the videos consider such issues as who profits from the personal data that we generate online, the role of cookies in the internet economy, how online profiling is done, the data generated by mobile phones and how algorithms interpret the data.

Each episode is between 5 and 10 minutes long, and is therefore ideal for asynchronous viewing. In a survey of critical data literacy tools (Sander, 2020), ‘Do not Track’ proved popular with the students who used it. I highly recommend it, but students will probably need a B2 level or higher.

More informational videos

If you do not have time to watch the ‘Do Not Track’ video series, you may want to use something shorter. There are a huge number of freely available videos about online privacy. I have selected just two which I think would be useful. You may be able to find something better!

1 Students watch a video about how cookies work. This video, from Vox, is well-produced and is just under 7 minutes long. The speaker speaks fairly rapidly, so captions may be helpful.

Students watch a video as an introduction to the topic of surveillance and privacy. This video, ‘Reclaim our Privacy’, was produced by ‘La Quadrature du Net’, a French advocacy group that promotes digital rights and freedoms of citizens. It is short (3 mins) and can be watched with or without captions (English or 6 other languages). Its message is simple: political leaders should ensure that our online privacy is respected.

A simple matching task ‘ten principles for online privacy’

1 Share the image below with all the students and ask them to take a few minutes matching the illustrations to the principles on the right. There is no need for anyone to write or say anything, but it doesn’t matter if some students write the answers in the chat box.

(Note: This image and the other ideas for this activity are adapted from https://teachingprivacy.org/ , a project developed by the International Computer Science Institute and the University of California-Berkeley for secondary school students and undergraduates. Each of the images corresponds to a course module, which contains a wide-range of materials (videos, readings, discussions, etc.) which you may wish to explore more fully.)

2 Share the image below (which shows the answers in abbreviated form). Ask if anyone needs anything clarified.

You’re Leaving Footprints Principle: Your information footprint is larger than you think.

There’s No Anonymity Principle: There is no anonymity on the Internet.

Information Is Valuable Principle: Information about you on the Internet will be used by somebody in their interest — including against you.

Someone Could Listen Principle: Communication over a network, unless strongly encrypted, is never just between two parties.

Sharing Releases Control Principle: Sharing information over a network means you give up control over that information — forever.

Search Is Improving Principle: Just because something can’t be found today, doesn’t mean it can’t be found tomorrow.

Online Is Real Principle: The online world is inseparable from the “real” world.

Identity Isn’t Guaranteed Principle: Identity is not guaranteed on the Internet.

You Can’t Escape Principle: You can’t avoid having an information footprint by not going online.

Privacy Requires Work Principle: Only you have an interest in maintaining your privacy.

3 Wrap up with a discussion of these principles.

Hands-on exploration of privacy tools

Click on the link below to download the procedure for the activity, as well as supporting material.

A graphic novel

Written by Michael Keller and Josh Neufeld, and produced by Al Jazeera, this graphic novel ‘Terms of Service. Understanding our role in the world of Big Data’ provides a good overview of critical data literacy issues, offering lots of interesting, concrete examples of real cases. The language is, however, challenging (C1+). It may be especially useful for trainee teachers.

A website

The Privacy International website is an extraordinary goldmine of information and resources. Rather than recommending anything specific, my suggestion is that you, or your students, use the ‘Search’ function on the homepage and see where you end up.

In the first post in this 3-part series, I focussed on data collection practices in a number of ELT websites, as a way of introducing ‘critical data literacy’. Here, I explore the term in more detail.

Although the term ‘big data’ has been around for a while (see this article and infographic) it’s less than ten years ago that it began to enter everyday language, and found its way into the OED (2013). In the same year, Viktor Mayer-Schönberger and Kenneth Cukier published their best-selling ‘Big Data: A Revolution That Will Transform How We Live, Work, and Think’ (2013) and it was hard to avoid enthusiastic references in the media to the transformative potential of big data in every sector of society.

Since then, the use of big data and analytics has become ubiquitous. Massive data collection (and data surveillance) has now become routine and companies like Palantir, which specialise in big data analytics, have become part of everyday life. Palantir’s customers include the LAPD, the CIA, the US Immigration and Customs Enforcement (ICE) and the British Government. Its recent history includes links with Cambridge Analytica, assistance in an operation to arrest the parents of illegal migrant children, and a racial discrimination lawsuit where the company was accused of having ‘routinely eliminated’ Asian job applicants (settled out of court for $1.7 million).

Unsurprisingly, the datafication of society has not gone entirely uncontested. Whilst the vast majority of people seem happy to trade their personal data for convenience and connectivity, a growing number are concerned about who benefits most from this trade-off. On an institutional level, the EU introduced the General Data Protection Regulation (GDPR), which led to Google being fined Ꞓ50 million for insufficient transparency in their privacy policy and their practices of processing personal data for the purposes of behavioural advertising. In the intellectual sphere, there has been a recent spate of books that challenge the practices of ubiquitous data collection, coining new terms like ‘surveillance capitalism’, ‘digital capitalism’ and ‘data colonialism’. Here are four recent books that I have found particularly interesting.

Beer, D. (2019). The Data Gaze. London: Sage

Couldry, N. & Mejias, U. A. (2019). The Costs of Connection. Stanford: Stanford University Press

Sadowski, J. (2020). Too Smart. Cambridge, Mass.: MIT Press

Zuboff, S. (2019). The Age of Surveillance Capitalism. New York: Public Affairs

The use of big data and analytics in education is also now a thriving industry, with its supporters claiming that these technologies can lead to greater personalization, greater efficiency of instruction and greater accountability. Opponents (myself included) argue that none of these supposed gains have been empirically demonstrated, and that the costs to privacy, equity and democracy outweigh any potential gains. There is a growing critical literature and useful, recent books include:

Bradbury, A. & Roberts-Holmes, G. (2018). The Datafication of Primary and Early Years Education. Abingdon: Routledge

Jarke, J. & Breiter, A. (Eds.) (2020). The Datafication of Education. Abingdon: Routledge

Williamson, B. (2017). Big Data in Education: The digital future of learning, policy and practice. London: Sage

Concomitant with the rapid growth in the use of digital tools for language learning and teaching, and therefore the rapid growth in the amount of data that learners were (mostly unwittingly) giving away, came a growing interest in the need for learners to develop a set of digital competencies, or literacies, which would enable them to use these tools effectively. In the same year that Mayer-Schönberger and Cukier brought out their ‘Big Data’ book, the first book devoted to digital literacies in English language teaching came out (Dudeney et al., 2013). They defined digital literacies as the individual and social skills needed to effectively interpret, manage, share and create meaning in the growing range of digital communication channels (Dudeney et al., 2013: 2). The book contained a couple of activities designed to raise students’ awareness of online identity issues, along with others intended to promote critical thinking about digitally-mediated information (what the authors call ‘information literacy’), but ‘critical literacy’ was missing from the authors’ framework.

Critical thinking and critical literacy are not the same thing. Although there is no generally agreed definition of the former (with a small ‘c’), it is focussed primarily on logic and comprehension (Lee, 2011). Paul Dummett and John Hughes (2019: 4) describe it as ‘a mindset that involves thinking reflectively, rationally and reasonably’. The prototypical critical thinking activity involves the analysis of a piece of fake news (e.g. the task where students look at a website about tree octopuses in Dudeney et al. 2013: 198 – 203). Critical literacy, on the other hand, involves standing back from texts and technologies and viewing them as ‘circulating within a larger social and textual context’ (Warnick, 2002). Consideration of the larger social context necessarily entails consideration of unequal power relationships (Leee, 2011; Darvin, 2017), such as that between Google and the average user of Google. And it follows from this that critical literacy has a socio-political emancipatory function.

Critical digital literacy is now a growing field of enquiry (e.g. Pötzsch, 2019) and there is an awareness that digital competence frameworks, such as the Digital Competence Framework of the European Commission, are incomplete and out of date without the inclusion of critical digital literacy. Dudeney et al (2013) clearly recognise the importance of including critical literacy in frameworks of digital literacies. In Pegrum et al. (2018, unfortunately paywalled), they update the framework from their 2013 book, and the biggest change is the inclusion of critical literacy. They divide this into the following:

  • critical digital literacy – closely related to information literacy
  • critical mobile literacy – focussing on issues brought to the fore by mobile devices, ranging from protecting privacy through to safeguarding mental and physical health
  • critical material literacy – concerned with the material conditions underpinning the use of digital technologies, ranging from the socioeconomic influences on technological access to the environmental impacts of technological manufacturing and disposal
  • critical philosophical literacy – concerned with the big questions posed to and about humanity as our lives become conjoined with the existence of our smart devices, robots and AI
  • critical academic literacy, which refers to the pressing need to conduct meaningful studies of digital technologies in place of what is at times ‘cookie-cutter’ research

I’m not entirely convinced by the subdivisions, but labelling in this area is still in its infancy. My particular interest here, in critical data literacy, seems to span across a number of their sub-divisions. And the term that I am using, ‘critical data literacy’, which I’ve taken from Tygel & Kirsch (2016), is sometimes referred to as ‘critical big data literacy’ (Sander, 2020a) or ‘personal data literacy’ (Pangrazio & Selwyn, 2019). Whatever it is called, it is the development of ‘informed and critical stances toward how and why [our] data are being used’ (Pangrazio & Selwyn, 2018). One of the two practical activities in the Pegrum at al article (2018) looks at precisely this area (the task requires students to consider the data that is collected by fitness apps). It will be interesting to see, when the new edition of the ‘Digital Literacies’ book comes out (perhaps some time next year), how many other activities take a more overtly critical stance.

In the next post, I’ll be looking at a range of practical activities for developing critical data literacy in the classroom. This involves both bridging the gaps in knowledge (about data, algorithms and online privacy) and learning, practically, how to implement ‘this knowledge for a more empowered internet usage’ (Sander, 2020b).

Without wanting to invalidate the suggestions in the next post, a word of caution is needed. Just as critical thinking activities in the ELT classroom cannot be assumed to lead to any demonstrable increase in critical thinking (although there may be other benefits to the activities), activities to promote critical literacy cannot be assumed to lead to any actual increase in critical literacy. The reaction of many people may well be ‘It’s not like it’s life or death or whatever’ (Pangrazio & Selwyn, 2018). And, perhaps, education is rarely, if ever, a solution to political and social problems, anyway. And perhaps, too, we shouldn’t worry too much about educational interventions not leading to their intended outcomes. Isn’t that almost always the case? But, with those provisos in mind, I’ll come back next time with some practical ideas.

REFERENCES

Darvin R. (2017). Language, Ideology, and Critical Digital Literacy. In: Thorne S., May S. (eds) Language, Education and Technology. Encyclopedia of Language and Education (3rd ed.). Springer, Cham. pp. 17 – 30 https://doi.org/10.1007/978-3-319-02237-6_35

Dudeney, G., Hockly, N. & Pegrum, M. (2013). Digital Literacies. Harlow: Pearson Education

Dummett, P. & Hughes, J. (2019). Critical Thinking in ELT. Boston: National Geographic Learning

Lee, C. J. (2011). Myths about critical literacy: What teachers need to unlearn. Journal of Language and Literacy Education [Online], 7 (1), 95-102. Available at http://www.coa.uga.edu/jolle/2011_1/lee.pdf

Mayer-Schönberger, V. & Cukier, K. (2013). Big Data: A Revolution That Will Transform How We Live, Work, and Think. London: John Murray

Pangrazio, L. & Selwyn, N. (2018). ‘It’s not like it’s life or death or whatever’: young people’s understandings of social media data. Social Media + Society, 4 (3): pp. 1–9. https://journals.sagepub.com/doi/pdf/10.1177/2056305118787808

Pangrazio, L. & Selwyn, N. (2019). ‘Personal data literacies’: A critical literacies approach to enhancing understandings of personal digital data. New Media and Society, 21 (2): pp. 419 – 437

Pegrum, M., Dudeney, G. & Hockly, N. (2018). Digital literacies revisited. The European Journal of Applied Linguistics and TEFL, 7 (2), pp. 3-24

Pötzsch, H. (2019). Critical Digital Literacy: Technology in Education Beyond Issues of User Competence and Labour-Market Qualifications. tripleC: Communication, Capitalism & Critique, 17: pp. 221 – 240 Available at https://www.triple-c.at/index.php/tripleC/article/view/1093

Sander, I. (2020a). What is critical big data literacy and how can it be implemented? Internet Policy Review, 9 (2). DOI: 10.14763/2020.2.1479 https://www.econstor.eu/bitstream/10419/218936/1/2020-2-1479.pdf

Sander, I. (2020b). Critical big data literacy tools – Engaging citizens and promoting empowered internet usage. Data & Policy, 2: DOI: https://doi.org/10.1017/dap.2020.5

Tygel, A. & Kirsch, R. (2016). Contributions of Paulo Freire for a Critical Data Literacy: a Popular Education Approach. The Journal of Community Informatics, 12 (3). Available at http://www.ci-journal.net/index.php/ciej/article/view/1296

Warnick, B. (2002). Critical Literacy in a Digital Era. Mahwah, NJ, Lawrence Erlbaum Associates

Take the Cambridge Assessment English website, for example. When you connect to the site, you will see, at the bottom of the screen, a familiar (to people in Europe, at least) notification about the site’s use of cookies: the cookies consent.

You probably trust the site, so ignore the notification and quickly move on to find the resource you are looking for. But if you did click on hyperlinked ‘set cookies’, what would you find? The first link takes you to the ‘Cookie policy’ where you will be told that ‘We use cookies principally because we want to make our websites and mobile applications user-friendly, and we are interested in anonymous user behaviour. Generally our cookies don’t store sensitive or personally identifiable information such as your name and address or credit card details’. Scroll down, and you will find out more about the kind of cookies that are used. Besides the cookies that are necessary to the functioning of the site, you will see that there are also ‘third party cookies’. These are explained as follows: ‘Cambridge Assessment works with third parties who serve advertisements or present offers on our behalf and personalise the content that you see. Cookies may be used by those third parties to build a profile of your interests and show you relevant adverts on other sites. They do not store personal information directly but use a unique identifier in your browser or internet device. If you do not allow these cookies, you will experience less targeted content’.

This is not factually inaccurate: personal information is not stored directly. However, it is extremely easy for this information to be triangulated with other information to identify you personally. In addition to the data that you generate by having cookies on your device, Cambridge Assessment will also directly collect data about you. Depending on your interactions with Cambridge Assessment, this will include ‘your name, date of birth, gender, contact data including your home/work postal address, email address and phone number, transaction data including your credit card number when you make a payment to us, technical data including internet protocol (IP) address, login data, browser type and technology used to access this website’. They say they may share this data ‘with other people and/or businesses who provide services on our behalf or at our request’ and ‘with social media platforms, including but not limited to Facebook, Google, Google Analytics, LinkedIn, in pseudonymised or anonymised forms’.

In short, Cambridge Assessment may hold a huge amount of data about you and they can, basically, do what they like with it.

The cookie and privacy policies are fairly standard, as is the lack of transparency in the phrasing of them. Rather more transparency would include, for example, information about which particular ad trackers you are giving your consent to. This information can be found with a browser extension tool like Ghostery, and these trackers can be blocked. As you’ll see below, there are 5 ad trackers on this site. This is rather more than other sites that English language teachers are likely to go to. ETS-TOEFL has 4, Macmillan English and Pearson have 3, CUP ELT and the British Council Teaching English have 1, OUP ELT, IATEFL, BBC Learning English and Trinity College have none. I could only find TESOL, with 6 ad trackers, which has more. The blogs for all these organisations invariably have more trackers than their websites.

The use of numerous ad trackers is probably a reflection of the importance that Cambridge Assessment gives to social media marketing. There is a research paper, produced by Cambridge Assessment, which outlines the significance of big data and social media analytics. They have far more Facebook followers (and nearly 6 million likes) than any other ELT page, and they are proud of their #1 ranking in the education category of social media. The amount of data that can be collected here is enormous and it can be analysed in myriad ways using tools like Ubervu, Yomego and Hootsuite.

A little more transparency, however, would not go amiss. According to a report in Vox, Apple has announced that some time next year ‘iPhone users will start seeing a new question when they use many of the apps on their devices: Do they want the app to follow them around the internet, tracking their behavior?’ Obviously, Google and Facebook are none too pleased about this and will be fighting back. The implications for ad trackers and online advertising, more generally, are potentially huge. I wrote to Cambridge Assessment about this and was pleased to hear that ‘Cambridge Assessment are currently reviewing the process by which we obtain users consent for the use of cookies with the intention of moving to a much more transparent model in the future’. Let’s hope that other ELT organisations are doing the same.

You may be less bothered than I am by the thought of dozens of ad trackers following you around the net so that you can be served with more personalized ads. But the digital profile about you, to which these cookies contribute, may include information about your ethnicity, disabilities and sexual orientation. This profile is auctioned to advertisers when you visit some sites, allowing them to show you ‘personalized’ adverts based on the categories in your digital profile. Contrary to EU regulations, these categories may include whether you have cancer, a substance-abuse problem, your politics and religion (as reported in Fortune https://fortune.com/2019/01/28/google-iab-sensitive-profiles/ ).

But it’s not these cookies that are the most worrying aspect about our lack of digital privacy. It’s the sheer quantity of personal data that is stored about us. Every time we ask our students to use an app or a platform, we are asking them to divulge huge amounts of data. With ClassDojo, for example, this includes names, usernames, passwords, age, addresses, photographs, videos, documents, drawings, or audio files, IP addresses and browser details, clicks, referring URL’s, time spent on site, and page views (Manolev et al., 2019; see also Williamson, 2019).

It is now widely recognized that the ‘consent’ that is obtained through cookie policies and other end-user agreements is largely spurious. These consent agreements, as Sadowski (2019) observes, are non-negotiated, and non-negotiable; you either agree or you are denied access. What’s more, he adds, citing one study, it would take 76 days, working for 8 hours a day, to read the privacy policies a person typically encounters in a year. As a result, most of us choose not to choose when we accept online services (Cobo, 2019: 25). We have little, if any, control over how the data that is collected is used (Birch et al., 2020). More importantly, perhaps, when we ask our students to sign up to an educational app, we are asking / telling them to give away their personal data, not just ours. They are unlikely to fully understand the consequences of doing so.

The extent of this ignorance is also now widely recognized. In the UK, for example, two reports (cited by Sander, 2020) indicate that ‘only a third of people know that data they have not actively chosen to share has been collected’ (Doteveryone, 2018: 5), and that ‘less than half of British adult internet users are aware that apps collect their location and information on their personal preferences’ (Ofcom, 2019: 14).

The main problem with this has been expressed by programmer and activist, Richard Stallman, in an interview with New York magazine (Kulwin, 2018): Companies are collecting data about people. The data that is collected will be abused. That’s not an absolute certainty, but it’s a practical, extreme likelihood, which is enough to make collection a problem.

The abuse that Smallman is referring to can come in a variety of forms. At the relatively trivial end is the personalized advertising. Much more serious is the way that data aggregation companies will scrape data from a variety of sources, building up individual data profiles which can be used to make significant life-impacting decisions, such as final academic grades or whether one is offered a job, insurance or credit (Manolev et al., 2019). Cathy O’Neil’s (2016) best-selling ‘Weapons of Math Destruction’ spells out in detail how this abuse of data increases racial, gender and class inequalities. And after the revelations of Edward Snowden, we all know about the routine collection by states of huge amounts of data about, well, everyone. Whether it’s used for predictive policing or straightforward repression or something else, it is simply not possible for younger people, our students, to know what personal data they may regret divulging at a later date.

Digital educational providers may try to reassure us that they will keep data private, and not use it for advertising purposes, but the reassurances are hollow. These companies may change their terms and conditions further down the line, and examples exist of when this has happened (Moore, 2018: 210). But even if this does not happen, the data can never be secure. Illegal data breaches and cyber attacks are relentless, and education ranked worst at cybersecurity out of 17 major industries in one recent analysis (Foresman, 2018). One report suggests that one in five US schools and colleges have fallen victim to cyber-crime. Two weeks ago, I learnt (by chance, as I happened to be looking at my security settings on Chrome) that my passwords for Quizlet, Future Learn, Elsevier and Science Direct had been compromised by a data breach. To get a better understanding of the scale of data breaches, you might like to look at the UK’s IT Governance site, which lists detected and publicly disclosed data breaches and cyber attacks each month (36.6 million records breached in August 2020). If you scroll through the list, you’ll see how many of them are educational sites. You’ll also see a comment about how leaky organisations have been throughout lockdown … because they weren’t prepared for the sudden shift online.

Recent years have seen a growing consensus that ‘it is crucial for language teaching to […] encompass the digital literacies which are increasingly central to learners’ […] lives’ (Dudeney et al., 2013). Most of the focus has been on the skills that are needed to use digital media. There also appears to be growing interest in developing critical thinking skills in the context of digital media (e.g. Peachey, 2016) – identifying fake news and so on. To a much lesser extent, there has been some focus on ‘issues of digital identity, responsibility, safety and ethics when students use these technologies’ (Mavridi, 2020a: 172). Mavridi (2020b: 91) also briefly discusses the personal risks of digital footprints, but she does not have the space to explore more fully the notion of critical data literacy. This literacy involves an understanding of not just the personal risks of using ‘free’ educational apps and platforms, but of why they are ‘free’ in the first place. Sander (2020b) suggests that this literacy entails ‘an understanding of datafication, recognizing the risks and benefits of the growing prevalence of data collection, analytics, automation, and predictive systems, as well as being able to critically reflect upon these developments. This includes, but goes beyond the skills of, for example, changing one’s social media settings, and rather constitutes an altered view on the pervasive, structural, and systemic levels of changing big data systems in our datafied societies’.

In my next two posts, I will, first of all, explore in more detail the idea of critical data literacy, before suggesting a range of classroom resources.

(I posted about privacy in March 2014, when I looked at the connections between big data and personalized / adaptive learning. In another post, September 2014, I looked at the claims of the CEO of Knewton who bragged that his company had five orders of magnitude more data about you than Google has. … We literally have more data about our students than any company has about anybody else about anything, and it’s not even close. You might find both of these posts interesting.)

References

Birch, K., Chiappetta, M. & Artyushina, A. (2020). ‘The problem of innovation in technoscientific capitalism: data rentiership and the policy implications of turning personal digital data into a private asset’ Policy Studies, 41:5, 468-487, DOI: 10.1080/01442872.2020.1748264

Cobo, C. (2019). I Accept the Terms and Conditions. https://adaptivelearninginelt.files.wordpress.com/2020/01/41acf-cd84b5_7a6e74f4592c460b8f34d1f69f2d5068.pdf

Doteveryone. (2018). People, Power and Technology: The 2018 Digital Attitudes Report. https://attitudes.doteveryone.org.uk

Dudeney, G., Hockly, N. & Pegrum, M. (2013). Digital Literacies. Harlow: Pearson Education

Foresman, B. (2018). Education ranked worst at cybersecurity out of 17 major industries. Edscoop, December 17, 2018. https://edscoop.com/education-ranked-worst-at-cybersecurity-out-of-17-major-industries/

Kulwin, K. (2018). F*ck Them. We Need a Law’: A Legendary Programmer Takes on Silicon Valley, New York Intelligencer, 2018, https://nymag.com/intelligencer/2018/04/richard-stallman-rms-on-privacy-data-and-free-software.html

Manolev, J., Sullivan, A. & Slee, R. (2019). ‘Vast amounts of data about our children are being harvested and stored via apps used by schools’ EduReseach Matters, February 18, 2019. https://www.aare.edu.au/blog/?p=3712

Mavridi, S. (2020a). Fostering Students’ Digital Responsibility, Ethics and Safety Skills (Dress). In Mavridi, S. & Saumell, V. (Eds.) Digital Innovations and Research in Language Learning. Faversham, Kent: IATEFL. pp. 170 – 196

Mavridi, S. (2020b). Digital literacies and the new digital divide. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp. 90 – 98

Moore, M. (2018). Democracy Hacked. London: Oneworld

Ofcom. (2019). Adults: Media use and attitudes report [Report]. https://www.ofcom.org.uk/__data/assets/pdf_file/0021/149124/adults-media-use-and-attitudes-report.pdf

O’Neil, C. (2016). Weapons of Math Destruction. London: Allen Lane

Peachey, N. (2016). Thinking Critically through Digital Media. http://peacheypublications.com/

Sadowski, J. (2019). ‘When data is capital: Datafication, accumulation, and extraction’ Big Data and Society 6 (1) https://doi.org/10.1177%2F2053951718820549

Sander, I. (2020a). What is critical big data literacy and how can it be implemented? Internet Policy Review, 9 (2). DOI: 10.14763/2020.2.1479 https://www.econstor.eu/bitstream/10419/218936/1/2020-2-1479.pdf

Sander, I. (2020b). Critical big data literacy tools—Engaging citizens and promoting empowered internet usage. Data & Policy, 2: e5 doi:10.1017/dap.2020.5

Williamson, B. (2019). ‘Killer Apps for the Classroom? Developing Critical Perspectives on ClassDojo and the ‘Ed-tech’ Industry’ Journal of Professional Learning, 2019 (Semester 2) https://cpl.asn.au/journal/semester-2-2019/killer-apps-for-the-classroom-developing-critical-perspectives-on-classdojo

A week or so ago, someone in the Macmillan marketing department took it upon themselves to send out this tweet. What grabbed my attention was the claim that it is ‘a well-known fact’ that teaching students a growth mindset makes them perform better academically over time. The easily demonstrable reality (which I’ll come on to) is that this is not a fact. It’s fake news, being used for marketing purposes. The tweet links to a blog post of over a year ago. In it, Chia Suan Chong offers five tips for developing a growth mindset in students: educating students about neuroplasticity, delving deeper into success stories, celebrating challenges and mistakes, encouraging students to go outside their comfort zones, and giving ‘growth-mindset-feedback’. All of which, she suggests, might help our students. Indeed, it might, and, even if it doesn’t, it might be worth a try anyway. Chia doesn’t make any claims beyond the potential of the suggested strategies, so I wonder where the Macmillan Twitter account person got the ‘well-known fact’.

If you google ‘mindset ELT’, you will find webpage after webpage offering tips about how to promote growth mindset in learners. It’s rare for the writers of these pages to claim that the positive effects of mindset interventions are a ‘fact’, but it’s even rarer to come across anyone who suggests that mindset interventions might be an à la mode waste of time and effort. Even in more serious literature (e.g. Mercer, S. & Ryan, S. (2010). A mindset for EFL: learners’ beliefs about the role of natural talent. ELT Journal, 64 (4): 436 – 444), the approach is fundamentally enthusiastic, with no indication that there might be a problem with mindset theory. Given that this enthusiasm is repeated so often, perhaps we should not blame the Macmillan tweeter for falling victim to the illusory truth effect. After all, it appears that 98% of teachers in the US feel that growth mindset approaches should be adopted in schools (Hendrick, 2019).

Chia suggests that we can all have fixed mindsets in certain domains (e.g. I know all about that, there’s nothing more I can learn). One domain where it seems that fixed mindsets are prevalent is mindset theory itself. This post is an attempt to nudge towards more ‘growth’ and, in trying to persuade you to be more sceptical, I will quote as much as possible from Carol Dweck, the founder of mindset theory, and her close associates.

Carol Dweck’s book ‘Mindset: The New Psychology of Success’ appeared in 2006. In it, she argued that people can be placed on a continuum between those who have ‘a fixed mindset–those who believe that abilities are fixed—[and who] are less likely to flourish [and] those with a growth mindset–those who believe that abilities can be developed’ (from the back cover of the updated (2007) version of the book). There was nothing especially new about the idea. It is very close to Bandura’s (1982) theory of self-efficacy, which will be familiar to anyone who has read Zoltán Dörnyei’s more recent work on motivation in language learning. It’s closely related to Carl Roger’s (1969) ideas about self-concept and it’s not a million miles removed, either, from Maslow’s (1943) theory of self-actualization. The work of Rogers and Maslow was at the heart of the ‘humanistic turn’ in ELT in the latter part of the 20th century (see, for example, Early, 1981), so mindset theory is likely to resonate with anyone who was inspired by the humanistic work of people like Moskowitz, Stevick or Rinvolucri. The appeal of mindset theory is easy to see. Besides its novelty value, it resonates emotionally with the values that many teachers share, writes Tom Bennett: it feels right that you don’t criticise the person, but invite them to believe that, through hard work and persistence, you can achieve.

We might even trace interest in the importance of self-belief back to the Stoics (who, incidentally but not coincidentally, are experiencing a revival of interest), but Carol Dweck introduced a more modern flavour to the old wine and packaged it skilfully and accessibly in shiny new bottles. Her book was a runaway bestseller, with sales in the millions, and her TED Talk has now had over 11 million views. It was in education that mindset theory became particularly popular. As a mini-industry it is now worth millions and millions. Just one research project into the efficacy of one mindset product has received 3.5 million dollars in US federal funding.

But, much like other ideas that have done a roaring trade in popular psychology (Howard Gardner’s ‘multiple intelligences theory, for example) which seem to offer simple solutions to complex problems, there was soon pushback. It wasn’t hard for critics to scoff at motivational ‘yes-you-can’ posters in classrooms or accounts of well-meaning but misguided teacher interventions, like this one reported by Carl Hendrick:

One teacher [took] her children out into the pristine snow covering the school playground, she instructed them to walk around, taking note of their footprints. “Look at these paths you’ve been creating,” the teacher said. “In the same way that you’re creating new pathways in the snow, learning creates new pathways in your brain.”

Carol Dweck was sympathetic to the critics. She has described the early reaction to her book as ‘uncontrollable’. She freely admits that she and her colleagues had underestimated the issues around mindset interventions in the classrooms and that such interventions were ‘not yet evidence-based’. She identified two major areas where mindset interventions have gone awry. The first of these is when a teacher teaches the concept of mindsets to students, but does not change other policies and practices in the classroom. The second is that some teachers have focussed too much on praising their learners’ efforts. Teachers have taken mindset recipes and tips, without due consideration. She says:

Teachers have to ask, what exactly is the evidence suggesting? They have to realise it takes deep thought and deep experimentation on their part in the classroom to see how best the concept can be implemented there. This should be a group enterprise, in which they share what worked, what did not work, for whom and when. People need to recognise we are researchers, we have produced a body of evidence that says under these conditions this is what happened. We have not explored all the conditions that are possible. Teacher feedback on what is working and not working is hugely valuable to us to tell us what we have not done and what we need to do.

Critics like Dylan William, Carl Hendrick and Timothy Bates found that it was impossible to replicate Dweck’s findings, and that there were at best weak correlations between growth mindset and academic achievement, and between mindset interventions and academic gains. They were happy to concede that typical mindset interventions would not do any harm, but asked whether the huge amounts of money being spent on mindset would not be better invested elsewhere.

Carol Dweck seems to like the phrase ‘not yet’. She argues, in her TED Talk, that simply using the words ‘not yet’ can build students’ confidence, and her tip is often repeated by others. She also talks about mindset interventions being ‘not yet evidence-based’, which is a way of declaring her confidence that they soon will be. But, with huge financial backing, Dweck and her colleagues have recently been carrying out a lot of research and the results are now coming in. There are a small number of recent investigations that advocates of mindset interventions like to point to. For reasons of space, I’ll refer to two of them.

The first (Outes-Leon, et al., 2020) of these looked at an intervention with children in the first grades in a few hundred Peruvian secondary schools. The intervention consisted of students individually reading a text designed to introduce them to the concept of growth-mindset. This was followed by a group debate about the text, before students had to write individually a reflective letter to a friend/relative describing what they had learned. In total, this amounted to about 90 minutes of activity. Subsequently, teachers made a subjective assessment of the ‘best’ letters and attached these to the classroom wall, along with a growth mindset poster, for the rest of the school year. Teachers were also asked to take a picture of the students alongside the letters and the poster and to share this picture by email.

Academic progress was measured 2 and 14 months after the intervention and compared to a large control group. The short-term (2 months) impact of the intervention was positive for mathematics, but less so for reading comprehension. (Why?) These gains were only visible in regional schools, not at all in metropolitan schools. Similar results were found when looking at the medium-term (14 month) impact. The reasons for this are unclear. It is hypothesized that the lower-achieving students in regional schools might benefit more from the intervention. Smaller class sizes in regional schools might also be a factor. But, of course, many other explanations are possible.

The paper is entitled The Power of Believing You Can Get Smarter. The authors make it clear that they were looking for positive evidence of the intervention and they were supported by mindset advocates (e.g. David Yeager) from the start. It was funded by the World Bank, which is a long-standing advocate of growth mindset interventions. (Rather jumping the gun, the World Bank’s Mindset Team wrote in 2014 that teaching growth mindset is not just another policy fad. It is backed by a burgeoning body of empirical research.) The paper’s authors conclude that ‘the benefits of the intervention were relevant and long-lasting in the Peruvian context’, and they focus strongly on the low costs of the intervention. They acknowledge that the way the tool is introduced (design of the intervention) and the context in which this occurs (i.e., school and teacher characteristics) both matter to understand potential gains. But without understanding the role of the context, we haven’t really learned anything practical that we can take away from the research. Our understanding of the power of believing you can get smarter has not been meaningfully advanced.

The second of these studies (Yeager et al., 2019) took many thousands of lower-achieving American 9th graders from a representative sample of schools. It is a very well-designed and thoroughly reported piece of research. The intervention consisted of two 25-minute online sessions, 20 days apart, which sought to reduce the negative effort beliefs of students (the belief that having to try hard or ask for help means you lack ability), fixed-trait attributions (the attribution that failure stems from low ability) and performance avoidance goals (the goal of never looking stupid). An analysis of academic achievement at the end of the school year indicated clearly that the intervention led to improved performance. These results lead to very clear grounds for optimism about the potential of growth mindset interventions, but the report is careful to avoid overstatement. We have learnt about one particular demographic with one particular intervention, but it would be wrong to generalise beyond that. The researchers had hoped that the intervention would help to compensate for unsupportive school norms, but found that this was not the case. Instead, they found that it was when the peer norm supported the adoption of intellectual challenges that the intervention promoted sustained benefits. Context, as in the Peruvian study, was crucial. The authors write:

We emphasize that not all forms of growth mindset interventions can be expected to increase grades or advanced course-taking, even in the targeted subgroups. New growth mindset interventions that go beyond the module and population tested here will need to be subjected to rigorous development and validation processes.

I think that a reasonable conclusion from reading this research is that it may well be worth experimenting with growth mindset interventions in English language classes, but without any firm expectation of any positive impact. If nothing else, the interventions might provide useful, meaningful practice of the four skills. First, though, it would make sense to read two other pieces of research (Sisk et al., 2018; Burgoyne et al., 2020). Unlike the projects I have just discussed, these were not carried out by researchers with an a priori enthusiasm for growth-mindset interventions. And the results were rather different.

The first of these (Sisk et al., 2018) was a meta-analysis of the literature. It found that there was only a weak correlation between mindset and academic achievement, and only a weak correlation between mindset interventions and academic gains. It did, however, lend support to one of the conclusions of Yeager et al (2019), that such interventions may benefit students who are academically at risk.

The second (Burgoyne et al., 2020) found that the foundations of mind-set theory are not firm and that bold claims about mind-set appear to be overstated. Other constructs such as self-efficacy and need for achievement, [were] found to correlate much more strongly with presumed associates of mind-set.

So, where does this leave us? We are clearly a long way from ‘facts’; mindset interventions are ‘not yet evidence-based’. Carl Hendrick (2019) provides a useful summary:

The truth is we simply haven’t been able to translate the research on the benefits of a growth mindset into any sort of effective, consistent practice that makes an appreciable difference in student academic attainment. In many cases, growth mindset theory has been misrepresented and miscast as simply a means of motivating the unmotivated through pithy slogans and posters. […] Recent evidence would suggest that growth mindset interventions are not the elixir of student learning that many of its proponents claim it to be. The growth mindset appears to be a viable construct in the lab, which, when administered in the classroom via targeted interventions, doesn’t seem to work at scale. It is hard to dispute that having a self-belief in their own capacity for change is a positive attribute for students. Paradoxically, however, that aspiration is not well served by direct interventions that try to instil it.

References

Bandura, Albert (1982). Self-efficacy mechanism in human agency. American Psychologist, 37 (2): pp. 122–147. doi:10.1037/0003-066X.37.2.122.

Burgoyne, A. P., Hambrick, D. Z., & Macnamara, B. N. (2020). How Firm Are the Foundations of Mind-Set Theory? The Claims Appear Stronger Than the Evidence. Psychological Science, 31(3), 258–267. https://doi.org/10.1177/0956797619897588

Early, P. (Ed.) ELT Documents 1113 – Humanistic Approaches: An Empirical View. London: The British Council

Dweck, C. S. (2006). Mindset: The New Psychology of Success. New York: Ballantine Books

Hendrick, C. (2019). The growth mindset problem. Aeon,11 March 2019.

Maslow, A. (1943). A Theory of Human Motivation. Psychological Review, 50: pp. 370-396.

Outes-Leon, I., Sanchez, A. & Vakis, R. (2020). The Power of Believing You Can Get Smarter : The Impact of a Growth-Mindset Intervention on Academic Achievement in Peru (English). Policy Research working paper, no. WPS 9141 Washington, D.C. : World Bank Group. http://documents.worldbank.org/curated/en/212351580740956027/The-Power-of-Believing-You-Can-Get-Smarter-The-Impact-of-a-Growth-Mindset-Intervention-on-Academic-Achievement-in-Peru

Rogers, C. R. (1969). Freedom to Learn: A View of What Education Might Become. Columbus, Ohio: Charles Merill

Sisk, V. F., Burgoyne, A. P., Sun, J., Butler, J. L., Macnamara, B. N. (2018). To what extent and under which circumstances are growth mind-sets important to academic achievement? Two meta-analyses. Psychological Science, 29, 549–571. doi:10.1177/0956797617739704

Yeager, D.S., Hanselman, P., Walton, G.M. et al. (2019). A national experiment reveals where a growth mindset improves achievement. Nature 573, 364–369. https://doi.org/10.1038/s41586-019-1466-y

subtitlesAs both a language learner and a teacher, I have a number of questions about the value of watching subtitled videos for language learning. My interest is in watching extended videos, rather than short clips for classroom use, so I am concerned with incidental, rather than intentional, learning, mostly of vocabulary. My questions include:

  • Is it better to watch a video that is subtitled or unsubtitled?
  • Is it better to watch a video with L1 or L2 subtitles?
  • If a video is watched more than once, what is the best way to start and proceed? In which order (no subtitles, L1 subtitles and L2 subtitles) is it best to watch?

For help, I turned to three recent books about video and language learning: Ben Goldstein and Paul Driver’s Language Learning with Digital Video (CUP, 2015), Kieran Donaghy’s Film in Action (Delta, 2015) and Jamie Keddie’s Bringing Online Video into the Classroom (OUP, 2014). I was surprised to find no advice, but, as I explored more, I discovered that there may be a good reason for these authors’ silence.

There is now a huge literature out there on subtitles and language learning, and I cannot claim to have read it all. But I think I have read enough to understand that I am not going to find clear-cut answers to my questions.

The learning value of subtitles

It has been known for some time that the use of subtitles during extensive viewing of video in another language can help in the acquisition of that language. The main gains are in vocabulary acquisition and the development of listening skills (Montero Perez et al., 2013). This is true of both L1 subtitles (with an L2 audio track), sometimes called interlingual subtitles, (Incalcaterra McLoughlin et al, 2011) and L2 subtitles (with an L2 audio track), sometimes called intralingual subtitles or captions (Vanderplank, 1988). Somewhat more surprisingly, vocabulary gains may also come from what are called reversed subtitles (L2 subtitles and an L1 audio track) (Burczyńska, 2015). Of course, certain conditions apply for subtitled video to be beneficial, and I’ll come on to these. But there is general research agreement (an exception is Karakaş & Sariçoban, 2012) that more learning is likely to take place from watching a subtitled video in a target language than an unsubtitled one.

Opposition to the use of subtitles as a tool for language learning has mostly come from three angles. The first of these, which concerns L1 subtitles, is an antipathy to any use at all of L1. Although such an attitude remains entrenched in some quarters, there is no evidence to support it (Hall & Cook, 2012; Kerr, 2016). Researchers and, increasingly, teachers have moved on.

The second reservation that is sometimes expressed is that learners may not attend to either the audio track or the subtitles if they do not need to. They may, for example, ignore the subtitles in the case of reversed subtitles or ignore the L2 audio track when there are L1 subtitles. This can, of course, happen, but it seems that, on the whole, this is not the case. In an eye-tracking study by Bisson et al (2012), for example, it was found that most people followed the subtitles, irrespective of what kind they were. Unsurprisingly, they followed the subtitles more closely when the audio track was in a language that was less familiar. When conditions are right (see below), reading subtitles becomes a very efficient and partly automatized cognitive activity, which does not prevent people from processing the audio track at the same time (d’Ydewalle & Pavakanun, 1997).

Related to the second reservation is the concern that the two sources of information (audio and subtitles), combined with other information (images and music or sound effects), may be in competition and lead to cognitive overload, impacting negatively on both comprehension and learning. Recent research suggests that this concern is ungrounded (Kruger et al, 2014). L1 subtitles generate less cognitive load than L2 subtitles, but overload is not normally reached and mental resources are still available for learning (Baranowska, 2020). The absence of subtitles generates more cognitive load.

Conditions for learning

Before looking at the differences between L1 and L2 subtitles, it’s a good idea to look at the conditions under which learning is more likely to take place with subtitles. Some of these are obvious, others less so.

First of all, the video material must be of sufficient intrinsic interest to the learner. Secondly, the subtitles must be of a sufficiently high quality. This is not always the case with automatically generated captions, especially if the speech-to-text software struggles with the audio accent. It is also not always the case with professionally produced L1 subtitles, especially when the ‘translations are non-literal and made at the phrase level, making it hard to find connections between the subtitle text and the words in the video’ (Kovacs, 2013, cited by Zabalbeascoa et al., 2015: 112). As a minimum, standard subtitling guidelines, such as those produced for the British Channel 4, should be followed. These limit, for example, the number of characters per line to about 40 and a maximum of two lines.

For reasons that I’ll come on to, learners should be able to switch easily between L1 and L2 subtitles. They are also likely to benefit if reliably accurate glosses or hyperlinks are ‘embedded in the subtitles, making it possible for a learner to simply click for additional verbal, auditory or even pictorial glosses’ (Danan, 2015: 49).

At least as important as considerations of the materials or tools, is a consideration of what the learner brings to the activity (Frumuselu, 2019: 104). Vanderplank (2015) describes these different kinds of considerations as the ‘effects of’ subtitles on a learner and the ‘effects with’ subtitles on learner behaviour.

In order to learn from subtitles, you need to be able to read fast enough to process them. Anyone with a slow reading speed (e.g. some dyslexics) in their own language is going to struggle. Even with L1 subtitles, Vanderplank (2015: 24) estimates that it is only around the age of 10 that children can do this with confidence. Familarity with both the subject matter and with subtitle use will impact on this ability to read subtitles fast enough.

With L2 subtitles, the language proficiency of the learner related to the level of difficulty (especially lexical difficulty) of the subtitles will clearly be of some significance. It is unlikely that L2 subtitles will be of much benefit to beginners (Taylor, 2005). It also suggests that, at lower levels, materials need to be chosen carefully. On the whole, researchers have found that higher proficiency levels correlate with greater learning gains (Pujadas & Muñoz, 2019; Suárez & Gesa, 2019), but one earlier meta-analysis (Montero Perez et al., 2013) did not find that proficiency levels were significant.

Measures of general language proficiency may be too blunt an instrument to help us all of the time. I can learn more from Portuguese than from Arabic subtitles, even though I am a beginner in both languages. The degree of proximity between two languages, especially the script (Winke et al., 2010), is also likely to be significant.

But a wide range of other individual learner differences will also impact on the learning from subtitles. It is known that learners approach subtitles in varied and idiosyncratic ways (Pujolá, 2002), with some using L2 subtitles only as a ‘back-up’ and others relying on them more. Vanderplank (2019) grouped learners into three broad categories: minimal users who were focused throughout on enjoying films as they would in their L1, evolving users who showed marked changes in their viewing behaviour over time, and maximal users who tended to be experienced at using films to enhance their language learning.

Categories like these are only the tip of the iceberg. Sensory preferences, personality types, types of motivation, the impact of subtitles on anxiety levels and metacognitive strategy awareness are all likely to be important. For the last of these, Danan (2015: 47) asks whether learners should be taught ‘techniques to make better use of subtitles and compensate for weaknesses: techniques such as a quick reading of subtitles before listening, confirmation of word recognition or meaning after listening, as well as focus on form for spelling or grammatical accuracy?’

In short, it is, in practice, virtually impossible to determine optimal conditions for learning from subtitles, because we cannot ‘take into account all the psycho-social, cultural and pedagogic parameters’ (Gambier, 2015). With that said, it’s time to take a closer look at the different potential of L1 and L2 subtitles.

L1 vs L2 subtitles

Since all other things are almost never equal, it is not possible to say that one kind of subtitles offers greater potential for learning than another. As regards gains in vocabulary acquisition and listening comprehension, there is no research consensus (Baranowska, 2020: 107). Research does, however, offer us a number of pointers.

Extensive viewing of subtitled video (both L1 and L2) can offer ‘massive quantities of authentic and comprehensible input’ (Vanderplank, 1988: 273). With lower level learners, the input is likely to be more comprehensible with L1 subtitles, and, therefore, more enjoyable and motivating. This makes them often more suitable for what Caimi (2015: 11) calls ‘leisure viewing’. Vocabulary acquisition may be better served with L2 subtitles, because they can help viewers to recognize the words that are being spoken, increase their interaction with the target language, provide further language context, and increase the redundancy of information, thereby enhancing the possibility of this input being stored in long-term memory (Frumuselu et al., 2015). These effects are much more likely with Vanderplank’s (2019) motivated, ‘maximal’ users than with ‘minimal’ users.

There is one further area where L2 subtitles may have the edge over L1. One of the values of extended listening in a target language is the improvement in phonetic retuning (see, for example, Reinisch & Holt, 2013), the ability to adjust the phonetic boundaries in your own language to the boundaries that exist in the target language. Learning how to interpret unusual speech-sounds, learning how to deal with unusual mappings between sounds and words and learning how to deal with the acoustic variations of different speakers of the target language are all important parts of acquiring another language. Research by Mitterer and McQueen (2009) suggests that L2 subtitles help in this process, but L1 subtitles hinder it.

Classroom implications?

The literature on subtitles and language learning echoes with the refrain of ‘more research needed’, but I’m not sure that further research will lead to less ambiguous, practical conclusions. One of my initial questions concerned the optimal order of use of different kinds of subtitles. In most extensive viewing contexts, learners are unlikely to watch something more than twice. If they do (watching a recorded academic lecture, for example), they are likely to be more motivated by a desire to learn from the content than to learn language from the content. L1 subtitles will probably be preferred, and will have the added bonus of facilitating note-taking in the L1. For learners who are more motivated to learn the target language (Vanderplank’s ‘maximal’ users), a sequence of subtitle use, starting with the least cognitively challenging and moving to greater challenge, probably makes sense. Danan (2015: 46) suggests starting with an L1 soundtrack and reversed (L2) subtitles, then moving on to an L2 soundtrack and L2 subtitles, and ending with an L2 soundtrack and no subtitles. I would replace her first stage with an L2 soundtrack and L1 subtitles, but this is based on hunch rather than research.

This sequencing of subtitle use is common practice in language classrooms, but, here, (1) the video clips are usually short, and (2) the aim is often not incidental learning of vocabulary. Typically, the video clip has been selected as a tool for deliberate teaching of language items, so different conditions apply. At least one study has confirmed the value of the common teaching practice of pre-teaching target vocabulary items before viewing (Pujadas & Muñoz, 2019). The drawback is that, by getting learners to focus on particular items, less incidental learning of other language features is likely to take place. Perhaps this doesn’t matter too much. In a short clip of a few minutes, the opportunities for incidental learning are limited, anyway. With short clips and a deliberate learning aim, it seems reasonable to use L2 subtitles for a first viewing, and no subtitles thereafter.

An alternative frequent use of short video clips in classrooms is to use them as a springboard for speaking. In these cases, Baranowska (2020: 113) suggests that teachers may opt for L1 subtitles first, and follow up with L2 subtitles. Of course, with personal viewing devices or in online classes, teachers may want to exploit the possibilities of differentiating the subtitle condition for different learners.

REFERENCES

Baranowska, K. (2020). Learning most with least effort: subtitles and cognitive load. ELT Journal 74 (2): pp.105 – 115

Bisson, M.-J., Van Heuven, W.J.B., Conklin, K. and Tunney, R.J. (2012). Processing of native and foreign language subtitles in films: An eye tracking study. Applied Psycholingistics, 35 (2): pp. 399 – 418

Burczyńska, P. (2015). Reversed Subtitles as a Powerful Didactic Tool in SLA. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 221 – 244)

Caimi, A. (2015). Introduction. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 9 – 18)

Danan, M. (2015). Subtitling as a Language Learning Tool: Past Findings, Current Applications, and Future Paths. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 41 – 61)

d’Ydewalle, G. & Pavakanun, U. (1997). Could Enjoying a Movie Lead to Language Acquisition?. In: Winterhoff-Spurk, P., van der Voort, T.H.A. (Eds.) New Horizons in Media Psychology. VS Verlag für Sozialwissenschaften, Wiesbaden. https://doi.org/10.1007/978-3-663-10899-3_10

Frumuselu, A.D., de Maeyer, S., Donche, V. & Gutierrez Colon Plana, M. (2015). Television series inside the EFL classroom: bridging the gap between teaching and learning informal language through subtitles. Linguistics and Education, 32: pp. 107 – 17

Frumuselu, A. D. (2019). ‘A Friend in Need is a Film Indeed’: Teaching Colloquial Expressions with Subtitled Television Series. In Herrero, C. & Vanderschelden, I. (Eds.) Using Film and Media in the Language Classroom. Bristol: Multimedia Matters. pp.92 – 107

Gambier, Y. (2015). Subtitles and Language Learning (SLL): Theoretical background. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 63 – 82)

Hall, G. & Cook, G. (2012). Own-language Use in Language Teaching and Learning. Language Learning, 45 (3): pp. 271 – 308

Incalcaterra McLoughlin, L., Biscio, M. & Ní Mhainnín, M. A. (Eds.) (2011). Audiovisual Translation, Subtitles and Subtitling. Theory and Practice. Bern: Peter Lang

Karakaş, A. & Sariçoban, A. (2012). The impact of watching subtitled animated cartoons on incidental vocabulary learning of ELT students. Teaching English with Technology, 12 (4): pp. 3 – 15

Kerr, P. (2016). Questioning ‘English-only’ Classrooms: Own-language Use in ELT. In Hall, G. (Ed.) The Routledge Handbook of English Language Teaching (pp. 513 – 526)

Kruger, J. L., Hefer, E. & Matthew, G. (2014). Attention distribution and cognitive load in a subtitled academic lecture: L1 vs. L2. Journal of Eye Movement Research, 7: pp. 1 – 15

Mitterer, H. & McQueen, J. M. (2009). Foreign Subtitles Help but Native-Language Subtitles Harm Foreign Speech Perception. PLoS ONE 4 (11): e7785.doi:10.1371/journal.pone.0007785

Montero Perez, M., Van Den Noortgate, W., & Desmet, P. (2013). Captioned video for L2 listening and vocabulary learning: A meta-analysis. System, 41, pp. 720–739 doi:10.1016/j.system.2013.07.013

Pujadas, G. & Muñoz, C. (2019). Extensive viewing of captioned and subtitled TV series: a study of L2 vocabulary learning by adolescents, The Language Learning Journal, 47:4, 479-496, DOI: 10.1080/09571736.2019.1616806

Pujolá, J.- T. (2002). CALLing for help: Researching language learning strategies using help facilities in a web-based multimedia program. ReCALL, 14 (2): pp. 235 – 262

Reinisch, E. & Holt, L. L. (2013). Lexically Guided Phonetic Retuning of Foreign-Accented Speech and Its Generalization. Journal of Experimental Psychology: Human Perception and Performance. Advance online publication. doi: 10.1037/a0034409

Suárez, M. & Gesa, F. (2019) Learning vocabulary with the support of sustained exposure to captioned video: do proficiency and aptitude make a difference? The Language Learning Journal, 47:4, 497-517, DOI: 10.1080/09571736.2019.1617768

Taylor, G. (2005). Perceived processing strategies of students watching captioned video. Foreign Language Annals, 38(3), pp. 422-427

Vanderplank, R. (1988). The value of teletext subtitles in language learning. ELT Journal, 42 (4): pp. 272 – 281

Vanderplank, R. (2015). Thirty Years of Research into Captions / Same Language Subtitles and Second / Foreign Language Learning: Distinguishing between ‘Effects of’ Subtitles and ‘Effects with’ Subtitles for Future Research. In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences. Bern: Peter Lang (pp. 19 – 40)

Vanderplank, R. (2019). ‘Gist watching can only take you so far’: attitudes, strategies and changes in behaviour in watching films with captions, The Language Learning Journal, 47:4, 407-423, DOI: 10.1080/09571736.2019.1610033

Winke, P., Gass, S. M., & Sydorenko, T. (2010). The Effects of Captioning Videos Used for Foreign Language Listening Activities. Language Learning & Technology, 1 (1): pp. 66 – 87

Zabalbeascoa, P., González-Casillas, S. & Pascual-Herce, R. (2015). In Gambier, Y., Caimi, A. & Mariotti, C. (Eds.), Subtitles and Language Learning. Principles, strategies and practical experiences Bern: Peter Lang (pp. 105–126)

Precarity

Barely liveable hourly wages, no job security because there is no permanent contract (so employment may be terminated at short or no notice), no social security, paid health care or pension, struggling to meet everyday needs, such as food and accommodation … this is the situation for at least one in five workers in the UK and similar figures exist in many countries (e.g. one in six in New Zealand). As Bourdieu (1998: 81ff.) noted, job insecurity is now everywhere.

Many English language teachers, especially those working for private schools or universities operating like private schools, belong to what has been termed the global educational precariat. In addition to language school and university language teachers, there are hundreds of thousands of teachers, mostly American and British, working in English-medium schools ‘international schools’ around the world (Bunnell, 2016). Besides financial insecurity, many of these teachers also suffer from a lack of agency and a marginalisation of their professional identities (Poole, 2019). There’s a very useful article on ‘precarity’ in ELT Journal (Walsh, 2019) that I’d recommend.

Even teachers with reasonable pay and job security are facing attacks on their pay and working conditions. A few weeks ago in Jordan, security forces shut down the teachers’ union and arrested leading members. Teachers union leaders have also been imprisoned recently in Iran and Cambodia. The pages of the website of Education International , a global federation of teachers’ trade unions, catalogue the crises in education and the lives of teachers around the world.

Teacher bashing, in particular attacks on teacher unions, has been relentless. Four years ago, it was reported that teacher bashing had ‘reached unprecedented levels’ in the US (Saltzman, 2017: 39), where there has been a concerted attempt, over many years, to blame teachers for shortcomings in the educational system (see, for example, Kumashiro, 2012). Although it may have been the US that led the way, closely followed by Australia and the UK, attacks on teachers have become a global phenomenon. Mary Compton and Lois Weiner’s book, ‘The Global Assault on Teaching, Teachers and their Unions’ (Compton & Weiner 2008), gives examples from China to South Africa, from Denmark to Mexico, of how teachers’ pay and conditions have been eroded. The reason? Quite simply, it is because teachers have stood in the way of so-called ‘reforms’ (e.g. pay cuts). It is because they have, as they are doing now in times of COVID-19, stood in the way of what governments have wanted to do. In an earlier post, I wrote in more detail about the ways in which the World Bank has spearheaded the drive towards privatized, lower cost education around the world.

COVID-19 has, of course, made matters worse, much worse. As often as not, the pandemic has been used as an excuse to accelerate attacks on teachers that were well under way long before.

Wellbeing

In the circumstances, it is not surprising that teacher wellbeing has recently become a more talked-about topic. Precisely because there is so little of it about.

The publication earlier this year of a book about teacher wellbeing (Mercer & Gregersen, 2020) for language teachers is very timely. The authors acknowledge that real change for wellbeing [must] addresses structural and systemic levels of change and is not just a matter for individual teachers to cope with alone. They acknowledge that teachers should not have to compensate for fundamental flaws in the system as a whole that undermine their wellbeing, and they express concern about the risks associated with discussing teacher wellbeing at the individual level and not acknowledging that the systems in which teachers work may be at fault (Mercer & Gregersen, 2020: 9). But, with these caveats out of the way, the matter is closed, and the whole book is about how individuals can improve their wellbeing. Indeed, the book begins: As you read the title of this chapter, you might have thought how self-seeking or egocentric it sounds: It’s all about me? Our response is, ‘Yes, you!’ Throughout this book, we want you to focus your attention on yourself for a change, without any guilty feelings (Mercer & Gregersen, 2020: 1). Mindfulness techniques, tips for time management, ways of thinking positively and so on – it’s a compendium of self-help advice that may be helpful for language teachers. The real ravages of precarity, the real causes of so much lack of wellbeing, these do not get a mention.

Banksy_-_Grin_Reaper_With_TagPositive psychology

Mercer and Gregersen’s approach is directly inspired by the work of Martin Seligman, often referred to as the founder of ‘positive psychology’ (see, for example, Seligman, 2011; 2018). Positive psychology and Seligman’s ideas about wellbeing are not uncontested (see, for example, Bache & Reardon, 2016; Bache & Scott, 2018). The nub of the critiques is that positive psychology chooses to focus on happiness or wellbeing, rather than, say, justice, solidarity or loyalty. It articulates an underlying individualism and narrow sense of the social (Cabanas & Illouz, 2019: 68) and it is, therefore, not entirely surprising that much of the funding that made the rapid growth of positive psychology possible came from the ultra-conservative and religious institution, the John Templeton Foundation (Cabanas & Illouz, 2019: 20).

Mercer and Gregersen are not unaware of such critiques (see, for example, MacIntyre et al., 2016: 375). They mention the critiques of Barbara Ehrenreich (Ehrenreich, 2009), but, to the best of my knowledge, they have never troubled to respond to them. They have a very clear agenda – the promotion of positive psychology ideas in language teaching / learning contexts – which is made explicit in MacIntyre and Mercer (2014). A slew of articles, books and conference presentations have followed since then, and ‘Teacher Wellbeing’ is one of them. Mission seems to have been achieved.

Positive psychology has not only been criticised for its focus on the individual. Others have focused on its foundational assumptions, including decontextualized and ethnocentric claims; theoretical oversimplifications, tautologies and contradictions; methodological shortcomings; severe replicability problems; exaggerated generalizations; and even its therapeutic efficacy and scientific status (Cabanas & Illous, 2019: 29). Probably the most important of these critics was Richard Lazarus, whose work is certainly familiar to Mercer, Gregersen and their collaborators, since Lazarus’s criticisms are listed in MacIntyre and Mercer (2014) and elsewhere. These include:

  • the over-use of crosssectional research designs
  • a tendency to treat emotion too simplistically as either positive or negative
  • inadequate attention to both differences among individuals within a group as well as the overlap between groups when discussing statistically significant group differences
  • poor quality measurement of emotions.

However, as with the critiques of Ehrenreich, I have yet to find any examples of these authors actually addressing the criticisms. Instead, they prefer to talk about how problems such as those listed above need to be avoided in the future. For example, there is no doubt that the future development of the [positive psychology] approach within SLA can learn from these and other criticisms, write MacIntyre and Mercer (2014:161), and they see the future of positive psychology in language learning / teaching as being fundamentally grounded in science.

Empirical science

Acknowledging, but without actually addressing, past criticisms of the scientific shortcomings of positive psychology, MacIntyre and Mercer (2014: 15) insist that positive psychology is the empirical study of how people thrive and flourish […] it represents a form of “rebirth” for humanistic psychology, but with a stronger emphasis on empirical research. The word ‘empirical’ appears 4 times on this page and another 5 times in the article. In their follow-up book, ‘Positive Psychology in SLA’ (Macintyre et al., 2016), there is a whole section (over a third of the book) entitled ‘Empirical’. In a historical survey of positive psychology in foreign language teaching, written by close collaborators of Mercer, Gregersen and MacIntyre (Dewaele et al.,2019), the same focus on empirical science is chosen, with a description of positive psychology as being underpinned by solid empirical research. The frequency of this word choice is enough to set alarm bells ringing.

A year before the MacIntyre and Mercer article (2014), an article by Brown et al (2013) questioned one of the key empirical foundations of positive psychology, the so-called ‘critical positivity ratio’ (Fredrickson & Losada, 2005). Wikipedia explains this as the ratio of positive to negative emotions which distinguishes “flourishing” people from “languishing” people, and the ratio was 2.9013. A slightly later article (Brown et al, 2014) further debunked the work of Fredrickson, arguing that her work was full of conceptual difficulties and statistical flaws. Wikipedia now describes the ‘critical positivity ratio’ as ‘a largely discredited concept’. In contrast, Mercer and Gregersen (2020: 14) acknowledge that although the exact ratio (3:1) of positivity has been called into question by some, they reassert the value of Fredrickson’s work. They neither cite the criticisms, nor rebut them. In this, they are following a well-established tradition of positive psychology (Rhyff, 2003).

Given growing scepticism about the claims of positive psychology, MacIntyre et al (2016) elected to double-down. Even if empirical evidence for positive psychology was in short supply, it was incumbent on them to provide it. Hence, the section in their book entitled ‘Empirical’. Personally, I would have advised against it. The whole point of positive psychology, as outlined by Seligman, is to promote ‘wellbeing’. But what, exactly, is this? For some, like Mercer and Gregersen (2020: 3), it’s about finding meaning and connection in the world. For others, it’s not a ‘thing’ that needs research to uncover its essential nature, but as a social and cultural construction which is interesting as such, not least for what it can tell us about other social and cultural phenomena (Ereaut & Whiting, 2008). We may agree that it’s ‘a good thing’, but it lacks solidity as a construct. Even Seligman (2011: 15) comes to the conclusion that ‘wellbeing’ is not ‘a real thing. Rather, he says, it is a construct which has several measurable elements, each a real thing, each contributing to well-being, but none defining well-being. This, however, simply raises the question of how much of a ‘thing’ each of these elements are (Dodge et al., 2012). Seligman’s elements (Positive Emotion, Engagement, Relationships, Meaning, and Accomplishment (PERMA)) form the basis of Mercer and Gregersen’s book, but none lend themselves to clear, workable definitions. In the absence of construct validity, empirical research evidence will prove hard to find.

How well does the ‘Empirical’ section of Positive Psychology in SLA (MacIntyre et al., 2016) stand up? I don’t have space here to discuss all 7 chapters. However, I’ve selected the first of these, ‘Positive Psychology Exercises Build Social Capital for Language Learners: Preliminary Evidence’ (Gregersen et al, 2016) because it includes ‘evidence’ in the title and because it was written by two of the book’s editors. The research reported in this chapter involved five volunteer women, aged 20 -23, in an English program at an American university, who took part in a number of positive psychology exercises (PPEs) which entailed laughter, exercise, interaction with animals, listening to music, expressing gratitude and engaging in altruism. The data collected was self-rating questionnaires and some self-reflection discussion. The results indicated that the PPEs led to more positive emotions, with exercise and laughter leading to the greatest gains (but since the order of the PPEs was not randomized, and since the sample size was so small, this doesn’t really tell us anything). Some of the participants doubted the value of some of the PPEs. However, the participants developed better relationships with their partners and this may have led to gains in confidence. The authors conclude that although the present data-set is small, we see preliminary evidence of all three pillars of positive psychology supporting positive outcomes (p.164).

My own view is that this is wishful thinking. The only thing that this study does is to indicate that in this particular context with these particular learners, feeling good about what you are doing may help things along a bit. In addition, this has absolutely nothing to do with ‘social capital’, which the authors seem to have misunderstood. Citing an article by Nawyn et al (2012), they describe ‘social capital’ as emerging friendships that provide learners with positive emotional experiences and intangible resources for language acquisition (Gregersen et al, 2016: 147). But this is a misreading of the Nawyn et al article, which adheres fairly closely to Bourdieu’s notion of social capital as fundamentally about power relations, but extends it beyond purely economic power relations. Given the connections between the lack of teacher wellbeing and precarity, and given Bourdieu’s writings about precarity, the authors’ attempt to bring Bourdieu into their justification of positive psychological experiences, best undertaken at the individual level (Gregersen et al., 2016: 149), is really quite extraordinary. And if this is empirical evidence for anything, I’m a positive psychologist!

Cui bono?

It may be that some of the exercises suggested in Teacher Wellbeing will be of benefit to some, even many, teachers. Maybe. But the claims of empirical science behind this book are questionable, to say the least. More beneficial to teacher wellbeing would almost certainly be strong teacher unions, but these are only mentioned in passing. There is, incidentally, some recent evidence from the U.S. (Han, 2020), that highly unionized districts have higher average teacher quality and improved educational outcomes. But positive psychologists seem unwilling to explore the role that unions might play in teacher wellbeing. It is not, perhaps, coincidental that the chapter in Teacher Wellbeing that deals with teachers in their workplaces contains three recommendations for further reading, and all three are written for managers. The first on the list is called Build It: The Rebel Playbook for World-class Employee Engagement (Elliott & Corey, 2018).

The problems that teachers are facing, exacerbated by COVID-19, are fundamentally systemic and political. Mercer and Gregersen may be aware that there is a risk associated with discussing teacher wellbeing at the individual level and not acknowledging that the systems in which teachers work may be at fault, but it’s a risk they have chosen to take, believing that their self-help ideas are sufficiently valuable to make the risk worthwhile. I agree with a writer on the National Education Association blog, who thinks that self-care is important but argues that it is an insufficient and entirely too passive way to address the problems teachers are encountering today.

There are other ways of conceptualising teacher wellbeing (see, for example, the entries on the Education International website with this tag) and the Mercer / Gregersen book may be viewed as an attempt to ‘claim the field’. To return to Paul Walsh, whose article about precarity I recommended earlier, it is useful to see the current interest in teacher wellbeing in context. He writes: Well-being has entered ELT at a time when teachers have been demanding greater visibility and acceptance of issues such as mental health, poor working conditions, non-native speaker and gender equality. Yet to subsume these issues under a catch-all category does them a disservice. Because as soon as we put these issues under the well-being umbrella, they effectively vanish in a cloud of conceptual mist—and lose their sharp edges.

In this sense, a book like Teacher Wellbeing, although well-meaning, may well contribute to the undermining of the very thing it seeks to promote.

References

Bache, I. & Reardon, L. (2016) The Politics and Policy of Wellbeing: Understanding the Rise and Significance of a New Agenda. Cheltenham: Edward Elgar

Bache, I. and Scott, K. (eds.) (2018). The Politics of Wellbeing: Theory, Policy and Practice. Palgrave Macmillan

Bourdieu, P. (1998). Acts of Resistance: against the new myths of our time. Cambridge: Polity Press

Brown, N. J. L., Sokal, A. D., & Friedman, H. L. (2013). The complex dynamics of wishful thinking: The critical positivity ratio. American Psychologist, pp. 68, 801–813. http://dx.doi.org/10.1037/a0032850

Brown, N. J. L., MacDonald, D. A., Samanta, M. P., Friedman, H. L. & Coyne, J. C. (2014). A critical reanalysis of the relationship between genomics and well-being. Proceedings of the National Academy of Sciences of the United States of America, 111, 12705–12709. http://dx.doi.org/10.1073/pnas.1407057111

Bunnell, T. (2016). Teachers in International schools: a global educational ‘precariat’? Globalisation, Societies and Education, 14(4), pp. 543-559

Cabanas, E. & Illouz, E. (2019). Manufacturing Happy Citizens. Cambridge: Polity Press

Compton, M. & Weiner, L. (Eds.) (2008). The Global Assault on Teaching, Teachers and their Unions. Palgrave Macmillan

Dewaele, J. M., Chen, X., Padilla, A. M. & Lake, J. (2019). The Flowering of Positive Psychology in Foreign Language Teaching and Acquisition Research. Frontiers in psychology, 10, 2128. https://doi.org/10.3389/fpsyg.2019.02128

Dodge, R., Daly, A., Huyton, J. & Sanders, L. (2012). The challenge of defining wellbeing. International Journal of Wellbeing, 2(3), pp. 222-235. doi:10.5502/ijw.v2i3.4

Ehrenreich, B. (2009). Bright-Sided: How the relentless promotion of positive thinking has undermined America. New York: Metropolitan Books

Elliott, G. & Corey, D. (2018). Build It: The Rebel Playbook for World-class Employee Engagement. Chichester: Wiley

Ereaut, G. & Whiting, R. (2008). What do we mean by ‘wellbeing’? And why might it matter? Research Report No DCSF-RW073 Department for Children, Schools and Families https://dera.ioe.ac.uk/8572/1/dcsf-rw073%20v2.pdf

Fredrickson, B. L. & Losada, M.F. (2005). Positive affect and the complex dynamics of human flourishing. American Psychology, 60 (7): pp. 678–86. doi:10.1037/0003-066X.60.7.678

Gregersen, T., MacIntyre, P.D. & Meza, M. (2016). Positive Psychology Exercises Build Social Capital for Language Learners: Preliminary Evidence. In MacIntyre, P.D., Gregersen, T. & Mercer, S. (Eds.) Positive Psychology in SLA. Bristol: Multilingual Matters. pp.147 – 167

Han, E. S. (2020). The Myth of Unions’ Overprotection of Bad Teachers: Evidence from the District–Teacher Matched Data on Teacher Turnover. Industrial Relations, 59 (2): pp. 316 – 352

Kumashiro, K. K. (2012). Bad Teacher! How Blaming Teachers Distorts the Bigger Picture. Teachers College Press

Lazarus, R. S. (2003). Target article: Does the positive psychology movement have legs? Psychological Inquiry, 14 (2): pp. 93 – 109

MacIntyre, P.D. & Mercer, S. (2014). Introducing positive psychology to SLA. Studies in Second Language Learning and Teaching, 4 (2): pp. 153 -172

MacIntyre, P.D., Gregersen, T. & Mercer, S. (2016). Conclusion. In MacIntyre, P.D., Gregersen, T. & Mercer, S. (Eds.) Positive Psychology in SLA. Bristol: Multilingual Matters. pp.374 – 379

MacIntyre, P.D., Gregersen, T. & Mercer, S. (Eds.) (2016). Positive Psychology in SLA. Bristol: Multilingual Matters.

Mercer, S. & Gregersen, T. (2020). Teacher Wellbeing. Oxford: OUP

Nawyn, S.J., Gjokai, L., Agbenyiga, D.L. & Grace, B. (2012). Linguistic isolation, social capital, and immigrant belonging. Journal of Contemporary Ethnography 41 (3), pp.255 -282

Poole, A. (2019). International Education Teachers’ Experiences as an Educational Precariat in China. Journal of Research in International Education, 18 (1): pp. 60-76

Ryff, C. D. (2003). Corners of myopia in the positive psychology parade. Psychological Inquiry, 14: pp. 153–159

Saltzman, K. J. 2017. Scripted Bodies. Routledge

Seligman, M. (2011). Flourish – A new understanding of happiness and well-being – and how to achieve them. London: Nicholas Brealey Publishing.

Seligman, M. (2018). PERMA and the building blocks of well-being. The Journal of Positive Psychology, DOI: 10.1080/17439760.2018.1437466

Walsh, P. (2019). Precarity. ELT Journal, 73 (4), pp.459 – 462

Wong, P. T. P., & Roy, S. (2017). Critique of positive psychology and positive interventions. In N. J. L. Brown, T. Lomas, & F. J. Eiroa-Orosa (Eds.) The Routledge International Handbook of Critical Positive Psychology. London: Routledge.

I noted in a recent post about current trends in ELT that mindfulness has been getting a fair amount of attention recently. Here are three recent examples:

  • Pearson recently produced the Pearson Experiences: A Pocket Guide to Mindfulness, written by Amy Malloy. Amy has written also written a series of blog posts for Pearson on the topic and she is a Pearson-sponsored speaker (Why use mindfulness in the classroom?) at the English Australia Ed-Tech SIG Online Symposium this week.
  • Russell Stannard has written two posts for Express Publishing (here and here)
  • Sarah Mercer and Tammy Gregersen’s new book, ‘Teacher Wellbeing’ (OUP, 2020) includes a section in which they recommend mindfulness practices to teachers as a buffer against stress and as a way to promote wellbeing.

The claims

Definitions of mindfulness often vary slightly, but the following from Amy Malloy is typical: ‘mindfulness is about the awareness that comes from consciously focussing on the present moment’. Claims for the value of mindfulness practices also vary slightly. Besides the general improvements to wellbeing suggested by Sarah and Tammy, attention, concentration and resilience are also commonly mentioned.

Amy: [Mindfulness] develops [children’s] brains, which in turn helps them find it easier to calm down and stay calm. … It changes our brains for the better. …. [Mindfulness] helps children concentrate more easily on classroom activities.

Russell: Students going through mindfulness training have increased levels of determination and willpower, they are less likely to give up if they find something difficult … Mindfulness has been shown to improve concentration. Students are able to study for longer periods of time and are more focused … Studies have shown that practicing mindfulness can lead to reduced levels of anxiety and stress.

In addition to the behavioural changes that mindfulness can supposedly bring about, both Amy and Russell refer to neurological changes:

Amy: Studies have shown that the people who regularly practise mindfulness develop the areas of the brain associated with patience, compassion, focus, concentration and emotional regulation.

Russell: At the route of our current understanding of mindfulness is brain plasticity. … in probably the most famous neuroimaging research project, scientists took a group of people and found that by doing a programme of 8 weeks of mindfulness training based around gratitude, they could actually increase the size of the areas of the brain generally associated with happiness.

Supporting evidence

In her pocket guide for Pearson, Amy provides no references to support her claims.

In Russell’s first post, he links to a piece of research which looked at the self-reported psychological impact of a happiness training programme developed by a German cabaret artist and talk show host. The programme wasn’t specifically mindfulness-oriented, so tells us nothing about mindfulness, but it is also highly suspect as a piece of research, not least because one of the co-authors is the cabaret artist himself. His second link is to an article about human attention, a long-studied area of psychology, but this has nothing to do with mindfulness, although Russell implies that there is a connection. His third link is to a very selective review of research into mindfulness, written by two mindfulness enthusiasts. It’s not so much a review of research as a selection of articles which support mindfulness advocacy.

In his second post, Russell links to a review of mindfulness-based interventions (MBIs) in education. Appearing in the ‘Mindfulness’ journal, it is obviously in broad support of MBIs, but its conclusions are hedged: ‘Research on the neurobiology of mindfulness in adults suggests that sustained mindfulness practice can ….’ ‘mindfulness training holds promise for being one such intervention for teachers.’ His second link is to a masterpiece of pseudo-science delivered by Joe Dispenza, author of many titles including ‘Becoming Supernatural: How Common People are Doing the Uncommon’ and ‘Breaking the Habit of Being Yourself’. Russell’s 3rd link is to an interview with Matthieu Ricard, one of the Dalai Lama’s translators. Interestingly, but not in this interview, Ricard is very dismissive of secular mindfulness (‘Buddhist meditation without the Buddhism’). His fourth link is to a video presentation about mindfulness from Diana Winston of UCLA. The presenter doesn’t give citations for the research she mentions (so I can’t follow them up): instead, she plugs her own book.

Sarah and Tammy’s three references are not much better. The first is to a self-help book, called ‘Every Teacher Matters: Inspiring Well-Being through Mindfulness’ by K. Lovewell (2012), whose other work includes ‘The Little Book of Self-Compassion: Learn to be your own Best Friend’. The second (Cresswell, J. D. & Lindsay, E.K. (2014). How does mindfulness training affect health? A mindfulness stress buffering account. Current Directions in Psychological Science 23 (6): pp. 401-407) is more solid, but a little dated now. The third (Garland, E., Gaylord, S.A. & Fredrickson, B. L. (2011). Positive Reappraisal Mediates the Stress-Reductive Effects of Mindfulness: An Upward Spiral Process. Mindfulness 2 (1): pp. 59 – 67) is an interesting piece, but of limited value since there was no control group in the research and it tells us nothing about MBIs per se.

The supporting evidence provided by these writers for the claims they make is thin, to say the least. It is almost as if the truth of the claims is self-evident, and for these writers (all of whom use mindfulness practices themselves) there is clearly a personal authentication. But, not having had an epiphany myself and being somewhat reluctant to roll a raisin around my mouth, concentrating on its texture and flavours, fully focussing on the process of eating it (as recommended by Sarah and Tammy), I will, instead, consciously focus on the present moment of research.

Mindfulness and research

The first thing to know is that there has been a lot of research into mindfulness in recent years. The second thing to know is that much of it is poor quality. Here’s why:

  • There is no universally accepted technical definition of ‘mindfulness’ nor any broad agreement about detailed aspects of the underlying concept to which it refers (Van Dam, N. T. et al. (2018). Mind the Hype: A Critical Evaluation and Prescriptive Agenda for Research on Mindfulness and Meditation. Perspectives on Psychological Science 13: pp. 36 – 61)
  • To date, there are at least nine different psychometric questionnaires, all of which define and measure mindfulness differently (Purser, R.E. (2019). McMindfulness. Repeater Books. p.128)
  • Mindfulness research tends to rely on self-reporting, which is notoriously unreliable.
  • The majority of studies did not utilize randomized control groups (Goyal, M., Singh, S., Sibinga, E.S., et al. (201). Meditation Programs for Psychological Stress and Well-being: A Systematic Review and Meta-analysis. JAMA Intern Med. 2014. doi:10.1001/ jamainternmed.2013.13018).
  • Early meditation studies were mostly cross-sectional studies: that is, they compared data from a group of meditators with data from a control group at one point in time. A cross-sectional study design precludes causal attribution. (Tang, Y., Hölzel, B. & Posner, M. (2015). The neuroscience of mindfulness meditation. Nature Reviews Neuroscience 16, 213–225)
  • Sample sizes tend to be small and there is often no active control group. There are few randomized controlled trials (Dunning, D.L., Griffiths, K., Kuyken, W., Crane, C., Foulkes, L., Parker, J. and Dalgleish, T. (2019), Research Review: The effects of mindfulness‐based interventions on cognition and mental health in children and adolescents – a meta‐analysis of randomized controlled trials. (Journal of Child Psychology and Psychiatry, 60: 244-258. doi:10.1111/jcpp.12980)
  • There is a relatively strong bias towards the publication of positive or significant results (Coronado-Montoya, S., Levis, A.W., Kwakkenbos, L., Steele, R.J., Turner, E.H. & Thombs, B.D. (2016). Reporting of Positive Results in Randomized Controlled Trials of Mindfulness-Based Mental Health Interventions. PLoS ONE 11(4): e0153220. https://doi.org/10.1371/journal.pone.0153220)
  • More recent years have not seen significant improvements in the rigorousness of research (Goldberg SB, Tucker RP, Greene PA, Simpson TL, Kearney DJ, Davidson RJ (2017). Is mindfulness research methodology improving over time? A systematic review. PLoS ONE 12(10): e0187298).

 

The overall quality of the research into mindfulness is so poor that a group of fifteen researchers came together to write a paper entitled ‘Mind the Hype: A Critical Evaluation and Prescriptive Agenda for Research on Mindfulness and Meditation’ (Van Dam, N. T. et al. (2018). Mind the Hype: A Critical Evaluation and Prescriptive Agenda for Research on Mindfulness and Meditation. Perspectives on Psychological Science 13: pp. 36 – 61).

So, the research is problematic and replication is needed, but it does broadly support the claim that mindfulness meditation exerts beneficial effects on physical and mental health, and cognitive performance (Tang, Y., Hölzel, B. & Posner, M. (2015). The neuroscience of mindfulness meditation. Nature Reviews Neuroscience 16, 213–225). The italicized broadly is important here. As one of the leaders of the British Mindfulness in Schools Project (which has trained thousands of teachers in the UK) puts it, ‘research on mindfulness in schools is still in its infancy, particularly in relation to impacts on behaviour, academic performance and physical health. It can best be described as ‘promising’ and ‘worth trying’ (Weare, K. (2018). Evidence for the Impact of Mindfulness on Children and Young People. The Mindfulness in Schools Project). We don’t know what kind of MBIs are most effective, what kind of ‘dosage’ should be administered, what kinds of students it is (and is not) appropriate for, whether instructor training is significant or what cost-benefits it might bring. In short, there is more that we do not know than we know.

One systematic review, for example, found that MBIs had ‘small, positive effects on cognitive and socioemotional processes but these effects were not seen for behavioral or academic outcomes’. What happened to the promises of improved concentration, calmer behaviour and willpower? The review concludes that ‘the evidence from this review urges caution in the widespread adoption of MBIs and encourages rigorous evaluation of the practice should schools choose to implement it’ (Maynard, B. R., Solis, M., Miller, V. & Brendel, K. E. (2017). Mindfulness-based interventions for improving cognition, academic achievement, behavior and socio-emotional functioning of primary and secondary students. A Campbell Systematic Review 2017:5).

What about the claims for neurological change? As a general rule, references to neuroscience by educators should be taken with skepticism. Whilst it appears that ‘mindfulness meditation might cause neuroplastic changes in the structure and function of brain regions involved in regulation of attention, emotion and self-awareness’ (Tang, Y., Hölzel, B. & Posner, M. (2015). The neuroscience of mindfulness meditation. Nature Reviews Neuroscience 16, 213–225), this doesn’t really tell us very much. A complex mental state like mindfulness ‘is likely to be supported by the large-scale brain networks’ (ibid) and insights derived from fMRI scans of particular parts of the brain provide us with, at best, only a trivial understanding of what is going on. Without a clear definition of what mindfulness actually is, it is going to be some time before we unravel the neural mechanisms underpinning it. If, in fact, we ever do. By way of comparison, you might be interested in reading about neuroscientific studies into prayer , which also appears to correlate with enhanced wellbeing.

Rather than leaving things with the research, I’d like to leave you with a few more short mindfulness raisins to chew on.

Mindfulness and money

As Russell says in his blog post, ‘research in science doesn’t come out of a vacuum’. Indeed, it tends to follow the money. It is estimated that mindfulness is now ‘a $4 billion industry’ (Purser, R.E. (2019). McMindfulness. Repeater Books. p.13): ‘More than 100,000 books for sale on Amazon have a variant of ‘mindfulness’ in their title, touting the benefits of Mindful Parenting, Mindful Eating, Mindful Teaching, Mindful Therapy, Mindful Leadership, Mindful Finance, a Mindful Nation, and Mindful Dog Owners, to name just a few. There is also The Mindfulness Coloring Book, a bestselling genre in itself. Besides books, there are workshops, online courses, glossy magazines, documentary films, smartphone apps, bells, cushions, bracelets, beauty products and other paraphernalia, as well as a lucrative and burgeoning conference circuit’.

It is precisely because so much money is at stake that so much research has been funded. More proof is desperately needed, and it is sadly unforthcoming. Meanwhile, in the immortal words of Kayleigh McEnany, ‘science should not stand in the way.’

Minefulness and the individual

Mindfulness may be aptly described as a ‘technology of the self’. Ronald Purser, the author of ‘McMindfulness’, puts it like this: ‘Rather than discussing how attention is monetized and manipulated by corporations such as Google, Facebook, Twitter and Apple, [mindfulness advocates] locate crisis in our minds. It is not the nature of the capitalist system that is inherently problematic; rather, it is the failure of individuals to be mindful and resilient in a precarious and uncertain economy. Then they sell us solutions that make us contented mindful capitalists’.

It is this focus on the individual that makes it so appealing to right-wing foundations (e.g. the Templeton Foundation) that fund the research into mindfulness. For more on this topic, see my post about grit .

Mindfulness and religion

It is striking how often mindfulness advocates, like Amy, feel the need to insist that mindfulness is not a religious practice. Historically, of course, mindfulness comes direct from a Buddhist tradition, but in its present Western incarnation, it is a curious hybrid. Jon Kabat-Zinn  who, more than anyone else, has transformed mindfulness into a marketable commodity, is profoundly ambiguous on the topic. Buddhists, like Matthieu Ricard or David Forbes (author of ‘Mindfulness and its Discontents’, Fernwood Publishing, 2019), have little time for the cultural appropriation of the Pali term ‘Sati’, especially when mindfulness is employed by the American military for training for snipers. Others, like Goldie Hawn, whose MindUP programme sells well in the US, are quite clear about their religious affiliation and their desire to bring Buddhism into schools through the back door.

I personally find it hard to see the banging of Tibetan bowls as anything other than a religious act, but I am less bothered by this than those American school districts who saw MBIs as ‘covert religious indoctrination’ and banned them. Having said that, why not promote more prayer in schools if the ‘neuroscience’ supports it?

Clare is a busy teacher

Image from Andrew Percival , inspired by The Ladybird Book of Mindfulness and similar titles.

Flipped learning undoubtedly has much potential and now, when F2F teaching is not always possible, the case for exploring what it might offer seems greater still. For a variety of reasons (not the least of which are motivational issues), it may not always be possible to flip the classroom, but, if and when it is, how and what should be flipped?

In the most well-known flipped approaches, such as the Khan Academy, students watch instructional videos in their own time, before coming to class where they can work together on practical problems, applying the knowledge they have gained from the instructional video. The flipped part of the learning does not need to be a video (Bergmann et al., 2013), but, in practice, it usually is. But whether video or something else, one of the big questions for me is what, precisely, does it make sense to flip?

In a recently published Cambridge Paper in ELT that I wrote on Flipped Learning, I noted that it is not uncommon for grammar instruction to be flipped. Al-Harbi & Alshumaimeri (2016), for example, describe a Saudi secondary school where the teacher selected a number of grammar areas from the coursebook and then identified instructional videos from YouTube that addressed these areas. Buitrago & Díaz (2018) describe a Colombian university where students were required to watch instructional videos about grammar, some of which were selected from YouTube and others created by members of staff.

To understand better what learners might be doing in their flipped time, I decided to take a look at a selection of YouTube grammar videos. I focussed on one area of grammar only (‘bored’ vs ‘boring’) and from the huge selection available, I prioritised those that were the most popular. Here’s what I found. After a brief commentary on each of the 10 videos, I wrap up with a few observations.

mmmEnglish 1245K views 8.33 minutes

mmmEnglish

Early on, Emma says ‘These endings are called suffixes and when we add them to the end of a verb, they transform our verb into an adjective, but you need to know how to use each of these types of adjectives and we’re gonna do that right now’. This gives a good taste of what follows. We learn that –ing adjectives refer to ‘the characteristics of a person, a thing, or a situation’ while –ed adjectives refer to an ‘emotion or a feeling’. Bearing in mind that this area of grammar is listed as A2+ (in Pearson’s GSE), explanations of this kind in English may be tricky for many learners. The language grading in explanations like ‘If you say that someone or something is boring, they or it makes you feel bored. Do the thing or the person that is boring is what makes you feel bored. It bores you. OK, there’s our verb’ needs a little attention! On and on goes Emma, until after almost five minutes she reads out a few sentences and students have to decide if the correct adjective has been used. Over a million people have watched this.

Learn English with Let’s Talk 452K views 8.52 minutes

Lets Talk

Rashna explains: ‘First, let’s begin by understanding what are adjectives’. My heart sinks. ‘So ‘pretty’ is doing the job of describing or bringing about a quality of the noun ‘girl’, so ‘pretty’ becomes my adjective. So when you’re confused and don’t know how to spot the adjectives, ask the question ‘what kind’. All right. So, if I say I live in a big city, and if I ask what kind of a city, it’s big, so ‘big’ is an adjective that is describing the noun ‘city’. All right. So remember, adjectives are nothing but just words that describe a noun that tell you more about it or bring about some quality.’ Over a quarter of the way through and we haven’t yet got on to –ed and –ing. I recommend watching all the way through to the end just to admire the whiteboard work. You might enjoy the comments, too (e.g. ‘Thanks very much. This lesson was confused me so much.’) Coming up for half a million views.

Alejo Lopera Inglés 428K views 4.07 minutes

Alejo

The only English here is in the example sentences, with Spanish being used for the rest. The explanation hinges on ‘pienso’ (think) for –ing and ‘sentimiento’ (feeling) for –ed, which only kind of works. Alejo takes us through a few examples using a combination of talking-head video and background slides. His delivery is engaging and using Spanish makes things clearer than English only.

English Lessons with Adam 357K views 5.27 minutes

Adam

Standing in front of the whiteboard, Adam says that his video is especially useful for beginners. He rambles on for over 5 minutes in language which is far more complicated than the language he is explaining. Here’s a flavour: Now, what does it mean to be bored and what does it mean to be boring? When we talk about “bored”, we’re describing a feeling. Okay? When we talk about “interested”, we’re describing a feeling. So all of the “ed” adjectives are actually feelings, and you can only use them to talk about people and sometimes animals. Why? Because things, like chairs, or tables, or whatever, they don’t have feelings. […]”I am worried”, now people don’t realize that “worried” can have “worrying” as another adjective. “The situation is worrying” means the situation is making me feel worried. Okay? Maybe the whole global political situation, whatever. Now, hopefully none of you are confused by this lesson because I’m trying to make it not confusing. Okay? Everybody okay with that? […] Now, I just want to point out one other thing: Don’t confuse feeling adjectives with “ed” with actual feelings. Okay? If somebody is loved, does he feel loved? Maybe yes, maybe no. We’re not talking about that person’s feelings.

Crown Academy of English 270K views 26.57 minutes

Crown academy

Using screen capture and voiceover software, the script is mostly read aloud from the screen. There is no attempt to make either the script or the delivery interesting. The approach is as traditional as can be: it focuses first on form, with no shying away from grammatical jargon, and eventually moves on to meaning. And then, if you’re still awake, there’s a discrimination exercise. After over 25 minutes of death-by-Powerpoint, the lesson comes, mercifully, to an end.

 

Learn English with Rebecca 274K views 3.30 minutes

Rebecca

From the same stable as Adam’s video, this is more controlled than his ramble, and with slightly better language grading, but is still hard to follow, in part because no examples are given in written form. As with Adam, Rebecca bangs on about how important it is to get this grammar right, because ‘if you make a mistake you could be saying something very unpleasant about yourself’. It’s hard to tell what level it’s intended for.

Francisco Ochoa Inglés Fácil 64K views 11.02 minutes

Pacho

Switching between Spanish and English, Pacho rattles non-stop through 6 discrimination sentences, taking the difference between feelings (which take the Spanish ‘estar’) and states (which take the Spanish ‘ser’) as his key explanatory tool. This doesn’t quite work, but following his breakneck delivery is more of a problem. The only thing he doesn’t translate are the commas in his examples. I challenge you not to feel confused / confusing by the time he gets to the third sentence. Even Pacho seems to be struggling. Words like ‘hence’ and tenses like past perfect continuous don’t help his 11 minute monologue. I loved the way that he says at the end that the only way to learn this stuff is by applying the language in the way he has just done.

BBC Learning English 48K views 0.56 minutes

BBC_Learning_English

In under a minute, Sam from BBC Learning English achieves much greater clarity than anyone else I watched, helped by a carefully planned script, very controlled language and a split screen showing the key points as she makes them. Towards the end, she rattles through 5 more –ed / -ing pairs rather too quickly. It’s a shame, I thought, that she (or the producers) felt the need to reference the old trope about how boring grammar lessons are.

Shaw English Online 46K views 8.49 minutes

Shaw English Online

The explanation is mercifully brief and the language of Fanny, the presenter, is well controlled. We could do without the exhortations to listen carefully, etc, ‘because this is very important’, but you can’t have everything. A lot of examples are given, before the explanations are repeated. The repetitions don’t help as Fanny resorts to more complicated language than the language she is explaining (e.g. ‘But when you say the teacher was boring, you are describing the teacher, OK, the teacher made the students feel bored, because he or she was boring’). After nearly 4 minutes of presentation, there are some practice discrimination tasks, but Fanny’s relentless commentary gets seriously in the way. The lesson is rounded off with a few minutes of repeat-after-me pronunciation practice.

Mad English TV 24K views 6.59 minutes

Mad_English

In a surreal opening, the presenter talks about the three different states of H2O, before explaining that people, too, can have different states. Eventually, we get to the idea that ‘boring’ is an accusation, ‘bored’ is a state: ‘If you go up to your teacher and say ‘you’re boring’, that’s an insult’. The language grading is all over the place, as is the explanation itself. As a general rule, the longer the explanation, the less clear it is. At 7 minutes, this video is no exception to the rule. When we get to a mini-test (a useful feature that not all other videos have), the choice is ‘My cat is _______’. To know the answer, you need to know if you’re making an accusation about the cat. Got it?

Flipped learning and grammar

Although grammar instruction might seem a strong candidate for a flipped treatment, videoed explanations are clearly not the way to do it. Many coursebooks have perfectly adequate guided discoveries of this and other standard grammar points. Newer courses on platforms have interactive guided discoveries (and often also offer a more traditional deductive route) that will also do the trick much better than videoed explanations. Would learners not be better off doing something else altogether with their time? Initial vocabulary study, listening, reading, writing, almost anything in fact, is a more appropriate target for flipping than grammar, when approached in this way. Video is not the solution to a problem: on the evidence here, it makes the problem worse.

The popularity of grammar videos

It’s very hard to watch this stuff and not scoff, but there’s no denying the immense popularity of videos like these. Much as I find it difficult to believe, people must be learning something (or think they are learning something) from watching them. Otherwise, they presumably wouldn’t consume them to such an extent. Perhaps, these videos conform to expectations about what English lessons should be like? Perhaps viewers subscribe to a belief in ‘no pain, no gain’? Perhaps they simply don’t know where to find something that would help them more? Or perhaps they have been told to watch by their flipping teachers?

Emma has had 1.25 million views. Advertising earnings from 1 million YouTube views are generally reckoned to be between $600-$7000, but are likely to be at the higher end of this scale if (1) people watch the video through to the end (which is probably the case here), and (2) viewers interact with the video through likes and comment (for this video Emma has received 2353 comments). Earnings are also higher when you have more subscribers to your channel. Emma can count on 3.25 million subscribers and Rachna of Let’s Talk has 4.77 million subscribers. By way of contrast, Russell Stannard’s Teacher Training Videos has 40,000 subscribers. There’s gold in them thar hills.

Grammar videos and the world of ELT

Free grammar videos, along with self-study apps like Duolingo, are a huge and thriving sector of ELT. They rarely, if ever, feature in research, conference presentations or the broader discourse of ELT, a world, it seems, much more oriented to products you have to pay for.

References

Al-Harbi, S.S., & Alshumaimeri, Y.A. (2016). The flipped classroom impact in grammar class on EFL Saudi secondary school students’ performances and attitudes. English Language Teaching, 9(10): 60–80. Available at: https://files.eric.ed.gov/fulltext/EJ1113506.pdf

Bergmann, J., Overmeyer, J., & Wilie, B. (2013). The flipped class: myth vs. reality. The Daily Riff, July 9, 2013. Available at: http://www.thedailyriff.com/articles/the-flipped-class-conversation-689.php

Buitrago, C. R., & Díaz, J. (2018). Flipping your writing lessons: Optimizing your time in your EFL writing classroom. In Mehring, J., & Leis, A. (Eds.), Innovations in Flipping the Language Classroom. Singapore: Springer, 69–91.