Posts Tagged ‘critical thinking’

My attention was recently drawn (thanks to Grzegorz Śpiewak) to a recent free publication from OUP. It’s called ‘Multimodality in ELT: Communication skills for today’s generation’ (Donaghy et al., 2023) and it’s what OUP likes to call a ‘position paper’: it offers ‘evidence-based recommendations to support educators and learners in their future success’. Its topic is multimodal (or multimedia) literacy, a term used to describe the importance for learners of being able ‘not just to understand but to create multimedia messages, integrating text with images, sounds and video to suit a variety of communicative purposes and reach a range of target audiences’ (Dudeney et al., 2013: 13).

Grzegorz noted the author of this paper’s ‘positively charged, unhedged language to describe what is arguably a most complex problem area’. As an example, he takes the summary of the first section and circles questionable and / or unsubstantiated claims. It’s just one example from a text that reads more like a ‘manifesto’ than a balanced piece of evidence-reporting. The verb ‘need’ (in the sense of ‘must’, as in ‘teachers / learners / students need to …’) appears no less than 57 times. The modal ‘should’ (as in ‘teachers / learners / students should …’) clocks up 27 appearances.

What is it then that we all need to do? Essentially, the argument is that English language teachers need to develop their students’ multimodal literacy by incorporating more multimodal texts and tasks (videos and images) in all their lessons. The main reason for this appears to be that, in today’s digital age, communication is more often multimodal than not (i.e. monomodal written or spoken text). As an addendum, we are told that multimodal classroom practices are a ‘fundamental part of inclusive teaching’ in classes with ‘learners with learning difficulties and disabilities’. In case you thought it was ironic that such an argument would be put forward in a flat monomodal pdf, OUP also offers the same content through a multimodal ‘course’ with text, video and interactive tasks.

It might all be pretty persuasive, if it weren’t so overstated. Here are a few of the complex problem areas.

What exactly is multimodal literacy?

We are told in the paper that there are five modes of communication: linguistic, visual, aural, gestural and spatial. Multimodal literacy consists, apparently, of the ability

  • to ‘view’ multimodal texts (noticing the different modes, and, for basic literacy, responding to the text on an emotional level, and, for more advanced literacy, respond to it critically)
  • to ‘represent’ ideas and information in a multimodal way (posters, storyboards, memes, etc.)

I find this frustratingly imprecise. First: ‘viewing’. Noticing modes and reacting emotionally to a multimedia artefact do not take anyone very far on the path towards multimodal literacy, even if they are necessary first steps. It is only when we move towards a critical response (understanding the relative significance of different modes and problematizing our initial emotional response) that we can really talk about literacy (see the ‘critical literacy’ of Pegrum et al., 2018). We’re basically talking about critical thinking, a concept as vague and contested as any out there. Responding to a multimedia artefact ‘critically’ can mean more or less anything and everything.

Next: ‘representing’. What is the relative importance of ‘viewing’ and ‘representing’? What kinds of representations (artefacts) are important, and which are not? Presumably, they are not all of equal importance. And, whichever artefact is chosen as the focus, a whole range of technical skills will be needed to produce the artefact in question. So, precisely what kind of representing are we talking about?

Priorities in the ELT classroom

The Oxford authors write that ‘the main focus as English language teachers should obviously be on language’. I take this to mean that the ‘linguistic mode’ of communication should be our priority. This seems reasonable, since it’s hard to imagine any kind of digital literacy without some reading skills preceding it. But, again, the question of relative importance rears its ugly head. The time available for language leaning and teaching is always limited. Time that is devoted to the visual, aural, gestural or spatial modes of communication is time that is not devoted to the linguistic mode.

There are, too, presumably, some language teaching contexts (I’m thinking in particular about some adult, professional contexts) where the teaching of multimodal literacy would be completely inappropriate.

Multimodal literacy is a form of digital literacy. Writers about digital literacies like to say things like ‘digital literacies are as important to language learning as […] reading and writing skills’ or it is ‘crucial for language teaching to […] encompass the digital literacies which are increasingly central to learners’ […] lives’ (Pegrum et al, 2022). The question then arises: how important, in relative terms, are the various digital literacies? Where does multimodal literacy stand?

The Oxford authors summarise their view as follows:

There is a need for a greater presence of images, videos, and other multimodal texts in ELT coursebooks and a greater focus on using them as a starting point for analysis, evaluation, debate, and discussion.

My question to them is: greater than what? Typical contemporary courseware is already a whizzbang multimodal jamboree. There seem to me to be more pressing concerns with most courseware than supplementing it with visuals or clickables.

Evidence

The Oxford authors’ main interest is unquestionably in the use of video. They recommend extensive video viewing outside the classroom and digital story-telling activities inside. I’m fine with that, so long as classroom time isn’t wasted on getting to grips with a particular digital tool (e.g. a video editor, which, a year from now, will have been replaced by another video editor).

I’m fine with this because it involves learners doing meaningful things with language, and there is ample evidence to indicate that a good way to acquire language is to do meaningful things with it. However, I am less than convinced by the authors’ claim that such activities will strengthen ‘active and critical viewing, and effective and creative representing’. My scepticism derives firstly from my unease about the vagueness of the terms ‘viewing’ and ‘representing’, but I have bigger reservations.

There is much debate about the extent to which general critical thinking can be taught. General critical viewing has the same problems. I can apply critical viewing skills to some topics, because I have reasonable domain knowledge. In my case, it’s domain knowledge that activates my critical awareness of rhetorical devices, layout, choice of images and pull-out quotes, multimodal add-ons and so on. But without the domain knowledge, my critical viewing skills are likely to remain uncritical.

Perhaps most importantly of all, there is a lack of reliable research about ‘the extent to which language instructors should prioritize multimodality in the classroom’ (Kessler, 2022: 552). There are those, like the authors of this paper, who advocate for a ‘strong version’ of multimodality. Others go for a ‘weak version’ ‘in which non-linguistic modes should only minimally support or supplement linguistic instruction’ (Kessler, 2022: 552). And there are others who argue that multimodal activities may actually detract from or stifle L2 development (e.g. Manchón, 2017). In the circumstances, all the talk of ‘needs to’ and ‘should’ is more than a little premature.

Assessment

The authors of this Oxford paper rightly note that, if we are to adopt a multimodal approach, ‘it is important that assessment requirements take into account the multimodal nature of contemporary communication’. The trouble is that there are no widely used assessments (to my knowledge) that do this (including Oxford’s own tests). English language reading tests (like the Oxford Test of English) measure the comprehension of flat printed texts, as a proxy for reading skills. This is not the place to question the validity of such reading tests. Suffice to say that ‘little consensus exists as to what [the ability to read another language] entails, how it develops, and how progress in development can be monitored and fostered’ (Koda, 2021).

No doubt there are many people beavering away at trying to figure out how to assess multimodal literacy, but the challenges they face are not negligible. Twenty-first century digital (multimodal) literacy includes such things as knowing how to change the language of an online text to your own (and vice versa), how to bring up subtitles, how to convert written text to speech, how to generate audio scripts. All such skills may well be very valuable in this digital age, and all of them limit the need to learn another language.

Final thoughts

I can’t help but wonder why Oxford University Press should bring out a ‘position paper’ that is so at odds with their own publishing and assessing practices, and so at odds with the paper recently published in their flagship journal, ELT Journal. There must be some serious disconnect between the Marketing Department, which commissions papers such as these, and other departments within the company. Why did they allow such overstatement, when it is well known that many ELT practitioners (i.e. their customers) have the view that ‘linguistically based forms are (and should be) the only legitimate form of literacy’ (Choi & Yi, 2016)? Was it, perhaps, the second part of the title of this paper that appealed to the marketing people (‘Communication Skills for Today’s Generation’) and they just thought that ‘multimodality’ had a cool, contemporary ring to it? Or does the use of ‘multimodality’ help the marketing of courses like Headway and English File with additional multimedia bells and whistles? As I say, I can’t help but wonder.

If you want to find out more, I’d recommend the ELT Journal article, which you can access freely without giving your details to the marketing people.

Finally, it is perhaps time to question the logical connection between the fact that much reading these days is multimodal and the idea that multimodal literacy should be taught in a language classroom. Much reading that takes place online, especially with multimodal texts, could be called ‘hyper reading’, characterised as ‘sort of a brew of skimming and scanning on steroids’ (Baron, 2021: 12). Is this the kind of reading that should be promoted with language learners? Baron (2021) argues that the answer to this question depends on the level of reading skills of the learner. The lower the level, the less beneficial it is likely to be. But for ‘accomplished readers with high levels of prior knowledge about the topic’, hyper-reading may be a valuable approach. For many language learners, monomodal deep reading, which demands ‘slower, time-demanding cognitive and reflective functions’ (Baron, 2021: x – xi) may well be much more conducive to learning.

References

Baron, N. S. (2021) How We Read Now. Oxford: Oxford University Press

Choi, J. & Yi, Y. (2016) Teachers’ Integration of Multimodality into Classroom Practices for English Language Learners’ TESOL Journal, 7 (2): 3-4 – 327

Donaghy, K. (author), Karastathi, S. (consultant), Peachey, N. (consultant), (2023). Multimodality in ELT: Communication skills for today’s generation [PDF]. Oxford University Press. https://elt.oup.com/feature/global/expert/multimodality (registration needed)

Dudeney, G., Hockly, N. & Pegrum, M. (2013) Digital Literacies. Harlow: Pearson Education

Kessler, M. (2022) Multimodality. ELT Journal, 76 (4): 551 – 554

Koda, K. (2021) Assessment of Reading. https://doi.org/10.1002/9781405198431.wbeal0051.pub2

Manchón, R. M. (2017) The Potential Impact of Multimodal Composition on Language Learning. Journal of Second Language Writing, 38: 94 – 95

Pegrum, M., Dudeney, G. & Hockly, N. (2018) Digital Literacies Revisited. The European Journal of Applied Linguistics and TEFL, 7 (2): 3 – 24

Pegrum, M., Hockly, N. & Dudeney, G. (2022) Digital Literacies 2nd Edition. New York: Routledge

You could be forgiven for wondering what, precisely, digital literacies are. In the first edition of ‘Digital Literacies’, Dudeney et al. (2013:2) define the term as ‘the individual and social skills needed to effectively interpret, manage, share and create meaning in the growing range of digital communication channels’. This is pretty broad, and would seem to encompass more or less anything that people do with digital technology, including the advanced arts of trolling and scamming. Nine years later, in the new edition of this book (Pegrum et al., 2022:5), the authors modify their definition a little: ‘the individual and social skills needed to effectively manage meaning in an era of digitally networked, often blended, communication’. This is broader still. In the intervening years there has been a massive proliferation of ways of describing specific digital literacies, as well as more frameworks of digital literacies than anyone (bar people writing about the topic) could possibly want. Of course, there is much in common between all these descriptive and taxonomic efforts, but there is also much that differs. What, precisely, ‘digital literacies’ means changes over both time and space. It carries different meanings in Australia, Sweden and Argentina, and, perhaps, it only makes sense to have a local conceptualisation of the term (Pangrazio et al., 2020). By the time you have figured out what these differences are, things will have moved on. Being ‘digitally-literate’ literate is an ongoing task.

What, precisely, ‘digital literacies’ are only really matters when we are told that it is vital to teach them. It’s easy to agree that digital skills are quite handy in this networked world, but, unless we have a very clear idea of what they are, it’s not going to be easy to know which ones to teach or how to teach them. Before we get caught up in the practical pedagogical details, it might be useful to address three big questions:

  • How useful it is to talk about digital literacies?
  • Can digital literacies be taught?
  • Should digital literacies be taught as part of the English language curriculum?

How useful is it to talk about digital literacies?

Let’s take one example of a framework: the Cambridge Life Competencies Framework (CLC). The CLC lists six key competencies (creative thinking, critical thinking, communication, collaboration, learning to learn, and social responsibilities). Underpinning and informing these six competencies are three ‘foundation layers’: ‘emotional development’, ‘discipline knowledge’ and ‘digital literacy’. Digital literacy is broken down as follows:

It’s a curious amalgam of relatively straightforward skills and much more complex combinations of skills with knowledge, attitudes and dispositions. In the former category (see the first box in the chart above), we would find things like the ability to use tags, hashtags, search engines, and filters. In the latter (see the second box in the chart above), we would find things like the ability to recognise fake news or to understand how and why personally targeted advertising is such a money-spinner.

Another example, this one, from Pegrum et al (2018), is more complex and significantly more detailed. On the more technical side, we see references to the ability to navigate within multimodal gaming, VR and AR environments, or the ability to write and modify computer code. And for more complex combinations of skills, knowledge, attitudes and dispositions, we have things like the ability to develop a reputation and exert influence within online networks, or ‘the ability to exert a positive influence (online) by adopting a stance of intellectual humility and ethical pluralism’.

This is all a far remove from only seven years ago when ‘digital literacies’ were described as ‘the practices of reading, writing and communication made possible by digital media’ (Hafner et al., 2015) and the kinds of skills required were almost all closely connected to reading and writing. The digital world has changed, and so has our understanding of what it means to operate effectively within that world. Perhaps it is time, too, to change our terminology: ‘literacies’ is still with us, but it seems almost wilfully misleading. ‘Abilities’ or ‘competencies’ would seem to be much more appropriate terms to refer to what we are discussing in these frameworks, but ‘abilities’ probably isn’t sciency enough, and ‘competencies’ has already been done to death.

The problem with lumping all these things together under a single superordinate is that it seems to imply that there is some sort of equivalence between all the subordinate labels, that there is some categorial similarity. Pegrum et al (2022) acknowledge that there are differences of complexity between these ‘literacies’ – they use a star system to indicate degree of complexity. But I think that there is no sufficiently strong reason to put some of these things together in the first place. Dudeney at al (2013: 14) note that some of their literacies are ‘macroliteracies’ – ‘in other words, a literacy which draws on numerous other literacies – and involves linguistic, multimedia, spatial, kinaesthetic and other skills’. Why, then, call them ‘literacies’ at all? The only connection between knowing how to generate a LOLcat post and knowing how to ‘remain non-judgemental towards new perspectives, multiple viewpoints, and shifting contexts’ is that both are related to our use of digital technology. But since there is very little in our lives that is not now related in some way to digital technology, is this good enough reason to bracket these two abilities together?

Pegrum et al (2022) found that they needed to expand their list of digital literacies in the new edition of their book, and they will no doubt need to do so again nine years from now. But is the fact that something could be included in a taxonomy a good reason for actually including it? ‘Code literacy’, for example, seems rather less urgent now than it did nine years ago. I have never been convinced by gaming literacy or remix literacy. Are these really worth listing alongside the others in the table? Even if they are, nobody (including Pegrum et al.) would disagree that some prioritisation is necessary. However, when we refer to ‘digital literacies’ and how vital it is to teach them, we typically don’t specify a particular literacy and not another. We risk committing the logical error of assuming that something that holds true for a group or category, also holds true for all the members of the group or subordinates of the category.

Can digital literacies be taught?

There is clearly no particular problem in teaching and learning some digital literacies, especially the more technical ones. Unfortunately, the more specific and technical we are (e.g. when we mention a particular digital tool), the more likely it is that its shelf-life will be limited. Hardware comes and goes (I haven’t had to format a floppy disc for a while), as do apps and software. To the risk of wasting time teaching a skill that may soon be worthless, we may add the risk of not including literacies that have not yet found their way into the taxonomies. Examples include knowing how to avoid plagiarism detectors (as opposed to avoiding plagiarism) or how to use GPT-3 (and soon GPT-4) text generators. Handy for students.

The choice of digital tools is crucial when one of the key pieces of advice for teaching digital literacy is to integrate the use of digital tools into lessons (e.g. in the Cambridge Life Competencies Digital Literacy booklet). This advice skates over the key questions of which tool, and which literacy is being targeted (and why). Watching TikTok videos, using Mentimeter in class, or having a go with a VR headset may serve various educational purposes, but it would be stretching a point to argue that these activities will do much for anyone’s digital literacy. Encouraging teachers to integrate technology into their lessons (government policy in some countries) makes absolutely no sense unless the desired outcome – digital literacy – is precisely defined in advance. It rarely is. See here for further discussion.

Encouragement to include technology, any old technology, in lessons is almost never grounded in claims that a particular technical skill (e.g. navigating TikTok) has any pressing value. Rather, the justification usually comes from reference to what might be called ‘higher-order’ skills, like critical thinking: what I referred to earlier as curious amalgams of relatively straightforward skills and much more complex combinations of skills with knowledge, attitudes and dispositions.

The problem here is that it remains very uncertain whether things like ethical literacy or critical digital literacy are likely to be learnt through instruction. They can certainly be practised, and Pegrum et al (2022) have some very nice activities. The aims of these activities is typically described using a vague ‘raise awareness of’ formula, but whether they will lead, for example, to any improved ability ‘to exert a positive influence (online) by adopting a stance of intellectual humility and ethical pluralism’ is debatable. Much as the world might be a better place if classroom activities of this kind did actually work, research evidence is sadly lacking. For a more detailed look at the problems of trying to teach critical digital literacy / media information literacy, see here.

Should digital literacies be part of the English language curriculum?

So, is it ‘crucial for language teaching to […] encompass the digital literacies which are increasingly central to learners’ […] lives’ (Pegrum et al, 2022)? Well, it all depends on which digital literacies we are talking about. It also depends on what kind of learners in what kinds of learning contexts. And it depends on both institutional objectives and the personal objectives of the learners themselves. So, ‘crucial’, no, but we’ll put the choice of adjective down to rhetorical flourish.

Is it true that ‘digital literacies are as important to language learning as […] reading and writing skills […]’ (Pegrum et al., 2022: 1)? Clearly not. Since it’s hard to imagine any kind of digital literacy without some reading skills preceding it, the claim that they are comparable in importance is also best understood as rhetorical flourish.

A modicum of critical (digital) literacy is helpful when it comes to reading literature on digital literacies.

References

Dudeney, G., Hockly, N. & Pegrum, M. (2013) Digital Literacies. Harlow: Pearson Education

Hafner, C.A., Chik, A. & Jones, R. H. (2015) Digital Literacies and Language Learning, Language Learning & Technology, 19 (3): 1-  7

Pangrazio, L., Godhe, A.-L., & Ledesma, A. G. L. (2020) What is digital literacy? A comparative review of publications across three language contexts. E-Learning and Digital Media, 17(6), 442–459. https://doi.org/10.1177/204275302094629

Pegrum, M., Hockly, N. & Dudeney, G. (2022) Digital Literacies 2nd Edition. New York: Routledge

Pegrum, M., Dudeney, G. & Hockly, N. (2018) Digital Literacies Revisited. The European Journal of Applied Linguistics and TEFL, 7 (2): 3 – 24

In the campaign for leadership of the British Conservative party, prime ministerial wannabe, Rishi Sunak, announced that he wanted to phase out all university degrees with low ‘earning potential’. This would mean the end of undergraduate courses in fashion, film, philosophy, English language and media studies. And linguistics. More of an attention-grabbing soundbite than anything else, it reflects a view of education that is shared by his competitor, Liz Truss, who ‘is passionate about giving every child basic maths and science skills’ as a way of driving the contribution of education to the economy.

It’s a view that is shared these days by practically everyone with any power and influence, from national governments to organisations like the EU and the OECD (Schuller, 2000). It is rooted in the belief that what matters most in education are the teachable knowledges, skills and competences that are relevant to economic activity (as the OECD puts it). These competences are seen to be essential to economic growth and competitivity, and essential to individuals to enhance their employment potential. Learning equals earning. The way for societies to push this orientation to education is to allow market forces to respond to the presumed demands of the consumers of education (students and their sponsors), as they seek to obtain the best possible return on their investment in education. Market forces are given more power when education is privatized and uncoupled from the state. For this to happen, the market may need a little help in the form of policies from the likes of Sunak and Truss.

This set of beliefs has a name: human capital theory (Becker, 1993). Human capital refers both to the skills that individuals ‘bring to bear in the economy and the need for capital investment in these’ (Holborow, 2012). It is impossible to overstate just how pervasive this theory in contemporary approaches to education is. See, for example, this selection of articles from Science Direct. It is also very easy to forget how recently the lens of human capital has become practically the only lens through which education is viewed.

Contemporary language teaching is perhaps best understood as a series of initiatives that have been driven by human capital theory. First and foremost, there is the global ‘frenzied rush towards acquiring English’ (Holborow, 2018), driven both by governments and by individuals who see that foreign language competence (especially English) ‘might […]open up new opportunities for students [and] assist them in breaking social barriers’ (Kormos & Kiddle, 2013). Children, at ever younger ages (even pre-school), are pushed towards getting a headstart in the race to acquire human capital, whilst there has been an explosive growth in EMI courses (Lasagabaster, 2022). At the same time, there has been mushrooming interest in so-called 21st century skills (or ‘life skills’ / ‘global skills’) in the English language curriculum. These skills have been identified by asking employers what skills matter most to them when recruiting staff. Critical and creative thinking skills may be seen as having pre-Human Capital, intrinsic educational worth, but it is their potential contribution to economic productivity that explains their general current acceptance.

Investments in human capital need to be measured and measurable. Language teaching needs to be made accountable. Our preoccupation with learning outcomes is seen in the endless number of competency frameworks, and with new tools for quantifying language proficiency. Technology facilitates this evaluation, promises to deliver language teaching more efficiently, and technological skills are, after English language skills themselves, seen to be the most bankable of 21st century skills. Current interest in social-emotional learning – growth mindsets, grit, resilience and so on – is also driven by a concern to make learning more efficient.

In all of these aspects of language teaching / learning, the private sector (often in private-public partnerships) is very visible. This is by design. Supported by the state, the market economy of education grows in tandem with the rising influence of the private sector on national educational policy. When education ministers lose their job, they can easily find well-paid consultancies in the private sector (as in the case of Sunak and Truss’s colleague, Gavin Williamson).

One of the powers of market-economy ideologies is that it often seems that ‘there is no alternative’ (TINA). There are, however, good reasons to try to think in alternative terms. To begin with, and limiting ourselves for the moment to language teaching, there is a desperate lack of evidence that starting English language learning at very young ages (in the way that is most typically done) will lead to any appreciable gains in the human capital race. It is generally recognised that EMI is highly problematic in a variety of ways (Lasagabaster, 2022). The focus on 21st century skills has not led to any significant growth in learning outcomes when these skills are measured. There is a worrying lack of evidence that interventions in schools to promote improvements in critical or creative thinking have had much, if any, impact at all. Similarly, there is a worrying lack of evidence that attention to growth mindsets or grit has led to very much at all. Personalized learning, facilitated by technology, likewise has a dismal track record. At the same time, there is no evidence that the interest in measuring learning outcomes has led to any improvement in those outcomes. For all the millions and millions that have been invested in all these trends, the returns have been very slim. Perhaps we would have done better to look for solutions to those aspects of language teaching which we know to be problematic. The obsession with synthetic syllabuses delivered by coursebooks (or their online equivalents) comes to mind.

But beyond the failure of all these things to deliver on their promises, there are broader issues. Although language skills (usually English) have the potential to enhance employment prospects, Holborow (2018) has noted that they do not necessarily do so (see, for example, Yeung & Gray, 2022). Precisely how important language skills are is very hard to determine. A 2016 survey by Cambridge English found that ‘approximately half of all employers offer a better starting package to applicants with good English language skills’ and a similar number indicate that these skills result in faster career progression. But these numbers need to be treated with caution, not least because Cambridge English is in the business of selling English. More importantly, it seems highly unlikely that the figures that are reported reflect the reality of job markets around the world. The survey observes that banking, finance and law are the sectors with the greatest need for such skills, but these are all usually graduate posts. An average of 39% of the population in OECD countries has tertiary education; the percentage is much lower elsewhere. How many students of a given age cohort will actually work in these sectors? Even in rich countries, like Germany and the Netherlands, between 40 and 60% of workers are employed in what is termed ‘nonstandard forms of work’ (OECD, 2015) where language skills will count for little or nothing. These numbers are growing. Language skills are of most value to those students who are already relatively advantaged. That is not to say that there are no potential benefits to everyone in learning English, but these benefits will not be found in better jobs and wages for the majority. One interesting case study describes how a Swiss airport company exploits the language skills of migrant workers, without any benefits (salary or mobility) accruing to the workers themselves (Duchêne, 2011).

The relationship between learning English and earning more is a lot more complex than is usually presented. The same holds true for learning more generally. In the US, ‘nearly two-thirds of job openings in 2020 required no more than a high school diploma’ (Brown et al., 2022: 222). Earnings for graduates in real terms are in decline, except for those at the very top. For the rest, over $1.3 trillion in student loan debt remains unpaid. Elsewhere in the world, the picture is more mixed, but it is clear that learning does not equal earning in the global gig economy.

This evident uncoupling of learning from earning has led some to conclude that education is ‘a waste of time and money’ (Caplan, 2018), a view that has been gaining traction in the US. It’s not an entirely unreasonable view, if the only reason for education is seen to be its contribution to the economy. More commonly, the reaction has been to double-down on human capital theory. In Spain, for example, with its high levels of youth unemployment, there are calls for closer links between educational institutions, and graduates themselves are blamed for failing to take ‘advantage of the upgrading in the demand for skills’ (Bentolilla et al., 2022). This seems almost wilfully cruel, especially since the authors note that there is global trend in falling economic returns in tertiary education (ILO, 2020).

But, rather than doubling-down on human capital theory (e.g. more vocational training, more efficient delivery of the training), it might be a good idea to question human capital theory itself. Both early and more recent critics have tended to accept without hesitation that education can enhance worker productivity, but argue that, as a theory, it is too simplistic to have much explanatory power, and that the supporting evidence is weak, vague or untestable (Bowles & Gintis, 1975; Fix, 2018). Language skills, like education more generally, do not always lead to better employment prospects and salaries, because ‘wider, systemic social inequalities come into play’ (Holborow, 2018). It is not because black women need to brush up on their 21st century skills that they earn less than white men.

Until recently, critics of human capital theory have been a minority, and largely unheard, voice. But this appear to be changing. The World Bank, more guilty than anyone for pushing human capital theory on the global stage (see here), has recognised that hoped-for job outcomes do not always materialize after massive investments in training systems (World Bank, 2013). Mainstream critics include the Nobel prize winners Joseph Stiglitz and Amartya Sen, and the recent OUP title, ‘The Death of Human Capital?’ (Brown et al., 2020) is likely to spur debate further. The assumption that human capital theory holds water no longer holds water.

When we turn back to English language teaching, we might benefit from some new thinking. For sure, there will be large numbers of English language learners whose only purpose in studying is utilitarian, whose primary desire is to enhance their human capital. But there are also millions, especially children studying in public schools, for whom things are rather different. A major change in thinking involves a reconceptualization of the point of all this English study. If learning English is not, for the majority, seen primarily as a preparation for the workplace, but as compensation for the realities of (un)employment (Brown et al., 2020: 13), most of the recent innovations in ELT would become highly suspect. We would have a much less impoverished view of ‘the complex and multifaceted nature of language’ (Holborow, 2018) and we would find more space for plurilingual practices. A brake on relentless Englishization might be no bad thing (Wilkinson & Gabriëls, 2021). We might be able to explore more fully artistic and creative uses of language. Who knows? We might finally get round to wider implementation of language teaching approaches that we know have a decent chance of success.

References

Becker, G. S. (1993). Human Capital: A Theoretical and Empirical Analysis, with Special Reference to Education (3rd ed.). University of Chicago Press.

Bentolila, S., Felgueroso, F., Jansen, M. et al. (2022). Lost in recessions: youth employment and earnings in Spain. SERIEs 13: 11–49. https://doi.org/10.1007/s13209-021-00244-6

Bowles, S. & Gintis, H. (1975). The Problem with Human Capital Theory – a Marxian critique. The American Economic Review, 65 (2): 74 – 83

Brown, S., Lauder, H. & Cheung, S. Y. (2020). The Death of Human Capital? New York: Oxford University Press

Caplan, B. (2018). The Case against Education: Why the Education System is a Waste of Time and Money. Princeton, NJ: Princeton University Press

Duchêne, A. (2011). Neoliberalism, Social Inequalities, and Multilingualism: The Exploitation of Linguistic Resources and Speakers. Langage et Société, 136 (2): 81 – 108

Fix, B. (2018). The Trouble with Human Capital Theory. Working Papers on Capital as Power, No. 2018/7

Holborow, M. (2012). Neoliberal keywords and the contradictions of an ideology. In Block, D., Gray, J. & Holborow, M. Neoliberalism and Applied Linguistics. Abingdon: Routledge: 33 – 55

Holborow, M. (2018). Language skills as human capital? Challenging the neoliberal frame. Language and Intercultural Communication, 18: (5): 520-532

ILO (2020). Global employment trends for youth, 2020. Geneva: International Labour Organization

Kormos, J., & Kiddle, T. (2013). The role of socio-economic factors in motivation to learn English as a foreign language: the case of Chile. System, 41(2): 399-412

Lasagabaster, D. (2022). English-Medium Instruction in Higher Education. Cambridge: Cambridge University Press

OECD (2015). In It Together, Why Less Inequality Benefits All. Paris: OECD

Schuller, T. (2000). Social and Human Capital: The Search for Appropriate Technomethodology. Policy Studies, 21 (1): 25 – 35

Wilkinson, R., & Gabriëls, R. (Eds.) (2021). The Englishization of Higher Education in Europe. Amsterdam: Amsterdam University Press.

World Bank (2012). World Development Report 2013: Jobs. Washington, DC: World Bank

Yeung, S. & Gray, J. (2022). Neoliberalism, English, and spoiled identity: The case of a high-achieving university graduate in Hong Kong. Language in Society, First View, pp. 1 – 22

There’s a video on YouTube from Oxford University Press in which the presenter, the author of a coursebook for primary English language learners (‘Oxford Discover’), describes an activity where students have a short time to write some sentences about a picture they have been shown. Then, working in pairs, they read aloud their partner’s sentences and award themselves points, with more points being given for sentences that others have not come up with. For lower level, young learners, it’s not a bad activity. It provides opportunities for varied skills practice of a limited kind and, if it works, may be quite fun and motivating. However, what I found interesting about the video is that it is entitled ‘How to teach critical thinking skills: speaking’ and the book that is being promoted claims to develop ‘21st Century Skills in critical thinking, communication, collaboration and creativity’. The presenter says that the activity achieves its critical thinking goals by promoting ‘both noticing and giving opinions, […] two very important critical thinking skills.’

Noticing (or observation) and giving opinions are often included in lists of critical thinking skills, but, for this to be the case, they must presumably be exercised in a critical way – some sort of reasoning must be involved. This is not the case here, so only the most uncritical understanding of critical thinking could consider this activity to have any connection to critical thinking. Whatever other benefits might accrue from it, it seems highly unlikely that the students’ ability to notice or express opinions will be developed.

My scepticism is not shared by many users of the book. Oxford University Press carried out a scientific-sounding ‘impact study’: this consisted of a questionnaire (n = 198) in which ‘97% of teachers reported that using Oxford Discover helps their students to improve in the full range of 21st century skills, with critical thinking and communication scoring the highest’.

Enthusiasm for critical thinking activities is extremely widespread. In 2018, TALIS, the OECD Teaching and Learning International Survey (with more than 4000 respondents) found that ‘over 80% of teachers feel confident in their ability to vary instructional strategies in their classroom and help students think critically’ and almost 60% ‘frequently or always’ ‘give students tasks that require students to think critically.’ Like the Oxford ‘impact study’, it’s worth remembering that these are self-reporting figures.

This enthusiasm is shared in the world of English language teaching, reflected in at least 17 presentations at the 2021 IATEFL conference that discussed practical ideas for promoting critical thinking. These ranged from the more familiar (e.g. textual analysis in EAP) to the more original – developing critical thinking through the use of reading reaction journals, multicultural literature, fables, creative arts performances, self-expression, escape rooms, and dice games.

In most cases, it would appear that the precise nature of the critical thinking that was ostensibly being developed was left fairly vague. This vagueness is not surprising. Practically the only thing that writers about critical thinking in education can agree on is that there is no general agreement about what, precisely, critical thinking is. Lai (2011) offers an accessible summary of a range of possible meanings, but points out that, in educational contexts, its meaning is often rather vague and encompasses other concepts (such as higher order thinking skills) which also lack clarity. Paul Dummett and John Hughes (2019: 4) plump for ‘a mindset that involves thinking reflectively, rationally and reasonably’ – a vague definition which leaves unanswered two key questions: to what extent is it a skill set or a disposition? Are these skills generic or domain specific?

When ‘critical thinking’ is left undefined, it is impossible to evaluate the claims that a particular classroom activity will contribute to the development of critical thinking. However, irrespective of the definition, there are good reasons to be sceptical about the ability of educational activities to have a positive impact on the generic critical thinking skills of learners in English language classes. There can only be critical-thinking value in the activity described at the beginning of this post if learners somehow transfer the skills they practise in the activity to other domains of their lives. This is, of course, possible, but, if we approach the question with a critical disposition, we have to conclude that it is unlikely. We may continue to believe the opposite, but this would be an uncritical act of faith.

The research evidence on the efficacy of teaching generic critical thinking is not terribly encouraging (Tricot & Sweller, 2014). There’s no shortage of anecdotal support for classroom critical thinking, but ‘education researchers have spent over a century searching for, and failing to find evidence of, transfer to unrelated domains by the use of generic-cognitive skills’ (Sweller, 2022). One recent meta-analysis (Huber & Kuncel, 2016) found insufficient evidence to justify the explicit teaching of generic critical thinking skills at college level. In an earlier blog post https://adaptivelearninginelt.wordpress.com/2020/10/16/fake-news-and-critical-thinking-in-elt/ looking at the impact of critical thinking activities on our susceptibility to fake news, I noted that research was unable to find much evidence of the value of media literacy training. When considerable time is devoted to generic critical thinking training and little or no impact is found, how likely is it that the kind of occasional, brief one-off activity in the ELT classroom will have the desired impact? Without going as far as to say that critical thinking activities in the ELT classroom have no critical-thinking value, it is uncontentious to say that we still do not know how to define critical thinking, how to assess evidence of it, or how to effectively practise and execute it (Gay & Clark, 2021).

It is ironic that there is so little critical thinking about critical thinking in the world of English language teaching, but it should not be particularly surprising. Teachers are no more immune to fads than anyone else (Fuertes-Prieto et al., 2020). Despite a complete lack of robust evidence to support them, learning styles and multiple intelligences influenced language teaching for many years. Mindfulness, growth mindsets, grit are more contemporary influences and, like critical thinking, will go the way of learning styles when the commercial and institutional forces that currently promote them find the lack of empirical supporting evidence problematic.

Critical thinking is an educational aim shared by educational authorities around the world, promoted by intergovernmental bodies like the OECD, the World Bank, the EU, and the United Nations. In Japan, for example, the ‘Ministry of Education (MEXT) puts critical thinking (CT) at the forefront of its ‘global jinzai’ (human capital for a global society) directive’ (Gay & Clark, 2021). It is taught as an academic discipline in some universities in Russia (Ivlev et al, 2021) and plans are underway to introduce it into schools in Saudi Arabia. https://www.arabnews.com/node/1764601/saudi-arabia I suspect that it doesn’t mean quite the same thing in all these places.

Critical thinking is also an educational aim that most teachers can share. Few like to think of themselves as Gradgrinds, bashing facts into their pupils’ heads: turning children into critical thinkers is what education is supposed to be all about. It holds an intuitive appeal, and even if we (20% of teachers in the TALIS survey) lack confidence in our ability to promote critical thinking in the classroom, few of us doubt the importance of trying to do so. Like learning styles, multiple intelligences and growth mindsets, it seems possible that, with critical thinking, we are pushing the wrong thing, but for the right reasons. But just how much evidence, or lack of evidence, do we need before we start getting critical about critical thinking?

References

Dummett, P. & Hughes, J. (2019) Critical Thinking in ELT. Boston: National Geographic Learning

Fuertes-Prieto, M.Á., Andrés-Sánchez, S., Corrochano-Fernández, D. et al. (2020) Pre-service Teachers’ False Beliefs in Superstitions and Pseudosciences in Relation to Science and Technology. Science & Education 29, 1235–1254 (2020). https://doi.org/10.1007/s11191-020-00140-8

Gay, S. & Clark, G. (2021) Revisiting Critical Thinking Constructs and What This Means for ELT. Critical Thinking and Language Learning, 8 (1): pp. 110 – 147

Huber, C.R. & Kuncel, N.R. (2016) Does College Teach Critical Thinking? A Meta-Analysis. Review of Educational Research. 2016: 86 (2) pp.:431-468. doi:10.3102/0034654315605917

Ivlev, V. Y., Pozdnyakov, M. V., Inozemtsez, V. A. & Chernyak, A. Z. (2021) Critical Thinking in the Structure of Educational Programs in Russian Universities. Advances in Social Science, Education and Humanities Research, volume 555: pp. 121 -128

Lai, E.R. 2011. Critical Thinking: A Literature Review. Pearson. http://images.pearsonassessments.com/images/tmrs/CriticalThinkingReviewFINAL.pdf

Sweller, J. (2022) Some Critical Thoughts about Critical and Creative Thinking. Sydney: The Centre for Independent Studies Analysis Paper 32

Tricot, A., & Sweller, J. (2014) Domain-specific knowledge and why teaching generic skills does not work. Educational Psychology Review, 26, 265- 283.

Since no single definition of critical thinking prevails (Dummett & Hughes, 2019: 2), discussions of the topic invariably begin with attempts to provide a definition. Lai (2011) offers an accessible summary of a range of possible meanings, but points out that, in educational contexts, its meaning is often rather vague and encompasses other concepts (such as higher order thinking skills) which also lack clarity. Paul Dummett and John Hughes (2019: 4) plump for ‘a mindset that involves thinking reflectively, rationally and reasonably’ – a definition which involves a vague noun (that could mean a fixed state of mind, a learned attitude, a disposition or a mood) and three highly subjective adverbs. I don’t think I could do any better. However, instead of looking for a definition, we can reach a sort of understanding by looking at examples of it. Dummett and Hughes’ book is extremely rich in practical examples, and the picture that emerges of critical thinking is complex and multifaceted.

As you might expect of a weasel word like ‘critical thinking’, there appears to be general agreement that it’s a ‘good thing’. Paul Dummett suggests that there are two common reasons for promoting the inclusion of critical thinking activities in the language classroom. The first of these is a desire to get students thinking for themselves. The second is the idea ‘that we live in an age of misinformation in which only the critically minded can avoid manipulation or slavish conformity’. Neither seems contentious at first glance, although he points out that ‘they tend to lead to a narrow application of critical thinking in ELT materials: that is to say, the analysis of texts and evaluation of the ideas expressed in them’. It’s the second of these rationales that I’d like to explore further.

Penny Ur (2020: 9) offers a more extended version of it:

The role of critical thinking in education has become more central in the 21st century, simply because there is far more information readily available to today’s students than there was in previous centuries (mainly, but not only, online), and it is vital for them to be able to deal with such input wisely. They need to be able to distinguish between what is important and what is trivial, between truth and lies, between fact and opinion, between logical argument and specious propaganda […] Without such skills and awareness of the need to exercise them, they are liable to find themselves victims of commercial or political interests, their thinking manipulated by persuasion disguised as information.

In the same edited collection Olja Milosevic (2020:18) echoes Ur’s argument:

Critical thinking becomes even more important as communication increasingly moves online. Students find an overwhelming amount of information and need to be taught how to evaluate its relevance, accuracy and quality. If teachers do not teach students how to go beyond surface meaning, students cannot be expected to practise it.

In the passages I’ve quoted, these writers are referring to one particular kind of critical thinking. The ability to critically evaluate the reliability, accuracy, etc of a text is generally considered to be a part of what is usually called ‘media information literacy’. In these times of fake news, so the argument goes, it is vital for students to develop (with their teachers’ help) the necessary skills to spot fake news when they see it. The most prototypical critical thinking activity in ELT classrooms is probably one in which students analyse some fake news, such as the website about the Pacific Tree Octopus (which is the basis of a lesson in Dudeney et al., 2013: 198 – 203).

Before considering media information literacy in more detail, it’s worth noting in passing that a rationale for critical thinking activities is no rationale at all if it only concerns one aspect of critical thinking, since it has applied attributes of a part (media information literacy) to a bigger whole (critical thinking).

There is no shortage of good (free) material available for dealing with fake news in the ELT classroom. Examples include work by James Taylor, Chia Suan Chong and Tyson Seburn. Material of this kind may result in lively, interesting, cognitively challenging, communicative and, therefore, useful lessons. But how likely is it that material of this kind will develop learners’ media information literacy and, by extension therefore, their critical thinking skills? How likely is it that teaching material of this kind will help people identify (and reject) fake news? Is it possible that material of this kind is valuable despite its rationale, rather than because of it? In the spirit of rational, reflective and reasonable thinking, these are questions that seem to be worth exploring.

ELT classes and fake news

James Taylor has suggested that the English language classroom is ‘the perfect venue for [critical thinking] skills to be developed’. Although academic English courses necessarily involve elements of critical thinking, I’m not so sure that media information literacy (and, specifically, the identification of fake news) can be adequately addressed in general English classes. There are so many areas, besides those that are specifically language-focussed, competing for space in language classes (think of all those other 21st century skills), that it is hard to see how sufficient time can be found for real development of this skill. It requires modelling, practice of the skill, feedback on the practice, and more practice (Mulnix, 2010): it needs time. Fake news activities in the language classroom would, of course, be of greater value if they were part of an integrated approach across the curriculum. Unfortunately, this is rarely the case.

Information literacy skills

Training materials for media information literacy usually involve a number of stages. These include things like fact-checking and triangulation of different sources, consideration of web address, analysis of images, other items on the site, source citation and so on. The problem, however, is that news-fakers have become so good at what they do. The tree octopus site is very crude in comparison to what can be produced nowadays by people who have learnt to profit from the online economy of misinformation. Facebook employs an army of algorithmic and human fact-checkers, but still struggles. The bottom line is that background knowledge is needed (this is as true for media information literacy as it is for critical thinking more generally) (Willingham, 2007). With news, the scope of domain knowledge is so vast that it is extremely hard to transfer one’s ability to critically evaluate one particular piece of news to another. We are all fooled from time to time.

Media information literacy interventions: research on effectiveness

With the onset of COVID-19, the ability to identify fake news has become, more than ever, a matter of life and death. There is little question that this ability correlates strongly with analytic thinking (see, for example, Stanley et al., 2020). What is much less clear is how we can go about promoting analytic thinking. Analytic thinking comes in different varieties, and another hot-off-the-press research study into susceptibility to COVID-19 fake news (Roozenbeek et al., 2020) has found that the ability to spot fake news may correlate more strongly with numerical literacy than with reasoning ability. In fact, the research team found that a lack of numerical literacy was the most consistent predictor of susceptibility to misinformation about COVID-19. Perhaps we are attempting to develop the wrong kind of analytic thinking?

In educational contexts, attempts to promote media information literacy typically seek to develop reasoning abilities, and the evidence for their effectiveness is mixed. First of all, it needs to be said that ‘little large-scale evidence exists on the effectiveness of promoting digital media literacy as a response to online misinformation’ (Guess et al., 2020). An early meta-analysis (Jeong et al., 2012) found that such interventions had a positive effect, when the interventions were long (not one-off), but impacted more on students’ knowledge than they did on their behaviour. More recently, Huguet et al (2019) were unable to draw ‘definitive conclusions from past research, such as what kinds of media literacy practices work and under what conditions’. And this year, a study by Guess et al (2020) did not generate sufficient evidence ‘to conclude that the [media information literacy] intervention changed real-world consumption of false news’. I am unaware of any robust research in this area in the context of ELT.

It’s all rather disappointing. Why are we not better at it? After all, teachers of media studies have been exploring pathways for many years now. One possible answer is this: Media information literacy, like critical thinking more generally, is a skill that is acquirable, but it can only be acquired if there is a disposition to do so. The ability to think critically and the disposition to do so are separate entities (Facione, 2000). Training learners to be more critical in their approach to media information may be so much pissing in the wind if the disposition to be sceptical is not there. Shaping dispositions is a much harder task than training skills.

Both of the research studies into susceptibility to COVID-19 misinformation that I referred to earlier in this section underscore the significance of dispositions to analytic thinking. Roozenbeek et al (2020) found, in line with much previous research (for example, Jost et al. 2018), that political conservatism is associated with a slightly higher susceptibility to misinformation. Political views (on either side of the political spectrum) rarely change as a result of exposure to science or reasoned thinking. They also found that ‘self-identifying as a member of a minority predicts susceptibility to misinformation about the virus in all countries surveyed’ (except, interestingly, in the UK). Again, when issues of identity are at stake, emotional responses tend to trump rational ones.

Rational, reflective and reasonable thinking about media information literacy leads to an uncomfortable red-pill rabbit-hole. This is how Bulger and Davidson (2018) put it:

The extent to which media literacy can combat the problematic news environment is an open question. Is denying the existence of climate change a media literacy problem? Is believing that a presidential candidate was running a sex-trafficking ring out of a pizza shop a media literacy problem? Can media literacy combat the intentionally opaque systems of serving news on social media platforms? Or intentional campaigns of disinformation?

Teachers and fake news

The assumption that the critical thinking skills of young people can be developed through the intervention of their teachers is rarely problematized. It should be. A recent study of Spanish pre-service teachers (Fuertes-Prieto et al., 2020) showed that their ‘level of belief in pseudoscientific issues is comparable, or even higher in some cases to those of the general population’. There is no reason to believe that this changes after they have qualified. Teachers are probably no more likely to change their beliefs when presented with empirical evidence (Menz et al., 2020) than people from any other profession. Research has tended to focus on teachers’ lack of critical thinking in areas related to their work, but, things may be no different in the wider world. It is estimated that over a quarter of teachers in the US voted for the world’s greatest peddler of fake news in the 2016 presidential election.

It is also interesting to note that the sharing of fake news on social media is much more widespread among older people (including US teachers who have an average age of 42.4) than those under 30 (Bouygues, 2019).

Institutional contexts and fake news

Cory Doctorow has suggested that the fake news problem is not a problem of identifying what is true and what is fake, but a problem ‘about how we know whether something is true. We’re not disagreeing about facts, we’re disagreeing about epistemology’. In a post-modernist world of ‘Truth Decay’ (Kavanagh & Rich, 2018), where there is ‘a blurring of the line between opinion and fact’, epistemological authority is a rare commodity. Medicine, social sciences and applied linguistics are all currently experiencing a ‘replication crisis’ (Ioannidis, 2005) and we had a British education minister saying that ‘people of this country have had enough of experts’.

News reporting has always relied to some extent on trust in the reliability of the news source. The BBC or CNN might attempt to present themselves as more objective than, say, Fox News or InfoWars, but trust in all news outlets has collapsed globally in recent years. As Michael Shudson has written in the Columbia Journalism Review, ‘all news outlets write from a set of values, not simply from a disinterested effort at truth’. If a particular news channel manifestly shares different values from your own, it is easy to reject the veracity of the news it reports. Believers in COVID conspiracy theories often hold their views precisely because of their rejection of the epistemological authority of mainstream news and the WHO or governments who support lockdown measures.

The training of media information literacy in schools is difficult because, for many people in the US (and elsewhere), education is not dissimilar to mainstream media. They ‘are seen as the enemy — two institutions who are trying to have power over how people think. Two institutions that are trying to assert authority over epistemology’ (boyd, 2018). Schools have always been characterized by imbalances in power (between students and teachers / administrators), and this power dynamic is not conducive to open-minded enquiry. Children are often more aware of the power of their teachers than they are accepting of their epistemological authority. They are enjoined to be critical thinkers, but only about certain things and only up to a certain point. One way for children to redress the power imbalance is to reject the epistemological authority of their teachers. I think this may explain why a group of young children I observed recently coming out of a lesson devoted to environmental issues found such pleasure in joking about Greta ‘Thunfisch’.

Power relationships in schools are reflected and enacted in the interaction patterns between teachers and students. The most common of these is ‘initiation-response-feedback (IRF)’ and it is unlikely that this is particularly conducive to rational, reflective and reasonable thinking. At the same time, as Richard Paul, one of the early advocates of critical thinking in schools, noted, much learning activity is characterised by lower order thinking skills, especially memorization (Paul, 1992: 22). With this kind of backdrop, training in media information literacy is more likely to be effective if it goes beyond the inclusion of a few ‘fake news’ exercises: a transformation in the way that the teaching is done will also be needed. Benesch (1999) describes this as a more ‘dialogic’ approach and there is some evidence that a more dialogic approach can have a positive impact on students’ dispositions (e.g. Hajhosseiny, 2012).

I think that David Buckingham (2019a) captures the educational problem very neatly:

There’s a danger here of assuming that we are dealing with a rational process – or at least one that can, by some pedagogical means, be made rational. But from an educational perspective, we surely have to begin with the question of why people might believe apparently ‘fake’ news in the first place. Where we decide to place our trust is as much to do with fantasy, emotion and desire, as with rational calculation. All of us are inclined to believe what we want to believe.

Fake news: a problem or a symptom of a problem?

There has always been fake news. The big problem now is ‘the speed and ease of its dissemination, and it exists primarily because today’s digital capitalism makes it extremely profitable – look at Google and Facebook – to produce and circulate false but click-worthy narratives’ (Morosov, 2017). Fake news taps into and amplifies broader tendencies and divides in society: the problem is not straightforward and is unlikely to be easy to eradicate (Buckingham, 2019a: 3).

There is increasing discussion of media regulation and the recent banning by Facebook of Holocaust denial and QAnon is a recognition that some regulation cannot now be avoided. But strict regulations would threaten the ‘basic business model, and the enormous profitability’ of social media companies (Buckingham, 2009b) and there are real practical and ethical problems in working out exactly how regulation would happen. Governments do not know what to do.

Lacking any obvious alternative, media information literacy is often seen as the solution: can’t we ‘fact check and moderate our way out of this conundrum’ (boyd, 2018)? danah boyd’s stark response is, no, this will fail. It’s an inadequate solution to an oversimplified problem (Buckingham, 2019a).

Along with boyd and Buckingham, I’m not trying to argue that we drop media information literacy activities from educational (including ELT) programmes. Quite the opposite. But if we want our students to think reflectively, rationally and reasonably, I think we will need to start by doing the same.

References

Benesch, S. (1999). Thinking critically, thinking dialogically. TESOL Quarterly, 33: pp. 573 – 580

Bouygues, H. L. (2019). Fighting Fake News: Lessons From The Information Wars. Reboot Foundation https://reboot-foundation.org/fighting-fake-news/

boyd, d. (2018). You Think You Want Media Literacy… Do You? Data and Society: Points https://points.datasociety.net/you-think-you-want-media-literacy-do-you-7cad6af18ec2

Buckingham, D. (2019a). Teaching Media in a ‘Post-Truth’ Age: Fake News, Media Bias and the Challenge for Media Literacy Education. Cultura y Educación 31(2): pp. 1-19

Buckingham, D. (2019b). Rethinking digital literacy: Media education in the age of digital capitalism. https://ddbuckingham.files.wordpress.com/2019/12/media-education-in-digital-capitalism.pdf

Bulger, M. & Davidson, P. (2018). The Promises, Challenges and Futures of Media Literacy. Data and Society. https://datasociety.net/pubs/oh/DataAndSociety_Media_Literacy_2018.pdf

Doctorow, C. (2017). Three kinds of propaganda, and what to do about them. boingboing 25th February 2017, https://boingboing.net/2017/02/25/counternarratives-not-fact-che.html

Dudeney, G., Hockly, N. & Pegrum, M. (2013). Digital Literacies. Harlow: Pearson Education

Dummett, P. & Hughes, J. (2019). Critical Thinking in ELT. Boston: National Geographic Learning

Facione, P. A. (2000). The disposition toward critical thinking: Its character, measurement, and relation to critical thinking skill. Informal Logic, 20(1), 61–84.

Fuertes-Prieto, M.Á., Andrés-Sánchez, S., Corrochano-Fernández, D. et al. (2020). Pre-service Teachers’ False Beliefs in Superstitions and Pseudosciences in Relation to Science and Technology. Science & Education 29, 1235–1254 (2020). https://doi.org/10.1007/s11191-020-00140-8

Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, N., Reifler, J. & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences Jul 2020, 117 (27) 15536-15545; DOI: 10.1073/pnas.1920498117

Hajhosseiny, M. (2012). The Effect of Dialogic Teaching on Students’ Critical Thinking Disposition. Procedia – Social and Behavioral Sciences, 69: pp. 1358 – 1368

Huguet, A., Kavanagh, J., Baker, G. & Blumenthal, M. S. (2019). Exploring Media Literacy Education as a Tool for Mitigating Truth Decay. RAND Corporation, https://www.rand.org/content/dam/rand/pubs/research_reports/RR3000/RR3050/RAND_RR3050.pdf

Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine 2 (8): e124. https://doi.org/10.1371/journal.pmed.0020124

Jeong, S. H., Cho, H., & Hwang, Y. (2012). Media literacy interventions: A meta-analytic review. Journal of Communication, 62, pp. 454–472

Jones-Jang, S. M., Mortensen, T. & Liu, J. (2019). Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. American Behavioral Scientist, pp. 1 – 18, doi:10.1177/0002764219869406

Jost, J. T., van der Linden, S., Panagopoulos, C. & Hardin, C. D. (2018). Ideological asymmetries in conformity, desire for shared reality, and the spread of misinformation. Current Opinion in Psychology, 23: pp/ 77-83. doi:10.1016/j.copsyc.2018.01.003

Kavanagh, J. & Rich, M. D. (2018). Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND Corporation, https://www.rand.org/pubs/research_reports/RR2314.html

Lai, E.R. 2011. Critical Thinking: A Literature Review. Pearson. http://images.pearsonassessments.com/images/tmrs/CriticalThinkingReviewFINAL.pdf

Menz, C., Spinath, B. & Seifried, E. (2020). Misconceptions die hard: prevalence and reduction of wrong beliefs in topics from educational psychology among preservice teachers. European Journal of Psychology of Education https://doi.org/10.1007/s10212-020-00474-5

Milosevic, O. (2020). Promoting critical thinking in the EFL classroom. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp.17 – 22

Morozov, E. (2017). Moral panic over fake news hides the real enemy – the digital giants. The Guardian, 8 January 2017 https://www.theguardian.com/commentisfree/2017/jan/08/blaming-fake-news-not-the-answer-democracy-crisis

Mulnix, J.W. 2010. ‘Thinking critically about critical thinking’ Educational Philosophy and Theory, 2010

Paul, R. W. (1992). Critical thinking: What, why, and how? New Directions for Community Colleges, 77: pp. 3–24.

Roozenbeek, J., Schneider, C.R., Dryhurst, S., Kerr, J., Freeman, A. L. J., Recchia, G., van der Bles, A. M. & and van der Linden, S. (2020). Susceptibility to misinformation about COVID-19 around the world. Royal Society Open Science, 7 (10) https://doi.org/10.1098/rsos.201199

Stanley, M., Barr, N., Peters, K. & Seli, P. (2020). Analytic-thinking predicts hoax beliefs and helping behaviors in response to the COVID-19 pandemic. PsyArxiv Preprints doi:10.31234/osf.io/m3vt

Ur, P. (2020). Critical Thinking. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp.9 – 16

Willingham, D. T. (2007). Critical Thinking: Why Is It So Hard to Teach? American Educator Summer 2007: pp. 8 – 19

In the first post in this 3-part series, I focussed on data collection practices in a number of ELT websites, as a way of introducing ‘critical data literacy’. Here, I explore the term in more detail.

Although the term ‘big data’ has been around for a while (see this article and infographic) it’s less than ten years ago that it began to enter everyday language, and found its way into the OED (2013). In the same year, Viktor Mayer-Schönberger and Kenneth Cukier published their best-selling ‘Big Data: A Revolution That Will Transform How We Live, Work, and Think’ (2013) and it was hard to avoid enthusiastic references in the media to the transformative potential of big data in every sector of society.

Since then, the use of big data and analytics has become ubiquitous. Massive data collection (and data surveillance) has now become routine and companies like Palantir, which specialise in big data analytics, have become part of everyday life. Palantir’s customers include the LAPD, the CIA, the US Immigration and Customs Enforcement (ICE) and the British Government. Its recent history includes links with Cambridge Analytica, assistance in an operation to arrest the parents of illegal migrant children, and a racial discrimination lawsuit where the company was accused of having ‘routinely eliminated’ Asian job applicants (settled out of court for $1.7 million).

Unsurprisingly, the datafication of society has not gone entirely uncontested. Whilst the vast majority of people seem happy to trade their personal data for convenience and connectivity, a growing number are concerned about who benefits most from this trade-off. On an institutional level, the EU introduced the General Data Protection Regulation (GDPR), which led to Google being fined Ꞓ50 million for insufficient transparency in their privacy policy and their practices of processing personal data for the purposes of behavioural advertising. In the intellectual sphere, there has been a recent spate of books that challenge the practices of ubiquitous data collection, coining new terms like ‘surveillance capitalism’, ‘digital capitalism’ and ‘data colonialism’. Here are four recent books that I have found particularly interesting.

Beer, D. (2019). The Data Gaze. London: Sage

Couldry, N. & Mejias, U. A. (2019). The Costs of Connection. Stanford: Stanford University Press

Sadowski, J. (2020). Too Smart. Cambridge, Mass.: MIT Press

Zuboff, S. (2019). The Age of Surveillance Capitalism. New York: Public Affairs

The use of big data and analytics in education is also now a thriving industry, with its supporters claiming that these technologies can lead to greater personalization, greater efficiency of instruction and greater accountability. Opponents (myself included) argue that none of these supposed gains have been empirically demonstrated, and that the costs to privacy, equity and democracy outweigh any potential gains. There is a growing critical literature and useful, recent books include:

Bradbury, A. & Roberts-Holmes, G. (2018). The Datafication of Primary and Early Years Education. Abingdon: Routledge

Jarke, J. & Breiter, A. (Eds.) (2020). The Datafication of Education. Abingdon: Routledge

Williamson, B. (2017). Big Data in Education: The digital future of learning, policy and practice. London: Sage

Concomitant with the rapid growth in the use of digital tools for language learning and teaching, and therefore the rapid growth in the amount of data that learners were (mostly unwittingly) giving away, came a growing interest in the need for learners to develop a set of digital competencies, or literacies, which would enable them to use these tools effectively. In the same year that Mayer-Schönberger and Cukier brought out their ‘Big Data’ book, the first book devoted to digital literacies in English language teaching came out (Dudeney et al., 2013). They defined digital literacies as the individual and social skills needed to effectively interpret, manage, share and create meaning in the growing range of digital communication channels (Dudeney et al., 2013: 2). The book contained a couple of activities designed to raise students’ awareness of online identity issues, along with others intended to promote critical thinking about digitally-mediated information (what the authors call ‘information literacy’), but ‘critical literacy’ was missing from the authors’ framework.

Critical thinking and critical literacy are not the same thing. Although there is no generally agreed definition of the former (with a small ‘c’), it is focussed primarily on logic and comprehension (Lee, 2011). Paul Dummett and John Hughes (2019: 4) describe it as ‘a mindset that involves thinking reflectively, rationally and reasonably’. The prototypical critical thinking activity involves the analysis of a piece of fake news (e.g. the task where students look at a website about tree octopuses in Dudeney et al. 2013: 198 – 203). Critical literacy, on the other hand, involves standing back from texts and technologies and viewing them as ‘circulating within a larger social and textual context’ (Warnick, 2002). Consideration of the larger social context necessarily entails consideration of unequal power relationships (Leee, 2011; Darvin, 2017), such as that between Google and the average user of Google. And it follows from this that critical literacy has a socio-political emancipatory function.

Critical digital literacy is now a growing field of enquiry (e.g. Pötzsch, 2019) and there is an awareness that digital competence frameworks, such as the Digital Competence Framework of the European Commission, are incomplete and out of date without the inclusion of critical digital literacy. Dudeney et al (2013) clearly recognise the importance of including critical literacy in frameworks of digital literacies. In Pegrum et al. (2018, unfortunately paywalled), they update the framework from their 2013 book, and the biggest change is the inclusion of critical literacy. They divide this into the following:

  • critical digital literacy – closely related to information literacy
  • critical mobile literacy – focussing on issues brought to the fore by mobile devices, ranging from protecting privacy through to safeguarding mental and physical health
  • critical material literacy – concerned with the material conditions underpinning the use of digital technologies, ranging from the socioeconomic influences on technological access to the environmental impacts of technological manufacturing and disposal
  • critical philosophical literacy – concerned with the big questions posed to and about humanity as our lives become conjoined with the existence of our smart devices, robots and AI
  • critical academic literacy, which refers to the pressing need to conduct meaningful studies of digital technologies in place of what is at times ‘cookie-cutter’ research

I’m not entirely convinced by the subdivisions, but labelling in this area is still in its infancy. My particular interest here, in critical data literacy, seems to span across a number of their sub-divisions. And the term that I am using, ‘critical data literacy’, which I’ve taken from Tygel & Kirsch (2016), is sometimes referred to as ‘critical big data literacy’ (Sander, 2020a) or ‘personal data literacy’ (Pangrazio & Selwyn, 2019). Whatever it is called, it is the development of ‘informed and critical stances toward how and why [our] data are being used’ (Pangrazio & Selwyn, 2018). One of the two practical activities in the Pegrum at al article (2018) looks at precisely this area (the task requires students to consider the data that is collected by fitness apps). It will be interesting to see, when the new edition of the ‘Digital Literacies’ book comes out (perhaps some time next year), how many other activities take a more overtly critical stance.

In the next post, I’ll be looking at a range of practical activities for developing critical data literacy in the classroom. This involves both bridging the gaps in knowledge (about data, algorithms and online privacy) and learning, practically, how to implement ‘this knowledge for a more empowered internet usage’ (Sander, 2020b).

Without wanting to invalidate the suggestions in the next post, a word of caution is needed. Just as critical thinking activities in the ELT classroom cannot be assumed to lead to any demonstrable increase in critical thinking (although there may be other benefits to the activities), activities to promote critical literacy cannot be assumed to lead to any actual increase in critical literacy. The reaction of many people may well be ‘It’s not like it’s life or death or whatever’ (Pangrazio & Selwyn, 2018). And, perhaps, education is rarely, if ever, a solution to political and social problems, anyway. And perhaps, too, we shouldn’t worry too much about educational interventions not leading to their intended outcomes. Isn’t that almost always the case? But, with those provisos in mind, I’ll come back next time with some practical ideas.

REFERENCES

Darvin R. (2017). Language, Ideology, and Critical Digital Literacy. In: Thorne S., May S. (eds) Language, Education and Technology. Encyclopedia of Language and Education (3rd ed.). Springer, Cham. pp. 17 – 30 https://doi.org/10.1007/978-3-319-02237-6_35

Dudeney, G., Hockly, N. & Pegrum, M. (2013). Digital Literacies. Harlow: Pearson Education

Dummett, P. & Hughes, J. (2019). Critical Thinking in ELT. Boston: National Geographic Learning

Lee, C. J. (2011). Myths about critical literacy: What teachers need to unlearn. Journal of Language and Literacy Education [Online], 7 (1), 95-102. Available at http://www.coa.uga.edu/jolle/2011_1/lee.pdf

Mayer-Schönberger, V. & Cukier, K. (2013). Big Data: A Revolution That Will Transform How We Live, Work, and Think. London: John Murray

Pangrazio, L. & Selwyn, N. (2018). ‘It’s not like it’s life or death or whatever’: young people’s understandings of social media data. Social Media + Society, 4 (3): pp. 1–9. https://journals.sagepub.com/doi/pdf/10.1177/2056305118787808

Pangrazio, L. & Selwyn, N. (2019). ‘Personal data literacies’: A critical literacies approach to enhancing understandings of personal digital data. New Media and Society, 21 (2): pp. 419 – 437

Pegrum, M., Dudeney, G. & Hockly, N. (2018). Digital literacies revisited. The European Journal of Applied Linguistics and TEFL, 7 (2), pp. 3-24

Pötzsch, H. (2019). Critical Digital Literacy: Technology in Education Beyond Issues of User Competence and Labour-Market Qualifications. tripleC: Communication, Capitalism & Critique, 17: pp. 221 – 240 Available at https://www.triple-c.at/index.php/tripleC/article/view/1093

Sander, I. (2020a). What is critical big data literacy and how can it be implemented? Internet Policy Review, 9 (2). DOI: 10.14763/2020.2.1479 https://www.econstor.eu/bitstream/10419/218936/1/2020-2-1479.pdf

Sander, I. (2020b). Critical big data literacy tools – Engaging citizens and promoting empowered internet usage. Data & Policy, 2: DOI: https://doi.org/10.1017/dap.2020.5

Tygel, A. & Kirsch, R. (2016). Contributions of Paulo Freire for a Critical Data Literacy: a Popular Education Approach. The Journal of Community Informatics, 12 (3). Available at http://www.ci-journal.net/index.php/ciej/article/view/1296

Warnick, B. (2002). Critical Literacy in a Digital Era. Mahwah, NJ, Lawrence Erlbaum Associates

If you cast your eye over the English language teaching landscape, you can’t help noticing a number of prominent features that weren’t there, or at least were much less visible, twenty years ago. I’d like to highlight three. First, there is the interest in life skills (aka 21st century skills). Second, there is the use of digital technology to deliver content. And third, there is a concern with measuring educational outputs through frameworks such as the Pearson GSE. In this post, I will focus primarily on the last of these, with a closer look at measuring teacher performance.

Recent years have seen the development of a number of frameworks for evaluating teacher competence in ELT. These include

TESOL has also produced a set of guidelines for developing professional teaching standards for EFL.

Frameworks such as these were not always intended as tools to evaluate teachers. The British Council’s framework, for example, was apparently designed for teachers to understand and plan their own professional development. Similarly, the Cambridge framework says that it is for teachers to see where they are in their development – and think about where they want to go next. But much like the CEFR for language competence, frameworks can be used for purposes rather different from their designers’ intentions. I think it is likely that frameworks such as these are more often used to evaluate teachers than for teachers to evaluate themselves.

But where did the idea for such frameworks come from? Was there a suddenly perceived need for things like this to aid in self-directed professional development? Were teachers’ associations calling out for frameworks to help their members? Even if that were the case, it would still be useful to know why, and why now.

One possibility is that the interest in life skills, digital technology and the measurement of educational outputs have all come about as a result of what has been called the Global Educational Reform Movement, or GERM (Sahlberg, 2016). GERM dates back to the 1980s and the shifts (especially in the United States under Reagan and the United Kingdom under Thatcher) in education policy towards more market-led approaches which emphasize (1) greater competition between educational providers, (2) greater autonomy from the state for educational providers (and therefore a greater role for private suppliers), (3) greater choice of educational provider for students and their parents, and (4) standardized tests and measurements which allow consumers of education to make more informed choices. One of the most significant GERM vectors is the World Bank.

The interest in incorporating the so-called 21st century skills as part of the curriculum can be traced back to the early 1980s when the US National Commission on Excellence in Education recommended the inclusion of a range of skills, which eventually crystallized into the four Cs of communication, collaboration, critical thinking and creativity. The labelling of this skill set as ‘life skills’ or ‘21st century skills’ was always something of a misnomer: the reality was that these were the soft skills required by the world of work. The key argument for their inclusion in the curriculum was that they were necessary for the ‘competitiveness and wealth of corporations and countries’ (Trilling & Fadel, 2009: 7). Unsurprisingly, the World Bank, whose interest in education extends only so far as its economic value, embraced the notion of ‘life skills’ with enthusiasm. Its document ‘Life skills : what are they, why do they matter, and how are they taught?’ (World Bank, 2013), makes the case very clearly. It took a while for the world of English language teaching to get on board, but by 2012, Pearson was already sponsoring a ‘signature event’ at IATEFL Glasgow entitled ‘21st Century Skills for ELT’. Since then, the currency of ‘life skills’ as an ELT buzz phrase has not abated.

Just as the World Bank’s interest in ‘life skills’ is motivated by the perceived need to prepare students for the world of work (for participation in the ‘knowledge economy’), the Bank emphasizes the classroom use of computers and resources from the internet: Information and communication technology (ICT) allows the adaptation of globally available information to local learning situations. […] A large percentage of the World Bank’s education funds are used for the purchase of educational technology. […] According to the Bank’s figures, 40 per cent of their education budget in 2000 and 27 per cent in 2001 was used to purchase technology. (Spring, 2015: 50).

Digital technology is also central to capturing data, which will allow for the measurement of educational outputs. As befits an organisation of economists that is interested in the cost-effectiveness of investments into education, it accords enormous importance to what are thought to be empirical measures or accountability. So intrinsic to the Bank’s approach is this concern with measurement that ‘the Bank’s implicit message to national governments seems to be: ‘improve your data collection capacity so that we can run more reliable cross-country analysis and regressions’. (Verger & Bonal, 2012: 131).

Measuring the performance of teachers is, of course, a part of assessing educational outputs. The World Bank, which sees global education as fundamentally ‘broken’, has, quite recently, turned more of its attention to the role of teachers. A World Bank blog from 2019 explains the reasons:

A growing body of evidence suggests the learning crisis is, at its core, a teaching crisis. For students to learn, they need good teachers—but many education systems pay little attention to what teachers know, what they do in the classroom, and in some cases whether they even show up. Rapid technological change is raising the stakes. Technology is already playing a crucial role in providing support to teachers, students, and the learning process more broadly. It can help teachers better manage the classroom and offer different challenges to different students. And technology can allow principals, parents, and students to interact seamlessly.

A key plank in the World Banks’s attempts to implement its educational vision is its System Assessment and Benchmarking for Education Results (SABER), which I will return to in due course. As part of its SABER efforts, last year the World Bank launched its ‘Teach’ tool . This tool is basically an evaluation framework. Videos of lessons are recorded and coded for indicators of teacher efficiency by coders who can be ‘90% reliable’ after only four days of training. The coding system focuses on the time that students spend on-task, but also ‘life skills’ like collaboration and critical thinking (see below).

Teach framework

Like the ELT frameworks, it can be used as a professional development tool, but, like them, it may also be used for summative evaluation.

The connections between those landmarks on the ELT landscape and the concerns of the World Bank are not, I would suggest, coincidental. The World Bank is, of course, not the only player in GERM, but it is a very special case. It is the largest single source of external financing in ‘developing countries’ (Beech, 2009: 345), managing a portfolio of $8.9 billion, with operations in 70 countries as of August 2013 (Spring, 2015: 32). Its loans come attached with conditions which tie the borrowing countries to GERM objectives. Arguably of even greater importance than its influence through funding, is the Bank’s direct entry into the world of ideas:

The Bank yearns for a deeper and more comprehensive impact through avenues of influence transcending both project and program loans. Not least in education, the World Bank is investing much in its quest to shape global opinion about economic, developmental, and social policy. Rather than imposing views through specific loan negotiations, Bank style is broadening in attempts to lead borrower country officials to its preferred way of thinking. (Jones, 2007: 259).

The World Bank sees itself as a Knowledge Bank and acts accordingly. Rizvi and Lingard (2010: 48) observe that ‘in many nations of the Global South, the only extant education policy analysis is research commissioned by donor agencies such as the World Bank […] with all the implications that result in relation to problem setting, theoretical frameworks and methodologies’. Hundreds of academics are engaged to do research related to the Bank’s areas of educational interest, and ‘the close links with the academic world give a strong credibility to the ideas disseminated by the Bank […] In fact, many ideas that acquired currency and legitimacy were originally proposed by them. This is the case of testing students and using the results to evaluate progress in education’ (Castro, 2009: 472).

Through a combination of substantial financial clout and relentless marketing (Selwyn, 2013: 50), the Bank has succeeded in shaping global academic discourse. In partnership with similar institutions, it has introduced a way of classifying and thinking about education (Beech, 2009: 352). It has become, in short, a major site ‘for the organization of knowledge about education’ (Rizvi & Lingard, 2010: 79), wielding ‘a degree of power that has arguably enabled it to shape the educational agendas of nations throughout the Global South’ and beyond (Menashy, 2012).

So, is there any problem in the world of ELT taking up the inclusion of ‘life skills’? I think there is. The first is one of definition. Creativity and critical thinking are very poorly defined, meaning very different things to different people, so it is not always clear what is being taught. Following on from this, there is substantial debate about whether such skills can actually be taught at all, and, if they can, how they should be taught. It seems highly unlikely that the tokenistic way in which they are ‘taught’ in most published ELT courses can be of any positive impact. But this is not my main reservation, which is that, by and large, we have come to uncritically accept the idea that English language learning is mostly concerned with preparation for the workplace (see my earlier post ‘The EdTech Imaginary in ELT’).

Is there any problem with the promotion of digital technologies in ELT? Again, I think there is, and a good proportion of the posts on this blog have argued for the need for circumspection in rolling out more technology in language learning and teaching. My main reason is that while it is clear that this trend is beneficial to technology vendors, it is much less clear that advantages will necessarily accrue to learners. Beyond this, there must be serious concerns about data ownership, privacy, and the way in which the datafication of education, led by businesses and governments in the Global North, is changing what counts as good education, a good student or an effective teacher, especially in the Global South. ‘Data and metrics,’ observe Williamson et al. (2020: 353), ‘do not just reflect what they are designed to measure, but actively loop back into action that can change the very thing that was measured in the first place’.

And what about tools for evaluating teacher competences? Here I would like to provide a little more background. There is, first of all, a huge question mark about how accurately such tools measure what they are supposed to measure. This may not matter too much if the tool is only used for self-evaluation or self-development, but ‘once smart systems of data collection and social control are available, they are likely to be widely applied for other purposes’ (Sadowski, 2020: 138). Jaime Saavedra, head of education at the World Bank, insists that the World Bank’s ‘Teach’ tool is not for evaluation and is not useful for firing teachers who perform badly.

Saavedra needs teachers to buy into the tool, so he obviously doesn’t want to scare them off. However, ‘Teach’ clearly is an evaluation tool (if not, what is it?) and, as with other tools (I’m thinking of CEFR and teacher competency frameworks in ELT), its purposes will evolve. Eric Hanushek, an education economist at Stanford University, has commented that ‘this is a clear evaluation tool at the probationary stage … It provides a basis for counseling new teachers on how they should behave … but then again if they don’t change over the first few years you also have information you should use.

At this point, it is useful to take a look at the World Bank’s attitudes towards teachers. Teachers are seen to be at the heart of the ‘learning crisis’. However, the greatest focus in World Bank documents is on (1) teacher absenteeism in some countries, (2) unskilled and demotivated teachers, and (3) the reluctance of teachers and their unions to back World Bank-sponsored reforms. As real as these problems are, it is important to understand that the Bank has been complicit in them:

For decades, the Bank has criticised pre-service and in-service teacher training as not cost-effective For decades, the Bank has been pushing the hiring of untrained contract teachers as a cheap fix and a way to get around teacher unions – and contract teachers are again praised in the World Bank Development Report (WDR). This contradicts the occasional places in the WDR in which the Bank argues that developing countries need to follow the lead of the few countries that attract the best students to teaching, improve training, and improve working conditions. There is no explicit evidence offered at all for the repeated claim that teachers are unmotivated and need to be controlled and monitored to do their job. The Bank has a long history of blaming teachers and teacher unions for educational failures. The Bank implicitly argues that the problem of teacher absenteeism, referred to throughout the report, means teachers are unmotivated, but that simply is not true. Teacher absenteeism is not a sign of low motivation. Teacher salaries are abysmally low, as is the status of teaching. Because of this, teaching in many countries has become an occupation of last resort, yet it still attracts dedicated teachers. Once again, the Bank has been very complicit in this state of affairs as it, and the IMF, for decades have enforced neoliberal, Washington Consensus policies which resulted in government cutbacks and declining real salaries for teachers around the world. It is incredible that economists at the Bank do not recognise that the deterioration of salaries is the major cause of teacher absenteeism and that all the Bank is willing to peddle are ineffective and insulting pay-for-performance schemes. (Klees, 2017).

The SABER framework (referred to above) focuses very clearly on policies for hiring, rewarding and firing teachers.

[The World Bank] places the private sector’s methods of dealing with teachers as better than those of the public sector, because it is more ‘flexible’. In other words, it is possible to say that teachers can be hired and fired more easily; that is, hired without the need of organizing a public competition and fired if they do not achieve the expected outcomes as, for example, students’ improvements in international test scores. Further, the SABER document states that ‘Flexibility in teacher contracting is one of the primary motivations for engaging the private sector’ (World Bank, 2011: 4). This affirmation seeks to reduce expenditures on teachers while fostering other expenses such as the creation of testing schemes and spending more on ICTs, as well as making room to expand the hiring of private sector providers to design curriculum, evaluate students, train teachers, produce education software, and books. (De Siqueira, 2012).

The World Bank has argued consistently for a reduction of education costs by driving down teachers’ salaries. One of the authors of the World Bank Development Report 2018 notes that ‘in most countries, teacher salaries consume the lion’s share of the education budget, so there are already fewer resources to implement other education programs’. Another World Bank report (2007) makes the importance of ‘flexible’ hiring and lower salaries very clear:

In particular, recent progress in primary education in Francophone countries resulted from reduced teacher costs, especially through the recruitment of contractual teachers, generally at about 50% the salary of civil service teachers. (cited in Compton & Weiner, 2008: 7).

Merit pay (or ‘pay for performance’) is another of the Bank’s preferred wheezes. Despite enormous problems in reaching fair evaluations of teachers’ work and a distinct lack of convincing evidence that merit pay leads to anything positive (and may actually be counter-productive) (De Bruyckere et al., 2018: 143 – 147), the Bank is fully committed to the idea. Perhaps this is connected to the usefulness of merit pay in keeping teachers on their toes, compliant and fearful of losing their jobs, rather than any desire to improve teacher effectiveness?

There is evidence that this may be the case. Yet another World Bank report (Bau & Das, 2017) argues, on the basis of research, that improved TVA (teacher value added) does not correlate with wages in the public sector (where it is hard to fire teachers), but it does in the private sector. The study found that ‘a policy change that shifted public hiring from permanent to temporary contracts, reducing wages by 35 percent, had no adverse impact on TVA’. All of which would seem to suggest that improving the quality of teaching is of less importance to the Bank than flexible hiring and firing. This is very much in line with a more general advocacy of making education fit for the world of work. Lois Weiner of New Jersey City University puts it like this:

The architects of [GERM] policies—imposed first in developing countries—openly state that the changes will make education better fit the new global economy by producing workers who are (minimally) educated for jobs that require no more than a 7th or 8th grade education; while a small fraction of the population receive a high quality education to become the elite who oversee finance, industry, and technology. Since most workers do not need to be highly educated, it follows that teachers with considerable formal education and experience are neither needed nor desired because they demand higher wages, which is considered a waste of government money. Most teachers need only be “good enough”—as one U.S. government official phrased it—to follow scripted materials that prepare students for standardized tests. (Weiner, 2012).

It seems impossible to separate the World Bank’s ‘Teach’ tool from the broader goals of GERM. Teacher evaluation tools, like the teaching of 21st century skills and the datafication of education, need to be understood properly, I think, as means to an end. It’s time to spell out what that end is.

The World Bank’s mission is ‘to end extreme poverty (by reducing the share of the global population that lives in extreme poverty to 3 percent by 2030)’ and ‘to promote shared prosperity (by increasing the incomes of the poorest 40 percent of people in every country)’. Its education activities are part of this broad aim and are driven by subscription to human capital theory (a view of the skills, knowledge and experience of individuals in terms of their ability to produce economic value). This may be described as the ‘economization of education’: a shift in educational concerns away from ‘such things as civic participation, protecting human rights, and environmentalism to economic growth and employment’ (Spring, 2015: xiii). Both students and teachers are seen as human capital. For students, human capital education places an emphasis on the cognitive skills needed to succeed in the workplace and the ‘soft skills’, needed to function in the corporate world (Spring, 2015: 2). Accordingly, World Bank investments require ‘justifications on the basis of manpower demands’ (Heyneman, 2003: 317). One of the Bank’s current strategic priorities is the education of girls: although human rights and equity may also play a part, the Bank’s primary concern is that ‘Not Educating Girls Costs Countries Trillions of Dollars’ .

According to the Bank’s logic, its educational aims can best be achieved through a combination of support for the following:

  • cost accounting and quantification (since returns on investment must be carefully measured)
  • competition and market incentives (since it is believed that the ‘invisible hand’ of the market leads to the greatest benefits)
  • the private sector in education and a rolling back of the role of the state (since it is believed that private ownership improves efficiency)

The package of measures is a straightforward reflection of ‘what Western mainstream economists believe’ (Castro, 2009: 474).

Mainstream Western economics is, however, going through something of a rocky patch right now. Human capital theory is ‘useful when prevailing conditions are right’ (Jones, 2007: 248), but prevailing conditions are not right in much of the world (even in the United States), and the theory ‘for the most part ignores the intersections of poverty, equity and education’ (Menashy, 2012). In poorer countries evidence for the positive effects of markets in education is in very short supply, and even in richer countries it is still not conclusive (Verger & Bonal, 2012: 135). An OECD Education Paper (Waslander et al., 2010: 64) found that the effects of choice and competition between schools were at best small, if indeed any effects were found at all. Similarly, the claim that privatization improves efficiency is not sufficiently supported by evidence. Analyses of PISA data would seem to indicate that, ‘all else being equal (especially when controlling for the socio-economic status of the students), the type of ownership of the school, whether it is a private or a state school, has only modest effects on student achievement or none at all’ (Verger & Bonal, 2012: 133). Educational privatization as a one-size-fits-all panacea to educational problems has little to recommend it.

There are, then, serious limitations in the Bank’s theoretical approach. Its practical track record is also less than illustrious, even by the Bank’s own reckoning. Many of the Bank’s interventions have proved very ‘costly to developing countries. At the Bank’s insistence countries over-invested in vocational and technical education. Because of the narrow definition of recurrent costs, countries ignored investments in reading materials and in maintaining teacher salaries. Later at the Bank’s insistence, countries invested in thousands of workshops and laboratories that, for the most part, became useless ‘white elephants’ (Heyneman, 2003: 333).

As a bank, the World Bank is naturally interested in the rate of return of investment in that capital, and is therefore concerned with efficiency and efficacy. This raises the question of ‘Effective for what?’ and given that what may be effective for one individual or group may not necessarily be effective for another individual or group, one may wish to add a second question: ‘Effective for whom?’ (Biesta, 2020: 31). Critics of the World Bank, of whom there are many, argue that its policies serve ‘the interests of corporations by keeping down wages for skilled workers, cause global brain migration to the detriment of developing countries, undermine local cultures, and ensure corporate domination by not preparing school graduates who think critically and are democratically oriented’ (Spring, 2015: 56). Lest this sound a bit harsh, we can turn to the Bank’s own commissioned history: ‘The way in which [the Bank’s] ideology has been shaped conforms in significant degree to the interests and conventional wisdom of its principal stockholders [i.e. bankers and economists from wealthy nations]. International competitive bidding, reluctance to accord preferences to local suppliers, emphasis on financing foreign exchange costs, insistence on a predominant use of foreign consultants, attitudes toward public sector industries, assertion of the right to approve project managers – all proclaim the Bank to be a Western capitalist institution’ (Mason & Asher, 1973: 478 – 479).

The teaching of ‘life skills’, the promotion of data-capturing digital technologies and the push to evaluate teachers’ performance are, then, all closely linked to the agenda of the World Bank, and owe their existence in the ELT landscape, in no small part, to the way that the World Bank has shaped educational discourse. There is, however, one other connection between ELT and the World Bank which must be mentioned.

The World Bank’s foreign language instructional goals are directly related to English as a global language. The Bank urges, ‘Policymakers in developing countries …to ensure that young people acquire a language with more than just local use, preferably one used internationally.’ What is this international language? First, the World Bank mentions that schools of higher education around the world are offering courses in English. In addition, the Bank states, ‘People seeking access to international stores of knowledge through the internet require, principally, English language skills.’ (Spring, 2015: 48).

Without the World Bank, then, there might be a lot less English language teaching than there is. I have written this piece to encourage people to think more about the World Bank, its policies and particular instantiations of those policies. You might or might not agree that the Bank is an undemocratic, technocratic, neoliberal institution unfit for the necessities of today’s world (Klees, 2017). But whatever you think about the World Bank, you might like to consider the answers to Tony Benn’s ‘five little democratic questions’ (quoted in Sardowski, 2020: 17):

  • What power has it got?
  • Where did it get this power from?
  • In whose interests does it exercise this power?
  • To whom is it accountable?
  • How can we get rid of it?

References

Bau, N. and Das, J. (2017). The Misallocation of Pay and Productivity in the Public Sector : Evidence from the Labor Market for Teachers. Policy Research Working Paper; No. 8050. World Bank, Washington, DC. Retrieved [18 May 2020] from https://openknowledge.worldbank.org/handle/10986/26502

Beech, J. (2009). Who is Strolling Through The Global Garden? International Agencies and Educational Transfer. In Cowen, R. and Kazamias, A. M. (Eds.) Second International Handbook of Comparative Education. Dordrecht: Springer. pp. 341 – 358

Biesta, G. (2020). Educational Research. London: Bloomsbury.

Castro, C. De M., (2009). Can Multilateral Banks Educate The World? In Cowen, R. and Kazamias, A. M. (Eds.) Second International Handbook of Comparative Education. Dordrecht: Springer. pp. 455 – 478

Compton, M. and Weiner, L. (Eds.) (2008). The Global Assault on Teaching, Teachers, and their Unions. New York: Palgrave Macmillan

De Bruyckere, P., Kirschner, P.A. and Hulshof, C. (2020). More Urban Myths about Learning and Education. New York: Routledge.

De Siqueira, A. C. (2012). The 2020 World Bank Education Strategy: Nothing New, or the Same Old Gospel. In Klees, S. J., Samoff, J. and Stromquist, N. P. (Eds.) The World Bank and Education. Rotterdam: Sense Publishers. pp. 69 – 81

Heyneman, S.P. (2003). The history and problems in the making of education policy at the World Bank 1960–2000. International Journal of Educational Development 23 (2003) pp. 315–337. Retrieved [18 May 2020] from https://www.academia.edu/29593153/The_History_and_Problems_in_the_Making_of_Education_Policy_at_the_World_Bank_1960_2000

Jones, P. W. (2007). World Bank Financing of Education. 2nd edition. Abingdon, Oxon.: Routledge.

Klees, S. (2017). A critical analysis of the World Bank’s World Development Report on education. Retrieved [18 May 2020] from: https://www.brettonwoodsproject.org/2017/11/critical-analysis-world-banks-world-development-report-education/

Mason, E. S. & Asher, R. E. (1973). The World Bank since Bretton Woods. Washington, DC: Brookings Institution.

Menashy, F. (2012). Review of Klees, S J., Samoff, J. & Stromquist, N. P. (Eds) (2012). The World Bank and Education: Critiques and Alternatives .Rotterdam: Sense Publishers. Education Review, 15. Retrieved [18 May 2020] from https://www.academia.edu/7672656/Review_of_The_World_Bank_and_Education_Critiques_and_Alternatives

Rizvi, F. & Lingard, B. (2010). Globalizing Education Policy. Abingdon, Oxon.: Routledge.

Sadowski, J. (2020). Too Smart. Cambridge, MA.: MIT Press.

Sahlberg, P. (2016). The global educational reform movement and its impact on schooling. In K. Mundy, A. Green, R. Lingard, & A. Verger (Eds.), The handbook of global policy and policymaking in education. New York, NY: Wiley-Blackwell. pp.128 – 144

Selwyn, N. (2013). Education in a Digital World. New York: Routledge.

Spring, J. (2015). Globalization of Education 2nd Edition. New York: Routledge.

Trilling, B. & C. Fadel (2009). 21st Century Skills. San Francisco: Wiley

Verger, A. & Bonal, X. (2012). ‘All Things Being Equal?’ In Klees, S. J., Samoff, J. and Stromquist, N. P. (Eds.) The World Bank and Education. Rotterdam: Sense Publishers. pp. 69 – 81

Waslander, S., Pater, C. & van der Weide, M. (2010). Markets in Education: An analytical review of empirical research on market mechanisms in education. OECD EDU Working Paper 52.

Weiner, L. (2012). Social Movement Unionism: Teachers Can Lead the Way. Reimagine, 19 (2) Retrieved [18 May 2020] from: https://www.reimaginerpe.org/19-2/weiner-fletcher

Williamson, B., Bayne, S. & Shay, S. (2020). The datafication of teaching in Higher Education: critical issues and perspectives, Teaching in Higher Education, 25:4, 351-365, DOI: 10.1080/13562517.2020.1748811

World Bank. (2013). Life skills : what are they, why do they matter, and how are they taught? (English). Adolescent Girls Initiative (AGI) learning from practice series. Washington DC ; World Bank. Retrieved [18 May 2020] from: http://documents.worldbank.org/curated/en/569931468331784110/Life-skills-what-are-they-why-do-they-matter-and-how-are-they-taught

Book_coverIn my last post, I looked at shortcomings in edtech research, mostly from outside the world of ELT. I made a series of recommendations of ways in which such research could become more useful. In this post, I look at two very recent collections of ELT edtech research. The first of these is Digital Innovations and Research in Language Learning, edited by Mavridi and Saumell, and published this February by the Learning Technologies SIG of IATEFL. I’ll refer to it here as DIRLL. It’s available free to IATEFL LT SIG members, and can be bought for $10.97 as an ebook on Amazon (US). The second is the most recent edition (February 2020) of the Language Learning & Technology journal, which is open access and available here. I’ll refer to it here as LLTJ.

In both of these collections, the focus is not on ‘technology per se, but rather issues related to language learning and language teaching, and how they are affected or enhanced by the use of digital technologies’. However, they are very different kinds of publication. Nobody involved in the production of DIRLL got paid in any way (to the best of my knowledge) and, in keeping with its provenance from a teachers’ association, has ‘a focus on the practitioner as teacher-researcher’. Almost all of the contributing authors are university-based, but they are typically involved more in language teaching than in research. With one exception (a grant from the EU), their work was unfunded.

The triannual LLTJ is funded by two American universities and published by the University of Hawaii Press. The editors and associate editors are well-known scholars in their fields. The journal’s impact factor is high, close to the impact factor of the paywalled reCALL (published by the University of Cambridge), which is the highest-ranking journal in the field of CALL. The contributing authors are all university-based, many with a string of published articles (in prestige journals), chapters or books behind them. At least six of the studies were funded by national grant-awarding bodies.

I should begin by making clear that there was much in both collections that I found interesting. However, it was not usually the research itself that I found informative, but the literature review that preceded it. Two of the chapters in DIRLL were not really research, anyway. One was the development of a template for evaluating ICT-mediated tasks in CLIL, another was an advocacy of comics as a resource for language teaching. Both of these were new, useful and interesting to me. LLTJ included a valuable literature review of research into VR in FL learning (but no actual new research). With some exceptions in both collections, though, I felt that I would have been better off curtailing my reading after the reviews. Admittedly, there wouldn’t be much in the way of literature reviews if there were no previous research to report …

It was no surprise to see the learners who were the subjects of this research were overwhelmingly university students. In fact, only one article (about a high-school project in Israel, reported in DIRLL) was not about university students. The research areas focused on reflected this bias towards tertiary contexts: online academic reading skills, academic writing, online reflective practices in teacher training programmes, etc.

In a couple of cases, the selection of experimental subjects seemed plain bizarre. Why, if you want to find out about the extent to which Moodle use can help EAP students become better academic readers (in DIRLL), would you investigate this with a small volunteer cohort of postgraduate students of linguistics, with previous experience of using Moodle and experience of teaching? Is a less representative sample imaginable? Why, if you want to investigate the learning potential of the English File Pronunciation app (reported in LLTJ), which is clearly most appropriate for A1 – B1 levels, would you do this with a group of C1-level undergraduates following a course in phonetics as part of an English Studies programme?

More problematic, in my view, was the small sample size in many of the research projects. The Israeli virtual high school project (DIRLL), previously referred to, started out with only 11 students, but 7 dropped out, primarily, it seems, because of institutional incompetence: ‘the project was probably doomed […] to failure from the start’, according to the author. Interesting as this was as an account of how not to set up a project of this kind, it is simply impossible to draw any conclusions from 4 students about the potential of a VLE for ‘interaction, focus and self-paced learning’. The questionnaire investigating experience of and attitudes towards VR (in DIRLL) was completed by only 7 (out of 36 possible) students and 7 (out of 70+ possible) teachers. As the author acknowledges, ‘no great claims can be made’, but then goes on to note the generally ‘positive attitudes to VR’. Perhaps those who did not volunteer had different attitudes? We will never know. The study of motivational videos in tertiary education (DIRLL) started off with 15 subjects, but 5 did not complete the necessary tasks. The research into L1 use in videoconferencing (LLTJ) started off with 10 experimental subjects, all with the same L1 and similar cultural backgrounds, but there was no data available from 4 of them (because they never switched into L1). The author claims that the paper demonstrates ‘how L1 is used by language learners in videoconferencing as a social semiotic resource to support social presence’ – something which, after reading the literature review, we already knew. But the paper also demonstrates quite clearly how L1 is not used by language learners in videoconferencing as a social semiotic resource to support social presence. In all these cases, it is the participants who did not complete or the potential participants who did not want to take part that have the greatest interest for me.

Unsurprisingly, the LLTJ articles had larger sample sizes than those in DIRLL, but in both collections the length of the research was limited. The production of one motivational video (DIRLL) does not really allow us to draw any conclusions about the development of students’ critical thinking skills. Two four-week interventions do not really seem long enough to me to discover anything about learner autonomy and Moodle (DIRLL). An experiment looking at different feedback modes needs more than two written assignments to reach any conclusions about student preferences (LLTJ).

More research might well be needed to compensate for the short-term projects with small sample sizes, but I’m not convinced that this is always the case. Lacking sufficient information about the content of the technologically-mediated tools being used, I was often unable to reach any conclusions. A gamified Twitter environment was developed in one project (DIRLL), using principles derived from contemporary literature on gamification. The authors concluded that the game design ‘failed to generate interaction among students’, but without knowing a lot more about the specific details of the activity, it is impossible to say whether the problem was the principles or the particular instantiation of those principles. Another project, looking at the development of pronunciation materials for online learning (LLTJ), came to the conclusion that online pronunciation training was helpful – better than none at all. Claims are then made about the value of the method used (called ‘innovative Cued Pronunciation Readings’), but this is not compared to any other method / materials, and only a very small selection of these materials are illustrated. Basically, the reader of this research has no choice but to take things on trust. The study looking at the use of Alexa to help listening comprehension and speaking fluency (LLTJ) cannot really tell us anything about IPAs unless we know more about the particular way that Alexa is being used. Here, it seems that the students were using Alexa in an interactive storytelling exercise, but so little information is given about the exercise itself that I didn’t actually learn anything at all. The author’s own conclusion is that the results, such as they are, need to be treated with caution. Nevertheless, he adds ‘the current study illustrates that IPAs may have some value to foreign language learners’.

This brings me onto my final gripe. To be told that IPAs like Alexa may have some value to foreign language learners is to be told something that I already know. This wasn’t the only time this happened during my reading of these collections. I appreciate that research cannot always tell us something new and interesting, but a little more often would be nice. I ‘learnt’ that goal-setting plays an important role in motivation and that gamification can boost short-term motivation. I ‘learnt’ that reflective journals can take a long time for teachers to look at, and that reflective video journals are also very time-consuming. I ‘learnt’ that peer feedback can be very useful. I ‘learnt’ from two papers that intercultural difficulties may be exacerbated by online communication. I ‘learnt’ that text-to-speech software is pretty good these days. I ‘learnt’ that multimodal literacy can, most frequently, be divided up into visual and auditory forms.

With the exception of a piece about online safety issues (DIRLL), I did not once encounter anything which hinted that there may be problems in using technology. No mention of the use to which student data might be put. No mention of the costs involved (except for the observation that many students would not be happy to spend money on the English File Pronunciation app) or the cost-effectiveness of digital ‘solutions’. No consideration of the institutional (or other) pressures (or the reasons behind them) that may be applied to encourage teachers to ‘leverage’ edtech. No suggestion that a zero-tech option might actually be preferable. In both collections, the language used is invariably positive, or, at least, technology is associated with positive things: uncovering the possibilities, promoting autonomy, etc. Even if the focus of these publications is not on technology per se (although I think this claim doesn’t really stand up to close examination), it’s a little disingenuous to claim (as LLTJ does) that the interest is in how language learning and language teaching is ‘affected or enhanced by the use of digital technologies’. The reality is that the overwhelming interest is in potential enhancements, not potential negative effects.

I have deliberately not mentioned any names in referring to the articles I have discussed. I would, though, like to take my hat off to the editors of DIRLL, Sophia Mavridi and Vicky Saumell, for attempting to do something a little different. I think that Alicia Artusi and Graham Stanley’s article (DIRLL) about CPD for ‘remote’ teachers was very good and should interest the huge number of teachers working online. Chryssa Themelis and Julie-Ann Sime have kindled my interest in the potential of comics as a learning resource (DIRLL). Yu-Ju Lan’s article about VR (LLTJ) is surely the most up-to-date, go-to article on this topic. There were other pieces, or parts of pieces, that I liked, too. But, to me, it’s clear that ‘more research is needed’ … much less than (1) better and more critical research, and (2) more digestible summaries of research.

Colloquium

At the beginning of March, I’ll be going to Cambridge to take part in a Digital Learning Colloquium (for more information about the event, see here ). One of the questions that will be explored is how research might contribute to the development of digital language learning. In this, the first of two posts on the subject, I’ll be taking a broad overview of the current state of play in edtech research.

I try my best to keep up to date with research. Of the main journals, there are Language Learning and Technology, which is open access; CALICO, which offers quite a lot of open access material; and reCALL, which is the most restricted in terms of access of the three. But there is something deeply frustrating about most of this research, and this is what I want to explore in these posts. More often than not, research articles end with a call for more research. And more often than not, I find myself saying ‘Please, no, not more research like this!’

First, though, I would like to turn to a more reader-friendly source of research findings. Systematic reviews are, basically literature reviews which can save people like me from having to plough through endless papers on similar subjects, all of which contain the same (or similar) literature review in the opening sections. If only there were more of them. Others agree with me: the conclusion of one systematic review of learning and teaching with technology in higher education (Lillejord et al., 2018) was that more systematic reviews were needed.

Last year saw the publication of a systematic review of research on artificial intelligence applications in higher education (Zawacki-Richter, et al., 2019) which caught my eye. The first thing that struck me about this review was that ‘out of 2656 initially identified publications for the period between 2007 and 2018, 146 articles were included for final synthesis’. In other words, only just over 5% of the research was considered worthy of inclusion.

The review did not paint a very pretty picture of the current state of AIEd research. As the second part of the title of this review (‘Where are the educators?’) makes clear, the research, taken as a whole, showed a ‘weak connection to theoretical pedagogical perspectives’. This is not entirely surprising. As Bates (2019) has noted: ‘since AI tends to be developed by computer scientists, they tend to use models of learning based on how computers or computer networks work (since of course it will be a computer that has to operate the AI). As a result, such AI applications tend to adopt a very behaviourist model of learning: present / test / feedback.’ More generally, it is clear that technology adoption (and research) is being driven by technology enthusiasts, with insufficient expertise in education. The danger is that edtech developers ‘will simply ‘discover’ new ways to teach poorly and perpetuate erroneous ideas about teaching and learning’ (Lynch, 2017).

This, then, is the first of my checklist of things that, collectively, researchers need to do to improve the value of their work. The rest of this list is drawn from observations mostly, but not exclusively, from the authors of systematic reviews, and mostly come from reviews of general edtech research. In the next blog post, I’ll look more closely at a recent collection of ELT edtech research (Mavridi & Saumell, 2020) to see how it measures up.

1 Make sure your research is adequately informed by educational research outside the field of edtech

Unproblematised behaviourist assumptions about the nature of learning are all too frequent. References to learning styles are still fairly common. The most frequently investigated skill that is considered in the context of edtech is critical thinking (Sosa Neira, et al., 2017), but this is rarely defined and almost never problematized, despite a broad literature that questions the construct.

2 Adopt a sceptical attitude from the outset

Know your history. Decades of technological innovation in education have shown precious little in the way of educational gains and, more than anything else, have taught us that we need to be sceptical from the outset. ‘Enthusiasm and praise that are directed towards ‘virtual education, ‘school 2.0’, ‘e-learning and the like’ (Selwyn, 2014: vii) are indications that the lessons of the past have not been sufficiently absorbed (Levy, 2016: 102). The phrase ‘exciting potential’, for example, should be banned from all edtech research. See, for example, a ‘state-of-the-art analysis of chatbots in education’ (Winkler & Söllner, 2018), which has nothing to conclude but ‘exciting potential’. Potential is fine (indeed, it is perhaps the only thing that research can unambiguously demonstrate – see section 3 below), but can we try to be a little more grown-up about things?

3 Know what you are measuring

Measuring learning outcomes is tricky, to say the least, but it’s understandable that researchers should try to focus on them. Unfortunately, ‘the vast array of literature involving learning technology evaluation makes it challenging to acquire an accurate sense of the different aspects of learning that are evaluated, and the possible approaches that can be used to evaluate them’ (Lai & Bower, 2019). Metrics such as student grades are hard to interpret, not least because of the large number of variables and the danger of many things being conflated in one score. Equally, or possibly even more, problematic, are self-reporting measures which are rarely robust. It seems that surveys are the most widely used instrument in qualitative research (Sosa Neira, et al., 2017), but these will tell us little or nothing when used for short-term interventions (see point 5 below).

4 Ensure that the sample size is big enough to mean something

In most of the research into digital technology in education that was analysed in a literature review carried out for the Scottish government (ICF Consulting Services Ltd, 2015), there were only ‘small numbers of learners or teachers or schools’.

5 Privilege longitudinal studies over short-term projects

The Scottish government literature review (ICF Consulting Services Ltd, 2015), also noted that ‘most studies that attempt to measure any outcomes focus on short and medium term outcomes’. The fact that the use of a particular technology has some sort of impact over the short or medium term tells us very little of value. Unless there is very good reason to suspect the contrary, we should assume that it is a novelty effect that has been captured (Levy, 2016: 102).

6 Don’t forget the content

The starting point of much edtech research is the technology, but most edtech, whether it’s a flashcard app or a full-blown Moodle course, has content. Research reports rarely give details of this content, assuming perhaps that it’s just fine, and all that’s needed is a little tech to ‘present learners with the ‘right’ content at the ‘right’ time’ (Lynch, 2017). It’s a foolish assumption. Take a random educational app from the Play Store, a random MOOC or whatever, and the chances are you’ll find it’s crap.

7 Avoid anecdotal accounts of technology use in quasi-experiments as the basis of a ‘research article’

Control (i.e technology-free) groups may not always be possible but without them, we’re unlikely to learn much from a single study. What would, however, be extremely useful would be a large, collated collection of such action-research projects, using the same or similar technology, in a variety of settings. There is a marked absence of this kind of work.

8 Enough already of higher education contexts

Researchers typically work in universities where they have captive students who they can carry out research on. But we have a problem here. The systematic review of Lundin et al (2018), for example, found that ‘studies on flipped classrooms are dominated by studies in the higher education sector’ (besides lacking anchors in learning theory or instructional design). With some urgency, primary and secondary contexts need to be investigated in more detail, not just regarding flipped learning.

9 Be critical

Very little edtech research considers the downsides of edtech adoption. Online safety, privacy and data security are hardly peripheral issues, especially with younger learners. Ignoring them won’t make them go away.

More research?

So do we need more research? For me, two things stand out. We might benefit more from, firstly, a different kind of research, and, secondly, more syntheses of the work that has already been done. Although I will probably continue to dip into the pot-pourri of articles published in the main CALL journals, I’m looking forward to a change at the CALICO journal. From September of this year, one issue a year will be thematic, with a lead article written by established researchers which will ‘first discuss in broad terms what has been accomplished in the relevant subfield of CALL. It should then outline which questions have been answered to our satisfaction and what evidence there is to support these conclusions. Finally, this article should pose a “soft” research agenda that can guide researchers interested in pursuing empirical work in this area’. This will be followed by two or three empirical pieces that ‘specifically reflect the research agenda, methodologies, and other suggestions laid out in the lead article’.

But I think I’ll still have a soft spot for some of the other journals that are coyer about their impact factor and that can be freely accessed. How else would I discover (it would be too mean to give the references here) that ‘the effective use of new technologies improves learners’ language learning skills’? Presumably, the ineffective use of new technologies has the opposite effect? Or that ‘the application of modern technology represents a significant advance in contemporary English language teaching methods’?

References

Bates, A. W. (2019). Teaching in a Digital Age Second Edition. Vancouver, B.C.: Tony Bates Associates Ltd. Retrieved from https://pressbooks.bccampus.ca/teachinginadigitalagev2/

ICF Consulting Services Ltd (2015). Literature Review on the Impact of Digital Technology on Learning and Teaching. Edinburgh: The Scottish Government. https://dera.ioe.ac.uk/24843/1/00489224.pdf

Lai, J.W.M. & Bower, M. (2019). How is the use of technology in education evaluated? A systematic review. Computers & Education, 133(1), 27-42. Elsevier Ltd. Retrieved January 14, 2020 from https://www.learntechlib.org/p/207137/

Levy, M. 2016. Researching in language learning and technology. In Farr, F. & Murray, L. (Eds.) The Routledge Handbook of Language Learning and Technology. Abingdon, Oxon.: Routledge. pp.101 – 114

Lillejord S., Børte K., Nesje K. & Ruud E. (2018). Learning and teaching with technology in higher education – a systematic review. Oslo: Knowledge Centre for Education https://www.forskningsradet.no/siteassets/publikasjoner/1254035532334.pdf

Lundin, M., Bergviken Rensfeldt, A., Hillman, T. et al. (2018). Higher education dominance and siloed knowledge: a systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education 15, 20 (2018) doi:10.1186/s41239-018-0101-6

Lynch, J. (2017). How AI Will Destroy Education. Medium, November 13, 2017. https://buzzrobot.com/how-ai-will-destroy-education-20053b7b88a6

Mavridi, S. & Saumell, V. (Eds.) (2020). Digital Innovations and Research in Language Learning. Faversham, Kent: IATEFL

Selwyn, N. (2014). Distrusting Educational Technology. New York: Routledge

Sosa Neira, E. A., Salinas, J. and de Benito Crosetti, B. (2017). Emerging Technologies (ETs) in Education: A Systematic Review of the Literature Published between 2006 and 2016. International Journal of Emerging Technologies in Education, 12 (5). https://online-journals.org/index.php/i-jet/article/view/6939

Winkler, R. & Söllner, M. (2018): Unleashing the Potential of Chatbots in Education: A State-Of-The-Art Analysis. In: Academy of Management Annual Meeting (AOM). Chicago, USA. https://www.alexandria.unisg.ch/254848/1/JML_699.pdf

Zawacki-Richter, O., Bond, M., Marin, V. I. And Gouverneur, F. (2019). Systematic review of research on artificial intelligence applications in higher education – where are the educators? International Journal of Educational Technology in Higher Education 2019

When the startup, AltSchool, was founded in 2013 by Max Ventilla, the former head of personalization at Google, it quickly drew the attention of venture capitalists and within a few years had raised $174 million from the likes of the Zuckerberg Foundation, Peter Thiel, Laurene Powell Jobs and Pierre Omidyar. It garnered gushing articles in a fawning edtech press which enthused about ‘how successful students can be when they learn in small, personalized communities that champion project-based learning, guided by educators who get a say in the technology they use’. It promised ‘a personalized learning approach that would far surpass the standardized education most kids receive’.

altschoolVentilla was an impressive money-raiser who used, and appeared to believe, every cliché in the edTech sales manual. Dressed in regulation jeans, polo shirt and fleece, he claimed that schools in America were ‘stuck in an industrial-age model, [which] has been in steady decline for the last century’ . What he offered, instead, was a learner-centred, project-based curriculum providing real-world lessons. There was a focus on social-emotional learning activities and critical thinking was vital.

The key to the approach was technology. From the start, software developers, engineers and researchers worked alongside teachers everyday, ‘constantly tweaking the Personalized Learning Plan, which shows students their assignments for each day and helps teachers keep track of and assess student’s learning’. There were tablets for pre-schoolers, laptops for older kids and wall-mounted cameras to record the lessons. There were, of course, Khan Academy videos. Ventilla explained that “we start with a representation of each child”, and even though “the vast majority of the learning should happen non-digitally”, the child’s habits and preferences gets converted into data, “a digital representation of the important things that relate to that child’s learning, not just their academic learning but also their non-academic learning. Everything logistic that goes into setting up the experience for them, whether it’s who has permission to pick them up or their allergy information. You name it.” And just like Netflix matches us to TV shows, “If you have that accurate and actionable representation for each child, now you can start to personalize the whole experience for that child. You can create that kind of loop you described where because we can represent a child well, we can match them to the right experiences.”

AltSchool seemed to offer the possibility of doing something noble, of transforming education, ‘bringing it into the digital age’, and, at the same time, a healthy return on investors’ money. Expanding rapidly, nine AltSchool microschools were opened in New York and the Bay Area, and plans were afoot for further expansion in Chicago. But, by then, it was already clear that something was going wrong. Five of the schools were closed before they had really got started and the attrition rate in some classrooms had reached about 30%. Revenue in 2018 was only $7 million and there were few buyers for the AltSchool platform. Quoting once more from the edTech bible, Ventilla explained the situation: ‘Our whole strategy is to spend more than we make,’ he says. Since software is expensive to develop and cheap to distribute, the losses, he believes, will turn into steep profits once AltSchool refines its product and lands enough customers.

The problems were many and apparent. Some of the buildings were simply not appropriate for schools, with no playgrounds or gyms, malfunctioning toilets, among other issues. Parents were becoming unhappy and accused AltSchool of putting ‘its ambitions as a tech company above its responsibility to teach their children. […] We kind of came to the conclusion that, really, AltSchool as a school was kind of a front for what Max really wants to do, which is develop software that he’s selling,’ a parent of a former AltSchool student told Business Insider. ‘We had really mediocre educators using technology as a crutch,’ said one father who transferred his child to a different private school after two years at AltSchool. […] We learned that it’s almost impossible to really customize the learning experience for each kid.’ Some parents began to wonder whether AltSchool had enticed families into its program merely to extract data from their children, then toss them aside?

With the benefit of hindsight, it would seem that the accusations were hardly unfair. In June of this year, AltSchool announced that its four remaining schools would be operated by a new partner, Higher Ground Education (a well-funded startup founded in 2016 which promotes and ‘modernises’ Montessori education). Meanwhile, AltSchool has been rebranded as Altitude Learning, focusing its ‘resources on the development and expansion of its personalized learning platform’ for licensing to other schools across the country.

Quoting once more from the edTech sales manual, Ventilla has said that education should drive the tech, not the other way round. Not so many years earlier, before starting AltSchool, Ventilla also said that he had read two dozen books on education and emerged a fan of Sir Ken Robinson. He had no experience as a teacher or as an educational administrator. Instead, he had ‘extensive knowledge of networks, and he understood the kinds of insights that can be gleaned from big data’.