Posts Tagged ‘social media’

Since no single definition of critical thinking prevails (Dummett & Hughes, 2019: 2), discussions of the topic invariably begin with attempts to provide a definition. Lai (2011) offers an accessible summary of a range of possible meanings, but points out that, in educational contexts, its meaning is often rather vague and encompasses other concepts (such as higher order thinking skills) which also lack clarity. Paul Dummett and John Hughes (2019: 4) plump for ‘a mindset that involves thinking reflectively, rationally and reasonably’ – a definition which involves a vague noun (that could mean a fixed state of mind, a learned attitude, a disposition or a mood) and three highly subjective adverbs. I don’t think I could do any better. However, instead of looking for a definition, we can reach a sort of understanding by looking at examples of it. Dummett and Hughes’ book is extremely rich in practical examples, and the picture that emerges of critical thinking is complex and multifaceted.

As you might expect of a weasel word like ‘critical thinking’, there appears to be general agreement that it’s a ‘good thing’. Paul Dummett suggests that there are two common reasons for promoting the inclusion of critical thinking activities in the language classroom. The first of these is a desire to get students thinking for themselves. The second is the idea ‘that we live in an age of misinformation in which only the critically minded can avoid manipulation or slavish conformity’. Neither seems contentious at first glance, although he points out that ‘they tend to lead to a narrow application of critical thinking in ELT materials: that is to say, the analysis of texts and evaluation of the ideas expressed in them’. It’s the second of these rationales that I’d like to explore further.

Penny Ur (2020: 9) offers a more extended version of it:

The role of critical thinking in education has become more central in the 21st century, simply because there is far more information readily available to today’s students than there was in previous centuries (mainly, but not only, online), and it is vital for them to be able to deal with such input wisely. They need to be able to distinguish between what is important and what is trivial, between truth and lies, between fact and opinion, between logical argument and specious propaganda […] Without such skills and awareness of the need to exercise them, they are liable to find themselves victims of commercial or political interests, their thinking manipulated by persuasion disguised as information.

In the same edited collection Olja Milosevic (2020:18) echoes Ur’s argument:

Critical thinking becomes even more important as communication increasingly moves online. Students find an overwhelming amount of information and need to be taught how to evaluate its relevance, accuracy and quality. If teachers do not teach students how to go beyond surface meaning, students cannot be expected to practise it.

In the passages I’ve quoted, these writers are referring to one particular kind of critical thinking. The ability to critically evaluate the reliability, accuracy, etc of a text is generally considered to be a part of what is usually called ‘media information literacy’. In these times of fake news, so the argument goes, it is vital for students to develop (with their teachers’ help) the necessary skills to spot fake news when they see it. The most prototypical critical thinking activity in ELT classrooms is probably one in which students analyse some fake news, such as the website about the Pacific Tree Octopus (which is the basis of a lesson in Dudeney et al., 2013: 198 – 203).

Before considering media information literacy in more detail, it’s worth noting in passing that a rationale for critical thinking activities is no rationale at all if it only concerns one aspect of critical thinking, since it has applied attributes of a part (media information literacy) to a bigger whole (critical thinking).

There is no shortage of good (free) material available for dealing with fake news in the ELT classroom. Examples include work by James Taylor, Chia Suan Chong and Tyson Seburn. Material of this kind may result in lively, interesting, cognitively challenging, communicative and, therefore, useful lessons. But how likely is it that material of this kind will develop learners’ media information literacy and, by extension therefore, their critical thinking skills? How likely is it that teaching material of this kind will help people identify (and reject) fake news? Is it possible that material of this kind is valuable despite its rationale, rather than because of it? In the spirit of rational, reflective and reasonable thinking, these are questions that seem to be worth exploring.

ELT classes and fake news

James Taylor has suggested that the English language classroom is ‘the perfect venue for [critical thinking] skills to be developed’. Although academic English courses necessarily involve elements of critical thinking, I’m not so sure that media information literacy (and, specifically, the identification of fake news) can be adequately addressed in general English classes. There are so many areas, besides those that are specifically language-focussed, competing for space in language classes (think of all those other 21st century skills), that it is hard to see how sufficient time can be found for real development of this skill. It requires modelling, practice of the skill, feedback on the practice, and more practice (Mulnix, 2010): it needs time. Fake news activities in the language classroom would, of course, be of greater value if they were part of an integrated approach across the curriculum. Unfortunately, this is rarely the case.

Information literacy skills

Training materials for media information literacy usually involve a number of stages. These include things like fact-checking and triangulation of different sources, consideration of web address, analysis of images, other items on the site, source citation and so on. The problem, however, is that news-fakers have become so good at what they do. The tree octopus site is very crude in comparison to what can be produced nowadays by people who have learnt to profit from the online economy of misinformation. Facebook employs an army of algorithmic and human fact-checkers, but still struggles. The bottom line is that background knowledge is needed (this is as true for media information literacy as it is for critical thinking more generally) (Willingham, 2007). With news, the scope of domain knowledge is so vast that it is extremely hard to transfer one’s ability to critically evaluate one particular piece of news to another. We are all fooled from time to time.

Media information literacy interventions: research on effectiveness

With the onset of COVID-19, the ability to identify fake news has become, more than ever, a matter of life and death. There is little question that this ability correlates strongly with analytic thinking (see, for example, Stanley et al., 2020). What is much less clear is how we can go about promoting analytic thinking. Analytic thinking comes in different varieties, and another hot-off-the-press research study into susceptibility to COVID-19 fake news (Roozenbeek et al., 2020) has found that the ability to spot fake news may correlate more strongly with numerical literacy than with reasoning ability. In fact, the research team found that a lack of numerical literacy was the most consistent predictor of susceptibility to misinformation about COVID-19. Perhaps we are attempting to develop the wrong kind of analytic thinking?

In educational contexts, attempts to promote media information literacy typically seek to develop reasoning abilities, and the evidence for their effectiveness is mixed. First of all, it needs to be said that ‘little large-scale evidence exists on the effectiveness of promoting digital media literacy as a response to online misinformation’ (Guess et al., 2020). An early meta-analysis (Jeong et al., 2012) found that such interventions had a positive effect, when the interventions were long (not one-off), but impacted more on students’ knowledge than they did on their behaviour. More recently, Huguet et al (2019) were unable to draw ‘definitive conclusions from past research, such as what kinds of media literacy practices work and under what conditions’. And this year, a study by Guess et al (2020) did not generate sufficient evidence ‘to conclude that the [media information literacy] intervention changed real-world consumption of false news’. I am unaware of any robust research in this area in the context of ELT.

It’s all rather disappointing. Why are we not better at it? After all, teachers of media studies have been exploring pathways for many years now. One possible answer is this: Media information literacy, like critical thinking more generally, is a skill that is acquirable, but it can only be acquired if there is a disposition to do so. The ability to think critically and the disposition to do so are separate entities (Facione, 2000). Training learners to be more critical in their approach to media information may be so much pissing in the wind if the disposition to be sceptical is not there. Shaping dispositions is a much harder task than training skills.

Both of the research studies into susceptibility to COVID-19 misinformation that I referred to earlier in this section underscore the significance of dispositions to analytic thinking. Roozenbeek et al (2020) found, in line with much previous research (for example, Jost et al. 2018), that political conservatism is associated with a slightly higher susceptibility to misinformation. Political views (on either side of the political spectrum) rarely change as a result of exposure to science or reasoned thinking. They also found that ‘self-identifying as a member of a minority predicts susceptibility to misinformation about the virus in all countries surveyed’ (except, interestingly, in the UK). Again, when issues of identity are at stake, emotional responses tend to trump rational ones.

Rational, reflective and reasonable thinking about media information literacy leads to an uncomfortable red-pill rabbit-hole. This is how Bulger and Davidson (2018) put it:

The extent to which media literacy can combat the problematic news environment is an open question. Is denying the existence of climate change a media literacy problem? Is believing that a presidential candidate was running a sex-trafficking ring out of a pizza shop a media literacy problem? Can media literacy combat the intentionally opaque systems of serving news on social media platforms? Or intentional campaigns of disinformation?

Teachers and fake news

The assumption that the critical thinking skills of young people can be developed through the intervention of their teachers is rarely problematized. It should be. A recent study of Spanish pre-service teachers (Fuertes-Prieto et al., 2020) showed that their ‘level of belief in pseudoscientific issues is comparable, or even higher in some cases to those of the general population’. There is no reason to believe that this changes after they have qualified. Teachers are probably no more likely to change their beliefs when presented with empirical evidence (Menz et al., 2020) than people from any other profession. Research has tended to focus on teachers’ lack of critical thinking in areas related to their work, but, things may be no different in the wider world. It is estimated that over a quarter of teachers in the US voted for the world’s greatest peddler of fake news in the 2016 presidential election.

It is also interesting to note that the sharing of fake news on social media is much more widespread among older people (including US teachers who have an average age of 42.4) than those under 30 (Bouygues, 2019).

Institutional contexts and fake news

Cory Doctorow has suggested that the fake news problem is not a problem of identifying what is true and what is fake, but a problem ‘about how we know whether something is true. We’re not disagreeing about facts, we’re disagreeing about epistemology’. In a post-modernist world of ‘Truth Decay’ (Kavanagh & Rich, 2018), where there is ‘a blurring of the line between opinion and fact’, epistemological authority is a rare commodity. Medicine, social sciences and applied linguistics are all currently experiencing a ‘replication crisis’ (Ioannidis, 2005) and we had a British education minister saying that ‘people of this country have had enough of experts’.

News reporting has always relied to some extent on trust in the reliability of the news source. The BBC or CNN might attempt to present themselves as more objective than, say, Fox News or InfoWars, but trust in all news outlets has collapsed globally in recent years. As Michael Shudson has written in the Columbia Journalism Review, ‘all news outlets write from a set of values, not simply from a disinterested effort at truth’. If a particular news channel manifestly shares different values from your own, it is easy to reject the veracity of the news it reports. Believers in COVID conspiracy theories often hold their views precisely because of their rejection of the epistemological authority of mainstream news and the WHO or governments who support lockdown measures.

The training of media information literacy in schools is difficult because, for many people in the US (and elsewhere), education is not dissimilar to mainstream media. They ‘are seen as the enemy — two institutions who are trying to have power over how people think. Two institutions that are trying to assert authority over epistemology’ (boyd, 2018). Schools have always been characterized by imbalances in power (between students and teachers / administrators), and this power dynamic is not conducive to open-minded enquiry. Children are often more aware of the power of their teachers than they are accepting of their epistemological authority. They are enjoined to be critical thinkers, but only about certain things and only up to a certain point. One way for children to redress the power imbalance is to reject the epistemological authority of their teachers. I think this may explain why a group of young children I observed recently coming out of a lesson devoted to environmental issues found such pleasure in joking about Greta ‘Thunfisch’.

Power relationships in schools are reflected and enacted in the interaction patterns between teachers and students. The most common of these is ‘initiation-response-feedback (IRF)’ and it is unlikely that this is particularly conducive to rational, reflective and reasonable thinking. At the same time, as Richard Paul, one of the early advocates of critical thinking in schools, noted, much learning activity is characterised by lower order thinking skills, especially memorization (Paul, 1992: 22). With this kind of backdrop, training in media information literacy is more likely to be effective if it goes beyond the inclusion of a few ‘fake news’ exercises: a transformation in the way that the teaching is done will also be needed. Benesch (1999) describes this as a more ‘dialogic’ approach and there is some evidence that a more dialogic approach can have a positive impact on students’ dispositions (e.g. Hajhosseiny, 2012).

I think that David Buckingham (2019a) captures the educational problem very neatly:

There’s a danger here of assuming that we are dealing with a rational process – or at least one that can, by some pedagogical means, be made rational. But from an educational perspective, we surely have to begin with the question of why people might believe apparently ‘fake’ news in the first place. Where we decide to place our trust is as much to do with fantasy, emotion and desire, as with rational calculation. All of us are inclined to believe what we want to believe.

Fake news: a problem or a symptom of a problem?

There has always been fake news. The big problem now is ‘the speed and ease of its dissemination, and it exists primarily because today’s digital capitalism makes it extremely profitable – look at Google and Facebook – to produce and circulate false but click-worthy narratives’ (Morosov, 2017). Fake news taps into and amplifies broader tendencies and divides in society: the problem is not straightforward and is unlikely to be easy to eradicate (Buckingham, 2019a: 3).

There is increasing discussion of media regulation and the recent banning by Facebook of Holocaust denial and QAnon is a recognition that some regulation cannot now be avoided. But strict regulations would threaten the ‘basic business model, and the enormous profitability’ of social media companies (Buckingham, 2009b) and there are real practical and ethical problems in working out exactly how regulation would happen. Governments do not know what to do.

Lacking any obvious alternative, media information literacy is often seen as the solution: can’t we ‘fact check and moderate our way out of this conundrum’ (boyd, 2018)? danah boyd’s stark response is, no, this will fail. It’s an inadequate solution to an oversimplified problem (Buckingham, 2019a).

Along with boyd and Buckingham, I’m not trying to argue that we drop media information literacy activities from educational (including ELT) programmes. Quite the opposite. But if we want our students to think reflectively, rationally and reasonably, I think we will need to start by doing the same.

References

Benesch, S. (1999). Thinking critically, thinking dialogically. TESOL Quarterly, 33: pp. 573 – 580

Bouygues, H. L. (2019). Fighting Fake News: Lessons From The Information Wars. Reboot Foundation https://reboot-foundation.org/fighting-fake-news/

boyd, d. (2018). You Think You Want Media Literacy… Do You? Data and Society: Points https://points.datasociety.net/you-think-you-want-media-literacy-do-you-7cad6af18ec2

Buckingham, D. (2019a). Teaching Media in a ‘Post-Truth’ Age: Fake News, Media Bias and the Challenge for Media Literacy Education. Cultura y Educación 31(2): pp. 1-19

Buckingham, D. (2019b). Rethinking digital literacy: Media education in the age of digital capitalism. https://ddbuckingham.files.wordpress.com/2019/12/media-education-in-digital-capitalism.pdf

Bulger, M. & Davidson, P. (2018). The Promises, Challenges and Futures of Media Literacy. Data and Society. https://datasociety.net/pubs/oh/DataAndSociety_Media_Literacy_2018.pdf

Doctorow, C. (2017). Three kinds of propaganda, and what to do about them. boingboing 25th February 2017, https://boingboing.net/2017/02/25/counternarratives-not-fact-che.html

Dudeney, G., Hockly, N. & Pegrum, M. (2013). Digital Literacies. Harlow: Pearson Education

Dummett, P. & Hughes, J. (2019). Critical Thinking in ELT. Boston: National Geographic Learning

Facione, P. A. (2000). The disposition toward critical thinking: Its character, measurement, and relation to critical thinking skill. Informal Logic, 20(1), 61–84.

Fuertes-Prieto, M.Á., Andrés-Sánchez, S., Corrochano-Fernández, D. et al. (2020). Pre-service Teachers’ False Beliefs in Superstitions and Pseudosciences in Relation to Science and Technology. Science & Education 29, 1235–1254 (2020). https://doi.org/10.1007/s11191-020-00140-8

Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, N., Reifler, J. & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences Jul 2020, 117 (27) 15536-15545; DOI: 10.1073/pnas.1920498117

Hajhosseiny, M. (2012). The Effect of Dialogic Teaching on Students’ Critical Thinking Disposition. Procedia – Social and Behavioral Sciences, 69: pp. 1358 – 1368

Huguet, A., Kavanagh, J., Baker, G. & Blumenthal, M. S. (2019). Exploring Media Literacy Education as a Tool for Mitigating Truth Decay. RAND Corporation, https://www.rand.org/content/dam/rand/pubs/research_reports/RR3000/RR3050/RAND_RR3050.pdf

Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLoS Medicine 2 (8): e124. https://doi.org/10.1371/journal.pmed.0020124

Jeong, S. H., Cho, H., & Hwang, Y. (2012). Media literacy interventions: A meta-analytic review. Journal of Communication, 62, pp. 454–472

Jones-Jang, S. M., Mortensen, T. & Liu, J. (2019). Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. American Behavioral Scientist, pp. 1 – 18, doi:10.1177/0002764219869406

Jost, J. T., van der Linden, S., Panagopoulos, C. & Hardin, C. D. (2018). Ideological asymmetries in conformity, desire for shared reality, and the spread of misinformation. Current Opinion in Psychology, 23: pp/ 77-83. doi:10.1016/j.copsyc.2018.01.003

Kavanagh, J. & Rich, M. D. (2018). Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life. RAND Corporation, https://www.rand.org/pubs/research_reports/RR2314.html

Lai, E.R. 2011. Critical Thinking: A Literature Review. Pearson. http://images.pearsonassessments.com/images/tmrs/CriticalThinkingReviewFINAL.pdf

Menz, C., Spinath, B. & Seifried, E. (2020). Misconceptions die hard: prevalence and reduction of wrong beliefs in topics from educational psychology among preservice teachers. European Journal of Psychology of Education https://doi.org/10.1007/s10212-020-00474-5

Milosevic, O. (2020). Promoting critical thinking in the EFL classroom. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp.17 – 22

Morozov, E. (2017). Moral panic over fake news hides the real enemy – the digital giants. The Guardian, 8 January 2017 https://www.theguardian.com/commentisfree/2017/jan/08/blaming-fake-news-not-the-answer-democracy-crisis

Mulnix, J.W. 2010. ‘Thinking critically about critical thinking’ Educational Philosophy and Theory, 2010

Paul, R. W. (1992). Critical thinking: What, why, and how? New Directions for Community Colleges, 77: pp. 3–24.

Roozenbeek, J., Schneider, C.R., Dryhurst, S., Kerr, J., Freeman, A. L. J., Recchia, G., van der Bles, A. M. & and van der Linden, S. (2020). Susceptibility to misinformation about COVID-19 around the world. Royal Society Open Science, 7 (10) https://doi.org/10.1098/rsos.201199

Stanley, M., Barr, N., Peters, K. & Seli, P. (2020). Analytic-thinking predicts hoax beliefs and helping behaviors in response to the COVID-19 pandemic. PsyArxiv Preprints doi:10.31234/osf.io/m3vt

Ur, P. (2020). Critical Thinking. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp.9 – 16

Willingham, D. T. (2007). Critical Thinking: Why Is It So Hard to Teach? American Educator Summer 2007: pp. 8 – 19

Take the Cambridge Assessment English website, for example. When you connect to the site, you will see, at the bottom of the screen, a familiar (to people in Europe, at least) notification about the site’s use of cookies: the cookies consent.

You probably trust the site, so ignore the notification and quickly move on to find the resource you are looking for. But if you did click on hyperlinked ‘set cookies’, what would you find? The first link takes you to the ‘Cookie policy’ where you will be told that ‘We use cookies principally because we want to make our websites and mobile applications user-friendly, and we are interested in anonymous user behaviour. Generally our cookies don’t store sensitive or personally identifiable information such as your name and address or credit card details’. Scroll down, and you will find out more about the kind of cookies that are used. Besides the cookies that are necessary to the functioning of the site, you will see that there are also ‘third party cookies’. These are explained as follows: ‘Cambridge Assessment works with third parties who serve advertisements or present offers on our behalf and personalise the content that you see. Cookies may be used by those third parties to build a profile of your interests and show you relevant adverts on other sites. They do not store personal information directly but use a unique identifier in your browser or internet device. If you do not allow these cookies, you will experience less targeted content’.

This is not factually inaccurate: personal information is not stored directly. However, it is extremely easy for this information to be triangulated with other information to identify you personally. In addition to the data that you generate by having cookies on your device, Cambridge Assessment will also directly collect data about you. Depending on your interactions with Cambridge Assessment, this will include ‘your name, date of birth, gender, contact data including your home/work postal address, email address and phone number, transaction data including your credit card number when you make a payment to us, technical data including internet protocol (IP) address, login data, browser type and technology used to access this website’. They say they may share this data ‘with other people and/or businesses who provide services on our behalf or at our request’ and ‘with social media platforms, including but not limited to Facebook, Google, Google Analytics, LinkedIn, in pseudonymised or anonymised forms’.

In short, Cambridge Assessment may hold a huge amount of data about you and they can, basically, do what they like with it.

The cookie and privacy policies are fairly standard, as is the lack of transparency in the phrasing of them. Rather more transparency would include, for example, information about which particular ad trackers you are giving your consent to. This information can be found with a browser extension tool like Ghostery, and these trackers can be blocked. As you’ll see below, there are 5 ad trackers on this site. This is rather more than other sites that English language teachers are likely to go to. ETS-TOEFL has 4, Macmillan English and Pearson have 3, CUP ELT and the British Council Teaching English have 1, OUP ELT, IATEFL, BBC Learning English and Trinity College have none. I could only find TESOL, with 6 ad trackers, which has more. The blogs for all these organisations invariably have more trackers than their websites.

The use of numerous ad trackers is probably a reflection of the importance that Cambridge Assessment gives to social media marketing. There is a research paper, produced by Cambridge Assessment, which outlines the significance of big data and social media analytics. They have far more Facebook followers (and nearly 6 million likes) than any other ELT page, and they are proud of their #1 ranking in the education category of social media. The amount of data that can be collected here is enormous and it can be analysed in myriad ways using tools like Ubervu, Yomego and Hootsuite.

A little more transparency, however, would not go amiss. According to a report in Vox, Apple has announced that some time next year ‘iPhone users will start seeing a new question when they use many of the apps on their devices: Do they want the app to follow them around the internet, tracking their behavior?’ Obviously, Google and Facebook are none too pleased about this and will be fighting back. The implications for ad trackers and online advertising, more generally, are potentially huge. I wrote to Cambridge Assessment about this and was pleased to hear that ‘Cambridge Assessment are currently reviewing the process by which we obtain users consent for the use of cookies with the intention of moving to a much more transparent model in the future’. Let’s hope that other ELT organisations are doing the same.

You may be less bothered than I am by the thought of dozens of ad trackers following you around the net so that you can be served with more personalized ads. But the digital profile about you, to which these cookies contribute, may include information about your ethnicity, disabilities and sexual orientation. This profile is auctioned to advertisers when you visit some sites, allowing them to show you ‘personalized’ adverts based on the categories in your digital profile. Contrary to EU regulations, these categories may include whether you have cancer, a substance-abuse problem, your politics and religion (as reported in Fortune https://fortune.com/2019/01/28/google-iab-sensitive-profiles/ ).

But it’s not these cookies that are the most worrying aspect about our lack of digital privacy. It’s the sheer quantity of personal data that is stored about us. Every time we ask our students to use an app or a platform, we are asking them to divulge huge amounts of data. With ClassDojo, for example, this includes names, usernames, passwords, age, addresses, photographs, videos, documents, drawings, or audio files, IP addresses and browser details, clicks, referring URL’s, time spent on site, and page views (Manolev et al., 2019; see also Williamson, 2019).

It is now widely recognized that the ‘consent’ that is obtained through cookie policies and other end-user agreements is largely spurious. These consent agreements, as Sadowski (2019) observes, are non-negotiated, and non-negotiable; you either agree or you are denied access. What’s more, he adds, citing one study, it would take 76 days, working for 8 hours a day, to read the privacy policies a person typically encounters in a year. As a result, most of us choose not to choose when we accept online services (Cobo, 2019: 25). We have little, if any, control over how the data that is collected is used (Birch et al., 2020). More importantly, perhaps, when we ask our students to sign up to an educational app, we are asking / telling them to give away their personal data, not just ours. They are unlikely to fully understand the consequences of doing so.

The extent of this ignorance is also now widely recognized. In the UK, for example, two reports (cited by Sander, 2020) indicate that ‘only a third of people know that data they have not actively chosen to share has been collected’ (Doteveryone, 2018: 5), and that ‘less than half of British adult internet users are aware that apps collect their location and information on their personal preferences’ (Ofcom, 2019: 14).

The main problem with this has been expressed by programmer and activist, Richard Stallman, in an interview with New York magazine (Kulwin, 2018): Companies are collecting data about people. The data that is collected will be abused. That’s not an absolute certainty, but it’s a practical, extreme likelihood, which is enough to make collection a problem.

The abuse that Smallman is referring to can come in a variety of forms. At the relatively trivial end is the personalized advertising. Much more serious is the way that data aggregation companies will scrape data from a variety of sources, building up individual data profiles which can be used to make significant life-impacting decisions, such as final academic grades or whether one is offered a job, insurance or credit (Manolev et al., 2019). Cathy O’Neil’s (2016) best-selling ‘Weapons of Math Destruction’ spells out in detail how this abuse of data increases racial, gender and class inequalities. And after the revelations of Edward Snowden, we all know about the routine collection by states of huge amounts of data about, well, everyone. Whether it’s used for predictive policing or straightforward repression or something else, it is simply not possible for younger people, our students, to know what personal data they may regret divulging at a later date.

Digital educational providers may try to reassure us that they will keep data private, and not use it for advertising purposes, but the reassurances are hollow. These companies may change their terms and conditions further down the line, and examples exist of when this has happened (Moore, 2018: 210). But even if this does not happen, the data can never be secure. Illegal data breaches and cyber attacks are relentless, and education ranked worst at cybersecurity out of 17 major industries in one recent analysis (Foresman, 2018). One report suggests that one in five US schools and colleges have fallen victim to cyber-crime. Two weeks ago, I learnt (by chance, as I happened to be looking at my security settings on Chrome) that my passwords for Quizlet, Future Learn, Elsevier and Science Direct had been compromised by a data breach. To get a better understanding of the scale of data breaches, you might like to look at the UK’s IT Governance site, which lists detected and publicly disclosed data breaches and cyber attacks each month (36.6 million records breached in August 2020). If you scroll through the list, you’ll see how many of them are educational sites. You’ll also see a comment about how leaky organisations have been throughout lockdown … because they weren’t prepared for the sudden shift online.

Recent years have seen a growing consensus that ‘it is crucial for language teaching to […] encompass the digital literacies which are increasingly central to learners’ […] lives’ (Dudeney et al., 2013). Most of the focus has been on the skills that are needed to use digital media. There also appears to be growing interest in developing critical thinking skills in the context of digital media (e.g. Peachey, 2016) – identifying fake news and so on. To a much lesser extent, there has been some focus on ‘issues of digital identity, responsibility, safety and ethics when students use these technologies’ (Mavridi, 2020a: 172). Mavridi (2020b: 91) also briefly discusses the personal risks of digital footprints, but she does not have the space to explore more fully the notion of critical data literacy. This literacy involves an understanding of not just the personal risks of using ‘free’ educational apps and platforms, but of why they are ‘free’ in the first place. Sander (2020b) suggests that this literacy entails ‘an understanding of datafication, recognizing the risks and benefits of the growing prevalence of data collection, analytics, automation, and predictive systems, as well as being able to critically reflect upon these developments. This includes, but goes beyond the skills of, for example, changing one’s social media settings, and rather constitutes an altered view on the pervasive, structural, and systemic levels of changing big data systems in our datafied societies’.

In my next two posts, I will, first of all, explore in more detail the idea of critical data literacy, before suggesting a range of classroom resources.

(I posted about privacy in March 2014, when I looked at the connections between big data and personalized / adaptive learning. In another post, September 2014, I looked at the claims of the CEO of Knewton who bragged that his company had five orders of magnitude more data about you than Google has. … We literally have more data about our students than any company has about anybody else about anything, and it’s not even close. You might find both of these posts interesting.)

References

Birch, K., Chiappetta, M. & Artyushina, A. (2020). ‘The problem of innovation in technoscientific capitalism: data rentiership and the policy implications of turning personal digital data into a private asset’ Policy Studies, 41:5, 468-487, DOI: 10.1080/01442872.2020.1748264

Cobo, C. (2019). I Accept the Terms and Conditions. https://adaptivelearninginelt.files.wordpress.com/2020/01/41acf-cd84b5_7a6e74f4592c460b8f34d1f69f2d5068.pdf

Doteveryone. (2018). People, Power and Technology: The 2018 Digital Attitudes Report. https://attitudes.doteveryone.org.uk

Dudeney, G., Hockly, N. & Pegrum, M. (2013). Digital Literacies. Harlow: Pearson Education

Foresman, B. (2018). Education ranked worst at cybersecurity out of 17 major industries. Edscoop, December 17, 2018. https://edscoop.com/education-ranked-worst-at-cybersecurity-out-of-17-major-industries/

Kulwin, K. (2018). F*ck Them. We Need a Law’: A Legendary Programmer Takes on Silicon Valley, New York Intelligencer, 2018, https://nymag.com/intelligencer/2018/04/richard-stallman-rms-on-privacy-data-and-free-software.html

Manolev, J., Sullivan, A. & Slee, R. (2019). ‘Vast amounts of data about our children are being harvested and stored via apps used by schools’ EduReseach Matters, February 18, 2019. https://www.aare.edu.au/blog/?p=3712

Mavridi, S. (2020a). Fostering Students’ Digital Responsibility, Ethics and Safety Skills (Dress). In Mavridi, S. & Saumell, V. (Eds.) Digital Innovations and Research in Language Learning. Faversham, Kent: IATEFL. pp. 170 – 196

Mavridi, S. (2020b). Digital literacies and the new digital divide. In Mavridi, S. & Xerri, D. (Eds.) English for 21st Century Skills. Newbury, Berks.: Express Publishing. pp. 90 – 98

Moore, M. (2018). Democracy Hacked. London: Oneworld

Ofcom. (2019). Adults: Media use and attitudes report [Report]. https://www.ofcom.org.uk/__data/assets/pdf_file/0021/149124/adults-media-use-and-attitudes-report.pdf

O’Neil, C. (2016). Weapons of Math Destruction. London: Allen Lane

Peachey, N. (2016). Thinking Critically through Digital Media. http://peacheypublications.com/

Sadowski, J. (2019). ‘When data is capital: Datafication, accumulation, and extraction’ Big Data and Society 6 (1) https://doi.org/10.1177%2F2053951718820549

Sander, I. (2020a). What is critical big data literacy and how can it be implemented? Internet Policy Review, 9 (2). DOI: 10.14763/2020.2.1479 https://www.econstor.eu/bitstream/10419/218936/1/2020-2-1479.pdf

Sander, I. (2020b). Critical big data literacy tools—Engaging citizens and promoting empowered internet usage. Data & Policy, 2: e5 doi:10.1017/dap.2020.5

Williamson, B. (2019). ‘Killer Apps for the Classroom? Developing Critical Perspectives on ClassDojo and the ‘Ed-tech’ Industry’ Journal of Professional Learning, 2019 (Semester 2) https://cpl.asn.au/journal/semester-2-2019/killer-apps-for-the-classroom-developing-critical-perspectives-on-classdojo

Around 25 years ago, when I worked at International House London, I used to teach a course called ‘Current Trends in ELT’. I no longer have records of the time so I can’t be 100% sure what was included in the course, but task-based learning, the ‘Lexical Approach’, the use of corpora, English as a Lingua Franca, learner autonomy / centredness, reflective practice and technology (CALL and CD-ROMs) were all probably part of it. I see that IH London still offers this course (next available course in January 2021) and I am struck by how similar the list of contents is. Only ‘emerging language’, CLIL, DOGME and motivation are clearly different from the menu of 25 years ago.

The term ‘current trends’ has always been a good hook to sell a product. Each year, any number of ELT conferences chooses it as their theme. Coursebooks, like ‘Cutting Edge’ or ‘Innovations’, suggest in their titles something fresh and appealing. And, since 2003, the British Council has used its English Language Teaching Innovation Awards to position itself as forward-thinking and innovative.

You could be forgiven for wondering what is especially innovative about many of the ELTon award-winners, or indeed, why neophilia actually matters at all. The problem, in a relatively limited world like language teaching, is that only so much innovation is either possible or desirable.

A year after the ELTons appeared, Adrian Underhill wrote an article about ‘Trends in English Language Teaching Today’. Almost ten years after I was teaching ‘current trends’, Adrian’s list included the use of corpora, English as a Lingua Franca, reflective practice and learner-centredness. His main guess was that practitioners would be working more with ‘the fuzzy, the unclear, the unfinished’. He hadn’t reckoned on the influence of the CEFR, Pearson’s Global Scale of English and our current obsession with measuring everything!

Jump just over ten years and Chia Suan Chong offered a listicle of ‘Ten innovations that have changed English language teaching for the British Council. Most of these were technological developments (platforms, online CPD, mobile learning) but a significant newcomer to the list was ‘soft skills’ (especially critical thinking).

Zooming forward nearer to the present, Chia then offered her list of ‘Ten trends and innovations in English language teaching for 2018’ in another post for the British Council. English as a Lingua Franca was still there, but gone were task-based learning and the ‘Lexical Approach’, corpora, learner-centredness and reflective practice. In their place came SpLNs, multi-literacies, inquiry-based learning and, above all, more about technology – platforms, mobile and blended learning, gamification.

I decided to explore current ‘current trends’ by taking a look at the last twelve months of blog posts from the four biggest UK ELT publishers. Posts such as these are interesting in two ways: (1) they are an attempt to capture what is perceived as ‘new’ and therefore more likely to attract clicks, and (2) they are also an attempt to set an agenda – they reflect what these commercial organisations would like us to be talking and thinking about. The posts reflect reasonably well the sorts of topics that are chosen for webinars, whether directly hosted or sponsored.

The most immediate and unsurprising observation is that technology is ubiquitous. No longer one among a number of topics, technology now informs (almost) all other topics. Before I draw a few conclusion, here are more detailed notes.

Pearson English blog

Along with other publishers, Pearson were keen to show how supportive to teachers they were, and the months following the appearance of the pandemic saw a greater number than normal of blog posts that did not focus on particular Pearson products. Over the last twelve months as a whole, Pearson made strenuous efforts to draw attention to their Global Scale of English and the Pearson Test of English. Assessment of one kind or another was never far away. But the other big themes of the last twelve months have been ‘soft / 21st century skills (creativity, critical thinking, collaboration, leadership, etc.), and aspects of social and emotional learning (especially engagement / motivation, anxiety and mindfulness). Three other topics also featured more than once: mediation, personalization and SpLN (dyslexia).

OUP ELT Global blog

The OUP blog has, on the whole, longer, rather more informative posts than Pearson. They also tend to be less obviously product-oriented, and fewer are written by in-house marketing people. The main message that comes across is the putative importance of ‘soft / 21st century skills’, which Oxford likes to call ‘global skills’ (along with the assessment of these skills). One post manages to pack three buzzwords into one title: ‘Global Skills – Create Empowered 21st Century Learners’. As with Pearson, ‘engagement / engaging’ is probably the most over-used word in the last twelve months. In the social and emotional area, OUP focuses on teacher well-being, rather than mindfulness (although, of course, mindfulness is a path to this well-being). There is also an interest in inquiry-based learning, literacies (digital and assessment), formative assessment and blended learning.

Macmillan English blog

The Macmillan English ‘Advancing Learning’ blog is a much less corporate beast than the Pearson and OUP blogs. There have been relatively few posts in the last twelve months, and no clear message emerges. The last year has seen posts on the Image Conference, preparing for IELTS, student retention, extensive reading, ELF pronunciation, drama, mindfulness, Zoom, EMI, and collaboration skills.

CUP World of Better Learning blog

The CUP blog, like Macmillan’s, is an eclectic affair, with no clearly discernible messages beyond supporting teachers with tips and tools to deal with the shift to online teaching. Motivation and engagement are fairly prominent (with Sarah Mercer contributing both here and at OUP). Well-being (and the inevitable nod to mindfulness) gets a look-in. Other topics include SpLNs, video and ELF pronunciation (with Laura Patsko contributing both here and at the Macmillan site).

Macro trends

My survey has certainly not been ‘scientific’, but I think it allows us to note a few macro-trends. Here are my thoughts:

  • Measurement of language and skills (both learning and teaching skills) has become central to many of our current concerns.
  • We are now much less interested in issues which are unique to language learning and teaching (e.g. task-based learning, the ‘Lexical Approach’, corpora) than we used to be.
  • Current concerns reflect much more closely the major concerns of general education (measurement, 21st century skills, social-emotional learning) than they used to. It is no coincidence that these reflect the priorities of those who shape global educational policy (OECD, World Bank, etc.).
  • 25 years ago, current trends were more like zones of interest. They were areas to explore, research and critique further. As such, we might think of them as areas of exploratory practice (‘Exploratory Practice’ itself was a ‘current trend’ in the mid 1990s). Current ‘current trends’ are much more enshrined. They are things to be implemented, and exploration of them concerns the ‘how’, not the ‘whether’.