Posts Tagged ‘publishers’

Who can tell where a blog post might lead? Over six years ago I wrote about adaptive professional development for teachers. I imagined the possibility of bite-sized, personalized CPD material. Now my vision is becoming real.

For the last two years, I have been working with a start-up that has been using AI to generate text using GPT-3 large language models. GPT-3 has recently been in the news because of the phenomenal success of the newly released ChatGPT. The technology certainly has a wow factor, but it has been around for a while now. ChatGPT can generate texts of various genres on any topic (with a few exceptions like current affairs) and the results are impressive. Imagine, then, how much more impressive the results can be when the kind of text is limited by genre and topic, allowing the software to be trained much more reliably.

This is what we have been working on. We took as our training corpus a huge collection of English language teaching teacher development texts that we could access online: blogs from all the major publishers, personal blogs, transcriptions from recorded conference presentations and webinars, magazine articles directed at teachers, along with books from publishers such as DELTA and Pavilion ELT, etc. We identified topics that seemed to be of current interest and asked our AI to generate blog posts. Later, we were able to get suggestions of topics from the software itself.

We then contacted a number of teachers and trainers who contribute to the publishers’ blogs and contracted them, first, to act as human trainers for the software, and, second, to agree to their names being used as the ‘authors’ of the blog posts we generated. In one or two cases, the authors thought that they had actually written the posts themselves! Next we submitted these posts to the marketing departments of the publishers (who run the blogs). Over twenty were submitted in this way, including:

  • What do teachers need to know about teaching 21st century skills in the English classroom?
  • 5 top ways of improving the well-being of English teachers
  • Teaching leadership skills in the primary English classroom
  • How can we promote eco-literacy in the English classroom?
  • My 10 favourite apps for English language learners

We couldn’t, of course, tell the companies that AI had been used to write the copy, but once we were sure that nobody had ever spotted the true authorship of this material, we were ready to move to the next stage of the project. We approached the marketing executives of two publishers and showed how we could generate teacher development material at a fraction of the current cost and in a fraction of the time. Partnerships were quickly signed.

Blog posts were just the beginning. We knew that we could use the same technology to produce webinar scripts, using learning design insights to optimise the webinars. The challenge we faced was that webinars need a presenter. We experimented with using animations, but feedback indicated that participants like to see a face. This is eminently doable, using our contracted authors and deep fake technology, but costs are still prohibitive. It remains cheaper and easier to use our authors delivering the scripts we have generated. This will no doubt change before too long.

The next obvious step was to personalize the development material. Large publishers collect huge amounts of data about visitors to their sites using embedded pixels. It is also relatively cheap and easy to triangulate this data with information from the customer databases and from activity on social media (especially Facebook). We know what kinds of classes people teach, and we know which aspects of teacher development they are interested in.

Publishers have long been interested in personalizing marketing material, and the possibility of extending this to the delivery of real development content is clearly exciting. (See below an email I received this week from the good folks at OUP marketing.)

Earlier this year one of our publishing partners began sending links to personalized materials of the kind we were able to produce with AI. The experiment was such a success that we have already taken it one stage further.

One of the most important clients of our main publishing partner employs hundreds of teachers to deliver online English classes using courseware that has been tailored to the needs of the institution. With so many freelance teachers working for them, along with high turnover of staff, there is inevitably a pressing need for teacher training to ensure optimal delivery. Since the classes are all online, it is possible to capture precisely what is going on. Using an AI-driven tool that was inspired by the Visible Classroom app (informed by the work of John Hattie), we can identify the developmental needs of the teachers. What kinds of activities are they using? How well do they exploit the functionalities of the platform? What can be said about the quality of their teacher talk? We combine this data with everything else and our proprietary algorithms determine what kinds of training materials each teacher receives. It doesn’t stop there. We can also now evaluate the effectiveness of these materials by analysing the learning outcomes of the students.

Teaching efficacy can by massively increased, whilst the training budget of the institution can be slashed. If all goes well, there will be no further need for teacher trainers at all. We won’t be stopping there. If results such as these can be achieved in teacher training, there’s no reason why the same technology cannot be leveraged for the teaching itself. Most of our partner’s teaching and testing materials are now quickly and very cheaply generated using GPT-3.5. If you want to see how this is done, check out the work of edugo.AI (a free trial is available) which can generate gapfills and comprehension test questions in a flash. As for replacing the teachers, we’re getting there. For the time being, though, it’s more cost-effective to use freelancers and to train them up.

There’s a wonderful promotional video for Pearson English https://www.youtube.com/watch?v=b6o1s8U88N8 that packs as many clichéd slogans and images into one minute as you could ever wish for. Here’s the script:

Great things happen when you dare to dream / Learning is a journey filled with challenge, wonder and discovery / Educators not only inspire the future they also define it / We partner with the learning community to change futures / It’s our passion / Together we can inspire / Together we can empower / Together we can achieve / Change is happening all around us, faster than ever / Let’s empower change / There’s an exciting future ahead / Expect great things / Pearson English / Dare to learn, dare to change / Pearson always learning

How futures will be changed, what exactly can be inspired or empowered, what great things we can expect, what we might dare to learn or change – all these minor details are left unanswered. It is a wonderful example of advertising language, aka bullshit, defined by philosopher Harry Frankfurt (2005) as discourse that is only intended to persuade, without any concern for truth. It’s language where meanings are not shared, but where emotional responses are desired.

Pearson refers to its slogan ‘Always learning’ as their ‘brand tagline’. It is, they say, ‘Short, bold, and memorable, “Always Learning” encapsulates our learners and ourselves. It highlights Pearson’s commitment to constantly be discovering, learning and improving ourselves. And it describes what we enable our learners to do–to keep learning, whenever, wherever and however it suits them, throughout every stage of their lives’. The company provides detailed advice to its employees about how the slogan can be used: when, where, when not, colour combinations, good and bad examples of use, translations, etc. All of this makes for fascinating reading, which, strangely, is available online (at least, for the time being!).

Bullshit is a wise approach in advertising ELT products. If you get too specific / meaningful, you run the risk of coming out with bullshit of the non-philosophical kind. Macmillan English, for example, has the new slogan ‘Advancing learning’ and says: As technology opens new doors for teachers and students, we use our expertise to create products that suit different learning styles and design innovative new tools for teachers and students.

With ELT conference season getting underway in some parts of the world, slogans, clichés and buzzwords are vying for our attention in the marketing of these events. There are ELT conferences of a commercial, predatory kind (‘guaranteed publication of your work in the conference proceedings’) where the slogans are clearly bullshit (in the philosophical sense). The upcoming ‘4th International Conference on Modern Research in Education, Teaching and Learning’ (22 – 24 April in Barcelona) has the marvellous slogan ‘The only of all English language teaching conference’ and can be attended for only €320 (much cheaper if you just want to listen without presenting).

But for conferences organised by teachers’ associations, it would be inaccurate and inappropriate to describe their choice of slogans as bullshit. This doesn’t mean, however (as an entertaining blog post at ELT Rants, Reviews and Reflections in 2015 described them), that they are not ‘buzzword-heavy word salads [that] are rinsed, re-used, and repeated ad nauseum’. Here’s a small current selection for you to take your pick from. The resemblance, in many cases, to the language of the Pearson promo video is striking.

ELT in the digital era and beyond: innovation, engagement, and resilience (ThaiTESOL)

The hybrid transition: emotional, social and educational impacts on language learning (TESOL Kuwait)

Connecting teachers, empowering learners (BBELT)

Innovating changes: a world of diversity (TESOL Spain)

Translanguaging and multilingualism in language teaching (TESOL Arabia)

Inspiring collaboration (BELTA)

For me, the standout slogan is definitely TESOL Arabia, since it is the only one that seems to be about something specific. But perhaps I’m wrong. Both translanguaging and multilingualism can mean quite a few different things. When you put the terms together, my thoughts turn first to questions of social justice, and the idea of a conference in which social justice is discussed in the Hyatt Regency hotel in Dubai is fairly surreal. As in most of these examples, conferences for ELT teachers tend to opt for broad themes which aim to include almost everyone in the field (Raza, 2018) and will usually index innovation, excellence, empowerment, and / or wellbeing.

A good slogan will include words that are themselves sloganized (Schmenk et al., 2019). ‘Innovation’ and ‘empowering’ are good examples here. Neither can truly be understood without familiarity with extensive co-texts which confer connotational meaning and rhetorical force. ‘Change’ (for ‘innovation’) and ‘helping’ (for ‘empowering’) don’t quite have the same heft, even though they basically mean the same.

It’s important that buzzwords don’t mean too much, but the ‘key processing features of successful slogans are simplicity, memorability and emotionality’ (Pavlenko, 2019: 146). By ‘emotionality’, Pavlenko means words that carry an upbeat / positive message. In this sense, TESOL Kuwait’s ‘emotional, social and educational impacts’ all sound rather neutral and academic. I think that ‘engagement, diversity and outcomes’ might resonate better. Similarly, ‘hybrid’ still needs to shake off some negative associations: ‘digital’ sounds more positively modern. Hats off to ThaiTESOL, whose ‘the digital era and beyond’ sounds positively visionary.

Even though slogans shouldn’t mean too much, they only work as slogans ‘as if their meaning were obvious’ (Schmenk et al., 2019: 4). In their exploration of sloganization in language education discourse, Schmenk et al (2019) look at ‘communicative language teaching’, ‘learner autonomy’, ‘innovation’, ‘multiple intelligences’, ‘intercultural / transcultural language learning’, ‘input’, ‘language commodification’ and ‘superdiversity’. In this blog, I’ve considered ‘innovation’, ‘resilience’, ‘translanguaging’ and ‘multilingualism’, among others. These buzzwords come and go – the field of language teaching is as keen on current trends as any other field – and they can usually be grouped into broader trends, which academics like to call ‘turns’. There’s the ‘social turn’ (Block, 2003), the ‘intercultural turn’ (Thorne, 2010), the ‘multilingual turn’ (May, 2013), the ‘critical turn’ (Dasli & Diáz, 2017), the ‘emotional turn’ (White, 2018), and these are just for starters. If you’re quick, you won’t be too late to register for the 2nd International Conference on Linguistic, Literary and Pedagogical Turns at the University of Wah. The conference doesn’t have a slogan, but my suggestion would be ‘The Turn Turn’.

Schmenk et al (2019: 3) note that language education is an inherently interdisciplinary field so it is not surprising to find so many of its current trends drawn from other disciplines. This has not always been the case. If we go back 30 / 40 years, the hot topics included corpora, task-based learning, and lexical approaches. Now, in the choice of slogans, ELT conferences are not dissimilar from other professional conferences in sales and marketing, management and leadership – see for example this website offering advice about organising such events.

Slogans and buzzwords are, of course, a marketing tool for ELT conferences and publishers, but they also play an important role in academic branding – the personal brand you construct for yourself as an academic. Aneta Pavlenko (2019: 1488 – 151) offers a useful set of strategies for this kind of academic branding, but similar strategies can also be used by ELT freelancers

  • Adopt a slogan / buzzword (simple, memorable and positive)
  • Link it to your work (easiest if it was either your idea in the first place or you were one of the first to import the idea into language education)
  • Institutionalize the slogan by organising conferences, courses, journals, supervising dissertations, and so on
  • Recycle the slogan endlessly (especially in the titles of publications)
  • Keep things pretty vague so you can’t be criticised too much
  • Frame the phenomenon in question with words like ‘radical’, ‘unprecedented’, ‘hugely complex’, ‘tremendously important’

Quoting the work of Michael Billig (2013), Pavlenko (2019: 160) suggests that we should not necessarily be asking ourselves what slogans and buzzwords mean. A better question is: what is the person who is using these words attempting to do with them?

My favourite ELT slogan is an anti-slogan slogan. It is Bo Cai Zhong Chang (‘assimilating merits of different teaching approaches for our own use’) which was used in China to advocate for a ‘methodological approach appropriate to the specific sociopolitical realities of the country’ (Feng & Feng, 2001). China has a long history of powerful slogans, of course, with ‘Dare to think, dare to act’ being the key phrase during the Great Leap Forward. Did the people at Pearson have this in mind when they came up with ‘Dare to learn, dare to change’?

References

Billig, M. (2013) Learn to Write Badly: How to Succeed in the Social Sciences. Cambridge: Cambridge University Press

Block, D. (2003) The Social Turn in Second Language Acquisition. Edinburgh: Edinburgh University Press

Dasli, M. & Diáz, A. R. (Eds.) (2017) The Critical Turn in Language and Intercultural Communication Pedagogy. New York: Routledge

Feng, A. & Feng, A. (2001) ‘Bo Cai Zhong Chang’ – A slogan for effective ELT methodology for College English education. English Language Teaching, 1: 1 – 22

Frankfurt, H. G. (2005) On Bullshit. Princeton: Princeton University Press

May. S. (Ed.) (2013) The multilingual turn: Implications for SLA, TESOL and Bilingual education. New York: Routledge

Pavlenko, A. (2019) Superdiversity and Why It Isn’t: Reflections on Terminological Innovation and Academic Branding. In Schmenk, B., Bredibach, S. & Küster, L. (Eds.) Sloganization in Language Education Discourses. Bristol: Multilingual Matters. pp. 142 – 168.

Raza, K. (2018) The Alignment of English Language Teacher Association Conference Themes to Research Agendas: An Investigation of TESOL International Association and IATEFL. In A. Elsheikh et al. (Eds.), The Role of Language Teacher Associations in Professional Development, Second Language Learning and Teaching. Cham: Springer pp. 117 – 129

Schmenk, B., Bredibach, S. & Küster, L. (Eds.) (2019) Sloganization in Language Education Discourses. Bristol: Multilingual Matters.

Thorne, S. L. (2010) The ‘Intercultural Turn’ and Language Learning in the Crucible of the New Media. In Helm, F. & Guth, S. (Eds.) Telecollaboration 2.0 for Language and Intercultural Learning. Bern: Peter Lang. pp. 139 – 164

White C.J. (2018) The Emotional Turn in Applied Linguistics and TESOL: Significance, Challenges and Prospects. In: Martínez Agudo J. (Eds) Emotions in Second Language Teaching. Cham: Springer

Back in December 2013, in an interview with eltjam , David Liu, COO of the adaptive learning company, Knewton, described how his company’s data analysis could help ELT publishers ‘create more effective learning materials’. He focused on what he calls ‘content efficacy[i]’ (he uses the word ‘efficacy’ five times in the interview), a term which he explains below:

A good example is when we look at the knowledge graph of our partners, which is a map of how concepts relate to other concepts and prerequisites within their product. There may be two or three prerequisites identified in a knowledge graph that a student needs to learn in order to understand a next concept. And when we have hundreds of thousands of students progressing through a course, we begin to understand the efficacy of those said prerequisites, which quite frankly were made by an author or set of authors. In most cases they’re quite good because these authors are actually good in what they do. But in a lot of cases we may find that one of those prerequisites actually is not necessary, and not proven to be useful in achieving true learning or understanding of the current concept that you’re trying to learn. This is interesting information that can be brought back to the publisher as they do revisions, as they actually begin to look at the content as a whole.

One commenter on the post, Tom Ewens, found the idea interesting. It could, potentially, he wrote, give us new insights into how languages are learned much in the same way as how corpora have given us new insights into how language is used. Did Knewton have any plans to disseminate the information publicly, he asked. His question remains unanswered.

At the time, Knewton had just raised $51 million (bringing their total venture capital funding to over $105 million). Now, 16 months later, Knewton have launched their new product, which they are calling Knewton Content Insights. They describe it as the world’s first and only web-based engine to automatically extract statistics comparing the relative quality of content items — enabling us to infer more information about student proficiency and content performance than ever before possible.

The software analyses particular exercises within the learning content (and particular items within them). It measures the relative difficulty of individual items by, for example, analysing how often a question is answered incorrectly and how many tries it takes each student to answer correctly. It also looks at what they call ‘exhaustion’ – how much content students are using in a particular area – and whether they run out of content. The software can correlate difficulty with exhaustion. Lastly, it analyses what they call ‘assessment quality’ – how well  individual questions assess a student’s understanding of a topic.

Knewton’s approach is premised on the idea that learning (in this case language learning) can be broken down into knowledge graphs, in which the information that needs to be learned can be arranged and presented hierarchically. The ‘granular’ concepts are then ‘delivered’ to the learner, and Knewton’s software can optimise the delivery. The first problem, as I explored in a previous post, is that language is a messy, complex system: it doesn’t lend itself terribly well to granularisation. The second problem is that language learning does not proceed in a linear, hierarchical way: it is also messy and complex. The third is that ‘language learning content’ cannot simply be delivered: a process of mediation is unavoidable. Are the people at Knewton unaware of the extensive literature devoted to the differences between synthetic and analytic syllabuses, of the differences between product-oriented and process-oriented approaches? It would seem so.

Knewton’s ‘Content Insights’ can only, at best, provide some sort of insight into the ‘language knowledge’ part of any learning content. It can say nothing about the work that learners do to practise language skills, since these are not susceptible to granularisation: you simply can’t take a piece of material that focuses on reading or listening and analyse its ‘content efficacy at the concept level’. Because of this, I predicted (in the post about Knowledge Graphs) that the likely focus of Knewton’s analytics would be discrete item, sentence-level grammar (typically tenses). It turns out that I was right.

Knewton illustrate their new product with screen shots such as those below.

Content-Insight-Assessment-1

 

 

 

 

 

Content-Insight-Exhaustion-1

 

 

 

 

 

 

 

They give a specific example of the sort of questions their software can answer. It is: do students generally find the present simple tense easier to understand than the present perfect tense? Doh!

It may be the case that Knewton Content Insights might optimise the presentation of this kind of grammar, but optimisation of this presentation and practice is highly unlikely to have any impact on the rate of language acquisition. Students are typically required to study the present perfect at every level from ‘elementary’ upwards. They have to do this, not because the presentation in, say, Headway, is not optimised. What they need is to spend a significantly greater proportion of their time on ‘language use’ and less on ‘language knowledge’. This is not just my personal view: it has been extensively researched, and I am unaware of any dissenting voices.

The number-crunching in Knewton Content Insights is unlikely, therefore, to lead to any actionable insights. It is, however, very likely to lead (as writer colleagues at Pearson and other publishers are finding out) to an obsession with measuring the ‘efficacy’ of material which, quite simply, cannot meaningfully be measured in this way. It is likely to distract from much more pressing issues, notably the question of how we can move further and faster away from peddling sentence-level, discrete-item grammar.

In the long run, it is reasonable to predict that the attempt to optimise the delivery of language knowledge will come to be seen as an attempt to tackle the wrong question. It will make no significant difference to language learners and language learning. In the short term, how much time and money will be wasted?

[i] ‘Efficacy’ is the buzzword around which Pearson has built its materials creation strategy, a strategy which was launched around the same time as this interview. Pearson is a major investor in Knewton.

‘Sticky’ – as in ‘sticky learning’ or ‘sticky content’ (as opposed to ‘sticky fingers’ or a ‘sticky problem’) – is itself fast becoming a sticky word. If you check out ‘sticky learning’ on Google Trends, you’ll see that it suddenly spiked in September 2011, following the slightly earlier appearance of ‘sticky content’. The historical rise in this use of the word coincides with the exponential growth in the number of references to ‘big data’.

I am often asked if adaptive learning really will take off as a big thing in language learning. Will adaptivity itself be a sticky idea? When the question is asked, people mean the big data variety of adaptive learning, rather than the much more limited adaptivity of spaced repetition algorithms, which, I think, is firmly here and here to stay. I can’t answer the question with any confidence, but I recently came across a book which suggests a useful way of approaching the question.

41u+NEyWjnL._SY344_BO1,204,203,200_‘From the Ivory Tower to the Schoolhouse’ by Jack Schneider (Harvard Education Press, 2014) investigates the reasons why promising ideas from education research fail to get taken up by practitioners, and why other, less-than-promising ideas, from a research or theoretical perspective, become sticky quite quickly. As an example of the former, Schneider considers Robert Sternberg’s ‘Triarchic Theory’. As an example of the latter, he devotes a chapter to Howard Gardner’s ‘Multiple Intelligences Theory’.

Schneider argues that educational ideas need to possess four key attributes in order for teachers to sit up, take notice and adopt them.

  1. perceived significance: the idea must answer a question central to the profession – offering a big-picture understanding rather than merely one small piece of a larger puzzle
  2. philosophical compatibility: the idea must clearly jibe with closely held [teacher] beliefs like the idea that teachers are professionals, or that all children can learn
  3. occupational realism: it must be possible for the idea to be put easily into immediate use
  4. transportability: the idea needs to find its practical expression in a form that teachers can access and use at the time that they need it – it needs to have a simple core that can travel through pre-service coursework, professional development seminars, independent study and peer networks

To what extent does big data adaptive learning possess these attributes? It certainly comes up trumps with respect to perceived significance. The big question that it attempts to answer is the question of how we can make language learning personalized / differentiated / individualised. As its advocates never cease to remind us, adaptive learning holds out the promise of moving away from a one-size-fits-all approach. The extent to which it can keep this promise is another matter, of course. For it to do so, it will never be enough just to offer different pathways through a digitalised coursebook (or its equivalent). Much, much more content will be needed: at least five or six times the content of a one-size-fits-all coursebook. At the moment, there is little evidence of the necessary investment into content being made (quite the opposite, in fact), but the idea remains powerful nevertheless.

When it comes to philosophical compatibility, adaptive learning begins to run into difficulties. Despite the decades of edging towards more communicative approaches in language teaching, research (e.g. the research into English teaching in Turkey described in a previous post), suggests that teachers still see explanation and explication as key functions of their jobs. They believe that they know their students best and they know what is best for them. Big data adaptive learning challenges these beliefs head on. It is no doubt for this reason that companies like Knewton make such a point of claiming that their technology is there to help teachers. But Jose Ferreira doth protest too much, methinks. Platform-delivered adaptive learning is a direct threat to teachers’ professionalism, their salaries and their jobs.

Occupational realism is more problematic still. Very, very few language teachers around the world have any experience of truly blended learning, and it’s very difficult to envisage precisely what it is that the teacher should be doing in a classroom. Publishers moving towards larger-scale blended adaptive materials know that this is a big problem, and are actively looking at ways of packaging teacher training / teacher development (with a specific focus on blended contexts) into the learner-facing materials that they sell. But the problem won’t go away. Education ministries have a long history of throwing money at technological ‘solutions’ without thinking about obtaining the necessary buy-in from their employees. It is safe to predict that this is something that is unlikely to change. Moreover, learning how to become a blended teacher is much harder than learning, say, how to make good use of an interactive whiteboard. Since there are as many different blended adaptive approaches as there are different educational contexts, there cannot be (irony of ironies) a one-size-fits-all approach to training teachers to make good use of this software.

Finally, how transportable is big data adaptive learning? Not very, is the short answer, and for the same reasons that ‘occupational realism’ is highly problematic.

Looking at things through Jack Schneider’s lens, we might be tempted to come to the conclusion that the future for adaptive learning is a rocky path, at best. But Schneider doesn’t take political or economic considerations into account. Sternberg’s ‘Triarchic Theory’ never had the OECD or the Gates Foundation backing it up. It never had millions and millions of dollars of investment behind it. As we know from political elections (and the big data adaptive learning issue is a profoundly political one), big bucks can buy opinions.

It may also prove to be the case that the opinions of teachers don’t actually matter much. If the big adaptive bucks can win the educational debate at the highest policy-making levels, teachers will be the first victims of the ‘creative disruption’ that adaptivity promises. If you don’t believe me, just look at what is going on in the U.S.

There are causes for concern, but I don’t want to sound too alarmist. Nobody really has a clue whether big data adaptivity will actually work in language learning terms. It remains more of a theory than a research-endorsed practice. And to end on a positive note, regardless of how sticky it proves to be, it might just provide the shot-in-the-arm realisation that language teachers, at their best, are a lot more than competent explainers of grammar or deliverers of gap-fills.

FluentU, busuu, Bliu Bliu … what is it with all the ‘u’s? Hong-Kong based FluentU used to be called FluentFlix, but they changed their name a while back. The service for English learners is relatively new. Before that, they focused on Chinese, where the competition is much less fierce.

At the core of FluentU is a collection of short YouTube videos, which are sorted into 6 levels and grouped into 7 topic categories. The videos are accompanied by transcriptions. As learners watch a video, they can click on any word in the transcript. This will temporarily freeze the video and show a pop-up which offers a definition of the word, information about part of speech, a couple of examples of this word in other sentences, and more example sentences of the word from other videos that are linked on FluentU. These can, in turn, be clicked on to bring up a video collage of these sentences. Learners can click on an ‘Add to Vocab’ button, which will add the word to personalised vocabulary lists. These are later studied through spaced repetition.

FluentU describes its approach in the following terms: FluentU selects the best authentic video content from the web, and provides the scaffolding and support necessary to bring that authentic content within reach for your students. It seems appropriate, therefore, to look first at the nature of that content. At the moment, there appear to be just under 1,000 clips which are allocated to levels as follows:

Newbie 123 Intermediate 294 Advanced 111
Elementary 138 Upper Int 274 Native 40

It has to be assumed that the amount of content will continue to grow, but, for the time being, it’s not unreasonable to say that there isn’t a lot there. I looked at the Upper Intermediate level where the shortest was 32 seconds long, the longest 4 minutes 34 seconds, but most were between 1 and 2 minutes. That means that there is the equivalent of about 400 minutes (say, 7 hours) for this level.

The actual amount that anyone would want to watch / study can be seen to be significantly less when the topics are considered. These break down as follows:

Arts & entertainment 105 Everyday life 60 Science & tech 17
Business 34 Health & lifestyle 28
Culture 29 Politics & society 6

The screenshots below give an idea of the videos on offer:

menu1menu2

I may be a little difficult, but there wasn’t much here that appealed. Forget the movie trailers for crap movies, for a start. Forget the low level business stuff, too. ‘The History of New Year’s Resolutions’ looked promising, but turned out to be a Wikipedia style piece. FluentU certainly doesn’t have the eye for interesting, original video content of someone like Jamie Keddie or Kieran Donaghy.

But, perhaps, the underwhelming content is of less importance than what you do with it. After all, if you’re really interested in content, you can just go to YouTube and struggle through the transcriptions on your own. The transcripts can be downloaded as pdfs, which, strangely are marked with a FluentU copyright notice.copyright FluentU doesn’t need to own the copyright of the videos, because they just provide links, but claiming copyright for someone else’s script seemed questionable to me. Anyway, the only real reason to be on this site is to learn some vocabulary. How well does it perform?

fluentu1

Level is self-selected. It wasn’t entirely clear how videos had been allocated to level, but I didn’t find any major discrepancies between FluentU’s allocation and my own, intuitive grading of the content. Clicking on words in the transcript, the look-up / dictionary function wasn’t too bad, compared to some competing products I have looked at. The system could deal with some chunks and phrases (e.g. at your service, figure out) and the definitions were appropriate to the way these had been used in context. The accuracy was far from consistent, though. Some definitions were harder than the word they were explaining (e.g. telephone = an instrument used to call someone) and some were plain silly (e.g. the definition of I is me).

have_been_definitionSome chunks were not recognised, so definitions were amusingly wonky. Come out, get through and have been were all wrong. For the phrase talk her into it, the program didn’t recognise the phrasal verb, and offered me communicate using speech for talk, and to the condition, state or form of for into.

For many words, there are pictures to help you with the meaning, but you wonder about some of them, e.g. the picture of someone clutching a suitcase to illustrate the meaning of of, or a woman holding up a finger and thumb to illustrate the meaning of what (as a pronoun).what_definition

The example sentences don’t seem to be graded in any way and are not always useful. The example sentences for of, for example, are The pages of the book are ripped, the lemurs of Madagascar and what time of day are you free. Since the definition is given as belonging to, there seems to be a problem with, at least, the last of these examples!

With the example sentence that link you to other video examples of this word being used, I found that it took a long time to load … and it really wasn’t worth waiting for.

After a catalogue of problems like this, you might wonder how I can say that this function wasn’t too bad, but I’ve seen a lot worse. It was, at least, mostly accurate.

Moving away from the ‘Watch’ options, I explored the ‘Learn’ section. Bearing in mind that I had described myself as ‘Upper Intermediate’, I was surprised to be offered the following words for study: Good morning, may, help, think, so. This then took me to the following screen:great job

I was getting increasingly confused. After watching another video, I could practise some of the words I had highlighted, but, again, I wasn’t sure quite what was going on. There was a task that asked me to ‘pick the correct translation’, but this was, in fact a multiple choice dictation task.translation task

Next, I was asked to study the meaning of the word in, followed by an unhelpful gap-fill task:gap fill

Confused? I was. I decided to look for something a little more straightforward, and clicked on a menu of vocabulary flash cards that I could import. These included sets based on copyright material from both CUP and OUP, and I wondered what these publishers might think of their property being used in this way.flashcards

FluentU claims  that it is based on the following principles:

  1. Individualized scaffolding: FluentU makes language learning easy by teaching new words with vocabulary students already know.
  2. Mastery Learning: FluentU sets students up for success by making sure they master the basics before moving on to more advanced topics.
  3. Gamification: FluentU incorporates the latest game design mechanics to make learning fun and engaging.
  4. Personalization: Each student’s FluentU experience is unlike anyone else’s. Video clips, examples, and quizzes are picked to match their vocabulary and interests.

The ‘individualized scaffolding’ is no more than common sense, dressed up in sciency-sounding language. The reference to ‘Mastery Learning’ is opaque, to say the least, with some confusion between language features and topic. The gamification is rudimentary, and the personalization is pretty limited. It doesn’t come cheap, either.

price table

Let’s take a look at the business of adaptive learning from a publisher’s perspective. Not an ELT publisher, but someone a few rungs higher on the ladder with strategic responsibilities. You might not know a great deal about ELT. It is, after all, only one of a number of divisions you are responsible for, and not an especially profitable one at that. You will, however, know a lot about creative destruction, the process by which one industry is replaced by another. The decline and demise of printed magazines, newspapers and books, of book reviewers and traditional booksellers, and their replacement by digital products will be such a part of your professional knowledge that they hardly need to be mentioned. Graphs such as the one below from PricewaterhouseCoopers (PwC) will be terribly familiar. You will also be aware that the gales of creative destruction in publishing are blowing harder than ever before.

2014-03-31_1020

In fact, you probably owe your job to your ability to talk convincingly about creative destruction and how to profit from it. Whatever your particular strategy for the future might be, you will have noted the actions of others. You will have evaluated advice, such as the following, from Publishing Perspectives

  • Do not delay in taking action when there are clear signals of a decline in market value.
  • Trade your low-profit print assets (even though some may have limited digital value) for core product that has a higher likelihood of success and can be exploited digitally.
  • Look for an orderly transition from print to digital product which enables a company to reinvent itself.

You will be looking to invest in technology, and prioritizing the acquisition of technological expertise (through partnerships or the purchase of start-ups) over the development of traditional ELT products. Your company will be restructured, and possibly renamed, to facilitate the necessary changes.

You will also know that big data and analytics have already transformed other industries. And you will know that educational publishing is moving towards a winner-take-all business market, where ‘the best performers are able to capture a very large share of the rewards, and the remaining competitors are left with very little’ (Investopedia). Erik Brynjolfsson and Andrew McAfee’s new book, The Second Machine Age (New York: Norton, 2014), argues that ‘each time a market becomes more digital, winner-take-all economics become a little more compelling …Digitization creates winner-take-all markets because [there are] enormous economies of scale, giving the market leader a huge cost advantage and room to beat the price of any competitor while still making a good profit’ (pp.153-155).

the second machine age

It is in this light that we need to understand the way that companies like Pearson and Macmillan are banking everything on a digital future. Laurie Harrison’s excellent blog post at eltjam  summarises the Pearson position: ‘the world’s biggest education publisher is spending £150m on a total restructure which involves an immediate move to digital learning, a focus on emerging markets, and a transformation from publisher to education services provider. If the English language learning market is worth $4billion a year, then Pearson still only have a very small chunk of it. And if you’re a company as successful and ambitious as Pearson, that just isn’t good enough – so a change of direction is needed. In order to deliver this change, the company have recently announced their new senior management team.’

Adaptive learning fits the new business paradigm perfectly. If the hype is to be believed, adaptive learning will be a game-changer. ‘The shifting of education from analog to digital is a one-time event in the history of the human race. At scale, it will have as big an effect on the world as indoor plumbing or electricity,’ writes Jose Ferreira of Knewton. ‘True disruption,’ he says elsewhere, ‘happens when entrepreneurs aim big and go after a giant problem, a problem that, if solved, would usher in an era of large-scale transformation across industries and nations. … Education is the last of the information industries to move online,’ he goes on. ‘When it breaks, it breaks fast. And that’s going to happen in the next five years. All the education content will go online in the next 10 years. And textbooks will go away. … Ultimately, all learning materials will be digital and they will all be adaptive.’

Ferreira clearly knows all about creative disruption. He also knows about winner-take-all markets. ‘The question is who is going to power [the] platform,’ he writes. ‘It’s probably going to be one or two companies’. He states his ambition for Knewton very clearly: ‘Knewton’s goal is to be like Amazon Web Services for education’. ‘It’s pretty clear to us,’ he writes, ‘that there’s going to be one dominant data platform for education, the way there’s one dominant data platform for search, social media, etailing. But in education, it’s going to be even more winner-take-all; there will be a number of companies that make up the platform, like Wintel. People might make a perverse choice to use Bing for search because they don’t like Google. But no one’s going to make the choice to subject their kid to the second-best adaptive learning platform, if that means there’s a 23% structural disadvantage. The data platform industries tend to have a winner-take-all dynamic. You take that and multiply it by a very, very high-stakes product and you get an even more winner-take-all dynamic.’

What is at stake in this winner-take-all market? Over to Jose Ferreira one more time: ‘The industry is massive. It’s so massive that virtually nobody I’ve met truly grasps how big it is. It’s beyond their frame of reference. The total amount of money (both public and private) spent annually exceeds all spending, both online and offline, of every other information industry combined: that is, all media, entertainment, games, news, software, Internet and mobile media, e-tailing, etc.’

But, still, a few questions continue to nag away at me. If all of this is so certain, why does Jose Ferreira feel the need to talk about it so much? If all of this is so certain, why don’t all the ELT publishers jump on the bandwagon? What sort of track record does economic forecasting have, anyway?

busuu is an online language learning service. I did not refer to it in the ‘guide’ because it does not seem to use any adaptive learning software yet, but this is set to change. According to founder Bernhard Niesner, the company is already working on incorporation of adaptive software.

A few statistics will show the significance of busuu. The site currently has over 40 million users (El Pais, 8 February 2014) and is growing by 40,000 a day. The basic service is free, but the premium service costs Euro 69.99 a year. The company will not give detailed user statistics, but say that ‘hundreds of thousands’ are paying for the premium service, that turnover was a 7-figure number last year and will rise to 8 figures this year.

It is easy to understand why traditional publishers might be worried about competition like busuu and why they are turning away from print-based courses.

Busuu offers 12 languages, but, as a translation-based service, any one of these languages can only be studied if you speak one of the other languages on offer. The levels of the different courses are tagged to the CEFR.

busuuframe

In some ways, busuu is not so different from competitors like duolingo. Students are presented with bilingual vocabulary sets, accompanied by pictures, which are tested in a variety of ways. As with duolingo, some of this is a little strange. For German at level A1, I did a vocabulary set on ‘pets’ which presented the German words for a ferret, a tortoise and a guinea-pig, among others. There are dialogues, which are both written and recorded, that are sometimes surreal.

Child: Mum, look over there, there’s a dog without a collar, can we take it?

Mother: No, darling, our house is too small to have a dog.

Child: Mum your bedroom is very big, it can sleep with dad and you.

Mother: Come on, I’ll buy you a toy dog.

The dialogues are followed up by multiple choice questions which test your memory of the dialogue. There are also writing exercises where you are given a picture from National Geographic and asked to write about it. It’s not always clear what one is supposed to write. What would you say about a photo that showed a large number of parachutes in the sky, beyond ‘I can see a lot of parachutes’?

There are also many gamification elements. There is a learning carrot where you can set your own learning targets and users can earn ‘busuuberries’ which can then be traded in for animations in a ‘language garden’.

2014-02-25_0911

But in one significant respect, busuu differs from its competitors. It combines the usual vocabulary, grammar and dialogue work with social networking. Users can interact with text or video, and feedback on written work comes from other users. My own experience with this was mixed, but the potential is clear. Feedback on other learners’ work is encouraged by the awarding of ‘busuuberries’.

We will have to wait and see what busuu does with adaptive software and what it will do with the big data it is generating. For the moment, its interest lies in illustrating what could be done with a learning platform and adaptive software. The big ELT publishers know they have a new kind of competition and, with a lot more money to invest than busuu, we have to assume that what they will launch a few years from now will do everything that busuu does, and more. Meanwhile, busuu are working on site redesign and adaptivity. They would do well, too, to sort out their syllabus!

Given what we know, it is possible to make some predictions about what the next generation of adult ELT materials will be like when they emerge a few years from now. Making predictions is always a hazardous game, but there are a number of reasonable certainties that can be identified, based on the statements and claims of the major publishers and software providers.

1 Major publishers will move gradually away from traditional coursebooks (whether in print or ebook format) towards the delivery of learning content on learning platforms. At its most limited, this will be in the form of workbook-style material with an adaptive element. At its most developed, this will be in the form of courses that can be delivered entirely without traditional coursebooks. These will allow teachers or institutions to decide the extent to which they wish to blend online and face-to-face instruction.

2 The adaptive elements of these courses will focus primarily or exclusively on discrete item grammar, vocabulary, functional language and phonology, since these lend themselves most readily to the software. These courses will be targeted mainly at lower level (B1 and below) learners.

3 The methodological approach of these courses will be significantly influenced by the expectations of the markets where they are predicted to be most popular and most profitable: South and Central America, the Arabian Gulf and Asia.

4 These courses will permit multiple modifications to suit local requirements. They will also allow additional content to be uploaded.

5 Assessment will play an important role in the design of all these courses. Things like discrete item grammar, vocabulary, functional language and phonology, which lend themselves most readily to assessment, will be prioritized over language skills, which are harder to assess.

6 The discrete items of language that are presented will be tagged to level descriptors, using scales like the Common European Framework or English Profile.

7 Language skills work will be included, but only in the more sophisticated (and better-funded) projects will these components be closely tied to the adaptive software.

8 Because of technological differences between different parts of the world, adaptive courses will co-exist with closely related, more traditional print (or ebook) courses.

9 Training for teachers (especially concerning blended learning) will become an increasingly important part of the package sold by the major publishers.

10 These courses will be more than ever driven by the publishers’ perceptions of what the market wants. There will be a concomitant decrease in the extent to which individual authors, or author teams, influence the material.

knewton-lg

Adaptive learning is a product to be sold. How?

1 Individualised learning

In the vast majority of contexts, language teaching is tied to a ‘one-size-fits-all’ model. This is manifested in institutional and national syllabuses which provide lists of structures and / or competences that all students must master within a given period of time. It is usually actualized in the use of coursebooks, often designed for ‘global markets’. Reaction against this model has been common currency for some time, and has led to a range of suggestions for alternative approaches (such as DOGME), none of which have really caught on. The advocates of adaptive learning programs have tapped into this zeitgeist and promise ‘truly personalized learning’. Atomico, a venture capital company that focuses on consumer technologies, and a major investor in Knewton, describes the promise of adaptive learning in the following terms: ‘Imagine lessons that adapt on-the-fly to the way in which an individual learns, and powerful predictive analytics that help teachers differentiate instruction and understand what each student needs to work on and why[1].’

This is a seductive message and is often framed in such a way that disagreement seems impossible. A post on one well-respected blog, eltjam, which focuses on educational technology in language learning, argued the case for adaptive learning very strongly in July 2013: ‘Adaptive Learning is a methodology that is geared towards creating a learning experience that is unique to each individual learner through the intervention of computer software. Rather than viewing learners as a homogenous collective with more or less identical preferences, abilities, contexts and objectives who are shepherded through a glossy textbook with static activities/topics, AL attempts to tap into the rich meta-data that is constantly being generated by learners (and disregarded by educators) during the learning process. Rather than pushing a course book at a class full of learners and hoping that it will (somehow) miraculously appeal to them all in a compelling, salubrious way, AL demonstrates that the content of a particular course would be more beneficial if it were dynamic and interactive. When there are as many responses, ideas, personalities and abilities as there are learners in the room, why wouldn’t you ensure that the content was able to map itself to them, rather than the other way around?[2]

Indeed. But it all depends on what, precisely, the content is – a point I will return to in a later post. For the time being, it is worth noting the prominence that this message is given in the promotional discourse. It is a message that is primarily directed at teachers. It is more than a little disingenuous, however, because teachers are not the primary targets of the promotional discourse, for the simple reason that they are not the ones with purchasing power. The slogan on the homepage of the Knewton website shows clearly who the real audience is: ‘Every education leader needs an adaptive learning infrastructure’[3].

2 Learning outcomes and testing

Education leaders, who are more likely these days to come from the world of business and finance than the world of education, are currently very focused on two closely interrelated topics: the need for greater productivity and accountability, and the role of technology. They generally share the assumption of other leaders in the World Economic Forum that ICT is the key to the former and ‘the key to a better tomorrow’ (Spring, Education Networks, 2012, p.52). ‘We’re at an important transition point,’ said Arne Duncan, the U.S. Secretary of Education in 2010, ‘we’re getting ready to move from a predominantly print-based classroom to a digital learning environment’ (quoted by Spring, 2012, p.58). Later in the speech, which was delivered at the time as the release of the new National Education Technology Plan, Duncan said ‘just as technology has increased productivity in the business world, it is an essential tool to help boost educational productivity’. The plan outlines how this increased productivity could be achieved: we must start ‘with being clear about the learning outcomes we expect from the investments we make’ (Office of Educational Technology, Transforming American Education: Learning Powered by Technology, U.S. Department of Education, 2010). The greater part of the plan is devoted to discussion of learning outcomes and assessment of them.

Learning outcomes (and their assessment) are also at the heart of ‘Asking More: the Path to Efficacy’ (Barber and Rizvi (eds), Asking More: the Path to Efficacy Pearson, 2013), Pearson’s blueprint for the future of education. According to John Fallon, the CEO of Pearson, ‘our focus should unfalteringly be on honing and improving the learning outcomes we deliver’ (Barber and Rizvi, 2013, p.3). ‘High quality learning’ is associated with ‘a relentless focus on outcomes’ (ibid, p.3) and words like ‘measuring / measurable’, ‘data’ and ‘investment’ are almost as salient as ‘outcomes’. A ‘sister’ publication, edited by the same team, is entitled ‘The Incomplete Guide to Delivering Learning Outcomes’ (Barber and Rizvi (eds), Pearson, 2013) and explores further Pearson’s ambition to ‘become the world’s leading education company’ and to ‘deliver learning outcomes’.

It is no surprise that words like ‘outcomes’, ‘data’ and ‘measure’ feature equally prominently in the language of adaptive software companies like Knewton (see, for example, the quotation from Jose Ferreira, CEO of Knewton, in an earlier post). Adaptive software is premised on the establishment and measurement of clearly defined learning outcomes. If measurable learning outcomes are what you’re after, it’s hard to imagine a better path to follow than adaptive software. If your priorities include standards and assessment, it is again hard to imagine an easier path to follow than adaptive software, which was used in testing long before its introduction into instruction. As David Kuntz, VP of research at Knewton and, before that, a pioneer of algorithms in the design of tests, points out, ‘when a student takes a course powered by Knewton, we are continuously evaluating their performance, what others have done with that material before, and what [they] know’[4]. Knewton’s claim that every education leader needs an adaptive learning infrastructure has a powerful internal logic.

3 New business models

‘Adapt or die’ (a phrase originally coined by the last prime minister of apartheid South Africa) is a piece of advice that is often given these days to both educational institutions and publishers. British universities must adapt or die, according to Michael Barber, author of ‘An Avalanche is Coming[5]’ (a report commissioned by the British Institute for Public Policy Research), Chief Education Advisor to Pearson, and editor of the Pearson ‘Efficacy’ document (see above). ELT publishers ‘must change or die’, reported the eltjam blog[6], and it is a message that is frequently repeated elsewhere. The move towards adaptive learning is seen increasingly often as one of the necessary adaptations for both these sectors.

The problems facing universities in countries like the U.K. are acute. Basically, as the introduction to ‘An Avalanche is Coming’ puts it, ‘the traditional university is being unbundled’. There are a number of reasons for this including the rising cost of higher education provision, greater global competition for the same students, funding squeezes from central governments, and competition from new educational providers (such as MOOCs). Unsurprisingly, universities (supported by national governments) have turned to technology, especially online course delivery, as an answer to their problems. There are two main reasons for this. Firstly, universities have attempted to reduce operating costs by looking for increases in scale (through mergers, transnational partnerships, international branch campuses and so on). Mega-universities are growing, and there are thirty-three in Asia alone (Selwyn Education in a Digital World New York: Routledge 2013, p.6). Universities like the Turkish Anadolu University, with over one million students, are no longer exceptional in terms of scale. In this world, online educational provision is a key element. Secondly, and not to put too fine a point on it, online instruction is cheaper (Spring, Education Networks 2012, p.2).

All other things being equal, why would any language department of an institute of higher education not choose an online environment with an adaptive element? Adaptive learning, for the time being at any rate, may be seen as ‘the much needed key to the “Iron Triangle” that poses a conundrum to HE providers; cost, access and quality. Any attempt to improve any one of those conditions impacts negatively on the others. If you want to increase access to a course you run the risk of escalating costs and jeopardising quality, and so on.[7]

Meanwhile, ELT publishers have been hit by rampant pirating of their materials, spiraling development costs of their flagship products and the growth of open educational resources. An excellent blog post by David Wiley[8] explains why adaptive learning services are a heaven-sent opportunity for publishers to modify their business model. ‘While the broad availability of free content and open educational resources have trained internet users to expect content to be free, many people are still willing to pay for services. Adaptive learning systems exploit this willingness by deeply intermingling content and services so that you cannot access one with using the other. Naturally, because an adaptive learning service is comprised of content plus adaptive services, it will be more expensive than static content used to be. And because it is a service, you cannot simply purchase it like you used to buy a textbook. An adaptive learning service is something you subscribe to, like Netflix. […] In short, why is it in a content company’s interest to enable you to own anything? Put simply, it is not. When you own a copy, the publisher completely loses control over it. When you subscribe to content through a digital service (like an adaptive learning service), the publisher achieves complete and perfect control over you and your use of their content.’

Although the initial development costs of building a suitable learning platform with adaptive capabilities are high, publishers will subsequently be able to produce and modify content (i.e. learning materials) much more efficiently. Since content will be mashed up and delivered in many different ways, author royalties will be cut or eliminated. Production and distribution costs will be much lower, and sales and marketing efforts can be directed more efficiently towards the most significant customers. The days of ELT sales reps trying unsuccessfully to get an interview with the director of studies of a small language school or university department are becoming a thing of the past. As with the universities, scale will be everything.


[2]http://www.eltjam.com/adaptive-learning/ (last accessed 13 January 2014)

[3] http://www.knewton.com/ (last accessed 13 January 2014)

[4] MIT Technology Review, November 26, 2012 http://www.technologyreview.com/news/506366/questions-surround-software-that-adapts-to-students/ (last accessed 13 January 2014)

[7] Tim Gifford Taking it Personally: Adaptive Learning July 9, 2013 http://www.eltjam.com/adaptive-learning/ (last accessed January 13, 2014)

[8] David Wiley, Buying our Way into Bondage: the risks of adaptive learning services March 20,2013 http://opencontent.org/blog/archives/2754 (last accessed January 13, 2014)