Posts Tagged ‘ed tech’

About two and a half years ago when I started writing this blog, there was a lot of hype around adaptive learning and the big data which might drive it. Two and a half years are a long time in technology. A look at Google Trends suggests that interest in adaptive learning has been pretty static for the last couple of years. It’s interesting to note that 3 of the 7 lettered points on this graph are Knewton-related media events (including the most recent, A, which is Knewton’s latest deal with Hachette) and 2 of them concern McGraw-Hill. It would be interesting to know whether these companies follow both parts of Simon Cowell’s dictum of ‘Create the hype, but don’t ever believe it’.

Google_trends

A look at the Hype Cycle (see here for Wikipedia’s entry on the topic and for criticism of the hype of Hype Cycles) of the IT research and advisory firm, Gartner, indicates that both big data and adaptive learning have now slid into the ‘trough of disillusionment’, which means that the market has started to mature, becoming more realistic about how useful the technologies can be for organizations.

A few years ago, the Gates Foundation, one of the leading cheerleaders and financial promoters of adaptive learning, launched its Adaptive Learning Market Acceleration Program (ALMAP) to ‘advance evidence-based understanding of how adaptive learning technologies could improve opportunities for low-income adults to learn and to complete postsecondary credentials’. It’s striking that the program’s aims referred to how such technologies could lead to learning gains, not whether they would. Now, though, with the publication of a report commissioned by the Gates Foundation to analyze the data coming out of the ALMAP Program, things are looking less rosy. The report is inconclusive. There is no firm evidence that adaptive learning systems are leading to better course grades or course completion. ‘The ultimate goal – better student outcomes at lower cost – remains elusive’, the report concludes. Rahim Rajan, a senior program office for Gates, is clear: ‘There is no magical silver bullet here.’

The same conclusion is being reached elsewhere. A report for the National Education Policy Center (in Boulder, Colorado) concludes: Personalized Instruction, in all its many forms, does not seem to be the transformational technology that is needed, however. After more than 30 years, Personalized Instruction is still producing incremental change. The outcomes of large-scale studies and meta-analyses, to the extent they tell us anything useful at all, show mixed results ranging from modest impacts to no impact. Additionally, one must remember that the modest impacts we see in these meta-analyses are coming from blended instruction, which raises the cost of education rather than reducing it (Enyedy, 2014: 15 -see reference at the foot of this post). In the same vein, a recent academic study by Meg Coffin Murray and Jorge Pérez (2015, ‘Informing and Performing: A Study Comparing Adaptive Learning to Traditional Learning’) found that ‘adaptive learning systems have negligible impact on learning outcomes’.

future-ready-learning-reimagining-the-role-of-technology-in-education-1-638In the latest educational technology plan from the U.S. Department of Education (‘Future Ready Learning: Reimagining the Role of Technology in Education’, 2016) the only mentions of the word ‘adaptive’ are in the context of testing. And the latest OECD report on ‘Students, Computers and Learning: Making the Connection’ (2015), finds, more generally, that information and communication technologies, when they are used in the classroom, have, at best, a mixed impact on student performance.

There is, however, too much money at stake for the earlier hype to disappear completely. Sponsored cheerleading for adaptive systems continues to find its way into blogs and national magazines and newspapers. EdSurge, for example, recently published a report called ‘Decoding Adaptive’ (2016), sponsored by Pearson, that continues to wave the flag. Enthusiastic anecdotes take the place of evidence, but, for all that, it’s a useful read.

In the world of ELT, there are plenty of sales people who want new products which they can call ‘adaptive’ (and gamified, too, please). But it’s striking that three years after I started following the hype, such products are rather thin on the ground. Pearson was the first of the big names in ELT to do a deal with Knewton, and invested heavily in the company. Their relationship remains close. But, to the best of my knowledge, the only truly adaptive ELT product that Pearson offers is the PTE test.

Macmillan signed a contract with Knewton in May 2013 ‘to provide personalized grammar and vocabulary lessons, exam reviews, and supplementary materials for each student’. In December of that year, they talked up their new ‘big tree online learning platform’: ‘Look out for the Big Tree logo over the coming year for more information as to how we are using our partnership with Knewton to move forward in the Language Learning division and create content that is tailored to students’ needs and reactive to their progress.’ I’ve been looking out, but it’s all gone rather quiet on the adaptive / platform front.

In September 2013, it was the turn of Cambridge to sign a deal with Knewton ‘to create personalized learning experiences in its industry-leading ELT digital products for students worldwide’. This year saw the launch of a major new CUP series, ‘Empower’. It has an online workbook with personalized extra practice, but there’s nothing (yet) that anyone would call adaptive. More recently, Cambridge has launched the online version of the 2nd edition of Touchstone. Nothing adaptive there, either.

Earlier this year, Cambridge published The Cambridge Guide to Blended Learning for Language Teaching, edited by Mike McCarthy. It contains a chapter by M.O.Z. San Pedro and R. Baker on ‘Adaptive Learning’. It’s an enthusiastic account of the potential of adaptive learning, but it doesn’t contain a single reference to language learning or ELT!

So, what’s going on? Skepticism is becoming the order of the day. The early hype of people like Knewton’s Jose Ferreira is now understood for what it was. Companies like Macmillan got their fingers badly burnt when they barked up the wrong tree with their ‘Big Tree’ platform.

Noel Enyedy captures a more contemporary understanding when he writes: Personalized Instruction is based on the metaphor of personal desktop computers—the technology of the 80s and 90s. Today’s technology is not just personal but mobile, social, and networked. The flexibility and social nature of how technology infuses other aspects of our lives is not captured by the model of Personalized Instruction, which focuses on the isolated individual’s personal path to a fixed end-point. To truly harness the power of modern technology, we need a new vision for educational technology (Enyedy, 2014: 16).

Adaptive solutions aren’t going away, but there is now a much better understanding of what sorts of problems might have adaptive solutions. Testing is certainly one. As the educational technology plan from the U.S. Department of Education (‘Future Ready Learning: Re-imagining the Role of Technology in Education’, 2016) puts it: Computer adaptive testing, which uses algorithms to adjust the difficulty of questions throughout an assessment on the basis of a student’s responses, has facilitated the ability of assessments to estimate accurately what students know and can do across the curriculum in a shorter testing session than would otherwise be necessary. In ELT, Pearson and EF have adaptive tests that have been well researched and designed.

Vocabulary apps which deploy adaptive technology continue to become more sophisticated, although empirical research is lacking. Automated writing tutors with adaptive corrective feedback are also developing fast, and I’ll be writing a post about these soon. Similarly, as speech recognition software improves, we can expect to see better and better automated adaptive pronunciation tutors. But going beyond such applications, there are bigger questions to ask, and answers to these will impact on whatever direction adaptive technologies take. Large platforms (LMSs), with or without adaptive software, are already beginning to look rather dated. Will they be replaced by integrated apps, or are apps themselves going to be replaced by bots (currently riding high in the Hype Cycle)? In language learning and teaching, the future of bots is likely to be shaped by developments in natural language processing (another topic about which I’ll be blogging soon). Nobody really has a clue where the next two and a half years will take us (if anywhere), but it’s becoming increasingly likely that adaptive learning will be only one very small part of it.

 

Enyedy, N. 2014. Personalized Instruction: New Interest, Old Rhetoric, Limited Results, and the Need for a New Direction for Computer-Mediated Learning. Boulder, CO: National Education Policy Center. Retrieved 17.07.16 from http://nepc.colorado.edu/publication/personalized-instruction

It’s a good time to be in Turkey if you have digital ELT products to sell. Not so good if you happen to be an English language learner. This post takes a look at both sides of the Turkish lira.

OUP, probably the most significant of the big ELT publishers in Turkey, recorded ‘an outstanding performance’ in the country in the last financial year, making it their 5th largest ELT market. OUP’s annual report for 2013 – 2014 describes the particularly strong demand for digital products and services, a demand which is now influencing OUP’s global strategy for digital resources. When asked about the future of ELT, Peter Marshall , Managing Director of OUP’s ELT Division, suggested that Turkey was a country that could point us in the direction of an answer to the question. Marshall and OUP will be hoping that OUP’s recently launched Digital Learning Platform (DLP) ‘for the global distribution of adult and secondary ELT materials’ will be an important part of that future, in Turkey and elsewhere. I can’t think of any good reason for doubting their belief.

tbl-ipad1OUP aren’t the only ones eagerly checking the pound-lira exchange rates. For the last year, CUP also reported ‘significant sales successes’ in Turkey in their annual report . For CUP, too, it was a year in which digital development has been ‘a top priority’. CUP’s Turkish success story has been primarily driven by a deal with Anadolu University (more about this below) to provide ‘a print and online solution to train 1.7 million students’ using their Touchstone course. This was the biggest single sale in CUP’s history and has inspired publishers, both within CUP and outside, to attempt to emulate the deal. The new blended products will, of course, be adaptive.

Just how big is the Turkish digital ELT pie? According to a 2014 report from Ambient Insight , revenues from digital ELT products reached $32.0 million in 2013. They are forecast to more than double to $72.6 million in 2018. This is a growth rate of 17.8%, a rate which is practically unbeatable in any large economy, and Turkey is the 17th largest economy in the world, according to World Bank statistics .

So, what makes Turkey special?

  • Turkey has a large and young population that is growing by about 1.4% each year, which is equivalent to approximately 1 million people. According to the Turkish Ministry of Education, there are currently about 5.5 million students enrolled in upper-secondary schools. Significant growth in numbers is certain.
  • Turkey is currently in the middle of a government-sponsored $990 million project to increase the level of English proficiency in schools. The government’s target is to position the country as one of the top ten global economies by 2023, the centenary of the Turkish Republic, and it believes that this position will be more reachable if it has a population with the requisite foreign language (i.e. English) skills. As part of this project, the government has begun to introduce English in the 1st grade (previously it was in the 4th grade).
  • The level of English in Turkey is famously low and has been described as a ‘national weakness’. In October/November 2011, the Turkish research institute SETA and the Turkish Ministry for Youth and Sports conducted a large survey across Turkey of 10,174 young citizens, aged 15 to 29. The result was sobering: 59 per cent of the young people said they “did not know any foreign language.” A recent British Council report (2013) found the competence level in English of most (90+%) students across Turkey was evidenced as rudimentary – even after 1000+ hours (estimated at end of Grade 12) of English classes. This is, of course, good news for vendors of English language learning / teaching materials.
  • Turkey has launched one of the world’s largest educational technology projects: the FATIH Project (The Movement to Enhance Opportunities and Improve Technology). One of its objectives is to provide tablets for every student between grades 5 and 12. At the same time, according to the Ambient report , the intention is to ‘replace all print-based textbooks with digital content (both eTextbooks and online courses).’
  • Purchasing power in Turkey is concentrated in a relatively small number of hands, with the government as the most important player. Institutions are often very large. Anadolu University, for example, is the second largest university in the world, with over 2 million students, most of whom are studying in virtual classrooms. There are two important consequences of this. Firstly, it makes scalable, big-data-driven LMS-delivered courses with adaptive software a more attractive proposition to purchasers. Secondly, it facilitates the B2B sales model that is now preferred by vendors (including the big ELT publishers).
  • Turkey also has a ‘burgeoning private education sector’, according to Peter Marshall, and a thriving English language school industry. According to Ambient ‘commercial English language learning in Turkey is a $400 million industry with over 600 private schools across the country’. Many of these are grouped into large chains (see the bullet point above).
  • Turkey is also ‘in the vanguard of the adoption of educational technology in ELT’, according to Peter Marshall. With 36 million internet users, the 5th largest internet population in Europe, and the 3rd highest online engagement in Europe, measured by time spent online, (reported by Sina Afra ), the country’s enthusiasm for educational technology is not surprising. Ambient reports that ‘the growth rate for mobile English educational apps is 27.3%’. This enthusiasm is reflected in Turkey’s thriving ELT conference scene. The most popular conference themes and conference presentations are concerned with edtech. A keynote speech by Esat Uğurlu at the ISTEK schools 3rd international ELT conference at Yeditepe in April 2013 gives a flavour of the current interests. The talk was entitled ‘E-Learning: There is nothing to be afraid of and plenty to discover’.

All of the above makes Turkey a good place to be if you’re selling digital ELT products, even though the competition is pretty fierce. If your product isn’t adaptive, personalized and gamified, you may as well not bother.

What impact will all this have on Turkey’s English language learners? A report co-produced by TEPAV (the Economic Policy Research Foundation of Turkey) and the British Council in November 2013 suggests some of the answers, at least in the school population. The report  is entitled ‘Turkey National Needs Assessment of State School English Language Teaching’ and its Executive Summary is brutally frank in its analysis of the low achievements in English language learning in the country. It states:

The teaching of English as a subject and not a language of communication was observed in all schools visited. This grammar-based approach was identified as the first of five main factors that, in the opinion of this report, lead to the failure of Turkish students to speak/ understand English on graduation from High School, despite having received an estimated 1000+ hours of classroom instruction.

In all classes observed, students fail to learn how to communicate and function independently in English. Instead, the present teacher-centric, classroom practice focuses on students learning how to answer teachers’ questions (where there is only one, textbook-type ‘right’ answer), how to complete written exercises in a textbook, and how to pass a grammar-based test. Thus grammar-based exams/grammar tests (with right/wrong answers) drive the teaching and learning process from Grade 4 onwards. This type of classroom practice dominates all English lessons and is presented as the second causal factor with respect to the failure of Turkish students to speak/understand English.

The problem, in other words, is the curriculum and the teaching. In its recommendations, the report makes this crystal clear. Priority needs to be given to developing a revised curriculum and ‘a comprehensive and sustainable system of in-service teacher training for English teachers’. Curriculum renewal and programmes of teacher training / development are the necessary prerequisites for the successful implementation of a programme of educational digitalization. Unfortunately, research has shown again and again that these take a long time and outcomes are difficult to predict in advance.

By going for digitalization first, Turkey is taking a huge risk. What LMSs, adaptive software and most apps do best is the teaching of language knowledge (grammar and vocabulary), not the provision of opportunities for communicative practice (for which there is currently no shortage of opportunity … it is just that these opportunities are not being taken). There is a real danger, therefore, that the technology will push learning priorities in precisely the opposite direction to that which is needed. Without significant investments in curriculum reform and teacher training, how likely is it that the transmission-oriented culture of English language teaching and learning will change?

Even if the money for curriculum reform and teacher training were found, it is also highly unlikely that effective country-wide approaches to blended learning for English would develop before the current generation of tablets and their accompanying content become obsolete.

Sadly, the probability is, once more, that educational technology will be a problem-changer, even a problem-magnifier, rather than a problem-solver. I’d love to be wrong.

The cheer-leading for big data in education continues unabated. Almost everything you read online on the subject is an advertisement, usually disguised as a piece of news or a blog post, but which can invariably be traced back to an organisation with a vested interest in digital disruption.  A typical example is this advergraphic which comes under a banner that reads ‘Big Data Improves Education’. The site, Datafloq, is selling itself as ‘the one-stop-shop around Big Data.’ Their ‘vision’ is ‘Connecting Data and People and [they] aim to achieve that by spurring the understanding, acceptance and application of Big Data in order to drive innovation and economic growth.’

Critical voices are rare, but growing. There’s a very useful bibliography of recent critiques here. And in the world of English language teaching, I was pleased to see that there’s a version of Gavin Dudeney’s talk, ‘Of Big Data & Little Data’, now up on YouTube. The slides which accompany his talk can be accessed here.

His main interest is in reclaiming the discourse of edtech in ELT, in moving away from the current obsession with numbers, and in returning the focus to what he calls ‘old edtech’ – the everyday technological practices of the vast majority of ELT practitioners.2014-12-01_2233

It’s a stimulating and deadpan-entertaining talk and well worth 40 minutes of your time. Just fast-forward the bit when he talks about me.

If you’re interested in hearing more critical voices, you may also like to listen to a series of podcasts, put together by the IATEFL Learning Technologies and Global Issues Special Interest Groups. In the first of these, I interview Neil Selwyn and, in the second, Lindsay Clandfield interviews Audrey Watters of Hack Education.

 

(This post won’t make a lot of sense unless you read the previous two – Researching research: part 1 and part 2!)

The work of Jayaprakash et al was significantly informed and inspired by the work done at Purdue University. In the words of these authors, they even ‘relied on [the] work at Purdue with Course Signals’ for parts of the design of their research. They didn’t know when they were doing their research that the Purdue studies were fundamentally flawed. This was, however, common knowledge (since September 2013) before their article (‘Early Alert of Academically At-Risk Students’) was published. This raises the interesting question of why the authors (and the journal in which they published) didn’t pull the article when they could still have done so. I can’t answer that question, but I can suggest some possible reasons. First, though, a little background on the Purdue research.

The Purdue research is important, more than important, because it was the first significant piece of research to demonstrate the efficacy of academic analytics. Except that, in all probability, it doesn’t! Michael Caulfield, director of blended and networked learning at Washington State University at Vancouver, and Alfred Essa, McGraw-Hill Education’s vice-president of research and development and analytics, took a closer look at the data. What they found was that the results were probably the result of selection bias rather than a real finding. In other words, as summarized by Carl Straumsheim in Inside Higher Ed in November of last year, there was no causal connection between students who use [Course Signals] and their tendency to stick with their studies .The Times Higher Education and the e-Literate blog contacted Purdue, but, to date, there has been no serious response to the criticism. The research is still on Purdue’s website .

The Purdue research article, ‘Course Signals at Purdue: Using Learning Analytics to Increase Student Success’ by Kimberley Arnold and Matt Pistilli, was first published as part of the proceedings of the Learning Analytics and Knowledge (LAK) conference in May 2012. The LAK conference is organised by the Society for Learning Analytics Research (SoLAR), in partnership with Purdue. SoLAR, you may remember, is the organisation which published the new journal in which Jayaprakash et al’s article appeared. Pistilli happens to be an associate editor of the journal. Jayaprakash et al also presented at the LAK ’12 conference. Small world.

The Purdue research was further publicized by Pistilli and Arnold in the Educause review. Their research had been funded by the Gates Foundation (a grant of $1.2 million in November 2011). Educause, in its turn, is also funded by the Gates Foundation (a grant of $9 million in November 2011). The research of Jayaprakash et al was also funded by Educause, which stipulated that ‘effective techniques to improve student retention be investigated and demonstrated’ (my emphasis). Given the terms of their grant, we can perhaps understand why they felt the need to claim they had demonstrated something.

What exactly is Educause, which plays such an important role in all of this? According to their own website, it is a non-profit association whose mission is to advance higher education through the use of information technology. However, it is rather more than that. It is also a lobbying and marketing umbrella for edtech. The following screenshot from their website makes this abundantly clear.educause

If you’ll bear with me, I’d like to describe one more connection between the various players I’ve been talking about. Purdue’s Couse Signals is marketed by a company called Ellucian. Ellucian’s client list includes both Educause and the Gates Foundation. A former Senior Vice President of Ellucian, Anne K Keehn, is currently ‘Senior Fellow -Technology and Innovation, Education, Post-Secondary Success’ at the Gates Foundation – presumably the sort of person to whom you’d have to turn if you wanted funding from the Gates Foundation. Small world.

Personal, academic and commercial networks are intricately intertwined in the high-stakes world of edtech. In such a world (not so very different from the pharmaceutical industry), independent research is practically impossible. The pressure to publish positive research results must be extreme. The temptation to draw conclusions of the kind that your paymasters are looking for must be high. Th edtech juggernaut must keep rolling on.

While the big money will continue to go, for the time being, into further attempts to prove that big data is the future of education, there are still some people who are interested in alternatives. Coincidentally (?), a recent survey  has been carried out at Purdue which looks into what students think about their college experience, about what is meaningful to them. Guess what? It doesn’t have much to do with technology.

Back in the Neanderthal days before Web 2.0, iPhones, tablets, the cloud, learning analytics and so on, Chris Bigum and Jane Kenway wrote a paper called ‘New Information Technologies and the Ambiguous Future of Schooling’. Although published in 1998, it remains relevant and can be accessed here.

They analysed the spectrum of discourse that was concerned with new technologies in education. At one end of this spectrum was a discourse community which they termed ‘boosters’. Then, as now, the boosters were far and away the dominant voices. Bigum and Kenway characterized the boosters as having an ‘unswerving faith in the technology’s capacity to improve education and most other things in society’. I discussed the boosterist discourse in my post on this blog, ‘Saving the World (adaptive marketing)’, focussing on the language of Knewton, as a representative example.

At the other end of Bigum and Kenway’s spectrum was what they termed ‘doomsters’ – ‘unqualified opponents of new technologies’ who see inevitable damage to society and education if we uncritically accept these new technologies.

Since starting this blog, I have been particularly struck by two things. The first of these is that I have had to try to restrain my aversion to the excesses of boosterist discourse – not always, it must be said, with complete success. The second is that I have found myself characterized by some people (perhaps those who have only superficially read a post of two) as an anti-technology doomsterist. At the same time, I have noticed that the debate about adaptive learning and educational technology, in general, tends to become polarized into booster and doomster camps.

To some extent, such polarization is inevitable. When a discourse is especially dominant, anyone who questions it risks finding themselves labelled as the extreme opposite. In some parts of the world, for example, any critique of neoliberal doxa is likely to be critiqued, in its turn, as ‘socialist, or worse’: ‘if you’re not with us, you’re against us’.

GramsciWhen it comes to adaptive learning, one can scoff at the adspeak of Knewton or the gapfills of Voxy, without having a problem with the technology per se. But, given the dominance of the booster discourse, one can’t really be neutral. Neil Selwyn (yes, him again!) suggests that the best way of making full sense of educational technology is to adopt a pessimistic perspective. ‘If nothing else,’ he writes, ‘a pessimistic view remains true to the realities of what has actually taken place with regards to higher education and digital technology over the past thirty years (to be blunt, things have clearly not been transformed or improved by digital technology so far, so why should we expect anything different in the near future?)’. This is not an ‘uncompromising pessimism’, but ‘a position akin to Gramsci’s notion of being ‘a pessimist because of intelligence, but an optimist because of will’’.

Note: The quotes from Neil Selwyn here are taken from his new book Digital Technology and the Contemporary University (2014, Abingdon: Routledge). In the autumn of this year, there will be an online conference, jointly organised by the Learning Technologies and Global Issues Special Interest Groups of IATEFL, during which I will be interviewing Neil Selwyn. I’ll keep you posted.

In Part 9 of the ‘guide’ on this blog (neo-liberalism and solutionism), I suggested that the major advocates of adaptive learning form a complex network of vested neo-liberal interests. Along with adaptive learning and the digital delivery of educational content, they promote a free-market, for-profit, ‘choice’-oriented (charter schools in the US and academies in the UK) ideology. The discourses of these advocates are explored in a fascinating article by Neil Selwyn, ‘Discourses of digital ‘disruption’ in education: a critical analysis’ which can be accessed here.

Stephen Ball includes a detailed chart of this kind of network in his ‘Global Education Inc.’ (Routledge 2012). I thought it would be interesting to attempt a similar, but less ambitious, chart of my own. Sugata Mitra’s plenary talk at the IATEFL conference yesterday has generated a lot of discussion, so I thought it would be interesting to focus on him. What such charts demonstrate very clearly is that there is a very close interlinking between EdTech advocacy and a wider raft of issues on the neo-liberal wish list. Adaptive learning developments (or, for example, schools in the cloud) need to be understood in a broader context … in the same way that Mitra, Tooley, Gates et al understand these technologies.

In order to understand the chart, you will need to look at the notes below. Many more nodes could be introduced, but I have tried my best to keep things simple. All of the information here is publicly available, but I found Stephen Ball’s work especially helpful.

mitra chart

People

Bill Gates is the former chief executive and chairman of Microsoft, co-chair of the Bill and Melinda Gates Foundation.

James Tooley is the Director of the E.G. West Centre. He is a founder of the Educare Trust, founder and chairman of Omega Schools, president of Orient Global, chairman of Rumi School of Excellence, and a former consultant to the International Finance Corporation. He is also a member of the advisory council of the Institute of Economic Affairs and was responsible for creating the Education and Training Unit at the Institute.

Michael Barber is Pearson’s Chief Education Advisor and Chairman of Pearson’s $15 million Affordable Learning Fund. He is also an advisor on ‘deliverology’ to the International Finance Corporation.

Sugata Mitra is Professor of Educational Technology at the E.G. West Centre and he is Chief Scientist, Emeritus, at NIIT. He is best known for his “Hole in the Wall” experiment. In 2013, he won the $1 million TED Prize to develop his idea of a ‘school-in-the-cloud’.

Institutions

Hiwel (Hole-in-the-Wall Education Limited) is the company behind Mitra’s “Hole in the Wall” experiment. It is a subsidiary of NIIT.

NIIT Limited is an Indian company based in Gurgaon, India that operates several for-profit higher education institutions.

Omega Schools is a privately held chain of affordable, for-profit schools based in Ghana.There are currently 38 schools educating over 20,000 students.

Orient Global is a Singapore-based investment group, which bought a $48 million stake in NIIT.

Pearson is … Pearson. Pearson’s Affordable Learning Fund was set up to invest in private companies committed to innovative approaches. Its first investment was a stake in Omega Schools.

Rumi Schools of Excellence is Orient Global’s chain of low-cost private schools in India, which aims to extend access and improve educational quality through affordable private schooling.

School-in-the-cloud is described by Mitra as’ a learning lab in India, where children can embark on intellectual adventures by engaging and connecting with information and mentoring online’. Microsoft are the key sponsors.

The E.G. West Centre of the University of Newcastle is dedicated to generating knowledge and understanding about how markets and self organising systems work in education.

The Educare Trustis a non-profit agency, formed in 2002 by Professor James Tooley of the University of Newcastle Upon Tyne, England, and other members associated with private unaided schools in India.It is advised by an international team from the University of Newcastle. It services include the running of a loan scheme for schools to improve their infrastructure and facilities.

The Institute of Economic Affairs is a right-wing free market think tank in London whose stated mission is to improve understanding of the fundamental institutions of a free society by analysing and expounding the role of markets in solving economic and social problems.

The International Finance Corporation is an international financial institution which offers investment, advisory, and asset management services to encourage private sector development in developing countries. The IFC is a member of the World Bank Group.

The Templeton Foundation is a philanthropic organization that funds inter-disciplinary research about human purpose and ultimate reality. Described by Barbara Ehrenreich as a ‘right wing venture’, it has a history of supporting the Cato Institute (publishers of Tooley’s most well-known book) , a libertarian think-tank, as well as projects at major research centers and universities that explore themes related to free market economics.

Additional connections

Barber is an old friend of Tooley’s from when both men were working in Zimbabwe in the 1990s.

Omega Schools are taking part in Sugata Mitra’s TED Prize Schools in the Cloud project.

Omega Schools use textbooks developed by Pearson.

Orient Global sponsored an Education Development fund at Newcastle University. The project leaders were Tooley and Mitra. They also sponsored the Hole-in-the-Wall experiment.

Pearson, the Pearson Foundation, Microsoft and the Gates Foundation work closely together on a wide variety of projects.

Some of Tooley’s work for the Educare Trust was funded by the Templeton Trust. Tooley was also winner of the 2006 Templeton Freedom Prize for Excellence.

The International Finance Corporation and the Gates Foundation are joint sponsors of a $60 million project to improve health in Nigeria.

The International Finance Corporation was another sponsor of the Hole-in-the-Wall experiment.

I mentioned the issue of privacy very briefly in Part 9 of the ‘Guide’, and it seems appropriate to take a more detailed look.

Adaptive learning needs big data. Without the big data, there is nothing for the algorithms to work on, and the bigger the data set, the better the software can work. Adaptive language learning will be delivered via a platform, and the data that is generated by the language learner’s interaction with the English language program on the platform is likely to be only one, very small, part of the data that the system will store and analyse. Full adaptivity requires a psychometric profile for each student.

It would make sense, then, to aggregate as much data as possible in one place. Besides the practical value in massively combining different data sources (in order to enhance the usefulness of the personalized learning pathways), such a move would possibly save educational authorities substantial amounts of money and allow educational technology companies to mine the rich goldmine of student data, along with the standardised platform specifications, to design their products.

And so it has come to pass. The Gates Foundation (yes, them again) provided most of the $100 million funding. A division of Murdoch’s News Corp built the infrastructure. Once everything was ready, a non-profit organization called inBloom was set up to run the thing. The inBloom platform is open source and the database was initially free, although this will change. Preliminary agreements were made with 7 US districts and involved millions of children. The data includes ‘students’ names, birthdates, addresses, social security numbers, grades, test scores, disability status, attendance, and other confidential information’ (Ravitch, D. ‘Reign of Error’ NY: Knopf, 2013, p. 235-236). Under federal law, this information can be ‘shared’ with private companies selling educational technology and services.

The edtech world rejoiced. ‘This is going to be a huge win for us’, said one educational software provider; ‘it’s a godsend for us,’ said another. Others are not so happy. If the technology actually works, if it can radically transform education and ‘produce game-changing outcomes’ (as its proponents claim so often), the price to be paid might just conceivably be worth paying. But the price is high and the research is not there yet. The price is privacy.

The problem is simple. InBloom itself acknowledges that it ‘cannot guarantee the security of the information stored… or that the information will not be intercepted when it is being transmitted.’ Experience has already shown us that organisations as diverse as the CIA or the British health service cannot protect their data. Hackers like a good challenge. So do businesses.

The anti-privatization (and, by extension, the anti-adaptivity) lobby in the US has found an issue which is resonating with electors (and parents). These dissenting voices are led by Class Size Matters, and their voice is being heard. Of the original partners of inBloom, only one is now left. The others have all pulled out, mostly because of concerns about privacy, although the remaining partner, New York, involves personal data on 2.7 million students, which can be shared without any parental notification or consent.

inbloom-student-data-bill-gates

This might seem like a victory for the anti-privatization / anti-adaptivity lobby, but it is likely to be only temporary. There are plenty of other companies that have their eyes on the data-mining opportunities that will be coming their way, and Obama’s ‘Race to the Top’ program means that the inBloom controversy will be only a temporary setback. ‘The reality is that it’s going to be done. It’s not going to be a little part. It’s going to be a big part. And it’s going to be put in place partly because it’s going to be less expensive than doing professional development,’ says Eva Baker of the Center for the Study of Evaluation at UCLA.

It is in this light that the debate about adaptive learning becomes hugely significant. Class Size Matters, the odd academic like Neil Selwyn or the occasional blogger like myself will not be able to reverse a trend with seemingly unstoppable momentum. But we are, collectively, in a position to influence the way these changes will take place.

If you want to find out more, check out the inBloom and Class Size Matters links. And you might like to read more from the news reports which I have used for information in this post. Of these, the second was originally published by Scientific American (owned by Macmillan, one of the leading players in ELT adaptive learning). The third and fourth are from Education Week, which is funded in part by the Gates Foundation.

http://www.reuters.com/article/2013/03/03/us-education-database-idUSBRE92204W20130303

http://www.salon.com/2013/08/01/big_data_puts_teachers_out_of_work_partner/

http://www.edweek.org/ew/articles/2014/01/08/15inbloom_ep.h33.html

http://blogs.edweek.org/edweek/marketplacek12/2013/12/new_york_battle_over_inBloom_data_privacy_heading_to_court.html

Adaptive learning is likely to impact on the lives of language teachers very soon. In my work as a writer of education materials, it has already dramatically impacted on mine. This impact has affected the kinds of things I am asked to write, the way in which I write them and my relationship with the editors and publishers I am writing for. I am as dismissive as Steve Jobs[1] was of the idea that technology can radically transform education, but in the short term it can radically disrupt it. Change is not necessarily progress.

Teachers and teacher trainers need to be very alert to what is going on if they don’t want to wake up one morning and find themselves out of work, or in a very different kind of job. The claims for adaptive language learning need to be considered in the bright light of particular, local contexts. Teachers and teacher trainers can even take a lesson from the proponents of adaptive learning who rail against the educational approach of one-size-fits-all. One size, whether it’s face-to-face with a print coursebook or whether it’s a blended adaptive program, will never fit all. We need to be very skeptical of the publishers and software providers who claim in a TED-style, almost evangelical way that they are doing the right thing for students, our society, or our world. There is a real risk that adaptive learning may be leading simply to ‘a more standardised, minimalist product targeted for a mass market, [that] will further ‘box in’ and ‘dumb down’ education’ (Selwyn, Education and Technology 2011, p.101).

There is nothing wrong, per se, with adaptive learning. It could be put to some good uses, but how likely is this? In order to understand how it may impact on our working lives, we need to be better informed. A historical perspective is often a good place to start and Larry Cuban’s Teachers and Machines: The Classroom Use of Technology since 1920 (New York: Teachers College Press, 1986) is still well worth reading.

81WEOH4yyOL

To get a good picture of where big data and analytics are now and where they are heading, Mayer-Schonberger & Cukier’s Big Data (London: John Murray, 2013) is informative and entertaining reading. If you are ‘an executive looking to integrate analytics in your decision making or a manager seeking to generate better conversations with the quants in your organisation’, I’d recommend Keeping up with the Quants by Thomas H. Davenport and Jinho Kim (Harvard Business School, 2013). Or you could just read ‘The Economist’ for this kind of thing.

If you want to follow up the connections between educational technology and neo-liberalism, the books by Stephen Ball (Global Education Inc., Abingdon, Oxon: Routledge, 2012), Neil Selwyn (Education and Technology, London: Continuum, 2011; Education in a Digital World, New York: Routledge, 2013; Distrusting Educational Technology, New York: Routledge, 2013), Diane Ravitch (Reign of Error, New York: Knopf, 2013) and Joel Spring (Education Networks, New York: Routledge, 2012; The Great American Education-Industrial Complex with Anthony G. Picciano, Routledge, 2013) are all good reads. And keep a look out for anything new from these writers.

Finally, to keep up to date with recent developments, the eltjam blog http://www.eltjam.com/ is a good one to follow, as is Richard Whiteside’s Scoop.it! page http://www.scoop.it/t/elt-publishing-by-richard-whiteside

I’ll be continuing to post things here from time to time! Thanks for following me so far.


[1] Jobs, however, did set his sights ‘on the $8 billion a year textbook industry, which he saw as ‘ripe for digital destruction’. His first instinct seems to have been to relieve kids from having to carry around heavy backpacks crammed with textbooks: ‘The iPad would solve that,’ he said, ever practical’ (Fullan, Stratosphere 2013, p.61).

Adaptive learning is a product to be sold. How?

1 Individualised learning

In the vast majority of contexts, language teaching is tied to a ‘one-size-fits-all’ model. This is manifested in institutional and national syllabuses which provide lists of structures and / or competences that all students must master within a given period of time. It is usually actualized in the use of coursebooks, often designed for ‘global markets’. Reaction against this model has been common currency for some time, and has led to a range of suggestions for alternative approaches (such as DOGME), none of which have really caught on. The advocates of adaptive learning programs have tapped into this zeitgeist and promise ‘truly personalized learning’. Atomico, a venture capital company that focuses on consumer technologies, and a major investor in Knewton, describes the promise of adaptive learning in the following terms: ‘Imagine lessons that adapt on-the-fly to the way in which an individual learns, and powerful predictive analytics that help teachers differentiate instruction and understand what each student needs to work on and why[1].’

This is a seductive message and is often framed in such a way that disagreement seems impossible. A post on one well-respected blog, eltjam, which focuses on educational technology in language learning, argued the case for adaptive learning very strongly in July 2013: ‘Adaptive Learning is a methodology that is geared towards creating a learning experience that is unique to each individual learner through the intervention of computer software. Rather than viewing learners as a homogenous collective with more or less identical preferences, abilities, contexts and objectives who are shepherded through a glossy textbook with static activities/topics, AL attempts to tap into the rich meta-data that is constantly being generated by learners (and disregarded by educators) during the learning process. Rather than pushing a course book at a class full of learners and hoping that it will (somehow) miraculously appeal to them all in a compelling, salubrious way, AL demonstrates that the content of a particular course would be more beneficial if it were dynamic and interactive. When there are as many responses, ideas, personalities and abilities as there are learners in the room, why wouldn’t you ensure that the content was able to map itself to them, rather than the other way around?[2]

Indeed. But it all depends on what, precisely, the content is – a point I will return to in a later post. For the time being, it is worth noting the prominence that this message is given in the promotional discourse. It is a message that is primarily directed at teachers. It is more than a little disingenuous, however, because teachers are not the primary targets of the promotional discourse, for the simple reason that they are not the ones with purchasing power. The slogan on the homepage of the Knewton website shows clearly who the real audience is: ‘Every education leader needs an adaptive learning infrastructure’[3].

2 Learning outcomes and testing

Education leaders, who are more likely these days to come from the world of business and finance than the world of education, are currently very focused on two closely interrelated topics: the need for greater productivity and accountability, and the role of technology. They generally share the assumption of other leaders in the World Economic Forum that ICT is the key to the former and ‘the key to a better tomorrow’ (Spring, Education Networks, 2012, p.52). ‘We’re at an important transition point,’ said Arne Duncan, the U.S. Secretary of Education in 2010, ‘we’re getting ready to move from a predominantly print-based classroom to a digital learning environment’ (quoted by Spring, 2012, p.58). Later in the speech, which was delivered at the time as the release of the new National Education Technology Plan, Duncan said ‘just as technology has increased productivity in the business world, it is an essential tool to help boost educational productivity’. The plan outlines how this increased productivity could be achieved: we must start ‘with being clear about the learning outcomes we expect from the investments we make’ (Office of Educational Technology, Transforming American Education: Learning Powered by Technology, U.S. Department of Education, 2010). The greater part of the plan is devoted to discussion of learning outcomes and assessment of them.

Learning outcomes (and their assessment) are also at the heart of ‘Asking More: the Path to Efficacy’ (Barber and Rizvi (eds), Asking More: the Path to Efficacy Pearson, 2013), Pearson’s blueprint for the future of education. According to John Fallon, the CEO of Pearson, ‘our focus should unfalteringly be on honing and improving the learning outcomes we deliver’ (Barber and Rizvi, 2013, p.3). ‘High quality learning’ is associated with ‘a relentless focus on outcomes’ (ibid, p.3) and words like ‘measuring / measurable’, ‘data’ and ‘investment’ are almost as salient as ‘outcomes’. A ‘sister’ publication, edited by the same team, is entitled ‘The Incomplete Guide to Delivering Learning Outcomes’ (Barber and Rizvi (eds), Pearson, 2013) and explores further Pearson’s ambition to ‘become the world’s leading education company’ and to ‘deliver learning outcomes’.

It is no surprise that words like ‘outcomes’, ‘data’ and ‘measure’ feature equally prominently in the language of adaptive software companies like Knewton (see, for example, the quotation from Jose Ferreira, CEO of Knewton, in an earlier post). Adaptive software is premised on the establishment and measurement of clearly defined learning outcomes. If measurable learning outcomes are what you’re after, it’s hard to imagine a better path to follow than adaptive software. If your priorities include standards and assessment, it is again hard to imagine an easier path to follow than adaptive software, which was used in testing long before its introduction into instruction. As David Kuntz, VP of research at Knewton and, before that, a pioneer of algorithms in the design of tests, points out, ‘when a student takes a course powered by Knewton, we are continuously evaluating their performance, what others have done with that material before, and what [they] know’[4]. Knewton’s claim that every education leader needs an adaptive learning infrastructure has a powerful internal logic.

3 New business models

‘Adapt or die’ (a phrase originally coined by the last prime minister of apartheid South Africa) is a piece of advice that is often given these days to both educational institutions and publishers. British universities must adapt or die, according to Michael Barber, author of ‘An Avalanche is Coming[5]’ (a report commissioned by the British Institute for Public Policy Research), Chief Education Advisor to Pearson, and editor of the Pearson ‘Efficacy’ document (see above). ELT publishers ‘must change or die’, reported the eltjam blog[6], and it is a message that is frequently repeated elsewhere. The move towards adaptive learning is seen increasingly often as one of the necessary adaptations for both these sectors.

The problems facing universities in countries like the U.K. are acute. Basically, as the introduction to ‘An Avalanche is Coming’ puts it, ‘the traditional university is being unbundled’. There are a number of reasons for this including the rising cost of higher education provision, greater global competition for the same students, funding squeezes from central governments, and competition from new educational providers (such as MOOCs). Unsurprisingly, universities (supported by national governments) have turned to technology, especially online course delivery, as an answer to their problems. There are two main reasons for this. Firstly, universities have attempted to reduce operating costs by looking for increases in scale (through mergers, transnational partnerships, international branch campuses and so on). Mega-universities are growing, and there are thirty-three in Asia alone (Selwyn Education in a Digital World New York: Routledge 2013, p.6). Universities like the Turkish Anadolu University, with over one million students, are no longer exceptional in terms of scale. In this world, online educational provision is a key element. Secondly, and not to put too fine a point on it, online instruction is cheaper (Spring, Education Networks 2012, p.2).

All other things being equal, why would any language department of an institute of higher education not choose an online environment with an adaptive element? Adaptive learning, for the time being at any rate, may be seen as ‘the much needed key to the “Iron Triangle” that poses a conundrum to HE providers; cost, access and quality. Any attempt to improve any one of those conditions impacts negatively on the others. If you want to increase access to a course you run the risk of escalating costs and jeopardising quality, and so on.[7]

Meanwhile, ELT publishers have been hit by rampant pirating of their materials, spiraling development costs of their flagship products and the growth of open educational resources. An excellent blog post by David Wiley[8] explains why adaptive learning services are a heaven-sent opportunity for publishers to modify their business model. ‘While the broad availability of free content and open educational resources have trained internet users to expect content to be free, many people are still willing to pay for services. Adaptive learning systems exploit this willingness by deeply intermingling content and services so that you cannot access one with using the other. Naturally, because an adaptive learning service is comprised of content plus adaptive services, it will be more expensive than static content used to be. And because it is a service, you cannot simply purchase it like you used to buy a textbook. An adaptive learning service is something you subscribe to, like Netflix. […] In short, why is it in a content company’s interest to enable you to own anything? Put simply, it is not. When you own a copy, the publisher completely loses control over it. When you subscribe to content through a digital service (like an adaptive learning service), the publisher achieves complete and perfect control over you and your use of their content.’

Although the initial development costs of building a suitable learning platform with adaptive capabilities are high, publishers will subsequently be able to produce and modify content (i.e. learning materials) much more efficiently. Since content will be mashed up and delivered in many different ways, author royalties will be cut or eliminated. Production and distribution costs will be much lower, and sales and marketing efforts can be directed more efficiently towards the most significant customers. The days of ELT sales reps trying unsuccessfully to get an interview with the director of studies of a small language school or university department are becoming a thing of the past. As with the universities, scale will be everything.


[2]http://www.eltjam.com/adaptive-learning/ (last accessed 13 January 2014)

[3] http://www.knewton.com/ (last accessed 13 January 2014)

[4] MIT Technology Review, November 26, 2012 http://www.technologyreview.com/news/506366/questions-surround-software-that-adapts-to-students/ (last accessed 13 January 2014)

[7] Tim Gifford Taking it Personally: Adaptive Learning July 9, 2013 http://www.eltjam.com/adaptive-learning/ (last accessed January 13, 2014)

[8] David Wiley, Buying our Way into Bondage: the risks of adaptive learning services March 20,2013 http://opencontent.org/blog/archives/2754 (last accessed January 13, 2014)

There is a good chance that many readers will have only the haziest idea of what adaptive learning is. There is a much better chance that most English language teachers, especially those working in post-secondary education, will feel the impact of adaptive learning on their professional lives in the next few years. According to Time magazine, it is a ‘hot concept, embraced by education reformers‘, which is ‘poised to reshape education’[1]. According to the educational news website, Education Dive, there is ‘no hotter segment in ed tech right now’[2]. All the major ELT publishers are moving away from traditional printed coursebooks towards the digital delivery of courses that will contain adaptive learning elements. Their investments in the technology are colossal. Universities in many countries, especially the US, are moving in the same direction, again with huge investments. National and regional governments, intergovernmental organisations (such as UNESCO, the OECD, the EU and the World Bank), big business and hugely influential private foundations (such as the Bill and Melinda Gates Foundation) are all lined up in support of the moves towards the digital delivery of education, which (1) will inevitably involve elements of adaptive learning, and (2) will inevitably impact massively on the world of English language teaching.

The next 13 posts will, together, form a guide to adaptive learning in ELT.

1 Introduction

2 Simple models of adaptive learning

3 Gamification

4 Big data, analytics and adaptive learning

5 Platforms and more complex adaptive learning systems

6 The selling points of adaptive learning

7 Ten predictions for the future

8 Theory, research and practice
9 Neo liberalism and solutionism
10 Learn more