Posts Tagged ‘Oxford University Press’

Around 25 years ago, when I worked at International House London, I used to teach a course called ‘Current Trends in ELT’. I no longer have records of the time so I can’t be 100% sure what was included in the course, but task-based learning, the ‘Lexical Approach’, the use of corpora, English as a Lingua Franca, learner autonomy / centredness, reflective practice and technology (CALL and CD-ROMs) were all probably part of it. I see that IH London still offers this course (next available course in January 2021) and I am struck by how similar the list of contents is. Only ‘emerging language’, CLIL, DOGME and motivation are clearly different from the menu of 25 years ago.

The term ‘current trends’ has always been a good hook to sell a product. Each year, any number of ELT conferences chooses it as their theme. Coursebooks, like ‘Cutting Edge’ or ‘Innovations’, suggest in their titles something fresh and appealing. And, since 2003, the British Council has used its English Language Teaching Innovation Awards to position itself as forward-thinking and innovative.

You could be forgiven for wondering what is especially innovative about many of the ELTon award-winners, or indeed, why neophilia actually matters at all. The problem, in a relatively limited world like language teaching, is that only so much innovation is either possible or desirable.

A year after the ELTons appeared, Adrian Underhill wrote an article about ‘Trends in English Language Teaching Today’. Almost ten years after I was teaching ‘current trends’, Adrian’s list included the use of corpora, English as a Lingua Franca, reflective practice and learner-centredness. His main guess was that practitioners would be working more with ‘the fuzzy, the unclear, the unfinished’. He hadn’t reckoned on the influence of the CEFR, Pearson’s Global Scale of English and our current obsession with measuring everything!

Jump just over ten years and Chia Suan Chong offered a listicle of ‘Ten innovations that have changed English language teaching for the British Council. Most of these were technological developments (platforms, online CPD, mobile learning) but a significant newcomer to the list was ‘soft skills’ (especially critical thinking).

Zooming forward nearer to the present, Chia then offered her list of ‘Ten trends and innovations in English language teaching for 2018’ in another post for the British Council. English as a Lingua Franca was still there, but gone were task-based learning and the ‘Lexical Approach’, corpora, learner-centredness and reflective practice. In their place came SpLNs, multi-literacies, inquiry-based learning and, above all, more about technology – platforms, mobile and blended learning, gamification.

I decided to explore current ‘current trends’ by taking a look at the last twelve months of blog posts from the four biggest UK ELT publishers. Posts such as these are interesting in two ways: (1) they are an attempt to capture what is perceived as ‘new’ and therefore more likely to attract clicks, and (2) they are also an attempt to set an agenda – they reflect what these commercial organisations would like us to be talking and thinking about. The posts reflect reasonably well the sorts of topics that are chosen for webinars, whether directly hosted or sponsored.

The most immediate and unsurprising observation is that technology is ubiquitous. No longer one among a number of topics, technology now informs (almost) all other topics. Before I draw a few conclusion, here are more detailed notes.

Pearson English blog

Along with other publishers, Pearson were keen to show how supportive to teachers they were, and the months following the appearance of the pandemic saw a greater number than normal of blog posts that did not focus on particular Pearson products. Over the last twelve months as a whole, Pearson made strenuous efforts to draw attention to their Global Scale of English and the Pearson Test of English. Assessment of one kind or another was never far away. But the other big themes of the last twelve months have been ‘soft / 21st century skills (creativity, critical thinking, collaboration, leadership, etc.), and aspects of social and emotional learning (especially engagement / motivation, anxiety and mindfulness). Three other topics also featured more than once: mediation, personalization and SpLN (dyslexia).

OUP ELT Global blog

The OUP blog has, on the whole, longer, rather more informative posts than Pearson. They also tend to be less obviously product-oriented, and fewer are written by in-house marketing people. The main message that comes across is the putative importance of ‘soft / 21st century skills’, which Oxford likes to call ‘global skills’ (along with the assessment of these skills). One post manages to pack three buzzwords into one title: ‘Global Skills – Create Empowered 21st Century Learners’. As with Pearson, ‘engagement / engaging’ is probably the most over-used word in the last twelve months. In the social and emotional area, OUP focuses on teacher well-being, rather than mindfulness (although, of course, mindfulness is a path to this well-being). There is also an interest in inquiry-based learning, literacies (digital and assessment), formative assessment and blended learning.

Macmillan English blog

The Macmillan English ‘Advancing Learning’ blog is a much less corporate beast than the Pearson and OUP blogs. There have been relatively few posts in the last twelve months, and no clear message emerges. The last year has seen posts on the Image Conference, preparing for IELTS, student retention, extensive reading, ELF pronunciation, drama, mindfulness, Zoom, EMI, and collaboration skills.

CUP World of Better Learning blog

The CUP blog, like Macmillan’s, is an eclectic affair, with no clearly discernible messages beyond supporting teachers with tips and tools to deal with the shift to online teaching. Motivation and engagement are fairly prominent (with Sarah Mercer contributing both here and at OUP). Well-being (and the inevitable nod to mindfulness) gets a look-in. Other topics include SpLNs, video and ELF pronunciation (with Laura Patsko contributing both here and at the Macmillan site).

Macro trends

My survey has certainly not been ‘scientific’, but I think it allows us to note a few macro-trends. Here are my thoughts:

  • Measurement of language and skills (both learning and teaching skills) has become central to many of our current concerns.
  • We are now much less interested in issues which are unique to language learning and teaching (e.g. task-based learning, the ‘Lexical Approach’, corpora) than we used to be.
  • Current concerns reflect much more closely the major concerns of general education (measurement, 21st century skills, social-emotional learning) than they used to. It is no coincidence that these reflect the priorities of those who shape global educational policy (OECD, World Bank, etc.).
  • 25 years ago, current trends were more like zones of interest. They were areas to explore, research and critique further. As such, we might think of them as areas of exploratory practice (‘Exploratory Practice’ itself was a ‘current trend’ in the mid 1990s). Current ‘current trends’ are much more enshrined. They are things to be implemented, and exploration of them concerns the ‘how’, not the ‘whether’.

The most widely-used and popular tool for language learners is the bilingual dictionary (Levy & Steel, 2015), and the first of its kind appeared about 4,000 years ago (2,000 years earlier than the first monolingual dictionaries), offering wordlists in Sumerian and Akkadian (Wheeler, 2013: 9 -11). Technology has come a long way since the clay tablets of the Bronze Age. Good online dictionaries now contain substantially more information (in particular audio recordings) than their print equivalents of a few decades ago. In addition, they are usually quicker and easier to use, more popular, and lead to retention rates that are comparable to, or better than, those achieved with print (Töpel, 2014). The future of dictionaries is likely to be digital, and paper dictionaries may well disappear before very long (Granger, 2012: 2).

English language learners are better served than learners of other languages, and the number of free, online bilingual dictionaries is now enormous. Speakers of less widely-spoken languages may still struggle to find a good quality service, but speakers of, for example, Polish (with approximately 40 million speakers, and a ranking of #33 in the list of the world’s most widely spoken languages) will find over twenty free, online dictionaries to choose from (Lew & Szarowska, 2017). Speakers of languages that are more widely spoken (Chinese, Spanish or Portuguese, for example) will usually find an even greater range. The choice can be bewildering and neither search engine results nor rankings from app stores can be relied on to suggest the product of the highest quality.

Language teachers are not always as enthusiastic about bilingual dictionaries as their learners. Folse (2004: 114 – 120) reports on an informal survey of English teachers which indicated that 11% did not allow any dictionaries in class at all, 37% allowed monolingual dictionaries and only 5% allowed bilingual dictionaries. Other researchers (e.g. Boonmoh & Nesi, 2008), have found a similar situation, with teachers overwhelmingly recommending the use of a monolingual learner’s dictionary: almost all of their students bought one, but the great majority hardly ever used it, preferring instead a digital bilingual version.

Teachers’ preferences for monolingual dictionaries are usually motivated in part by a fear that their students will become too reliant on translation. Whilst this concern remains widespread, much recent suggests that this fear is misguided (Nation, 2013: 424) and that monolingual dictionaries do not actually lead to greater learning gains than their bilingual counterparts. This is, in part, due to the fact that learners typically use these dictionaries in very limited ways – to see if a word exists, check spelling or look up meaning (Harvey & Yuill, 1997). If they made fuller use of the information (about frequency, collocations, syntactic patterns, etc.) on offer, it is likely that learning gains would be greater: ‘it is accessing multiplicity of information that is likely to enhance retention’ (Laufer & Hill, 2000: 77). Without training, however, this is rarely the case.  With lower-level learners, a monolingual learner’s dictionary (even one designed for Elementary level students) can be a frustrating experience, because until they have reached a vocabulary size of around 2,000 – 3,000 words, they will struggle to understand the definitions (Webb & Nation, 2017: 119).

The second reason for teachers’ preference for monolingual dictionaries is that the quality of many bilingual dictionaries is undoubtedly very poor, compared to monolingual learner’s dictionaries such as those produced by Oxford University Press, Cambridge University Press, Longman Pearson, Collins Cobuild, Merriam-Webster and Macmillan, among others. The situation has changed, however, with the rapid growth of bilingualized dictionaries. These contain all the features of a monolingual learner’s dictionary, but also include translations into the learner’s own language. Because of the wealth of information provided by a good bilingualized dictionary, researchers (e.g. Laufer & Hadar, 1997; Chen, 2011) generally consider them preferable to monolingual or normal bilingual dictionaries. They are also popular with learners. Good bilingualized online dictionaries (such as the Oxford Advanced Learner’s English-Chinese Dictionary) are not always free, but many are, and with some language pairings free software can be of a higher quality than services that incur a subscription charge.

If a good bilingualized dictionary is available, there is no longer any compelling reason to use a monolingual learner’s dictionary, unless it contains features which cannot be found elsewhere. In order to compete in a crowded marketplace, many of the established monolingual learner’s dictionaries do precisely that. Examples of good, free online dictionaries include:

Students need help in selecting a dictionary that is right for them. Without this, many end up using as a dictionary a tool such as Google Translate , which, for all its value, is of very limited use as a dictionary. They need to understand that the most appropriate dictionary will depend on what they want to use it for (receptive, reading purposes or productive, writing purposes). Teachers can help in this decision-making process by addressing the issue in class (see the activity below).

In addition to the problem of selecting an appropriate dictionary, it appears that many learners have inadequate dictionary skills (Niitemaa & Pietilä, 2018). In one experiment (Tono, 2011), only one third of the vocabulary searches in a dictionary that were carried out by learners resulted in success. The reasons for failure include focussing on only the first meaning (or translation) of a word that is provided, difficulty in finding the relevant information in long word entries, an inability to find the lemma that is needed, and spelling errors (when they had to type in the word) (Töpel, 2014). As with monolingual dictionaries, learners often only check the meaning of a word in a bilingual dictionary and fail to explore the wider range of information (e.g. collocation, grammatical patterns, example sentences, synonyms) that is available (Laufer & Kimmel, 1997; Laufer & Hill, 2000; Chen, 2010). This information is both useful and may lead to improved retention.

Most learners receive no training in dictionary skills, but would clearly benefit from it. Nation (2013: 333) suggests that at least four or five hours, spread out over a few weeks, would be appropriate. He suggests (ibid: 419 – 421) that training should encourage learners, first, to look closely at the context in which an unknown word is encountered (in order to identify the part of speech, the lemma that needs to be looked up, its possible meaning and to decide whether it is worth looking up at all), then to help learners in finding the relevant entry or sub-entry (by providing information about common dictionary abbreviations (e.g. for parts of speech, style and register)), and, finally, to check this information against the original context.

Two good resource books full of practical activities for dictionary training are available: ‘Dictionary Activities’ by Cindy Leaney (Cambridge: Cambridge University Press, 2007) and ‘Dictionaries’ by Jon Wright (Oxford: Oxford University Press, 1998). Many of the good monolingual dictionaries offer activity guides to promote effective dictionary use and I have suggested a few activities here.

Activity: Understanding a dictionary

Outline: Students explore the use of different symbols in good online dictionaries.

Level: All levels, but not appropriate for very young learners. The activity ‘Choosing a dictionary’ is a good follow-up to this activity.

1 Distribute the worksheet and ask students to follow the instructions.


2 Check the answers.


Activity: Choosing a dictionary

Outline: Students explore and evaluate the features of different free, online bilingual dictionaries.

Level: All levels, but not appropriate for very young learners. The text in stage 3 is appropriate for use with levels A2 and B1. For some groups of learners, you may want to adapt (or even translate) the list of features. It may be useful to do the activity ‘Understanding a dictionary’ before this activity.

1 Ask the class which free, online bilingual dictionaries they like to use. Write some of their suggestions on the board.

2 Distribute the list of features. Ask students to work individually and tick the boxes that are important for them. Ask students to work with a partner to compare their answers.


3 Give students a list of free, online bilingual (English and the students’ own language) dictionaries. You can use suggestions from the list below, add the suggestions that your students made in stage 1, or add your own ideas. (For many language pairings, better resources are available than those in the list below.) Give the students the following short text and ask the students to use two of these dictionaries to look up the underlined words. Ask the students to decide which dictionary they found most useful and / or easiest to use.



4 Conduct feedback with the whole class.

Activity: Getting more out of a dictionary

Outline: Students use a dictionary to help them to correct a text

Level: Levels B1 and B2, but not appropriate for very young learners. For higher levels, a more complex text (with less obvious errors) would be appropriate.

1 Distribute the worksheet below and ask students to follow the instructions.


2 Check answers with the whole class. Ask how easy it was to find the information in the dictionary that they were using.


When you are reading, you probably only need a dictionary when you don’t know the meaning of a word and you want to look it up. For this, a simple bilingual dictionary is good enough. But when you are writing or editing your writing, you will need something that gives you more information about a word: grammatical patterns, collocations (the words that usually go with other words), how formal the word is, and so on. For this, you will need a better dictionary. Many of the better dictionaries are monolingual (see the box), but there are also some good bilingual ones.

Use one (or more) of the online dictionaries in the box (or a good bilingual dictionary) and make corrections to this text. There are eleven mistakes (they have been underlined) in total.


Boonmoh, A. & Nesi, H. 2008. ‘A survey of dictionary use by Thai university staff and students with special reference to pocket electronic dictionaries’ Horizontes de Linguística Aplicada , 6(2), 79 – 90

Chen, Y. 2011. ‘Studies on Bilingualized Dictionaries: The User Perspective’. International Journal of Lexicography, 24 (2): 161–197

Folse, K. 2004. Vocabulary Myths. Ann Arbor: University of Michigan Press

Granger, S. 2012. Electronic Lexicography. Oxford: Oxford University Press

Harvey, K. & Yuill, D. 1997. ‘A study of the use of a monolingual pedagogical dictionary by learners of English engaged in writing’ Applied Linguistics, 51 (1): 253 – 78

Laufer, B. & Hadar, L. 1997. ‘Assessing the effectiveness of monolingual, bilingual and ‘bilingualized’ dictionaries in the comprehension and production of new words’. Modern Language Journal, 81 (2): 189 – 96

Laufer, B. & M. Hill 2000. ‘What lexical information do L2 learners select in a CALL dictionary and how does it affect word retention?’ Language Learning & Technology 3 (2): 58–76

Laufer, B. & Kimmel, M. 1997. ‘Bilingualised dictionaries: How learners really use them’, System, 25 (3): 361 -369

Leaney, C. 2007. Dictionary Activities. Cambridge: Cambridge University Press

Levy, M. and Steel, C. 2015. ‘Language learner perspectives on the functionality and use of electronic language dictionaries’. ReCALL, 27(2): 177–196

Lew, R. & Szarowska, A. 2017. ‘Evaluating online bilingual dictionaries: The case of popular free English-Polish dictionaries’ ReCALL 29(2): 138–159

Nation, I.S.P. 2013. Learning Vocabulary in Another Language 2nd edition. Cambridge: Cambridge University Press

Niitemaa, M.-L. & Pietilä, P. 2018. ‘Vocabulary Skills and Online Dictionaries: A Study on EFL Learners’ Receptive Vocabulary Knowledge and Success in Searching Electronic Sources for Information’, Journal of Language Teaching and Research, 9 (3): 453-462

Tono, Y. 2011. ‘Application of eye-tracking in EFL learners’ dictionary look-up process research’, International Journal of Lexicography 24 (1): 124–153

Töpel, A. 2014. ‘Review of research into the use of electronic dictionaries’ in Müller-Spitzer, C. (Ed.) 2014. Using Online Dictionaries. Berlin: De Gruyter, pp. 13 – 54

Webb, S. & Nation, P. 2017. How Vocabulary is Learned. Oxford: Oxford University Press

Wheeler, G. 2013. Language Teaching through the Ages. New York: Routledge

Wright, J. 1998. Dictionaries. Oxford: Oxford University Press

I was intrigued to learn earlier this year that Oxford University Press had launched a new online test of English language proficiency, called the Oxford Test of English (OTE). At the conference where I first heard about it, I was struck by the fact that the presentation of the OUP sponsored plenary speaker was entitled ‘The Power of Assessment’ and dealt with formative assessment / assessment for learning. Oxford clearly want to position themselves as serious competitors to Pearson and Cambridge English in the testing business.

The brochure for the exam kicks off with a gem of a marketing slogan, ‘Smart. Smarter. SmarTest’ (geddit?), and the next few pages give us all the key information.

Faster and more flexible‘Traditional language proficiency tests’ is presumably intended to refer to the main competition (Pearson and Cambridge English). Cambridge First takes, in total, 3½ hours; the Pearson Test of English Academic takes 3 hours. The OTE takes, in total, 2 hours and 5 minutes. It can be taken, in theory, on any day of the year, although this depends on the individual Approved Test Centres, and, again, in theory, it can be booked as little as 14 days in advance. Results should take only two weeks to arrive. Further flexibility is offered in the way that candidates can pick ’n’ choose which of the four skills they want to have tests, just one or all four, although, as an incentive to go the whole hog, they will only get a ‘Certificate of Proficiency’ if they do all four.

A further incentive to do all four skills at the same time can be found in the price structure. One centre in Spain is currently offering the test for one single skill at Ꞓ41.50, but do the whole lot, and it will only set you back Ꞓ89. For a high-stakes test, this is cheap. In the UK right now, both Cambridge First and Pearson Academic cost in the region of £150, and IELTS a bit more than that. So, faster, more flexible and cheaper … Oxford means business.

Individual experience

The ‘individual experience’ on the next page of the brochure is pure marketing guff. This is, after all, a high-stakes, standardised test. It may be true that ‘the Speaking and Writing modules provide randomly generated tasks, making the overall test different each time’, but there can only be a certain number of permutations. What’s more, in ‘traditional tests’, like Cambridge First, where there is a live examiner or two, an individualised experience is unavoidable.

More interesting to me is the reference to adaptive technology. According to the brochure, ‘The Listening and Reading modules are adaptive, which means the test difficulty adjusts in response to your answers, quickly finding the right level for each test taker. This means that the questions are at just the right level of challenge, making the test shorter and less stressful than traditional proficiency tests’.

My curiosity piqued, I decided to look more closely at the Reading module. I found one practice test online which is the same as the demo that is available at the OTE website . Unfortunately, this example is not adaptive: it is at B1 level. The actual test records scores between 51 and 140, corresponding to levels A2, B1 and B2.

Test scores

The tasks in the Reading module are familiar from coursebooks and other exams: multiple choice, multiple matching and gapped texts.

Reading tasks

According to the exam specifications, these tasks are designed to measure the following skills:

  • Reading to identify main message, purpose, detail
  • Expeditious reading to identify specific information, opinion and attitude
  • Reading to identify text structure, organizational features of a text
  • Reading to identify attitude / opinion, purpose, reference, the meanings of words in context, global meaning

The ability to perform these skills depends, ultimately, on the candidate’s knowledge of vocabulary and grammar, as can be seen in the examples below.

Task 1Task 2

How exactly, I wonder, does the test difficulty adjust in response to the candidate’s answers? The algorithm that is used depends on measures of the difficulty of the test items. If these items are to be made harder or easier, the only significant way that I can see of doing this is by making the key vocabulary lower- or higher-frequency. This, in turn, is only possible if vocabulary and grammar has been tagged as being at a particular level. The most well-known tools for doing this have been developed by Pearson (with the GSE Teacher Toolkit ) and Cambridge English Profile . To the best of my knowledge, Oxford does not yet have a tool of this kind (at least, none that is publicly available). However, the data that OUP will accumulate from OTE scripts and recordings will be invaluable in building a database which their lexicographers can use in developing such a tool.

Even when a data-driven (and numerically precise) tool is available for modifying the difficulty of test items, I still find it hard to understand how the adaptivity will impact on the length or the stress of the reading test. The Reading module is only 35 minutes long and contains only 22 items. Anything that is significantly shorter must surely impact on the reliability of the test.

My conclusion from this is that the adaptive element of the Reading and Listening modules in the OTE is less important to the test itself than it is to building a sophisticated database (not dissimilar to the GSE Teacher Toolkit or Cambridge English Profile). The value of this will be found, in due course, in calibrating all OUP materials. The OTE has already been aligned to the Oxford Online Placement Test (OOPT) and, presumably, coursebooks will soon follow. This, in turn, will facilitate a vertically integrated business model, like Pearson and CUP, where everything from placement test, to coursework, to formative assessment, to final proficiency testing can be on offer.