(This post won’t make a lot of sense unless you read the previous two – Researching research: part 1 and part 2!)
The work of Jayaprakash et al was significantly informed and inspired by the work done at Purdue University. In the words of these authors, they even ‘relied on [the] work at Purdue with Course Signals’ for parts of the design of their research. They didn’t know when they were doing their research that the Purdue studies were fundamentally flawed. This was, however, common knowledge (since September 2013) before their article (‘Early Alert of Academically At-Risk Students’) was published. This raises the interesting question of why the authors (and the journal in which they published) didn’t pull the article when they could still have done so. I can’t answer that question, but I can suggest some possible reasons. First, though, a little background on the Purdue research.
The Purdue research is important, more than important, because it was the first significant piece of research to demonstrate the efficacy of academic analytics. Except that, in all probability, it doesn’t! Michael Caulfield, director of blended and networked learning at Washington State University at Vancouver, and Alfred Essa, McGraw-Hill Education’s vice-president of research and development and analytics, took a closer look at the data. What they found was that the results were probably the result of selection bias rather than a real finding. In other words, as summarized by Carl Straumsheim in Inside Higher Ed in November of last year, there was no causal connection between students who use [Course Signals] and their tendency to stick with their studies .The Times Higher Education and the e-Literate blog contacted Purdue, but, to date, there has been no serious response to the criticism. The research is still on Purdue’s website .
The Purdue research article, ‘Course Signals at Purdue: Using Learning Analytics to Increase Student Success’ by Kimberley Arnold and Matt Pistilli, was first published as part of the proceedings of the Learning Analytics and Knowledge (LAK) conference in May 2012. The LAK conference is organised by the Society for Learning Analytics Research (SoLAR), in partnership with Purdue. SoLAR, you may remember, is the organisation which published the new journal in which Jayaprakash et al’s article appeared. Pistilli happens to be an associate editor of the journal. Jayaprakash et al also presented at the LAK ’12 conference. Small world.
The Purdue research was further publicized by Pistilli and Arnold in the Educause review. Their research had been funded by the Gates Foundation (a grant of $1.2 million in November 2011). Educause, in its turn, is also funded by the Gates Foundation (a grant of $9 million in November 2011). The research of Jayaprakash et al was also funded by Educause, which stipulated that ‘effective techniques to improve student retention be investigated and demonstrated’ (my emphasis). Given the terms of their grant, we can perhaps understand why they felt the need to claim they had demonstrated something.
What exactly is Educause, which plays such an important role in all of this? According to their own website, it is a non-profit association whose mission is to advance higher education through the use of information technology. However, it is rather more than that. It is also a lobbying and marketing umbrella for edtech. The following screenshot from their website makes this abundantly clear.
If you’ll bear with me, I’d like to describe one more connection between the various players I’ve been talking about. Purdue’s Couse Signals is marketed by a company called Ellucian. Ellucian’s client list includes both Educause and the Gates Foundation. A former Senior Vice President of Ellucian, Anne K Keehn, is currently ‘Senior Fellow -Technology and Innovation, Education, Post-Secondary Success’ at the Gates Foundation – presumably the sort of person to whom you’d have to turn if you wanted funding from the Gates Foundation. Small world.
Personal, academic and commercial networks are intricately intertwined in the high-stakes world of edtech. In such a world (not so very different from the pharmaceutical industry), independent research is practically impossible. The pressure to publish positive research results must be extreme. The temptation to draw conclusions of the kind that your paymasters are looking for must be high. Th edtech juggernaut must keep rolling on.
While the big money will continue to go, for the time being, into further attempts to prove that big data is the future of education, there are still some people who are interested in alternatives. Coincidentally (?), a recent survey has been carried out at Purdue which looks into what students think about their college experience, about what is meaningful to them. Guess what? It doesn’t have much to do with technology.