Climbing out of the Trough of Disillusionment: Making Sense of the Educational Data Hype Cycle

In 2014, I wrote a blog post in which I claimed (along with others) that analytics had reached a ‘peak of inflated expectations.’ Is the use of analytics in higher education now entering what Gartner would call the ‘trough of disillusionment’?

In 2011, Long and Siemens famously argued that big data and analytics represented “the most dramatic factor shaping the future of higher education.”  Since that time, the annual NMC Horizon Report has looked forward to the year 2016 as the year when we would see widespread adoption of learning analytics in higher education.  But as 2016 comes to a close, the widespread adoption of learning analytics still lies on the distant horizon.  Colleges and universities are still very much in their infancy when it comes to the effective use of educational data.  In fact, poor implementations and uncertain ROI have led to what Kenneth C. Green has termed ‘angst about analytics.’

As a methodology, the Gartner Hype Cycle is not without criticism.  Audrey Watters, for example, takes issue with the fact that it is proprietary and so ‘hidden from scrutiny.’  Any proprietary methodology is in fact difficult to take seriously as a methodology.  It should also be noted that the methodology is also improperly named, as any methodology that assumes a particular outcome (i.e. that assumes that all technology adoption trends follow the same patters) is unworthy of the term.  But as a heuristic or helpful model, it is helpful way of visualizing analytics adoption in higher education to date, and it offers some helpful language for describing the state of the field.

According to Gartner, a hype cycle consists of 5 stages: (1) technology trigger, (2) peak of inflated expectations, (3) trough of disillusionment, (3) slope of enlightenment, and (5) plateau of Productivity.  What I would like to argue is that we in the trough of disillusionment to be sure, but also that we are in its late stages.   We are learning a great deal from past mistakes.  We have rapidly increased our understanding about data itself as well as about the cultural changes that are necessary to any analytics project.  The shine has come off the apple, and that’s a good thing.


We know that proactive advising works.  Despite this, it is estimated that only 34% of public universities in the US require students to see an advisor, and only 2% of institutions advise proactively on the basis of alerts.  At institutions like Georgia State University, Integrated Planning and Advising Services (IPAS) and early alert systems are enabling advisors to make a significant difference in the lives of students.  But what is rarely discussed is the heavy burden that these technologies place on schools to radically change processes and models. The shift from traditional to proactive advisement is no small task. But the process is made more complex by the fact that the very IPAS systems that make proactive advisement possible also place a huge burden on otherwise immature advising programs. In other words, the state of the educational technology market today is such that institutions can’t simply change their advising approach. They also need to adapt to complex and inflexible software.

Remember that only 2% of schools are using this technology today.  The schools that we seeing the biggest impact from using these technologies have been working with them since their infancy, and have co-adapted.  They have matured together. In many cases, introducing these technologies as full solutions to schools with traditional advising models is like putting old wine into new wine skins.  The excitement and rhetoric around the power of predictive analytics, and high profile success stories about a handful of institutions that are told over and over again, obscure the fact that (1) success stories are not as common as one would think, and (2) the success of institutions using ‘predictive analytics’ is not reducible to their IT investments. There continues to be a widespread misconception that the scaling of high impact practices around the use of data and analytics is reducible to buying a sophisticated (and costly) product.  This is a misconception that has been understandably promoted both by the educational media and by vendors.  But what this has led to in recent years are a set of unrealistic expectations that are beginning to crumble.  As more and more institutions invest in analytics technology only to be disappointed with the result, ‘horror’ stories have begun to circulate and a general cloud of suspicion has descended upon the field of analytics.


As Kenneth C. Green observes, a major cause of analytics angst is the fact that the hype around analytics and its impact have meant that universities’ reach have extended their grasp.  The hype has put a tremendous amount of pressure on schools not only to invest, but also to ‘catch up’ with the rest of higher education.  Higher education news is saturated with stories about how institutions are effectively making use of data to improve graduation and retention rates.  The recent flood of analytics platforms and tools has also contributed to the perception that ‘everyone’s doing it but me.’  Universities feel as though they need to catch up with the rest of the industry and are scrambling to invest in complex systems despite not having the people and processes in place to see those investments pay off.  But the universities that are making the most effective use of data are still not in the majority.  More than this, the success of these institutions has been, not as a result of large scale investments in analytics, but rather as a result of systematically and strategically adding initiatives and growing institutional capability over time.

Using Georgia State University as an example, news coverage about the institution’s success over the past year and a half gives the impression that the university is an overnight success, and that its success is reducible to recent large-scale investments.  In a recent article for Forbes, Gridiron, Moses, and Seavey asserted that “Georgia State’s success was rooted in the use of predictive analytics and the application of big data to identify and advise students at risk of going off track.”  But when the story is told by Dr Tim Renick, Vice Provost and Vice President of Enrollment Management and Student Success at Georgia State University, he describes the university’s increased graduation rate as the result of compounding the impact of a large number of relatively small investments over a number of years. As Kurtzweil and Wu note,

“no single initiative is responsible for the dramatic gains at GSU; the university’s improvement represents the accumulated impact of a dozen or more relatively modest programs. As it turns out, the recipe for GSU’s success is not a particular solution, but rather a particular approach to problem-solving.”

The key to escaping out of the trough of disillusionment is to escape the hype all together.  Turn off the noise from media and the inclination to compare your institution to those like Georgia State University who are far more mature.  Instead of looking to the Georgia State of today, look to the GSU of 2011 when they first entered into data-informed advising, and pay special attention, not to tech investments, but to the painful process of cultural change.

My advice to institutions looking to make a move into predictive analytics?

  1. Start small — Focus on the specific problems facing your students. Focus intensively rather than extensively, responding to internal pressures generated from student need and barriers to institutional performance instead of competitive pressures. There’s a lesson here that can be learned from Zeno of Citium, founder of the Stoic school of Greek philosophy.  Zeno famously described two archers: The first archer is obsessed with hitting the target, fails to focus on the art of archery and so is prone to miss the mark.  The other focusses on the art of archery itself, obsessed with cultivating the excellences of being a good archer as an end in itself, and hits the mark every time as a matter of course.
  2. Collaborate — There are a great many lessons that can be learned from colleges and universities that have gone before, but the best place to go for information is rarely media or vendors. The best place to go for information about how schools are actually scaling innovation is the schools themselves.  Thanks to efforts from Gates, Lumina, Kresge, and others the conversation in higher education is being shifted in such a way as to foster collaboration in a way that traditional approaches to university ranking has previously disincentivised.  I feel strongly that the reason for both the hype and our current disillusionment are significant gaps between research, technology, and practice.  And I feel strongly that community is the key to addressing these gaps.  It is this belief that led me to found the Southeast Educational Data Symposium, and that drives me now as I work to organize the Blackboard Analytics Symposium.

The above is an expansion upon some of the ideas that I expressed along with Mike Sharkey in a post for the Next Generation Learning Challenges blog: Has Analytics Fallen Into the Trough of Disillusionment?.

Also published on Medium.