Traversing the Trough of Disillusionment: Where do Analytics Go from Here?

Co-Authored with Mike Sharkey

Last month we argued that analytics in higher education has entered a trough of disillusionment.  We posited that this is actually a good thing for higher education, because it means bringing attention to the hype itself. It means that we are making progress towards true productivity and student success.  We need to learn how to spot the hype before we can move beyond it and realize the true potential of educational data and learning analytics.

It is our hope that the ‘analytics angst’ that has accompanied increased data literacy will put pressure on vendors to reduce hyperbole in their marketing materials and encourage institutions to reset their expectations. A more realistic view of educational data will result in greater adoption, more successful implementations, and results that move the needle by positively impacting student success at scale.

READ FULL STORY HERE >> http://er.educause.edu/blogs/2016/12/traversing-the-trough-of-disillusionment-where-do-analytics-go-from-here

Has Analytics Fallen Into the Trough of Disillusionment?

Co-Authoered with Mike Sharkey

In direct contradiction to Betteridge’s Law, we believe the answer is yes. Analytics in higher education is in the trough of disillusionment.

The trough of disillusionment refers to a specific stage of Gartner’s Hype Cycle. It is that moment when, after a rapid build up leading to a peak of inflated expectations, a technology’s failure to achieve all that was hoped for results in disillusionment. Those who might benefit from a tool perceive a gap between the hype and actual results. Some have rightly pointed out that not all technologies follow the hype cycle, but we believe that analytics in higher education has followed this pattern fairly closely.

READ FULL STORY >> http://er.educause.edu/blogs/2016/11/has-analytics-fallen-into-the-trough-of-disillusionment

Climbing out of the Trough of Disillusionment: Making Sense of the Educational Data Hype Cycle

In 2014, I wrote a blog post in which I claimed (along with others) that analytics had reached a ‘peak of inflated expectations.’ Is the use of analytics in higher education now entering what Gartner would call the ‘trough of disillusionment’?

In 2011, Long and Siemens famously argued that big data and analytics represented “the most dramatic factor shaping the future of higher education.”  Since that time, the annual NMC Horizon Report has looked forward to the year 2016 as the year when we would see widespread adoption of learning analytics in higher education.  But as 2016 comes to a close, the widespread adoption of learning analytics still lies on the distant horizon.  Colleges and universities are still very much in their infancy when it comes to the effective use of educational data.  In fact, poor implementations and uncertain ROI have led to what Kenneth C. Green has termed ‘angst about analytics.’

As a methodology, the Gartner Hype Cycle is not without criticism.  Audrey Watters, for example, takes issue with the fact that it is proprietary and so ‘hidden from scrutiny.’  Any proprietary methodology is in fact difficult to take seriously as a methodology.  It should also be noted that the methodology is also improperly named, as any methodology that assumes a particular outcome (i.e. that assumes that all technology adoption trends follow the same patters) is unworthy of the term.  But as a heuristic or helpful model, it is helpful way of visualizing analytics adoption in higher education to date, and it offers some helpful language for describing the state of the field. Read more

Using Learning Analytics to Promote Success among the Successful

High performing institutions face a unique set of challenges and opportunities when it comes to investing in the use of educational to support student success.

Emory University, for example, sees a 6 year graduation rate of 90% and an average freshman retention rate of 96% (2014 US News and World Report Rankings). Looking specifically at undergraduate student performance in Emory College from Fall 2013 and Spring 2014, the rate of successful course completion (i.e. students receiving a grade of C or higher in a particular course) is 94%.  If the goal of learning analytics is to increase student success, and success is defined strictly in in terms of retention through to degree and/or achievement of a grade of C or higher, then Emory University has very little reason to invest in learning analytics.

Why, then, should a top-tier university with already high levels of student success seek to invest in learning analytics?

1. Learning Analytics is Fashionable

The oft-quoted 2013 NMC Horizon Report cites learning analytics as positioned for wide-spread adoption within 2 – 3 years. Long and Siemens have argued that big data and analytics represent “the most dramatic factor shaping the future of higher education.” There is a lot of pressure on universities to demonstrate investment in emerging educational technologies in order to maintain their position with respect to peer institutions in the academic marketplace.

Fashion is not, in itself, a bad reason to invest in learning analytics. Fashion becomes a bad reason if it is the only reason motivating investment. The pressure to ‘keep up with the Joneses’ is one that private Ed-Tech companies have capitalized on, and has resulted in a rush to cobble together products and services that capitalize on hype. In the absence of critical reflection on what analytics is and on the kind of questions that stakeholders are interested in using data to address, it becomes easy to confuse the practice of learning analytics with products. Faced with dashboards that promise the moon, but that are meaningless in light of the concrete questions, it is unsurprising when one hears administrators, faculty, and students describe learning analytics as creepy and useless.

With ‘big data,’ learning analytics is at the peak of the hype cycle. There is significant concern on the part of educational researchers, however, that weak motivations and poor products will stifle innovation and see learning analytics fizzle in the trough of disillusionment before achieving the maturity necessary to propel it through to a plateau of productivity.

2. Rethinking Student Success

On the one hand, a student population that is successful by conventional standards makes investment in learning analytics difficult to justify. In the absence of a large at-risk population,  learning analytics sounds an awful lot like a solution in search of a problem. On the other hand, a conventionally successful student population affords a university like Emory the ability to rethink success in a way that looks beyond mere credentialization, and toward an active and critical interest in supporting a variety of practices associated with teaching and learning.

Of course, a university must always concern itself with identifying at-risk students and student populations, and to the development of interventions that would increase their chances of success by conventional standards. No matter how small the gaps, it is always incumbent upon educational institutions to work to ensure as few as possible fall through. As the number of at-risk students decreases, however, individuals become increasingly difficult to identify. In this, machine learning and other predictive modeling techniques can go a long way in providing intelligence where the value of anecdotal evidence rapidly breaks down.

To the extent that a university is freed from worries about degree completion, investments in the area of learning analytics can be made in support of specific learning outcomes and more particularized conceptions of student success. In this, I have been influenced by the data wrangler approach advocated by Doug Clow and implemented with great success at The Open University. A data wrangler works closely with educational stakeholders to analyze relevant data in light of their particular values, goals, and practices. Rather than reduce student success to a one-size-fits-all kind of outcome, the data wrangler recognizes that definitions of success are context dependent, and works to mediate data use in order to address specific questions, here and now.

From a teaching and learning perspective, and with an interest in the ways in which learning analytics can be used in diverse disciplines and educational settings, I have also been influenced by Alyssa Wise, who advocates the use of embedded analytics, or the use of analytics by students and as a central piece of instructional design. Analytics can only have a positive effect on student behavior to the extent that students engage their data in reflective and meaningful ways. By creating occasions for analytical engagement on the part of students (through blogging and journaling, for example), Wise has sought to foster integration, diversity, agency, reflection, parity, and dialogue, and has demonstrated that learning analytics may be employed in ways that are consistent even with humanistic approaches to pedagogy.

 

At Emory university, I am actively working in support of a flexible, robust, and reflective approach to learning analytics. In addition to providing analytical support for instructors and instructional designers (we are already seeing some really interesting results with instructional design implications that extend well beyond Emory), and leading several workshops, I have worked with Emory’s Institute for Quantitative Theory and Methods (QuanTM) to organize a learning analytics speaker series featuring the likes of Ryan Baker, Alyssa Wise, Chuck Dziuban, Carolyn Rosé, and Dragan Gašević. I am also in the process of organizing a full-day SoLAR Flare (Spring 2015), which will bring together thought leaders from around Georgia to discuss their work and opportunities for future collaboration. Lastly, I have facilitated the formation of a new learning analytics community of practice, a community-driven opportunity to support, motivate, and educate Emory faculty and staff with an interest in using data to improve learning and optimize learning environments. My overarching aim in all these initiatives is to promote a reflective approach to learning analytics in support of student success, not just through to degree, but also beyond.