Last month we argued that analytics in higher education has entered a trough of disillusionment. We posited that this is actually a good thing for higher education, because it means bringing attention to the hype itself. It means that we are making progress towards true productivity and student success. We need to learn how to spot the hype before we can move beyond it and realize the true potential of educational data and learning analytics.
It is our hope that the ‘analytics angst’ that has accompanied increased data literacy will put pressure on vendors to reduce hyperbole in their marketing materials and encourage institutions to reset their expectations. A more realistic view of educational data will result in greater adoption, more successful implementations, and results that move the needle by positively impacting student success at scale.
In direct contradiction to Betteridge’s Law, we believe the answer is yes. Analytics in higher education is in the trough of disillusionment.
The trough of disillusionment refers to a specific stage of Gartner’s Hype Cycle. It is that moment when, after a rapid build up leading to a peak of inflated expectations, a technology’s failure to achieve all that was hoped for results in disillusionment. Those who might benefit from a tool perceive a gap between the hype and actual results. Some have rightly pointed out that not all technologies follow the hype cycle, but we believe that analytics in higher education has followed this pattern fairly closely.
EDUCAUSE is big. Really big. With so much to take in, conference-goers (myself included) are easily faced with the paradox of choice: a sense of paralysis in the face of too many options. To help myself and others, I have scanned this year’s conference agenda and selected five presentations that I think will be individually strong, and that as a group offer a good overview of the themes, issues, and state of analytics in higher education today.
Moderated by Michael Feldstein (e-Literate), and featuring John Whitmer (Blackboard), Russ Little (PAR), and Jeff Gold (California State University), and Avi Yashchin (IBM), this session promises to provide an engaging and insightful overview of why analytics are important for higher education, the biggest challenges currently facing the field, and opportunities for the future. Although most of the speakers are strongly affiliated with vendors in the analytics space, they are strong data scientists in their own right and have demonstrated time and time again that they do not shy from critical honesty. Attend this session for a raw glimpse into what analytics mean for higher education today.
Thursday, October 27 | 8:00am – 8:50am | Ballroom A, Level Three
Jisc is a non-profit company that aims to create and maintain a set of shared services in support of higher education in the UK. The Effective Learning Analytics project that Michael Webb will discuss in this session has aimed to provide a centralized learning analytics solution in addition to a library of shared resources. The outputs of this project to date have valuable resources to the international educational analytics community in general, including Code of practice for learning analytics and Learning Analytics in Higher Education. Jisc’s work is being watched carefully by governments and non-governmental organizations worldwide and represents an approach that we may wish to consider emulating in the US (current laws notwithstanding). Attend this session to learn about the costs and opportunities involved in the development of a centralized approach to collecting and distributing educational data.
Thursday, October 27 | 1:30pm – 2:20pm | Meeting Room 202A/B, Level Two
The higher education community is abuzz with talk of how data and analytics can improve student success. But data and analytics are worthless unless they are put in the hands of the right people and in the right ways. I am really interested to see how Ivy Tech has worked to successfully democratize access to information, and also about the ways that access to data has driven the kind of institutional and cultural change necessary to see the most significant results from data-driven initiatives.
Thursday, October 27 | 8:00am – 8:50am | Meeting Room 304A/B, Level Three
Everyone’s talking about analytics, and every institution seemingly has the will to invest. Attention paid to analytics in media and by vendors can lead to the impression that everybody’s doing it, and that everyone who’s doing it is seeing great results. But the truth is far from the case.
I’m not the greatest fan of benchmarking in general. Too often, benchmarking is productized by vendors and sold to universities despite providing very little actionable value. Worse yet, they can exacerbate feelings of institutional insecurity and drive imprudent investments. But when it comes to analytics, benchmarking done right can provide important evidence to counteract misperceptions about the general state of analytics in the US, and provide institutions with valuable information to inform prudent investment, planning, and policy decisions. In this presentation, I look forward to hearing Christopher Brooks and Jeffery Pomerantz from EDUCAUSE discuss their work on the analytics and student success benchmarking tools.
Friday, October 28 | 8:00am – 8:50am | Meeting Room 304C/D, Level Three
I am a huge advocate of open standards in learning analytics. Open standards mean greater amounts of higher quality data. They mean that vendors and data scientists can spend more time innovating and less time just trying to get plumbing to work. In this interactive presentation, Malcolm Brown (EDUCAUSE), Jenn Stringer (University of California, Berkeley), Sean DeMonner (University of Michigan-Ann Arbor), and Virginia Lacefield (University of Kentucky) talk about how open learning standards like IMS Caliper and xAPI are creating the foundation for the emergence of next generation learning environments.
‘Learning Analytics,’ as so many know it, is already passé.
There is almost always a disconnect between research innovation and the popular imagination. By the time a new concept or approach achieves widespread acceptance, its popular identity and applications too-often lag behind the state of the field.
The biggest problem with learning analytics in its most popular incarnations is that — particularly as it is applied at scale by colleges, universities, and in vendor-driven solutions — it sits on top of existing learning management architectures which, in turn, rely on irrelevant assumptions about what higher education looks like. At most universities, investing in learning analytics means buying or building a ‘nudging’ engine, a predictive model based on data from an institution’s student information system (SIS) and Learning Management System (LMS) that is used to drive alerts about at risk students. Such investments are costly, and so institutions have a vested interest in maximizing their Return on Investment (ROI). Where models make use of LMS data, their accuracy is a function of an institution’s rate of LMS utilization. The more data the better. If a university is serious about its investment in learning analytics, then, it also needs to be serious about its investment in a single learning management system to the exclusion of other alternatives.
But learning management systems are famously at odds with 21st century pedagogies. Popular solutions like Blackboard, D2L (now Brightspace), and Canvas continue to operate according to the assumption the university teaching involves the packaging of information for distribution by a single expert. Even with the addition of more social elements like blogs, wikis, and discussion boards, the fact that all of these elements are contained within a single ‘course shell’ betrays the fact that LMS-based teaching is centralized and, even if not tightly controlled by the instructor, at least curated (what is a syllabus, after all, but a set of rules set out by an instructor that determine what does and does not belong in their course?). The 20th century course is the intellectual property of the instructor (the question of course ownership has been raised with particular vigor recently, as schools push to deliver MOOCs). It is the instructor’s creation. It is the teacher’s course. It may be for students, but it is not theirs.
Learning analytics is very easy to do in the context of highly centralized teaching environments: where the institutions offers instructors a limited and requisite range of educational technologies, and where students agree to limit their learning activity to the teacher’s course. But learning analytics is first and foremost about learning, and learning in the 21st century is not centralized.
In a new report for the EDUCAUSE Learning Initiative (ELI), Malcolm Brown, Joanne Dehoney, and Nancy Millichap observe that the changing needs of higher education demand a major shift in thinking away from the Learning Management System and towards a Digital Learning Environment.
What is clear is that the LMS has been highly successful in enabling the administration
of learning but less so in enabling learning itself. Tools such as the grade book and mechanisms for distributing materials such as the syllabus are invaluable for the management of a course, but these resources contribute, at best, only indirectly to learning success. Initial LMS designs have been both course- and instructor- centric, which is consonant with the way higher education viewed teaching and learning through the 1990s.
In contrast the centralized learning management systems, the report’s authors recommend a Next Generation Digital Learning Environment (NGDLE) that acknowledges the distributed nature of learning and that empowers pedagogical innovation through a “Lego” approach that allows for decentralization, interoperability, and personalization. Analytics, in this context, would be challenging as the NGDLE would not lend itself to a single learning analytics layer. Open data, however, would facilitate the creation of modules that would address the needs of instructor, students, and researchers in particular learning environments. In this, we see a shift in the aim of analytics away from management and administration, and toward learning as an aim in itself.
Where the lag between learning analytics research and popular imagination is being overcome is in efforts like GlassLab (Games, Learning and Assessment Lab), an experimental nonprofit that aims at the creation of analytics-enabled educational versions of commercial video games. A joint initiative of the Educational Testing Service (ETS) and Electronic Arts, GlassLab aims at rethinking standardized testing in an age of gamified learning.
The future of learning analytics, a future in which it is not passé, is one in which learning comes before management, and analytics are intensive rather than extensive. In this, the field of learning analytics can actually function as an important agent of change. Critics have expressed concern over the datification of education, observing that the needs of big data require standardized systems that tend to conserve outdated (and counterproductive) pedagogical models. But the most innovative approaches to learning analytics take learning seriously. They are not interested in reducing learning to something that can be grasped, but rather in understanding it in all its complexity, and in all of its particular contexts. A common theme among the talks hosted by Emory University this past year as part of its QuanTM Learning Analytics Speaker Series (which featured Chuck Dziuban, Alyssa Wise, Ryan Baker, Tim McKay, and Dragan Gašević) was that learning is complicated, and learning analytics is hard to do. Gluttons for punishment, driven by a strong sense of vocation, and exceptionally humble, researchers and innovators in the learning analytics space are anything but reductive in their views of learning and are among the greatest advocates of distributed approaches to education.
I worry that learning analytics is indeed passé, a buzz word picked up by well-meaning administrators to give the impression of innovation while at the same time serving to support otherwise tired and irrelevant approaches to educational management. When I look at the work of learning analytics, however, what I see is something that is not only relevant and responsive to the needs of learners here and now, but also reflectively oriented toward shaping a rich and robust (possibly post-administrative) educational future.
15 January 2015 Study of Corporate LMS’s Shows that Learning Analytics is Important, but Unsatisfactory
Among those from organizations surveyed by Ventana Research, 60% of respondents cite learning management as an important next-generation learning management technology. However, only 37% are satisfied with their current analytics solution.
14 January 2015 LAMP Integrated into Jisc Learning Analytics Efforts
Library Analytics and Metrics Project (LAMP) announced this week that it will “move forward as an integral part of the overarching learning analytics R&D efforts of Jisc.” Until now, the LAMP project has been running in parallel with other related activity. Jisc is currently in the process of procuring vendors to contribute to various elements of an open learning analytics solution. With LAMP’s integration, not only is its technology (a dashboard for visualizing library use patterns) assured of being a part of the larger Jisc solution, but its members look forward to supporting the trajectory of the Jisc analytics project more generally.
14 January 2015 EDUCAUSE Drops Two Learning Analytics Technologies from its Annual Top Ten List
In contrast to the EDUCAUSE Susan Grajek, explains, the decrease in representation by analytics technologies should in no way indicate a decline in interest. Rather, she explains that “What we’re finding is that institutions are picking up the pace and paying more attention to more technologies. Attention isn’t shifting in the amount of effort, it’s shifting in terms of expanding.”
16 January 2015 Learning Analytics Speaker Series at Emory University
The 2014 – 2015 Learning Analytics Speaker Series at Emory University hosted by the Institute for Quantitative Theory and Methods aims to stimulate conversations about the use of educational data to promote student success, however that may be conceived.
The Southeast Educational Data Symposium (SEEDS) will bring together administrators, researchers, and instructors to share how they are making use of educational data to foster student success, and to generate opportunities for ongoing collaboration in the Southeast region of the United States on Friday, February 20, 2015 at the Emory Conference Center in Atlanta, GA.The all-day event will run from 9:00 AM to 4:30 PM EST and include a morning keynote, delivered by Carolyn Rosé (Carnegie Mellon University), followed by panel discussions and lunch-time roundtables.