‘Learning Analytics,’ as so many know it, is already passé.
There is almost always a disconnect between research innovation and the popular imagination. By the time a new concept or approach achieves widespread acceptance, its popular identity and applications too-often lag behind the state of the field.
The biggest problem with learning analytics in its most popular incarnations is that — particularly as it is applied at scale by colleges, universities, and in vendor-driven solutions — it sits on top of existing learning management architectures which, in turn, rely on irrelevant assumptions about what higher education looks like. At most universities, investing in learning analytics means buying or building a ‘nudging’ engine, a predictive model based on data from an institution’s student information system (SIS) and Learning Management System (LMS) that is used to drive alerts about at risk students. Such investments are costly, and so institutions have a vested interest in maximizing their Return on Investment (ROI). Where models make use of LMS data, their accuracy is a function of an institution’s rate of LMS utilization. The more data the better. If a university is serious about its investment in learning analytics, then, it also needs to be serious about its investment in a single learning management system to the exclusion of other alternatives.
George Tooker, “Landscape with Figures,” 1965-66.
But learning management systems are famously at odds with 21st century pedagogies. Popular solutions like Blackboard, D2L (now Brightspace), and Canvas continue to operate according to the assumption the university teaching involves the packaging of information for distribution by a single expert. Even with the addition of more social elements like blogs, wikis, and discussion boards, the fact that all of these elements are contained within a single ‘course shell’ betrays the fact that LMS-based teaching is centralized and, even if not tightly controlled by the instructor, at least curated (what is a syllabus, after all, but a set of rules set out by an instructor that determine what does and does not belong in their course?). The 20th century course is the intellectual property of the instructor (the question of course ownership has been raised with particular vigor recently, as schools push to deliver MOOCs). It is the instructor’s creation. It is the teacher’s course. It may be for students, but it is not theirs.
Learning analytics is very easy to do in the context of highly centralized teaching environments: where the institutions offers instructors a limited and requisite range of educational technologies, and where students agree to limit their learning activity to the teacher’s course. But learning analytics is first and foremost about learning, and learning in the 21st century is not centralized.
In a new report for the EDUCAUSE Learning Initiative (ELI), Malcolm Brown, Joanne Dehoney, and Nancy Millichap observe that the changing needs of higher education demand a major shift in thinking away from the Learning Management System and towards a Digital Learning Environment.
What is clear is that the LMS has been highly successful in enabling the administration
of learning but less so in enabling learning itself. Tools such as the grade book and mechanisms for distributing materials such as the syllabus are invaluable for the management of a course, but these resources contribute, at best, only indirectly to learning success. Initial LMS designs have been both course- and instructor- centric, which is consonant with the way higher education viewed teaching and learning through the 1990s.
In contrast the centralized learning management systems, the report’s authors recommend a Next Generation Digital Learning Environment (NGDLE) that acknowledges the distributed nature of learning and that empowers pedagogical innovation through a “Lego” approach that allows for decentralization, interoperability, and personalization. Analytics, in this context, would be challenging as the NGDLE would not lend itself to a single learning analytics layer. Open data, however, would facilitate the creation of modules that would address the needs of instructor, students, and researchers in particular learning environments. In this, we see a shift in the aim of analytics away from management and administration, and toward learning as an aim in itself.
Where the lag between learning analytics research and popular imagination is being overcome is in efforts like GlassLab (Games, Learning and Assessment Lab), an experimental nonprofit that aims at the creation of analytics-enabled educational versions of commercial video games. A joint initiative of the Educational Testing Service (ETS) and Electronic Arts, GlassLab aims at rethinking standardized testing in an age of gamified learning.
The future of learning analytics, a future in which it is not passé, is one in which learning comes before management, and analytics are intensive rather than extensive. In this, the field of learning analytics can actually function as an important agent of change. Critics have expressed concern over the datification of education, observing that the needs of big data require standardized systems that tend to conserve outdated (and counterproductive) pedagogical models. But the most innovative approaches to learning analytics take learning seriously. They are not interested in reducing learning to something that can be grasped, but rather in understanding it in all its complexity, and in all of its particular contexts. A common theme among the talks hosted by Emory University this past year as part of its QuanTM Learning Analytics Speaker Series (which featured Chuck Dziuban, Alyssa Wise, Ryan Baker, Tim McKay, and Dragan Gašević) was that learning is complicated, and learning analytics is hard to do. Gluttons for punishment, driven by a strong sense of vocation, and exceptionally humble, researchers and innovators in the learning analytics space are anything but reductive in their views of learning and are among the greatest advocates of distributed approaches to education.
I worry that learning analytics is indeed passé, a buzz word picked up by well-meaning administrators to give the impression of innovation while at the same time serving to support otherwise tired and irrelevant approaches to educational management. When I look at the work of learning analytics, however, what I see is something that is not only relevant and responsive to the needs of learners here and now, but also reflectively oriented toward shaping a rich and robust (possibly post-administrative) educational future.