Educational Technology is not a Rotisserie Oven

Ed-Tech is not a rotisserie oven

An important and fruitful area of discussion in learning analytics involves the use of embedded student dashboards, which are most commonly sold and promoted as tools for leveraging peer pressure to increase student success (like UMBC’s Check My Activity Tool). In my experience with a similar tool over the last year however, it has become abundantly clear that not all students respond to analytics in the same way. In fact, in two separate classes, instructors who piloted the tool found otherwise high-performing students see decreases in academic performance as a consequence of a kind of ‘gaming’ behavior (not intentional, but a consequence of confusing proxies — ie. Course accesses, minutes in course, interactions, etc — with learning outcomes). Others have observed similar negative results on the part of poor performers, who see a decrease in motivation following an ‘objective’ display of their performance relative to peers. This doesn’t involve learning styles, but does point to the fact that students differ and in such a way that we can’t expect them all to react the same in common learning environments. The task of the teacher, then, would seem to involve communicative strategies that would mitigate damaging effects while enhancing positive ones. The worst thing an instructor can do with any educational technology is to “set it and forget it,” expecting that it will achieve some glorious effect without the need for support from good pedagogy and good teaching.

In other words, Educational technology is not a rotisserie oven.

Using Learning Analytics to Promote Success among the Successful

High performing institutions face a unique set of challenges and opportunities when it comes to investing in the use of educational to support student success.

Emory University, for example, sees a 6 year graduation rate of 90% and an average freshman retention rate of 96% (2014 US News and World Report Rankings). Looking specifically at undergraduate student performance in Emory College from Fall 2013 and Spring 2014, the rate of successful course completion (i.e. students receiving a grade of C or higher in a particular course) is 94%.  If the goal of learning analytics is to increase student success, and success is defined strictly in in terms of retention through to degree and/or achievement of a grade of C or higher, then Emory University has very little reason to invest in learning analytics.

Why, then, should a top-tier university with already high levels of student success seek to invest in learning analytics?

1. Learning Analytics is Fashionable

The oft-quoted 2013 NMC Horizon Report cites learning analytics as positioned for wide-spread adoption within 2 – 3 years. Long and Siemens have argued that big data and analytics represent “the most dramatic factor shaping the future of higher education.” There is a lot of pressure on universities to demonstrate investment in emerging educational technologies in order to maintain their position with respect to peer institutions in the academic marketplace.

Fashion is not, in itself, a bad reason to invest in learning analytics. Fashion becomes a bad reason if it is the only reason motivating investment. The pressure to ‘keep up with the Joneses’ is one that private Ed-Tech companies have capitalized on, and has resulted in a rush to cobble together products and services that capitalize on hype. In the absence of critical reflection on what analytics is and on the kind of questions that stakeholders are interested in using data to address, it becomes easy to confuse the practice of learning analytics with products. Faced with dashboards that promise the moon, but that are meaningless in light of the concrete questions, it is unsurprising when one hears administrators, faculty, and students describe learning analytics as creepy and useless.

With ‘big data,’ learning analytics is at the peak of the hype cycle. There is significant concern on the part of educational researchers, however, that weak motivations and poor products will stifle innovation and see learning analytics fizzle in the trough of disillusionment before achieving the maturity necessary to propel it through to a plateau of productivity.

2. Rethinking Student Success

On the one hand, a student population that is successful by conventional standards makes investment in learning analytics difficult to justify. In the absence of a large at-risk population,  learning analytics sounds an awful lot like a solution in search of a problem. On the other hand, a conventionally successful student population affords a university like Emory the ability to rethink success in a way that looks beyond mere credentialization, and toward an active and critical interest in supporting a variety of practices associated with teaching and learning.

Of course, a university must always concern itself with identifying at-risk students and student populations, and to the development of interventions that would increase their chances of success by conventional standards. No matter how small the gaps, it is always incumbent upon educational institutions to work to ensure as few as possible fall through. As the number of at-risk students decreases, however, individuals become increasingly difficult to identify. In this, machine learning and other predictive modeling techniques can go a long way in providing intelligence where the value of anecdotal evidence rapidly breaks down.

To the extent that a university is freed from worries about degree completion, investments in the area of learning analytics can be made in support of specific learning outcomes and more particularized conceptions of student success. In this, I have been influenced by the data wrangler approach advocated by Doug Clow and implemented with great success at The Open University. A data wrangler works closely with educational stakeholders to analyze relevant data in light of their particular values, goals, and practices. Rather than reduce student success to a one-size-fits-all kind of outcome, the data wrangler recognizes that definitions of success are context dependent, and works to mediate data use in order to address specific questions, here and now.

From a teaching and learning perspective, and with an interest in the ways in which learning analytics can be used in diverse disciplines and educational settings, I have also been influenced by Alyssa Wise, who advocates the use of embedded analytics, or the use of analytics by students and as a central piece of instructional design. Analytics can only have a positive effect on student behavior to the extent that students engage their data in reflective and meaningful ways. By creating occasions for analytical engagement on the part of students (through blogging and journaling, for example), Wise has sought to foster integration, diversity, agency, reflection, parity, and dialogue, and has demonstrated that learning analytics may be employed in ways that are consistent even with humanistic approaches to pedagogy.


At Emory university, I am actively working in support of a flexible, robust, and reflective approach to learning analytics. In addition to providing analytical support for instructors and instructional designers (we are already seeing some really interesting results with instructional design implications that extend well beyond Emory), and leading several workshops, I have worked with Emory’s Institute for Quantitative Theory and Methods (QuanTM) to organize a learning analytics speaker series featuring the likes of Ryan Baker, Alyssa Wise, Chuck Dziuban, Carolyn Rosé, and Dragan Gašević. I am also in the process of organizing a full-day SoLAR Flare (Spring 2015), which will bring together thought leaders from around Georgia to discuss their work and opportunities for future collaboration. Lastly, I have facilitated the formation of a new learning analytics community of practice, a community-driven opportunity to support, motivate, and educate Emory faculty and staff with an interest in using data to improve learning and optimize learning environments. My overarching aim in all these initiatives is to promote a reflective approach to learning analytics in support of student success, not just through to degree, but also beyond.

2014 Learning Analytics Summer Institutes Begin Tomorrow

Geolocation from LASI 2014 (June 30, 2014)
FIGURE 1: Geolocation map of user tweets using the hash tag #lasi14 (As of 2014-06-29 19:00:00 EST)

The second annual Learning Analytics Summer Institutes (LASI) begin tomorrow. I am delighted to have been selected as one of the participants for this year’s event, and look forward to coming away at the end of the three days with new skills and insights that I can put immediately into practice, and share with others in my local community in Atlanta, GA. The Society for Learning Analytic Research and the International Educational Data Mining Society together form a vibrant, diverse, and welcoming community of scholars and practitioners. Professional conferences and related events are, generally speaking, terrible. The Learning Analytics and Knowledge Conference, which I has the great pleasure of attending this year, was a rare exception, and I expect LASI to be just as exceptional.

LASI14 Social Netork AnalysisFIGURE 2: Social Network Diagram of user tweets using the Twitter hashtag #lasi14 (As of 2014-06-29 19:00:00 EST)

Unlike an academic conference, the summer institutes are meant to function as an intensive ‘summer camp’ for educational data scientists. This year, in addition to keynote lectures by Pierre Dillenbourg (“What Does Eye Tracking Tell Us On MOOCs”), Phil Winne (“Learning Analytics for Learning Science When N = me”), and Tiffany Barnes (“Making a meaningful difference: Leveraging data to improve learning for most of people most of the time”), the event also gives participants the opportunity to participate in several hands-on workshops. Of course, the most valuable aspect of LASI is the chance to connect with experts in the field of learning analytics, to share ideas, mutually inspire, and generate opportunities for future collaboration.

In addition to the live event at the Harvard Graduate School of Education in Boston, MA, there are also several international satellite events taking place at the same time. Activity from all these events will be tagged #lasi2014, and I will do my best to summarize this activity at the end of each day.

Upping my ‘creepiness factor,’ I have also borrowed my wife’s narrative clip, which I will wear periodically over the next few days. My wife, Elisa Wallace, is an elite equestrian who has recently started working with Narrative to look at device applications in the sport of three day eventing. From the perspective of analytics, wearing the clip will give me an ongoing photo record of this year’s LASI, but also GPS and accelerometer data, which I look forward to reviewing as well.

Using SNAPP in OSX

In recent months, many folks (myself included) have been frustrated by a sudden incompatibility between SNAPP and the latest version of popular browsers and Java. SNAPP (Social Networks Adapting Pedagogical Practice) is a highly useful tool for visualizing discussion board activity in Blackboard, Moodle, and Sakai. I have wanted to use it, and to recommend it to others. With this work around, I can do both.

I should note up front, that I have only successfully implemented this work around on my own desktop environment, and have not tested it on other platforms or using other versions. If this works for you under other conditions, or if you find other more versatile work arounds, please post to the comments.

The environment in which I have been able to successfully launch SNAPP is as follows:

Under these conditions, getting SNAPP to work is actually…well, a snap!

1. In the Java Control Panel (System Preferences –> Java), go to the Security tab and edit the Site List to include both the code source and the base url from which you expect to launch the SNAPP applet. (If you point to, then Java will yell at you for not using https. You’ll have to accept, or else host the code in some other more secure location. So it goes.)

SNAPP Java Fix

2. Click OK to commit the changes.

3. Hurray!!!

In addition to the primary website for the SNAPP tool (, Sandeep Jayaprakash from Marist College has provided some excellent documentation on installing SNAPP to work on a local machine. well worth checking out:

Teaching the Unteachable: On the Compatibility of Learning Analytics and Humane Education

This paper is an exploratory effort to find a place for learning analytics in humane education. After distinguishing humane education from training on the basis of the Aristotelian model of intellectual capabilities, and arguing that humane education is distinct by virtue of its interest in cultivating prudence, which is unteachable, an account of three key characteristics of humane education is provided. Appealing to thinkers of the Italian Renaissance, it is argued that ingenium, eloquence, and self-knowledge constitute the what, how, and why of humane education. Lastly, looking to several examples from recent learning analytics literature, it is demonstrated that learning analytics is not only helpful as set of aids for ensuring success in scientific and technical disciplines, but in the humanities as well. In order to function effectively as an aid to humane education, however, learning analytics must be embedded within a context that encourages continuous reflection, responsiveness, and personal responsibility for learning.

Access Full Text Here

Learning Analytics for a World of Constant Change

Slides from a presentation delivered during the a Digital Pedagogy Meetup in Atlanta (20 February 2014), discussing ways in which traditional analytics may stifle innovation, and identifying several ways in which embedded approaches to learning analytics may actually contribute to the development of personal responsibility, critical thinking, digital citizenship, and imagination — characteristics so vital to surviving and thriving in the 21st century.

“The Society for Learning Analytics Research defines learning analytics as the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. Universities are increasingly using analytics to increase student retention and performance. Yet, the assumptions that frequently underly such initiatives are also inconsistent with pedagogies that would seek to cultivate creativity and innovation, capacities that are necessary in order to survive and thrive in a world that is constantly and increasingly changing. It will be argued that innovation and analytics are not incompatible, however, but rather that they are reconcilable through a shift in emphasis and priority. The presentation will sketch a provisional model of learning analytics that puts analytics in the service of (a humanist conception of) learning, rather than the reverse, and provide concrete examples of how this might be applied in practice.”