On Magic Beans: Or, Is Learning Analytics simply Educational Data Science, Poorly Done?

UPDATE 31 January 2017: This blog post was written during 2014. Since that time, Blackboard has made several very important and strategic hires including Mike Sharkey, John Whitmer, and others who are not only well-regarded data scientists, but also passionate educators. Since 2014, Blackboard has become a leader in educational data science, conducting generalizable research to arrive at insights with the potential to make a significant impact on how we understand teaching and learning in the 21 century. Blackboard has changed. Blackboard is now committed to high quality research in support of rigorously defensible claims to efficacy. Blackboard is not in the business of selling magic beans. Blackboard is also not the only company doing excellent work in this way. As this article continues to be read and shared, I still believe it has value. But it should be noted that the concerns that I express here are a reflection of the state of a field and industry still in its infancy. The irony it describes is still present to be sure, and we should all work to increase our data literacy so that we can spot the magic beans where they exist, but it should also be noted that educational technology companies are not enemies. Teachers, researchers, and edtech companies alike are struggling together to understand the impact of their work on student success. Appreciating that fact, and working together in a spirit of honesty and openness is crucial to the success of students and institutions of higher education in the 21st century.


The learning analytics space is currently dominated, not by scholars, but rather by tool developers and educational technology vendors with a vested interest in getting their products to market as quickly as they possibly can. The tremendous irony of these products is that, on the one hand, they claim to enable stakeholders (students, faculty, administration) to overcome the limitations of anecdotal decision-making and achieve a more evidence-based approach to teaching and learning. On the other hand, however, the effectiveness of the vast majority of learning analytic tools are untested. In other words, vendors insist upon the importance of evidence-based (i.e. data-driven) decision-making, but rely upon anecdotal evidence in support of claims with regard to the value of their analytics products.

In the above presentation, Kent Chen (former Director of Market Development for Blackboard Analytics) offers a startlingly honest account of the key factors motivating the decision to invest in Learning Analytics:

Analytics, I believe, revolves around two key fundamental concepts. The first of these fundamental concepts is a simple question: is student activity a valid indicator of student success? And this question is really just asking, is the amount of work that a student puts in a good indicator of whether or not that student is learning? Now this is really going to be the leap of faith, the jumping off point for a lot of our clients

Is learning analytics based on a leap of faith? If this is actually the case, then the whole field of learning analytics is premised on a fallacy. Specifically, it begs the question by assuming its conclusion in its premises: “we can use student activity data to predict student success, because student activity data is predictive of student success.” Indeed, we can see this belief in ‘faith as first principle’ in the Blackboard Analytics product itself, which famously fails to report on its own use.

Fortunately for Chen (and for Blackboard Analytics), he’s wrong. During the course of Emory’s year-long pilot of Blackboard Analytics for Learn, we were indeed able to find small but statistically significant correlations between several measures of student activity and success (defined as a grade of C or higher). Our own findings provisionally support the (cautious) use of student course accesses and interactions as heuristics on the basis of which an instructor can identify at-risk students. When it comes to delivering workshops to faculty at Emory, our findings are crucial, not only to making a case in defense of the value of learning analytics for teaching and course design, but also as we discuss how those analytics might most effectively be employed. In fact, analytics is valuable as a way of identifying contexts in which embedded analytic strategies (i.e. student-facing dashboards) might have no, or even negative, effects, and it is incumbent upon institutional researchers and course designers to use the data they have in order to evaluate how to use that data most responsibly. Paradoxically, one of the greatest potential strengths of learning analytics is that it provides us with insight into the contexts and situations where analytics should not be employed.

I should be clear that I use Blackboard Analytics as an example here solely for the sake of convenience. In Blackboard’s case, the problem is not so much a function of the product itself (which is a data model that is often mistaken for a reporting platform), but rather of the fact that it doesn’t understand the product’s full potential, which leads to investment in the wrong areas of product development, cliched marketing, and unsophisticated consulting practices. The same use of anecdotal evidence to justify data-driven approaches to decision-making is endemic to the learning analytics space dominated by educational technology vendors clamoring to make hay from learning analytics while the sun is shining.

I should also say that these criticisms do not necessarily apply to learning analytics researchers (like those involved with the Society of Learning Analytics Research and scholars involved in educational data mining). This is certainly not to say that researchers do not have their own sets of faith commitments (we all do, as a necessary condition of knowledge in general). Rather, freed from the pressure to sell a product, this group is far more reflective about how they understand concepts. As a community, the fields of learning analytics and educational data mining are constantly grappling with questions about the nature of learning, the definition(s) of student success, how concepts are best operationalized, and how specific interventions might be developed and evaluated. To the extent that vendors are not engaged in these kinds of reflective activity — that immediate sales trump understanding — it might be argued that vendors are giving ‘learning analytics’ a bad name, since they and the learning analytics research community are engaged in fundamentally different activities. Or perhaps the educational data science community has made the unfortunate decision to adopt a name for its activity that is already terribly tainted by the tradition of ‘decision-support’ in business, which is itself nothing if not dominated by a similar glut of vendors using a faith in data to sell its magic beans.

Challenges and Opportunities for Promoting Success among the Successful using Blackboard Analytics™

Presentation deck and abstract from my session at Blackboard World 2014. I also posted a few remarks on challenges associated with learning analytics at an institution with already high levels of student success HERE

Abstract
How can a university that already has very high levels of student performance and retention use data from its Blackboard® learning management system to identify effective teaching practices and at risk students? Based on experience gained from a year-long pilot of Blackboard Analytics™ for Learn at Emory University, this presentation will discuss (1) several unique challenges associated with the use of Blackboard Analytics™ to monitor high performing students, (2) the value of Blackboard Analytics™ as a data warehouse against which to run custom queries and apply more sophisticated data mining techniques, and (3) several preliminary insights obtained through the application of those techniques at Emory University.

Dancing in the Classroom (or, What Teachers can Learn from Jack White)

White Stripes Dancing

In a recent interview with Rolling Stone Magazine, Musician Jack White showed his cranky side while commenting about the current state of live music:

“People can’t clap anymore, because they’ve got a fucking texting thing in their fucking hand, and probably a drink, too!” he says. “Some musicians don’t care about this stuff, but I let the crowd tell me what to do. There’s no set list. I’m not just saying the same things I said in Cleveland last night. If they can’t give me that energy back? Maybe I’m wasting my time.”

If concert-goers who voluntarily part with $300 for prime tickets for one of the most engaging musicians/showmen touring today, an artist who makes an active effort at every show to actively engage their audience, is it any wonder to find students voluntarily parting with tens of thousands of dollars a year only to text and facebook their way through classes taken with even the most elite and engaging of university professors?

When it comes to university teaching, I am most often inclined to say that student engagement is the teacher’s responsibility. Students don’t know any better. They are the product of socialization processes driven by media experience and smart phone notifications. Viewed as an orator, it is the instructor’s duty to take the student where they are, to understand their knowledge-state, values, and interests, and entice them to enter into an experience of knowledge that is otherwise foreign, and even ‘boring.’ I hesitate to blame students for not taking responsibility for their learning, since it is exactly this kind of responsibility that is a key outcome of higher education. There is a sense, however, in which students ARE to blame for their lack of engagement. A lack of attention in the classroom is not necessarily a function of an unengaging teacher, or even of an unengaged student, but rather of the fact that students are making the choice to be engaged by media, content, and interests that are familiar and elsewhere rather than unfamiliar and present.

What’s the solution? According to Ryan Bort, the key to becoming truly engaged in and by the concert experience is to dance:

But say we are willing. What’s is the best way to stave off this inevitable boredom and really engage with what we’ve dedicated our night to come see? How do we reclaim the live experience for what it’s worth? It’s really simple, actually: Dance. Give in to that impulse. Don’t be scared. Go ahead and channel a little Bowie. I’m looking at you, stoic guy with the blank expression and girl who can’t see over the person in front of you. If you dance — and I’m not talking about timid, mindless knee bobbing — all of the encumbrances of the structured venue show will rescind into the periphery and you will enjoy yourself and the music in the realest way possible. A new world will reveal itself and you will be free. So next time dance and dance like you mean it, or keep dancing if that may be the case. In fact, it even works particularly well when the music is live and you’re surrounded by other people in a confined space.

What would dancing in the classroom look like? How can we, as educators, encourage learners to “go ahead and channel a little Bowie”? Are there inhibitions that we need to work with our students to overcome, inhibitions that ‘smart’ technologies serve to foster? How do we move learners to become fully embodied within a learning environment, to be fully present and, in so doing, to abandon themselves within the confined space of the classroom?

Educational Technology is not a Rotisserie Oven

Ed-Tech is not a rotisserie oven

An important and fruitful area of discussion in learning analytics involves the use of embedded student dashboards, which are most commonly sold and promoted as tools for leveraging peer pressure to increase student success (like UMBC’s Check My Activity Tool). In my experience with a similar tool over the last year however, it has become abundantly clear that not all students respond to analytics in the same way. In fact, in two separate classes, instructors who piloted the tool found otherwise high-performing students see decreases in academic performance as a consequence of a kind of ‘gaming’ behavior (not intentional, but a consequence of confusing proxies — ie. Course accesses, minutes in course, interactions, etc — with learning outcomes). Others have observed similar negative results on the part of poor performers, who see a decrease in motivation following an ‘objective’ display of their performance relative to peers. This doesn’t involve learning styles, but does point to the fact that students differ and in such a way that we can’t expect them all to react the same in common learning environments. The task of the teacher, then, would seem to involve communicative strategies that would mitigate damaging effects while enhancing positive ones. The worst thing an instructor can do with any educational technology is to “set it and forget it,” expecting that it will achieve some glorious effect without the need for support from good pedagogy and good teaching.

In other words, Educational technology is not a rotisserie oven.

Using Learning Analytics to Promote Success among the Successful

High performing institutions face a unique set of challenges and opportunities when it comes to investing in the use of educational to support student success.

Emory University, for example, sees a 6 year graduation rate of 90% and an average freshman retention rate of 96% (2014 US News and World Report Rankings). Looking specifically at undergraduate student performance in Emory College from Fall 2013 and Spring 2014, the rate of successful course completion (i.e. students receiving a grade of C or higher in a particular course) is 94%.  If the goal of learning analytics is to increase student success, and success is defined strictly in in terms of retention through to degree and/or achievement of a grade of C or higher, then Emory University has very little reason to invest in learning analytics.

Why, then, should a top-tier university with already high levels of student success seek to invest in learning analytics?

1. Learning Analytics is Fashionable

The oft-quoted 2013 NMC Horizon Report cites learning analytics as positioned for wide-spread adoption within 2 – 3 years. Long and Siemens have argued that big data and analytics represent “the most dramatic factor shaping the future of higher education.” There is a lot of pressure on universities to demonstrate investment in emerging educational technologies in order to maintain their position with respect to peer institutions in the academic marketplace.

Fashion is not, in itself, a bad reason to invest in learning analytics. Fashion becomes a bad reason if it is the only reason motivating investment. The pressure to ‘keep up with the Joneses’ is one that private Ed-Tech companies have capitalized on, and has resulted in a rush to cobble together products and services that capitalize on hype. In the absence of critical reflection on what analytics is and on the kind of questions that stakeholders are interested in using data to address, it becomes easy to confuse the practice of learning analytics with products. Faced with dashboards that promise the moon, but that are meaningless in light of the concrete questions, it is unsurprising when one hears administrators, faculty, and students describe learning analytics as creepy and useless.

With ‘big data,’ learning analytics is at the peak of the hype cycle. There is significant concern on the part of educational researchers, however, that weak motivations and poor products will stifle innovation and see learning analytics fizzle in the trough of disillusionment before achieving the maturity necessary to propel it through to a plateau of productivity.

2. Rethinking Student Success

On the one hand, a student population that is successful by conventional standards makes investment in learning analytics difficult to justify. In the absence of a large at-risk population,  learning analytics sounds an awful lot like a solution in search of a problem. On the other hand, a conventionally successful student population affords a university like Emory the ability to rethink success in a way that looks beyond mere credentialization, and toward an active and critical interest in supporting a variety of practices associated with teaching and learning.

Of course, a university must always concern itself with identifying at-risk students and student populations, and to the development of interventions that would increase their chances of success by conventional standards. No matter how small the gaps, it is always incumbent upon educational institutions to work to ensure as few as possible fall through. As the number of at-risk students decreases, however, individuals become increasingly difficult to identify. In this, machine learning and other predictive modeling techniques can go a long way in providing intelligence where the value of anecdotal evidence rapidly breaks down.

To the extent that a university is freed from worries about degree completion, investments in the area of learning analytics can be made in support of specific learning outcomes and more particularized conceptions of student success. In this, I have been influenced by the data wrangler approach advocated by Doug Clow and implemented with great success at The Open University. A data wrangler works closely with educational stakeholders to analyze relevant data in light of their particular values, goals, and practices. Rather than reduce student success to a one-size-fits-all kind of outcome, the data wrangler recognizes that definitions of success are context dependent, and works to mediate data use in order to address specific questions, here and now.

From a teaching and learning perspective, and with an interest in the ways in which learning analytics can be used in diverse disciplines and educational settings, I have also been influenced by Alyssa Wise, who advocates the use of embedded analytics, or the use of analytics by students and as a central piece of instructional design. Analytics can only have a positive effect on student behavior to the extent that students engage their data in reflective and meaningful ways. By creating occasions for analytical engagement on the part of students (through blogging and journaling, for example), Wise has sought to foster integration, diversity, agency, reflection, parity, and dialogue, and has demonstrated that learning analytics may be employed in ways that are consistent even with humanistic approaches to pedagogy.

 

At Emory university, I am actively working in support of a flexible, robust, and reflective approach to learning analytics. In addition to providing analytical support for instructors and instructional designers (we are already seeing some really interesting results with instructional design implications that extend well beyond Emory), and leading several workshops, I have worked with Emory’s Institute for Quantitative Theory and Methods (QuanTM) to organize a learning analytics speaker series featuring the likes of Ryan Baker, Alyssa Wise, Chuck Dziuban, Carolyn Rosé, and Dragan Gašević. I am also in the process of organizing a full-day SoLAR Flare (Spring 2015), which will bring together thought leaders from around Georgia to discuss their work and opportunities for future collaboration. Lastly, I have facilitated the formation of a new learning analytics community of practice, a community-driven opportunity to support, motivate, and educate Emory faculty and staff with an interest in using data to improve learning and optimize learning environments. My overarching aim in all these initiatives is to promote a reflective approach to learning analytics in support of student success, not just through to degree, but also beyond.

2014 Learning Analytics Summer Institutes Begin Tomorrow

Geolocation from LASI 2014 (June 30, 2014)
FIGURE 1: Geolocation map of user tweets using the hash tag #lasi14 (As of 2014-06-29 19:00:00 EST)

The second annual Learning Analytics Summer Institutes (LASI) begin tomorrow. I am delighted to have been selected as one of the participants for this year’s event, and look forward to coming away at the end of the three days with new skills and insights that I can put immediately into practice, and share with others in my local community in Atlanta, GA. The Society for Learning Analytic Research and the International Educational Data Mining Society together form a vibrant, diverse, and welcoming community of scholars and practitioners. Professional conferences and related events are, generally speaking, terrible. The Learning Analytics and Knowledge Conference, which I has the great pleasure of attending this year, was a rare exception, and I expect LASI to be just as exceptional.

LASI14 Social Netork AnalysisFIGURE 2: Social Network Diagram of user tweets using the Twitter hashtag #lasi14 (As of 2014-06-29 19:00:00 EST)

Unlike an academic conference, the summer institutes are meant to function as an intensive ‘summer camp’ for educational data scientists. This year, in addition to keynote lectures by Pierre Dillenbourg (“What Does Eye Tracking Tell Us On MOOCs”), Phil Winne (“Learning Analytics for Learning Science When N = me”), and Tiffany Barnes (“Making a meaningful difference: Leveraging data to improve learning for most of people most of the time”), the event also gives participants the opportunity to participate in several hands-on workshops. Of course, the most valuable aspect of LASI is the chance to connect with experts in the field of learning analytics, to share ideas, mutually inspire, and generate opportunities for future collaboration.

In addition to the live event at the Harvard Graduate School of Education in Boston, MA, there are also several international satellite events taking place at the same time. Activity from all these events will be tagged #lasi2014, and I will do my best to summarize this activity at the end of each day.

Upping my ‘creepiness factor,’ I have also borrowed my wife’s narrative clip, which I will wear periodically over the next few days. My wife, Elisa Wallace, is an elite equestrian who has recently started working with Narrative to look at device applications in the sport of three day eventing. From the perspective of analytics, wearing the clip will give me an ongoing photo record of this year’s LASI, but also GPS and accelerometer data, which I look forward to reviewing as well.