Number Games: Data Literacy When You Need It

My wife’s coach one told her that “experience is what you get the moment after you needed it.”  Too often the same can be said for data literacy.  Colleges and universities looking to wisely invest in analytics to support the success of their students and to optimize operational efficiency are confronted with the daunting task of having to evaluate a growing number of options before selecting a products and approaches that are right for them.  What products and services are most likely to see the greatest returns on investment?  What approaches have other institutions taken that have already seen high rates of success?  On the one hand, institutions that are just now getting started with analytics have the great advantage of being able to look to many who have gone before and who are beginning to see promising results.  On the other hand, the analytics space is still immature and there is little long-term high-quality evidence to support the effectiveness of many products and interventions.

Institutions and vendors who have invested heavily in analytics have a vested interest in representing promising results (and they ARE promising!) in the best light possible.  This makes sense.  This is a good thing.  The marketing tactics that both institutions of higher education and educational technology vendors employ as they represent their results are typically honest and in good faith as they earnestly work in support of student success.  But the representation of information is always a rhetorical act.  Consequently, the ways in which results are presented too often obscure the actual impact of technologies and interventions.  The way that results are promoted can make it difficult for less mature institutions to adjudicate the quality of claims and make well-informed decisions about the products, services, and practices that will be best for them.

Perhaps the most common tactic that is used to make results appear more impressive than they are involves changing the scale used on the y-axis of bar and line charts.  A relatively small difference can famously be made to appear dramatic if the range is small enough.  But there are other common tactics that are not as easily spotted that are nonetheless just as important when it comes to evaluating the impact of interventions.  Here are three:

There is a difference between a percentage increase and an increase in percentage points.  For example, an increase in retention from 50% to 55% may be represented as either an increase of 5 points or 10%.  It is also important to note that the same number of points will translate into a different percentage increase depending on the starting rate.  For example, a 5-point increase from a retention rate of 25% represents an increase of 20%.  A 5-point increase from a starting retention rate of 75%, on the other hand, is only an increase of 7%.  Marketing literature will tend to choose metrics based on what sounds most impressive, even if it obscures the real impact.

A single data point does not equal a trend.  Context and history are important.  When a vendor or institution claims that an intervention saw a significant increase in retention/graduation in only a year, it is possible that such an increase was due to chance, an existing trend, or else was the result of other initiatives or shifts in student demographics.  For example, one college recently reported a 10% increase in its retention rate after only one year of using a student retention product.  Looking back at historical retention rates, however, one finds that the year prior to tool adoption marked a significant and uncharacteristic drop in retention, which means that any increase could just as easily have been due to chance or other factors.  In the same case, close inspection finds that the retention rate following tool adoption was still low from an historical perspective, and part of an emerging downward trend rather than the reverse.

It’s not the tool.  It’s the intervention. One will ofter hear vendors take credit for significant increases in retention / graduation rates, when there are actually other far more significant causal factors.  One school, for example, is praised for using a particular analytics system to double its graduation rates.  What tends not to be mentioned, however, is the fact that the same school also radically reduced its student : advisor ratio, centralized its administration, and engaged in additional significant programmatic changes that contributed to the school’s success over and above the impact that the analytics system might have made by itself.  The effective use of an analytics solution can definitely play a major role in facilitating efforts to increase retention and graduation rates.  If fact, all things being equal, it is reasonable to expect a 1 to 3 point increase in student retention as a result of using early alerts powered by predictive analytics.  Significant gains above this, however, are only possible as a result of significant cultural change, strategic policy decisions, and well-designed interventions.  It can be tempting for a vendor specially to at least implicitly take credit for more than is due, but it can be misleading and have the effect of obscuring the tireless efforts of institutions and people who are working to support their students.  More than this, overemphasizing products over institutional change can impede progress.  It can lead institutions to falsely believe that a product will do all the work, and encourage them to naively embark on analytics projects and initiatives without fully understanding the change in culture, policy, and practice to make them fully successful.

Vlogging my way through BbWorld16

EPISODE I: Going to Vegas

Headed to Las Vegas for DevCon and BbWorld 2016. Having attended twice before as a customer, I am very excited to have played a part in organizing this year’s event.

In this vlog episode, I check in with Scott Hurrey (Code Poet at Blackboard) and ask him about what excites him the most about DevCon. Dan Rinzel (Product Manager, Blackboard Analytics) and John Whitmer (Director of Analytics and Research at Blackboard) tackle some extreme food portions.

EPISODE II: Teamwork makes the Dream Work

A day of rehearsal for the BbWorld16 opening general session leads to an air of playful excitement in anticipation of the main event. ‘Dr John’ talks about why data science isn’t scary, and why everyone should be interested and involved.

EPISODE III: Making Magic Happen

Want to go behind the scenes and get a sense of all of the work that goes into the opening main stage keynote presentation each year? Michelle Williams takes us on a tour!

EPISODE IV: Yoga and Analytics

Meet the Predictive Analytics ‘booth babes,’ learn from Michael Berman that yoga and analytics DO mix. Executive Director of the University Innovation Alliance, Bridget Burns, explains why there is a need for more empathy between institutions of higher educations and educational technology companies, and in higher education in general.

EPISODE V: We Are Family

Rachel Seranno from Appalachian State University talks about power poses and memes. Eric Silva praises the power of Twitter. Casey Nugent and Shelley White from the University of Nebraska – Lincoln describe how they are working with Blackboard consultants to understand and optimize instruction.