This Week in Learning Analytics (February 7 – 13, 2015)

On February 9, 2015 Ryan Baker delivered a compelling lecture on predicting student outcomes using automated predictors of engagement and affect. The lecture was delivered as part of the QuanTM Learning Analytics Speaker Series.
On February 9, 2015 Ryan Baker delivered a compelling lecture on predicting student outcomes using automated predictors of engagement and affect. The lecture was delivered as part of the QuanTM Learning Analytics Speaker Series, and will be made availalbe via the Emory University YouTube channel in the coming weeks.

Headlines

Higher Ed


Game Design Helps Engage Students in Classroom
Press and Guide | 13 February 2015

University of Michigan Ann Arbor is in the midst of a pilot program in which classes are taught using a new gamified LMS called Gradecraft. “GradeCraft is a game-inspired LMS designed to facilitate autonomy while building student competence in the subject area. It not only motivates students in nontraditional ways, but also leverages learning analytics to better inform the instructor on a student’s strengths, weaknesses and interests so that the instructor can best understand how to help students succeed.” Students are provided with analytics which chart their progress and identify ways to earn additional ‘points.’ GradeCraft was designed by Caitlin Holman, Stephen Agular, and Barry Fishman, who presented their initial findings at LAK’13.

SOURCE: http://www.pressandguide.com/articles/2015/02/13/news/doc54de183d6f687558600080.txt

Analytics Creating Whole Host of Privacy Issues, Claims University of Derby’s IT Director
Computing | 10 February 2015

IT director of the University of Derby, Neil Williams, comments on the opportunities and challenges associated with the use of big data analytics in university environments. “The challenge for the university, which isn’t the same as for other organisations, centres on the ethics of such ‘Big Brother’ big data efforts. We are building capabilities but we have to look through these internally and ask what it means and go through the appropriate governance methods and understand what our stakeholders feel. We have to make sure what we do is right. In contrast, if you look at a website and somebody is tracking what you’re looking at, you expect to then be targeted [with advertisements], but these [students] are members of our institution – not customers, so it’s a different situation.”

SOURCE: http://www.computing.co.uk/ctg/news/2394650/analytics-creating-whole-host-of-privacy-issues-claims-university-of-derby-s-it-director

FULL INTERVIEW: http://www.computing.co.uk/ctg/interview/2394906/balancing-big-data-with-big-brother-an-interview-with-the-university-of-derbys-neil-williams

University of Nottingham Announces Learning Analytics Program
University of Nottingham | 8 February 2015

The University of Nottingham has initiated a learning analytics project that will roll out in multiple phases. The first phase will involve identifying correlations between students’ use of Moodle, module scores, and satisfaction levels in order to optimize their LMS environment. Future goals for the project involve creating a recommender system and embedding analytics to create interactive learning environments.

SOURCE: http://www.nottingham.ac.uk/teaching/strategypolicy/ttp/analytics/index.aspx

K-12


Uncovering Security Flaws in Digital Education Products for Schoolchildren
New York Times | 8 February 2015

Tony Porterfield has alerted the makers of nearly 20 digital education products to serious security flaws which are symptomatic of “widespread lapses in student data protection across the education technology sector.” Although some companies, including Pearson and Class Dojo, took immediate action in the face of Porterfield’s concerns, many have not.

SOURCE: http://www.nytimes.com/2015/02/09/technology/uncovering-security-flaws-in-digital-education-products-for-schoolchildren.html?_r=0

Events


DIY Learning Analytics Workshop at Emory University
Timothy D. Harfield | Analytics for Learning at Emory | 13 February 2015

Microsoft Education evangelist, Patrick Leblanc, visited Emory University on February 11 to facilitate a workshop on the use of MS Excel, Power Query, Power Pivot, and Power BI to work with educational data sets. He effectively demonstrated the power and versatility of Microsoft products that many currently already license.

SOURCE: https://scholarblogs.emory.edu/ale/2015/02/13/learning-analytics-using-microsoft-excel/

LACE initiative Resulted in New Asian SIG on Learning Analytics
Weiqin Chen | Learning Analytics Community Exchange (LACE) | 8 February 2015

A successful pre-conference learning analytics workshop held in conjunction with the International Conference on Computers in Education (ICCE) has led to the formation of learning analytics special interest group within the Asia-Pacific Society for Computers in Education (APSCE)

SOURCE: http://www.laceproject.eu/blog/lace-initiative-resulted-new-asian-sig-learning-analytics

Research

Journal Papers


Data Mining in Higher Education: University Student Dropout Case Study
Ghadeer S. Abu-Oda & Alaa M. El-Halees |
International Journal of Data Mining & Knowledge Management Process (IJDKP) | 5 (1)

ABSTRACT: In this paper, we apply different data mining approaches for the purpose of examining and predicting students’ dropouts through their university programs. For the subject of the study we select a total of 1290 records of computer science students Graduated from ALAQSA University between 2005 and 2011. The collected data included student study history and transcript for courses taught in the first two years of computer science major in addition to student GPA , high school average , and class label of (yes ,No) to indicate whether the student graduated from the chosen major or not. In order to classify and predict dropout students, different classifiers have been trained on our data sets including Decision Tree (DT), Naive Bayes (NB). These methods were tested using 10-fold cross validation. The accuracy of DT, and NlB classifiers were 98.14% and 96.86% respectively. The study also includes discovering hidden relationships between student dropout status and enrolment persistence by mining a frequent cases using FP-growth algorithm.

SOURCE: http://www.academia.edu/10468618/DATA_MINING_IN_HIGHER_EDUCATION_UNIVERSITY_STUDENT_DROPOUT_CASE_STUDY

Conference Proceedings


A Novel Similarity Measure Between Two Probability Distributions For Course Establishment
Aijiao Liu, Yiping Zhang, Min Chen | International Conference on Education, Management, Commerce and Society (EMCS 2015)

ABSTRACT: In this paper, in order to obtain the optimized analysis of clustering for the probability distributions, the increment of the description length is proposed to instead the relative entropy as the similarity measure between two probability distributions. Its corresponding features are also discussed in detail in this paper. As the improvement, the increment of description satisfies the symmetrical feature. On the basis of this similarity measure, K-means algorithm is employed to analysis the police training data and to influence the corresponding course establishment. The experiment results indicate that the proposed similarity measure can lead to better clustering results than some other previous similarity measure.

SOURCE: http://www.atlantis-press.com/php/pub.php?publication=emcs-15&frame=http%3A//www.atlantis-press.com/php/paper-details.php%3Ffrom%3Dauthor+index%26id%3D16453%26querystr%3Dauthorstr%253DL

Opinions

Best of Blogs


Not In the Clear: Libraries and Privacy
Barbara Fister | Inide Higher Ed | 12 February 2015

Barbara Fister (librarian and self-identified privacy nut) discusses the tension between (1) an interest in promoting student success through learning analytics and (2) a commitment to protect student privacy. She observes that, in spite of the latter, “libraries are terrible at privacy! … in an era when everybody’s doing data dragnets, it’s alarming to see how leaky our library websites are, how revealing our catalogs and databases are, and how cavalier we have been with patron data that we swear we will protect.”

SOURCE: https://www.insidehighered.com/blogs/library-babel-fish/not-clear-libraries-and-privacy

Learning Analytics: On Silver Bullets and White Rabbits
Simon Buckingham Shum | Medium | 8 February 2015

With a poignancy so often lacking in discussions of learning analytics, Simon Buckingham Shum asks a question that should be top of mind for anyone before, during, and after working with big data in education: “In the very process of trying to value certain learning qualities by tracking them, will we in fact distort or even destroy a living, organic system, through clumsy efforts to categorise and quantify?

SOURCE: https://medium.com/@sbskmi/learning-analytics-on-silver-bullets-and-white-rabbits-a92d202dc7e3

Learning Analytics in Practice

Tips and Tricks


7 Ways to Get Started with Analytics & Reports in Moodle
Sean Marx | iLite | 7 February 2015

Reviews seven plugins and reports that give users access to view trends, analytics and data Moodle sites.

SOURCE: https://ilite.wordpress.com/2015/02/07/7-ways-to-get-started-with-analytics-reports-in-moodle/

Co-curricular Module Trends in Moodle, Using Learning Analytics
Charles Kasule | Kings College London, Centre for Technology Enhanced Learning | 6 February 2015

The Kings College London, Centre for Technology Enhanced Learning recently developed a learning analytics tool using Microsoft Excel that could visualize raw Moodle data and produce reports that are viewable offline. “In comparison to what is currently offered by Moodle reporting option, the Learning Analytics tool offers a better way of navigating, comparing, analysing and tracking how users use the VLE materials from the data that the Moodle report provides.”

SOURCE: https://blogs.kcl.ac.uk/ctel/2015/02/06/co-curricular-analytics/

Presentations & Webinars


What is the Tin Can API and how does it enable the flow of data?
Megan Bowe | SoLAR Storm | 7 February 2015

REPLAY WEBINAR: http://solaresearch.org/initiatives/storm/open-webinars/megan-bowe-tincan/

Personal Activity Monitors and the Future of Learning Analytics

Jawbone Up and Learning AnalyticsIn Spring 2013, while discussing the details of his final project, a gifted student of mine revealed that he was prone to insomnia. In an effort to understand and take control of his sleeping habits, had began wearing a device called a ‘Jawbone UP.’ I recently started wearing the device myself, and have found it an exciting (and fun) technology for increasing behavioral awareness, identifying activity patterns (both positive and negative), and motivating self-improvement. Part of the movement toward a quantification of self, this wearable technology not only exemplifies best practice in mobile dashboard design, but it also opens up exciting possibilities for the future of learning analytics.

Essentially, the UP is a bracelet that houses a precision motion sensor capable of recording physical activity during waking hours, and tracking sleep habits during the night. The wearable device syncs to a stunning app that presents the user with a longitudinal display of their activity and makes use of an ‘insight engine’ that identifies patterns and makes suggestions for positive behavioral improvements. The UP is made even more powerful by encouraging the user to record their mood, the specifics of deliberate exercise, and diet. The motto of the UP is “Know Yourself, Live Better.” In the age of ‘big data,’ an age in which it has become possible to record and analyze actual behavioral patterns in their entirety rather than simply relying upon samples of anecdotal accounts, and in which our mobile devices are powerful enough to effortlessly identify patterns of which we, ourselves, would otherwise be quite ignorant, the UP (and its main competitor, the Fitbit Flex) are exemplary personal monitoring tools, and represent exciting possibilities for the future of learning analytics.

Personal activity monitors like the UP effectively combine three of the six “technologies to watch,” as identified in the 2013 Higher Education Edition of the NMC Horizon Report: Wearable Technology, Learning Analytics, and Games and Gamification.

Wearable Technology. As a bracelet, the UP is obviously a wearable technology. This kind of device, however, is strikingly absent from the list of technologies listed in the report, which tend to have a prosthetic quality, extending the user’s ability to access and process information from their surroundings. The most interesting of these, of course, is Google’s augmented-reality-enabled glasses, Project Glass. In contrast to wearable technologies that aim at augmenting reality, motivated by a post-human ambition to generate a cyborg culture, the UP has an interestingly humanistic quality. Rather than aiming at extending consciousness, it aims at facilitating self-consciousness and promoting physical and mental well-being by revealing lived patterns of experience that we might otherwise fail to recognize. The technology is still in its infancy and is currently only capably of motion sensing, but it is conceivable that, in the future, such devices might be able to automatically record various other bodily activities as well (like heart-rate and geo-location, for example).

Learning Analytics. Learning Analytics is variously defined, but it essentially refers to the reporting of insights from learner behavior data in order to generate interventions that increase the chances of student success. Learning analytics takes many forms, but one of the most exciting is the development of student dashboards that identify student behaviors (typically in relation to a learning management system like Blackboard) and make relevant recommendations to increase academic performance. Acknowledging the powerful effect of social facilitation (the social-psychological insight that people often perform better in the presence of others than they do alone), such dashboards often also present students with anonymized information about class performance as a baseline for comparison. To the extent that the UP and the Fitbit monitor activity for the purpose of generating actionable insights that facilitate the achievement of personal goals, they function in the same way as student dashboards that monitor student performance. Each of these systems are also designed as application platforms, and the manufacturers strongly encourage the development of third-party apps that would make us of and integrate with their respective devices. Unsurprisingly, most of the third-party apps that have been built to date are concerned with fitness, but there is no reason why an app could not be developed that integrated personal activity data with information about academic behaviors and outcomes as well.

Games and Gamification. The ability to see one’s performance at a glance, to have access to relevant recommendations for improvement according to personal goals, and to have an objective sense of one’s performance relative to a group of like individuals can be a powerful motivator, and it is exactly this kind of dashboarding that the UP does exceptionally well. Although not aimed at academic success, but on physical and mental well-being, the UP (bracelet and app) functions in the same way as learning analytics dashboards, but better. To my mind, the main difference between the UP and learning analytics dashboards–and the main area in which learning analytics can learn from consumer products such as this–is that it is fun. The interface is user-friendly, appealing, and engaging. It is intentionally whimsical, like a video game, and so encourages frequent interaction and a strong desire to keep the graphs within desired thresholds. The desire to check in frequently is further increased by the social networking function, which allows friends to compare progress and encourage each other to be successful. Lastly, the fact that the primary UP interface takes the form of a mobile app (available for both  is reflective of the increasing push toward mobile devices in higher education. Learning analytics and student dashboarding can only promote student success if the students use it. More attention must be placed, then, on developing applications and interfaces that students WANT to use.

Screen shot of the iOS dashboard from the Jawbone UP app

Personal activity monitors like the UP should be exciting to, and closely examined by, educators. As a wearable technology that entices users to self-improvement by making performance analytics into a game, the UP does exactly what we are trying to do in the form of student activity dashboards, but doing it better. In this, the UP app should serve as an exemplar as we look forward to the development of reporting tools that are user-focused, promoting ease, access, and fun.

Looking ahead, however, what is even more exciting (to me at least) is the prospect that wearable devices like the UP might provide students with the ability to extend the kinds of data that we typically (and most easily) correlate with student success. We have LMS information, and more elaborate analytics programs are making effective use of more dispositional factors. Using the UP as a platform, I would like to see someone develop and app that draws upon the motion, mood, and nutrition tracking power of the UP and that allows students to relate this information to academic performance and study habits. Not only would such an application give students (I would hesitate to give personal data like this to instructors and / or administrators) a more holistic vision of the factors contributing or detracting from academic success, but it would also help to cultivate healthy habits that would contribute to student success in a way that extends beyond the walls of the university and into long-term relationships at work, to family, and with friends as well.