This Week in Learning Analytics (October 25 – 31, 2014)

Inspired by Halloween and comments made at the Blackboard Institutional Performance Conference last week | Image by Timothy Harfield Inspired by Halloween and comments made at the Blackboard Institutional Performance Conference last week
Image by Timothy Harfield

Latest News

30 October 2014
Survey Takes Pulse of Civitas Learning Partners’ Work in Analytics and Student Success
Civitas Learning today announced the results of its first “Pulse” survey recently conducted at its Pioneer Summit. More than 70 individuals representing more than 40 higher education institutions and systems participated in the survey. This is the first step in the ongoing effort to benchmark the burgeoning community’s work in predictive analytics focused on student success.

30 October 2014
Learning about Learning Analytics @ #Mozfest
Summary by Adam Lofting of a session he hosted, alongside Andrew Sliwinski, Doug Belshaw, and Simon Knight, on “Learning Analytics for Good in the Age of Big Data.” Lofting reflects upon the learning that took place as a result of this session, through a ‘silly’ meta-exercise.

29 October 2014
Notes from Utrecht Workshop on Ethics and Privacy Issues in the Application of Learning Analytics
Summary of excellent discussion that took place during the Workshop on Ethics & Privacy Issues in the Application of Learning Analytics, an event co-organized by LACE and SURF, and held in Utrecht on 28 October, 2014.

29 October 2014
Statistician explores how faculty can excel in blended learning environments
In a recent lecture sponsored by Emory’s QuanTM, learning analytics expert Chuck Dziuban explained trends about the new learning environment that blends face-to-face and virtual instruction.

Latest Blogs

The Quest for Data that Really Impacts Student Success by Dian Schaffhauser
A really nice review of the field, including 3 learning analytics tips worth knowing (from Josh Baron)

  1. Collaborate with other institutions
  2. Don’t jump into an analytics product willy-nilly
  3. Take care with ethics and data privacy considerations

OPINION: Personalization, Possibilities and Challenges with Learning Analytics by Arthur VanderVeen & Nick Sheltrown
The authors identify two key challenges facing personalized learning through learning analytics:

  1. the need to expand educators’ understanding of what is possible through analytics-driven personalized learning, and
  2. the need to actively engage with practicing educators on how to design and integrate analytics-driven learning experiences

Learning Analytics
Review of Greller and Drachsler article: Translating Learning into Numbers: A Generic Framework for Learning Analytics.

Tribal Student Insight: an interview with Chris Ballard by Niall Sclater
Interview with Chris Ballard, Data Scientist for Student Insight, about a tool that allows customers to build models to predict student risk. The product is currently being developed by Tribal with the University of Wolverhampton.

Recent Publications

Supporting competency-assessment through a learning analytics approach using enriched rubrics
Alex Rayón, Mariluz Guenaga, & Asier Núñez

Universities have increasingly emphasized competencies as central elements of students’ development. However, the assessment of these competencies is not an easy task. The availability of data that learners generate in computer mediated learning offers great potential to study how learning takes place, and thus, to gather evidences for competency-assessment using enriched rubrics. The lack of data interoperability and the decentralization of those educational applications set out a challenge to exploit trace data. To face these problems we have designed and developed SCALA (Scalable Competence Assessment through a Learning Analytics approach), an analytics system that integrates usage -how the user interacts with resources- and social -how students and teachers interact among them-trace data to support competency assessment. The case study of SCALA presents teachers a dashboard with enriched rubrics of blended datasets obtained from six assessment learning activities, performed with a group of 28 students working teamwork competency. In terms of knowledge discovery, we obtain results applying clustering and association rule mining algorithms. Thus, we provide a visual analytics tool ready to support competency-assessment.

Foundations of Big Data and Analytics in Higher Education
Ben Daniel & Russell Butson

This paper contributes to our theoretical understanding of the role Big Data plays in addressing contemporary challenges institutions of higher education face. The paper draws upon emergent literature in Big Data and discusses ways to better utilise the growing data available from various sources within the institutions of higher education, to help understand the complexity of influences on studentͲrelated outcomes, teaching and the ‘what if questions’ for research experimentation. The paper further presents opportunities and challenges associated with the implementation of Big Data analytics in higher education.

Dealing with complexity: educational data and tools for learning analytics
Ángel Hernández-García & Miguel Ángel Conde

The evolution of information technologies and their widespread use have caused an increase in complexity of the educational landscape, as institutions and instructors try to absorb and incorporate these innovations to learning processes. This in turn poses new and countless new challenges to educational research in general, and to new disciplines based on educational data analysis such as learning analytics in particular. In this paper, we introduce the Track on Learning Analytics within the Technological Ecosystems for Enhancing Multiculturality 2014 Conference, a track that aims to present new approaches that allow dealing with this complexity and solving some of these challenges.

The paper provides an overview of the motivations behind the proposal of this track, with a general introduction to learning analytics in this complex context and a presentation of the main challenges in current learning analytics research, both from a data analysis perspective and a tool analysis approach; this introduction is followed by an insight of the submission management and participants’ selection process. Then, a detailed summary of the manuscripts accepted for participation in the conference is presented.

Learning Analytics and Educational Data Mining in Practice: A Systematic Literature Review of Empirical Evidence

Zacharoula Papamitsiou & Anastasios A. Economides

This paper aims to provide the reader with a comprehensive background for understanding current knowledge on Learning Analytics (LA) and Educational Data Mining (EDM) and its impact on adaptive learning. It constitutes an overview of empirical evidence behind key objectives of the potential adoption of LA/EDM in generic educational strategic planning. We examined the literature on experimental case studies conducted in the domain during the past six years (2008-2013). Search terms identified 209 mature pieces of research work, but inclusion criteria limited the key studies to 40. We analyzed the research questions, methodology and findings of these published papers and categorized them accordingly. We used non-statistical methods to evaluate and interpret findings of the collected studies. The results have highlighted four distinct major directions of the LA/EDM empirical research. We discuss on the emerged added value of LA/EDM research and highlight the significance of further implications. Finally, we set our thoughts on possible uncharted key questions to investigate both from pedagogical and technical considerations.

Assessment of Robust Learning with Educational Data Mining
Ryan S. Baker & Albert T. Corbett

Many university leaders and faculty have the goal of promoting learning that connects across domains and prepares students with skills for their whole lives. However, as assessment emerges in higher education, many assessments focus on knowledge and skills that are specific to a single domain. Reworking assessment in higher education to focus on more robust learning is an important step towards making assessment match the goals of the context where it is being applied. In particular, assessment should focus on whether learning is robust (Koedinger, Corbett, & Perfetti, 2012), whether learning occurs in a way that transfers, prepares students for future learning, and is retained over time; and also on skills and meta–competencies that generalize across domains. By doing so, we can measure the outcomes that we as educators want to create, and increase the chance that our assessments help us to improve the outcomes we wish to create. In this article, we discuss and compare both traditional test–based methods for assessing robust learning, and new ways of inferring robustness of learning while the learning itself is occurring, comparing the methods within the domain of college genetics.

Calls for Papers / Participation


Open Learning Analytics Network – Summit Europe Amsterdam | 1 January 2015 (APPLICATION DEADLINE: None, but spaces are limited)

Third International Conference on Data Mining & Knowledge Management Process Dubai, UAE | 23-24 January, 2015 (APPLICATION DEADLINE: 31 October 2014)

Learning at Scale 2015 Vancouver, BC (Canada) | 14 – 15 March 2015 (SUBMISSION DEADLINE: 22 October 2014)

2015 Southeast Educational Data Symposium (SEEDS) Emory University (Atlanta, GA) | 20 Feb 2015 (APPLICATION DEADLINE: 14 November 2014)

11th International Conference on Computer Supported Collaborative
Learning: “Exploring the material conditions of learning: Opportunities and
challenges for CSCL”
University of Gothenburg, Sweden | 7 – 11 June 2015 (SUBMISSION DEADLINE: 17 November 2014)

28th annual Florida AI Research Symposium (FLAIRS-28) on Intelligent Learning Technologies Hollywood, Florida, USA (SUBMISSION DEADLINE: 17 November 2014)

Journals / Book Chapters

Universities and Knowledge Society Journal (RUSC): Special Section on Learning Analytics (SUBMISSION DEADLINE: 20 January 2015)

Employment Opportunities

Simon Fraser University (Victoria, BC, Canada)
Tenure Track Position In Educational Technology And Learning Design – The Faculty of Education, Simon Fraser University ( seeks applications for a tenure-track position in Educational Technology and Learning Design at the Assistant Professor rank beginning September 1, 2015, or earlier. The successful candidate will join an existing complement of faculty engaged in Educational Technology and Learning Design, and will contribute to teaching and graduate student supervision in our vibrant Masters program at our Surrey campus and PhD program at our Burnaby campus. DEADLINE FOR APPLICATION: December 1, 2014

University of Technology, Sydney (Sydney, AUS)
Postdoctoral Research Fellow: Academic Writing Analytics – Postdoctoral research position specialising in the use of language technologies to provide learning analytics on the quality of student writing, across diverse levels, genres and domains DEADLINE FOR APPLICATION: Posted Until Filled

University of Michigan (Ann Arbor, MI)
Senior Digital Media Specialist – The University of Michigan is seeking a qualified Senior Digital Media Specialist to create digital content in support of online and residential educational experiences for the Office of Digital Education & Innovation (DEI). DEADLINE FOR APPLICATION: Posted Until Filled

NYU Steinhardt School of Culture, Education,and Human Developments Center for Research on Higher Education Outcomes (USA)
12-month postdoctoral position – available for a qualified and creative individual with interests in postsecondary assessment, learning analytics, data management, and institutional research.The Postdoctoral Fellow will be responsible for promoting the use of institutional data sources and data systems for the purpose of developing institutional assessment tools that can inform decision making and contribute to institutional improvement across New York University (NYU). DEADLINE FOR APPLICATION: Open Until Filled