18 September 2014
IBM Introduces Cloud-Based Natural Language Analytics Tools
Best of Blogs
- Post Script to #AMEE2014 #PCW16 Workshop on Personalized Learning by Natalie Lafferty
- An EdTech Thought Experiment by Joshua Kim
- #ectel2014 Workshop – Learning Analytics for and in Serious Games; what I learned… by Vanessa Camilleri
Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment
Shane Dawson & George Siemens
The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional ‘literacy’ skills towards an enhanced set of “multiliteracies” or “new media literacies”. Measuring the literacy of a population, in the light of its linkage to individual and community wealth and wellbeing, is essential to determining the impact of compulsory education. The opportunity now is to develop tools to assess individual and societal attainment of these new literacies. Drawing on the work of Jenkins and colleagues (2006) and notions of a participatory culture, this paper proposes a conceptual framework for how learning analytics can assist in measuring individual achievement of multiliteracies and how this evaluative process can be scaled to provide an institutional perspective of the educational progress in fostering these fundamental skills.
Educational Triage in Open Distance Learning: Walking a Moral Tightrope
Paul Prinsloo & Sharon Slade
Higher education, and more specifically, distance education, is in the midst of a rapidly changing environment. Higher education institutions increasingly rely on the harvesting and analyses of student data to inform key strategic decisions across a wide range of issues, including marketing, enrolment, curriculum development, the appointment of staff, and student assessment. In the light of persistent concerns regarding student success and retention in distance education contexts, the harvesting and analysis of student data in particular in the emerging field of learning analytics holds much promise. As such the notion of educational triage needs to be interrogated. Educational triage is defined as balancing between the futility or impact of the intervention juxtaposed with the number of students requiring care, the scope of care required, and the resources available for care/interventions.
The central question posed by this article is “how do we make moral decisions when resources are (increasingly) limited?” An attempt is made to address this by discussing the use of data to support decisions regarding student support and examining the concept of educational triage. Despite the increase in examples of institutions implementing a triage based approach to student support, there is a serious lack of supporting conceptual and theoretical development, and, more importantly, to consideration of the moral cost of triage in educational settings.
This article provides a conceptual framework to realise the potential of educational triage to responsibly and ethically respond to legitimate concerns about the “revolving door” in distance and online learning and the sustainability of higher education, without compromising ‘openness.’ The conceptual framework does not attempt to provide a detailed map, but rather a compass consisting of principles to consider in using learning analytics to classify students according to their perceived risk of failing and the potential of additional support to alleviate this risk.
Educational Dashboards for Smart Learning: Review of Case Studies
Yesom Yoo, Hyeyun Lee, Il-Hyun Jo, Yeonjeong Park
An educational dashboard is a display which visualizes the results of educational data mining in a useful way. Educational data mining and visualization techniques allow teachers and students to monitor and reflect on their online teaching and learning behavior patterns. Previous literature has included such information in the dashboard to support students’ self-knowledge, self-evaluation, self-motivation, and social awareness. Further, educational dashboards are expected to support the smart learning environment, in the perspective that students receive personalized and automatically-generated information on a real-time base, by use of the log files in the Learning Management System (LMS). In this study, we reviewed ten case studies that deal with development and evaluation of such a tool, for supporting students and teachers through educational data mining techniques and visualization technologies. In the present study, a conceptual framework based on Few’s principles of dashboard design and Kirkpatrick’s four level evaluation model was developed to review educational dashboards. Ultimately, this study is expected to evaluate the current state of educational dashboard development and suggest an evaluative tool to judge whether or not the dashboard function is working properly, in both a pedagogical and visual way.
Learning Analytics Interoperability: Looking for Low-Hanging Fruits
Tore Hoel, Weiqin Chen
When Learning Analytics is seeking a wide community, the challenge of efficiently and reliably moving data between systems becomes important. This paper gives a summary of the current status of Learning
Analytics Interoperability and proposes a framework to help structuring the interoperability work. The model is based on a three dimensional Enterprise Interoperability Framework mapping concerns, interoperability barriers and potential solutions. The paper also introduces the concept of low-hanging fruits in prioritising among solutions. Data gathered from a small group of Norwegian stakeholders are analysed, and a list of potential interoperability issues is presented.
Causal Models and Big Data Learning Analytics
Vivekanandan Suresh Kumar, Kinshuk, Clayton Clemens, Steven Harris
New statistical methods allow discovery of causal models purely from observational data in some circumstances. Educational research that does not easily lend itself to experimental investigation can benefit from such discovery, particularly when the process of inquiry potentially affects measurement. Whether controlled or authentic, educational inquiry is sprinkled with hidden variables that only change over the long term, making them challenging and expensive to investigate experimentally. Big data learning analytics offers methods and techniques to observe such changes over longer terms at various levels of granularity. Learning analytics also allows construction of candidate models that expound hidden variables as well as their relationships with other variables of interest in the research. This article discusses the core ideas of causality and modeling of causality in the context of educational research with big data analytics as the underlying data supply mechanism. It provides results from studies that illustrate the need for causal modeling and how learning analytics could enhance the accuracy of causal models.
Calls for Papers / Participation
Call for Ethical & Privacy Issues in the Application of Learning Analytics Project LACE / SURF SIG LA (DEADLINE: 24 September 2014)
5th International Learning Analytics and Knowledge (LAK) Conference Marist College (Poughkeepsie, NY) | 16-20 March 2015 (SUBMISSION DEADLINE: 14 October 2014)
NEW! ALASI 2014: Australian Learning Analytics Summer Institute University of Technology Sydney (Sydney, AU) | 20-21 Nov 2014 (APPLICATION DEADLINE: 17 October 2014)
6 – 9 October 2014
Learning Analytics Week
École polytechnique fédérale de Lausanne
10 – 16 October 2014
Seminar on Learning Analytics for Schools in Stockholm
15 October 2014
ALE Speaker Series: Charles Dziuban on Engaging Students in an Engaging Educational Environment
Emory University (Atlanta, GA)
20 October 2014
Data, Analytics and Learning: An introduction to the logic and methods of analysis of data to improve teaching and learning
University of Texas Arlington | EdX
24 October 2014
LACE SoLAR Flare
The Open University (Milton Keynes, UK)
New! University of Wisconsin – Madison (USA)
The person in this position will be part of a dynamic evaluation team, situated in the Division of Information Technology (DoIT), Academic Technology (AT) as part of the Evaluation Design & Analysis service. The evaluation consultants in this service provide educational program and product evaluation services for DoIT AT and campus clients, as well as evaluation project management for campus evaluation projects. The position reports to the Assistant Director of the Learning Technology and Distance Education (LTDE) group within DoIT AT.DEADLINE EXTENDED UNTIL: 23 September 2014
University of Queensland (Australia)
The new Institute for Teaching and Learning Innovation (ITALI) will provide leadership, engagement and advocacy in educational innovation, teaching excellence and learning analytics and aims to transform & innovate teaching, learning and creativity. DEADLINE FOR APPLICATION: 22 September 2014
Taylor’s University (Malaysia)
The Learning Analytics Specialist is responsible for strategy development, data analysis, education evaluation and learning analytics. He/She will support the Taylor’s University learning analytics strategic priorities through the design, development and implementation of an innovative and sustainable learning analytics strategy and will engage with academic staff to encourage the use of analytics tools to support teaching and learning. DEADLINE FOR APPLICATION: 3 October 2014
NEW! Boise State University (USA)
Drawing on both technical and analytical competencies, the Report Developer for Academic Analytics in the Office of Institutional Research will collaborate with the IR staff, data consumers, and colleagues in OIT/Business Intelligence and Reporting Services to determine requirements, design solutions, and model data. The Report Developer will perform user and data-driven research, review and verify data, and develop end-user reports. She/he will develop a broad body of data reports and conduct analytic work, with a particular focus on academic analytics, ensuring integrity, timeliness, and appropriate presentation. The Report Developer will be expected to develop collegial working relationships within the unit and across campus units; continually hone her/his technical, analytical, and problem solving skills; and participate in committees and other workgroups as assigned.. DEADLINE FOR APPLICATION: 6 October 2014
Delft Extension School (The Netherlands)
We are looking for a researcher (PhD or Postdoc level), whose research focuses on the modeling
and analysis of learners and their learning processes in the context of MOOCs (Massive Open
Online Courses). DEADLINE FOR APPLICATION: 15 October 2014