This Week in Learning Analytics (November 22 – 28, 2014)

We're Always Watching Pictured left to right: Ty Ty, Poohie, and Pocket

“We’re Watching You”
Pictured left to right: Ty Ty, Poohie, and Pocket

News

Student Privacy and Ethics

November 25, 2014
Seattle Public Schools waited days to tell parents of huge student information leak.
While pursuing an administrative claim agains the Seattle School Board, a man accidentally received two large pdf files containing personal information about nearly all of of the district’s special education students. LESSON: Do not send sensitive student information over email.

November 20, 2014
ClassDojo to Offer Fix for Student Data Privacy Issues.
ClassDojo * a student conduct tracking app) has announced that it will only keep children’s behavioral statistics for one school year. The announcement comes amidst criticism sparked as a result of a recent New York Times Article

Industry Updates

November 28, 2014
Find Your Match: Data Companies Transform College Applications.
Parchment, LinkedIn, and Admittedly offer students college matching tools that predict student interest on the basis of GPA, SAT scores, state, race, and other information. Some worry that such tools will oversimplify college decisions, while others point out that these kinds of recommender systems have the ability to connect students with institutions that would not otherwise be on their radar.

Awards and Accolades

November 24, 2014
Tom Enders Nationally Recognized for Visionary Leadership in Student Success.
Thomas Enders, associate vice president of Enrollment Services at Cal State Long Beach (CSULB), has been presented with the Visionary Leadership Award from the Education Advisory Board (EAB), for the work he has done to increase his institution’s first-time freshman six-year graduation rate. Enders results were a consequence of a combination of predictive analytics and the implementation of an ambitious eAdvising initiative.

Blogs

How to prepare a new kind of classroom teacher by Jill Harvieux Pitner
Increased emphasis is being placed upon incorporating data literacy into American teacher training. Existing approaches to fostering data literacy involve training in data literacy as a decontextualized skill. The Urban Teacher Residency United (UTRU) Assessment and Data Literacy Scope and Sequence seeks to embed training in the use of educational data into all pre-service coursework modules, and in a way that is closely aligned with training in content areas and pedagogy.

Principal uncovers flawed data in her state’s official education reports by Carol Burris
Award-winning Principal Carol Burris of South Side High School in New York comments on problems associated with making significant ‘data-driven’ policy decisions on the basis on poor-quality and incomplete data. ‘Bigger data’ can lead decision-makers into a false sense of certainty that obscures significant gaps. Where decisions have a real impact on the lives and behaviors of people, it is incumbent upon ‘data-driven’ decision-makers to get their priorities straight, and focus on data quality ahead of quantity. It’s time that ‘Big Data’ became ‘Better Data.’

Five Reasons You Shouldn’t Use Technology In The Classroom by Andrew Campbell
The author cites privacy and security concerns as the number one reason why teachers should think twice about incorporating edtech into their classrooms: “My intent is not to prevent or dissuade educators from using EdTech, but rather to ensure more do so. “Non-techy” teachers are smarter than EdTech advocates give them credit for. They know that if something sounds too good to be true, it probably is.”

SOPIPA: A first step towards national standards for student data protection
The author makes a case for federal student privacy standards, along the lines of California’s Student online Personal Information Protection Act (SOPIPA). He identifies several gaps in SOPIPA, but nevertheless upholds the act as an admirable first step, on the road to establishing more universal legislation.

Publications

Articles

Participation-Based Student Final Performance Prediction Model through Interpretable Genetic Programming: Integrating Learning Analytics, Educational Data Mining and Theory
Wanli Xing, Rui Guo, Eva Petakovic, & Sean Goggins

Building a student performance prediction model that is both practical and understandable for users is a challenging task fraught with confounding factors to collect and measure. Most current prediction models are difficult for teachers to interpret. This poses significant problems for model use (e.g. personalizing education and intervention) as well as model evaluation. In this paper, we synthesize learning analytics approaches, educational data mining (EDM) and HCI theory to explore the development of more usable prediction models and prediction model representations using data from a collaborative geometry problem solving environment: Virtual Math Teams with Geogebra (VMTwG). First, based on theory proposed by Hrastinski (2009) establishing online learning as online participation, we operationalized activity theory to holistically quantify students’ participation in the CSCL (Computer-supported Collaborative Learning) course. As a result, 6 variables, Subject, Rules, Tools, Division of Labor, Community, and Object, are constructed. This analysis of variables prior to the application of a model distinguishes our approach from prior approaches (feature selection, Ad-hoc guesswork etc.). The approach described diminishes data dimensionality and systematically contextualizes data in a semantic background. Secondly, an advanced modeling technique, Genetic Programming (GP), underlies the developed prediction model. We demonstrate how connecting the structure of VMTwG trace data to a theoretical framework and processing that data using the GP algorithmic approach outperforms traditional models in prediction rate and interpretability. Theoretical and practical implications are then discussed.

Reports

Learning Analytics: Theoretical Background, Methodology and Expected Results
European Multiple MOOC Aggregator

“Learning analytics in EMMA project will focus on: a) real-time analytics through learning analytics dashboards for instructors and students; b) retrospective analysis of the digital traces in EMMA platform. First approach aims to support participants’ learning activities whereas the second approach is intended for more in-depth analysis of the MOOCs and overall EMMA evaluation. As EMMA is a MOOC platform then calculating the dropout and clustering the participants will be one of the research aims. Additionally uptake of the knowledge, students’ progress and social structures emerging from MOOCs will be analyzed in the pilot phase.

Videos, Presentations, and Webinars

Three Paths for Learning Analytics and Beyond: Moving from Rhetoric to Reality
Colin Beerm Rolley Tickner, & David Jones

Applying Learning Analytics in Serious Games
Baltasar Fernandez-Manjon

Learning Analytics for Holistic Improvement
Ruth Deakin Crick

Calls for Papers / Participation

Conferences

NEW! Workshop: It’s About Time: 4th International Workshop on Temporal Analyses of Learning Data @LAK15 Poughkeepsie, NY | 16 – 20 March, 2015 (SUBMISSION DEADLINE: 11 January 2015)

EDM 2015: 8th International Conference on Education Data Mining Madrid, Spain | 26 – 29 June, 2015 (SUBMISSION DEADLINE: 12 January 2015)

NEW! Workshop: Ethics and Privacy in Learning Analytics (#EP4LA) @LAK15 Poughkeepsie, NY | 16 – 20 March, 2015 (SUBMISSION DEADLINE: 15 January 2015)

NEW! Workshop: LAK Data Challenge 2015 Poughkeepsie, NY | 16 – 20 March, 2015 (SUBMISSION DEADLINE: 31 January 2015)

EDEN Annual Conference Barcelona, Spain | 9 – 12 June, 2015 (SUBMISSION DEADLINE: 31 January 2015)

NEW! The Fourth International Conference on Data Analytics Poughkeepsie, NY | 19 – 24 July, 2015 (SUBMISSION DEADLINE: 27 February 2015)

Journals / Book Chapters

Universities and Knowledge Society Journal (RUSC): Special Section on Learning Analytics (SUBMISSION DEADLINE: 20 January 2015)

Journal of Learning Analytics: Special Section on Multimodal Learning Analytics (SUBMISSION DEADLINE: 1 March 2015)

Employment Opportunities

Data & Society Research Institute (New York, NY)
Researcher, Enabling Connected Learning – seeking either a full-time or part-time researcher to help drive the research components of this project. Start date is negotiable and the appointment is for two years (with renewal possibilities). Applicants should have a PhD in a social science or related field or significant experience doing similar types of research. Applicants may be postdocs or more advanced researchers. This is a fully funded position with benefits and vacation; salary is dependent on experience. The appointment requires residency in New York. Travel may be necessary, both for conducting the research and for disseminating findings. DEADLINE FOR APPLICATION: Posted Until Filled

Simon Fraser University (Victoria, BC, Canada)
Tenure Track Position In Educational Technology And Learning Design – The Faculty of Education, Simon Fraser University (http://www.sfu.ca/education.html) seeks applications for a tenure-track position in Educational Technology and Learning Design at the Assistant Professor rank beginning September 1, 2015, or earlier. The successful candidate will join an existing complement of faculty engaged in Educational Technology and Learning Design, and will contribute to teaching and graduate student supervision in our vibrant Masters program at our Surrey campus and PhD program at our Burnaby campus. DEADLINE FOR APPLICATION: December 1, 2014

University at Buffalo (Buffalo, NY, USA)
Associate for Institutional Research/Research Scientist: Online Learning Analytics – The University at Buffalo (UB), State University of New York seeks a scholar in online learning analytics to join its newly formed Center for Educational Innovation. Reporting to the Senior Vice-Provost for Academic Affairs, the Center for Educational Innovation has a mission to support and guide the campus on issues related to teaching, learning and assessment, and at the same time serves as a nexus for campus-wide efforts to further elevate the scholarship of and research support for pedagogical advancement and improved learning. The Research Scientist in online learning analytics will work in the area of Online Learning within the department and join a campus-wide network of faculty and researchers working on “big data”. DEADLINE FOR APPLICATION: December 6, 2014

University of Boulder Colorado (Boulder, Colorado, USA)
Multiple Tenure Track Positions in Computer Science – The openings are targeted at the level of Assistant Professor, although exceptional candidates at higher ranks may be considered. Research areas of particular interest include secure and reliable software systems, numerical optimization and high-performance scientific computing, and network science and machine learning. DEADLINE FOR APPLICATION: Posted Until Filled

University of Technology, Sydney (Sydney, AUS)
Postdoctoral Research Fellow: Academic Writing Analytics – Postdoctoral research position specialising in the use of language technologies to provide learning analytics on the quality of student writing, across diverse levels, genres and domains DEADLINE FOR APPLICATION: Posted Until Filled

University of Michigan (Ann Arbor, MI)
Senior Digital Media Specialist – The University of Michigan is seeking a qualified Senior Digital Media Specialist to create digital content in support of online and residential educational experiences for the Office of Digital Education & Innovation (DEI). DEADLINE FOR APPLICATION: Posted Until Filled

NYU Steinhardt School of Culture, Education,and Human Developments Center for Research on Higher Education Outcomes (USA)
12-month postdoctoral position – available for a qualified and creative individual with interests in postsecondary assessment, learning analytics, data management, and institutional research.The Postdoctoral Fellow will be responsible for promoting the use of institutional data sources and data systems for the purpose of developing institutional assessment tools that can inform decision making and contribute to institutional improvement across New York University (NYU). DEADLINE FOR APPLICATION: Open Until Filled

This Week in Learning Analytics (November 15 – 21, 2014)

Latest News

17 November 2014
University of Texas at Arlington to Lead $1.6M Digital Learning Research Effort
The Learning Innovation and Networked Knowledge (LINK) Lab at UT Arlington has been chosen to lead a $1.6 million initiative to connect and support researchers across the country as they examine digital learning’s effect on higher education today and in the future. The new Digital Learning Research Network (dLRN) is funded by a grant from the Bill & Melinda Gates Foundation. LINK Lab Executive Director George Siemens will coordinate work between UT Arlington and nine additional institutions, including Carnegie Mellon University, Stanford University, Teachers College Columbia University, the Smithsonian Institution, University of Michigan, Ann Arbor, and others.

Latest Blogs

Standards and Specifications Quick Reference Guide by Adam Cooper
A list of standards and specifications, including research work, which may be relevant to people building Learning Analytics systems. The post includes a link to public draft for comment of a “Specification and Standards Quick Reference Guide” written by Adam Cooper for the Learning Analytics Community Exchange (LACE). As noted in a comment by Rebecca Ferguson, the document appears to be written for a technical audience with experience in software development and interoperability standards. Its technical language aside, the document is incredibly valuable in its current form, and I expect it to become even more so through public comment. I highly recommend reviewing this document, sharing it among more technical personnel, and contributing comments in service to the learning analytics community in general.

Teaching: Can we Blend Tradition with Technology? by Christopher Scanlon
Scanlon looks to ‘traditionalist’ and ‘technologist’ perspectives on teaching and learning and argues that they are both right in certain ways. He notes that present concerns about online education came about as a matter of course (new technologies always make educators uneasy), but also in response to earlier iterations of online teaching, which were ‘cheep and nasty.’ Learning analytics provide new ways to track and target students in a way that customizes content in much the same way as the teacher qua orator, but in a way that both scales and accepts the technologized nature of contemporary student life.

Next Generation Online Learning by Steven Mintz
Mintz identifies five contrasting ways to achieve four next generation aspirations: (1) learner focus, (2) interactivity, (3) scalability, and (4) decreased ration of cost:quality. In this, Mintz offers analytics key to scaling education, through the ability to offer real-time feedback and facilitating credentialization. I worry, however, about the emphasis that Mintz places upon modularity, and emphasis that is becoming increasingly common as a result of Clay Christensen’s ‘disruptive’ influence. In the start-up culture, profit motives and agility are too often associated with innovation, but at the expense of reflection upon more fundamental questions about the nature of education and the meaning of student success. What Mintz identifies as ‘aspirations’ are not goals (an aspiration is prescriptive by definition), but are rather pre-exiting cultural movements. I worry about embracing cultural trends as if they are goals, and would far rather see education as a space in which such trends might be called into question through a willingness to aspire to goals that are truly aspirational and future-focussed…even to the point of being counter-cultural.

Ten Educational Innovations to Watch For in the Next Ten Years by Keith Sawyer
A summary of a recent report written by a team of researchers led by Mike Sharples at the Open University (UK). Sawyer smartly acknowledges the commonplace and largely useless nature of the vast majority of top-ten lists on the internet, but highly recommends the report as representing the work of researchers at the cutting edge of learning science. The list includes massive open social learning, learning design informed by analytics, flipped classrooms, bring your own devices, learning to learn, dynamic assessment, event-based learning, learning through storytelling, threshold concepts, and bricolage.

Technology that puts the classroom in students’ laptops by Adam Stanley
Written for the Globe and Mail, the piece is all over the place. In essence, technology is helping institutions to offer education in much the same way as Starbucks serves coffee (bugs and all?). The aim on the part of many educators in Canada, is to use technology to support learner-centered models of education, in a way that fosters reflection and serves the bottom line. 9/10 students are at least moderately interested in learning analytics…what students want, students get.

Featured Videos, Presentations & Webinars

Open Learning Analytics Panel at the Open Education Conference
Josh Baron, Stian Håklev, and Norman Bier

Analytics for Decision Making in Learning Environments
Abelardo Pardo

Recent Publications

Learning Analytics to Quantize and Improve the Skills Development and Attainment in Large Classes
Ishwari Aghav & Jagannath Aghav

The intervention of technology in the teaching and learning processes is bringing change. There is requirement of methods and techniques to analyze the generated data in these processes. Traditional way of delivery and assessment is becoming more focused by the application of technology. In this paper, we propose and illustrate an algorithmic methodology that shall allow the stakeholders in education to focus more on skills attainment effectively.

In large size class, there is huge data generated by registration system, administration system, and the learning management system together. We propose the methodology that mainly maps and quantizes the learning outcomes, and that relates to the successful and weaker range of attainments of stakeholders. This algorithmic methodology reads the data viz. course details, outcomes, and assessment scores of a large class and analyzes the skills attainment with respect to the planned each course outcome. The detailed report is generated with the average of attainment of the class strength, highest attainment, and lowest attainment in each of the course outcomes for each student.

Learning Analytics: The Current State of Play in UK Higher and Further Education
Niall Sclater

This report examines the current situation in a range of universities and colleges across the UK. A number of institutions which were known to be carrying out work in learning analytics were approached to see if they would be prepared to be interviewed. The resulting list includes ten universities, two colleges and the University of London Computing Centre, which hosts the virtual learning environment’s (VLE) of more than a hundred organisations and is planning to further develop its analytics capabilities. While the list of institutions cannot be considered representative of UK tertiary education it does provide a snapshot of activity in a wide variety of institutions, and includes some of the most high profile developments.

Calls for Papers / Participation

Conferences

Open Learning Analytics Network – Summit Europe 2014 Amsterdam | 1 December 2014 (APPLICATION DEADLINE: None, but spaces are limited)

EDEN Annual Conference Barcelona, Spain | 9 – 12 June, 2015 (SUBMISSION DEADLINE: 31 January 2014)

EDM 2015: 8th International Conference on Education Data Mining Madrid, Spain | 26 – 29 June, 2015 (SUBMISSION DEADLINE: 12 January 2014)

Journals / Book Chapters

Universities and Knowledge Society Journal (RUSC): Special Section on Learning Analytics (SUBMISSION DEADLINE: 20 January 2015)

NEW! Journal of Learning Analytics: Special Section on Multimodal Learning Analytics (SUBMISSION DEADLINE: 1 March 2015)

Employment Opportunities

NEW! Data & Society Research Institute (New York, NY)
Researcher, Enabling Connected Learning – seeking either a full-time or part-time researcher to help drive the research components of this project. Start date is negotiable and the appointment is for two years (with renewal possibilities). Applicants should have a PhD in a social science or related field or significant experience doing similar types of research. Applicants may be postdocs or more advanced researchers. This is a fully funded position with benefits and vacation; salary is dependent on experience. The appointment requires residency in New York. Travel may be necessary, both for conducting the research and for disseminating findings. DEADLINE FOR APPLICATION: Posted Until Filled

Leiden University (Leiden, Netherlands)
Postdoc Data & Learning Analytics for Online Learning – The function involves doing research and conducting experiments within the Online Learning Lab ( http://leidenuniv.onlinelearninglab.org/ ). It’s purpose: 1) to improve online learning and 2) publishing in professional and scientific journals. DEADLINE FOR APPLICATION:November 28, 2014

Simon Fraser University (Victoria, BC, Canada)
Tenure Track Position In Educational Technology And Learning Design – The Faculty of Education, Simon Fraser University (http://www.sfu.ca/education.html) seeks applications for a tenure-track position in Educational Technology and Learning Design at the Assistant Professor rank beginning September 1, 2015, or earlier. The successful candidate will join an existing complement of faculty engaged in Educational Technology and Learning Design, and will contribute to teaching and graduate student supervision in our vibrant Masters program at our Surrey campus and PhD program at our Burnaby campus. DEADLINE FOR APPLICATION: December 1, 2014

University at Buffalo (Buffalo, NY, USA)
Associate for Institutional Research/Research Scientist: Online Learning Analytics – The University at Buffalo (UB), State University of New York seeks a scholar in online learning analytics to join its newly formed Center for Educational Innovation. Reporting to the Senior Vice-Provost for Academic Affairs, the Center for Educational Innovation has a mission to support and guide the campus on issues related to teaching, learning and assessment, and at the same time serves as a nexus for campus-wide efforts to further elevate the scholarship of and research support for pedagogical advancement and improved learning. The Research Scientist in online learning analytics will work in the area of Online Learning within the department and join a campus-wide network of faculty and researchers working on “big data”. DEADLINE FOR APPLICATION: December 6, 2014

University of Boulder Colorado (Boulder, Colorado, USA)
Multiple Tenure Track Positions in Computer Science – The openings are targeted at the level of Assistant Professor, although exceptional candidates at higher ranks may be considered. Research areas of particular interest include secure and reliable software systems, numerical optimization and high-performance scientific computing, and network science and machine learning. DEADLINE FOR APPLICATION: Posted Until Filled

University of Technology, Sydney (Sydney, AUS)
Postdoctoral Research Fellow: Academic Writing Analytics – Postdoctoral research position specialising in the use of language technologies to provide learning analytics on the quality of student writing, across diverse levels, genres and domains DEADLINE FOR APPLICATION: Posted Until Filled

University of Michigan (Ann Arbor, MI)
Senior Digital Media Specialist – The University of Michigan is seeking a qualified Senior Digital Media Specialist to create digital content in support of online and residential educational experiences for the Office of Digital Education & Innovation (DEI). DEADLINE FOR APPLICATION: Posted Until Filled

NYU Steinhardt School of Culture, Education,and Human Developments Center for Research on Higher Education Outcomes (USA)
12-month postdoctoral position – available for a qualified and creative individual with interests in postsecondary assessment, learning analytics, data management, and institutional research.The Postdoctoral Fellow will be responsible for promoting the use of institutional data sources and data systems for the purpose of developing institutional assessment tools that can inform decision making and contribute to institutional improvement across New York University (NYU). DEADLINE FOR APPLICATION: Open Until Filled

This Week in Learning Analytics (November 8 – 14, 2014)

Beautiful photo taken by my wife, Elisa Wallace at our home, Rock Creek Farms.

Beautiful photo taken by my wife, Elisa Wallace at our home, Rock Creek Farms. Pictured is Hwin, whom Elisa is currently training to compete at the 2015 Mustang Magic competition in Fort Worth, TX.

Latest News

14 November 2014
DeltaDNA makes analytics services free to academia
Scotland-based analytics firm DeltaDNA has creased a new free licence for universities and other academic institutions. The unlimited, non-commercial academic licence gives users access to the firm’s data platform and complete analytics and player relationship management toolset in order to better understand the nature of player behaviour in free-to-play games […]

12 November 2014
STAR Assessments Approved by California Department of Education for Diagnostic Assessment in Grade Two
Renaissance Learning, K12 assessment and learning analytics company, announced today that schools in California will be able to use its STAR Early Learning (consisting of STAR Reading and STAR Early Literacy) and STAR Math to meet requirements for grade two in the state’s newly adopted assessment system. […]

11 November 2014
MRUN and IRU to work together on 3-year research projects starting 2015
Ministry of Higher Education and the Innovative Research Universities (IRU) of Australia have agreed to each allocate RM2 million for a three-year research programme collaboration beginning next year. The programme, which involves Malaysia Research University Network (MRUN) and IRU, aims to improve learning and teaching skills/methods through technology in universities and to undertake in-depth research activites into fields of combined interest. […]

Latest Blogs

From MOOCs to Learning Analytics: Scratching the surface of the ‘visual’ by Jeremy Knox
A thoughtful and well-written piece exploring the common visual element in both MOOCs and learning analytics, and expresses concern about the emphasis on tools over pedagogy, and the extent to which am emphasis on visual elements might co-opt deep engagement and critical reflection on the part of learners and practitioners alike.

Featured Videos, Presentations & Webinars



Recent Publications

Am I doing well? A4Learning as a self-awareness tool to integrate in Learning Management Systems
Luis De La Fuente Valentín & Daniel Burgos

Most current online education scenarios use a Learning Management System (LMS) as the basecamp for the course activities. The LMS offers some centralized services and also integrates functionality from third party services (cloud services). This integration enriches the platform and increases the educational opportunities of the scenario. In such a distance scenario, with the students working in different physical spatial locations, they find difficult to determine if their activity level matches the expectation of the course. A4Learning performs a daily-updated analysis of learners’ activities by establishing the similarity between two given students. That is, finds students that are doing similar things in the Learning Management System. Then, the system finds and represents how similar students have similar achievements in the course. A4Learning can be integrated within the LMS to provide the students with a visual representation their similarity with others as an awareness mechanism, so that the students can determine the achievements of similar students in previous courses and estimate their own performance.

A Survey on the Classification Techniques In Educational Data Mining
Nitya Upadhyay & Vinodini Katiyar

Due to increasing interest in data mining and educational system, educational data mining is the emerging topic for research community. educational data mining means to extract the hidden knowledge from large repositories of data with the use of technique and tools. educational data mining develops new methods to discover knowledge from educational database and used for decision making in educational system. The various techniques of data mining like classification. clustering can be applied to bring out hidden knowledge from the educational data. In this paper, we focus on the educational data mining and classification techniques. In this study we analyze attributes for the prediction of student’s behavior and academic performance by using WEKA open source data mining tool and various classification methods like decision trees, C4.5 algorithm, ID3 algorithm etc.

Proceedings of the 2014 ACM workshop on Multimodal Learning Analytics Workshop and Grand Challenge
Editors: Xavier Ochoa, Marcelo Worsley, Katherine Chiluiza & Saturnino Luz

Learning Analytics is the “middle-space” where Educational Sciences, Computer Science, Learning Technologies and Data Science converge. The main goal of this new field of knowledge is to contribute to new empirical findings, theories, methods, and metrics for understanding how students learn and to use that knowledge to improve those students’ learning. Multimodal Learning Analytics, which emphasizes the analysis of natural rich modalities of communication during situated learning activities, is one of the most challenging but, at time, more promising areas of Learning Analytics. The Third International Workshop on Multimodal Learning Analytics brings together researchers in multimodal interaction and systems, cognitive and learning sciences, educational technologies, and related areas to discuss the recent developments and future opportunities in this sub-field.

Emotion analysis meets learning analytics: online learner profiling beyond numerical data
Calkin Suero Montero & Jarkko Suhonen

Learning analytics is an emerging field of research, which deals with collecting and analysing data about learners and their learning context, as well as developing solutions that utilise the analysed data. Traditionally, learning analytics methods focus on the analysis of learners’ digital trails or numerical big data, e.g., online material access, digital learners’ records, grades, and length of interaction with the learning environment. However, profiling a learner without taking into account the emotional aspects that may hinder the learner’s progress, can only offer an incomplete view of the learning experience. Hence, in this paper, we elaborate on the fusion of emotional aspects (i.e., emotion data) and learning analytics, specifically in online learning settings. We bring an open discussion to the educational technology community regarding the potential of analysing learner’s emotions from pedagogical texts (i.e., non-structured text data) generated during an online course. We also discuss the role of negative emotions during learning, the ethical issues with the use of emotion data and the technology acceptance and reliability.

Calls for Papers / Participation

Conferences

Open Learning Analytics Network – Summit Europe 2014 Amsterdam | 1 December 2014 (APPLICATION DEADLINE: None, but spaces are limited)

NEW! EDEN Annual Conference Barcelona, Spain | 9 – 12 June, 2015 (SUBMISSION DEADLINE: 31 January 2014)

EDM 2015: 8th International Conference on Education Data Mining Madrid, Spain | 26 – 29 June, 2015 (SUBMISSION DEADLINE: 12 January 2014)

Journals / Book Chapters

Universities and Knowledge Society Journal (RUSC): Special Section on Learning Analytics (SUBMISSION DEADLINE: 20 January 2015)

Employment Opportunities

NEW! Leiden University (Leiden, Netherlands)
Postdoc Data & Learning Analytics for Online Learning – The function involves doing research and conducting experiments within the Online Learning Lab ( http://leidenuniv.onlinelearninglab.org/ ). It’s purpose: 1) to improve online learning and 2) publishing in professional and scientific journals. DEADLINE FOR APPLICATION:November 28, 2014

Simon Fraser University (Victoria, BC, Canada)
Tenure Track Position In Educational Technology And Learning Design – The Faculty of Education, Simon Fraser University (http://www.sfu.ca/education.html) seeks applications for a tenure-track position in Educational Technology and Learning Design at the Assistant Professor rank beginning September 1, 2015, or earlier. The successful candidate will join an existing complement of faculty engaged in Educational Technology and Learning Design, and will contribute to teaching and graduate student supervision in our vibrant Masters program at our Surrey campus and PhD program at our Burnaby campus. DEADLINE FOR APPLICATION: December 1, 2014

University at Buffalo (Buffalo, NY, USA)
Associate for Institutional Research/Research Scientist: Online Learning Analytics – The University at Buffalo (UB), State University of New York seeks a scholar in online learning analytics to join its newly formed Center for Educational Innovation. Reporting to the Senior Vice-Provost for Academic Affairs, the Center for Educational Innovation has a mission to support and guide the campus on issues related to teaching, learning and assessment, and at the same time serves as a nexus for campus-wide efforts to further elevate the scholarship of and research support for pedagogical advancement and improved learning. The Research Scientist in online learning analytics will work in the area of Online Learning within the department and join a campus-wide network of faculty and researchers working on “big data”. DEADLINE FOR APPLICATION: December 6, 2014

University of Boulder Colorado (Boulder, Colorado, USA)
Multiple Tenure Track Positions in Computer Science – The openings are targeted at the level of Assistant Professor, although exceptional candidates at higher ranks may be considered. Research areas of particular interest include secure and reliable software systems, numerical optimization and high-performance scientific computing, and network science and machine learning. DEADLINE FOR APPLICATION: Posted Until Filled

University of Technology, Sydney (Sydney, AUS)
Postdoctoral Research Fellow: Academic Writing Analytics – Postdoctoral research position specialising in the use of language technologies to provide learning analytics on the quality of student writing, across diverse levels, genres and domains DEADLINE FOR APPLICATION: Posted Until Filled

University of Michigan (Ann Arbor, MI)
Senior Digital Media Specialist – The University of Michigan is seeking a qualified Senior Digital Media Specialist to create digital content in support of online and residential educational experiences for the Office of Digital Education & Innovation (DEI). DEADLINE FOR APPLICATION: Posted Until Filled

NYU Steinhardt School of Culture, Education,and Human Developments Center for Research on Higher Education Outcomes (USA)
12-month postdoctoral position – available for a qualified and creative individual with interests in postsecondary assessment, learning analytics, data management, and institutional research.The Postdoctoral Fellow will be responsible for promoting the use of institutional data sources and data systems for the purpose of developing institutional assessment tools that can inform decision making and contribute to institutional improvement across New York University (NYU). DEADLINE FOR APPLICATION: Open Until Filled

This Week in Learning Analytics (November 1 – 7, 2014)

The Report to the European Commission on New Modes of Learning and Teaching in Higher Education recommends the fill and informed consent of all students who lend their data for the sake of educational purposes. (Image Source: Report to the European Commission on New Modes of Learning in Higher Education)

The Report to the European Commission on New Modes of Learning and Teaching in Higher Education recommends the fill and informed consent of all students who lend their data for the sake of educational purposes. (Image Source: Report to the European Commission on New Modes of Learning in Higher Education)

Latest News

7 November 2014
Microsoft and Other Firms Pledge to Protect Student Data
Fourteen companies, including Microsoft and Mifflin Harcourt, Amplify, and Edmodo, have pledged to adopt nationwide policies that will restrict and protect data collected from K-12 students. The group in pledging not to (1) sell student information, (2) target students with advertisements, or (3) compile personal student profiles unless authorized by parents or schools. The pledge, which is not legally binding, was developed by the Future of Privacy Forum.

6 November 2014
Lecturer Calls for Clarity in Use of Learning Analytics
Sharon Slade (Open University) talks about her university’s effort to develop and ethical policy on the use of student data, that attempts to carefully address conflicting student concerns: (1) concerns about institutional ‘snooping’ on the one hand, and (2) an interest in personalized modes of communication. The Ethical Use of Student Data for Learning Analytics Policy produced at the Open University is the first of its kind, and the result of an exemplary effort that should be repeated widely.

6 November 2014
Echo360 Appoints Dr. Bradley S. Fordham as Global Chief Technology Officer
Echo360, an active learning and lecture capture platform, has appointed Dr. Fordham as Global Technology Officer. With a wealth of industry and scholarly experience, Dr. Fordham will add significant expertise, legitimacy, and exposure to the platform. The this is the latest in a series of recent investments in developing the platform’s real-time analytics capabilities, which until recently, have been rather limited and unsophisticated.

6 November 2014
Harvard Researchers Used Secret Cameras to Study Attendance. Was That Unethical?
In the spring of 2013, cameras in 10 Harvard classrooms recorded one image per minute, and the photographs were scanned to determine which seats were filled. The study rankled computer-science professor, Harry R. Lewis, who viewed the exercise as an obvious intrusion into student privacy. George Siemens notes that attendance data is the ‘lowest of the low,’ and notes that the level of surveillance taking place in online courses far exceeds what was collected as part of the attendance-tracking exercise. Since Lewis raised his concerns, Harvard has committed itself to reaching out to every faculty member and student whose image may have been captured to inform them of the research, a not-so-easy effort, as images were captured anonymously and have subsequently been destroyed as part of the research methodology.

5 November 2014
Disadvantages Students in Georgia District Get Home Internet Service
Fayette County Schools in Georgia have partnered with Kajeet to give Title 1 students a Kajeet SmartSpot so that they can access online textbooks, apps, email, documents, sites, and their teachers while outside of school. The mobile hotspot works with the Kajeet cloud service and allows districts and schools to restrict access according to site- and time- base rules. The service also monitors student activity and provides teachers and administrators with learning analytics reports.

1 November 2014
Track Your Child’s Development Easily
In May 2011, Jayashankar Balaraman — a serial entrepreneur with a background in advertising and marketing — moved into the education space with the launch of KNEWCLEUS, which in just three years has grown to become India’s largest parent-school engagement platform. The platform’s success is a result of the ease with which it makes parent-teacher communication, and the analytics engine that monitors student performance, identifies areas in need of remediation, and recommends relevant content.

Latest Blogs

Does Exercise (and Learning) Count If Not Counted? by Joshua Kim
Kim asks the age-old question, “If I exercise and my fitness app does not record my steps, did my exercise ever happen?” He wonders about how the ability to track certain forms of activity, including learning activity, ends up altering behavior and shifting values on the basis of ‘trackability.’ The danger here, cautions Kim, is that we may come to conflate good teaching with digital practices that are more amenable to datafication.

Report on Modernisation of Higher Education: Focus on Open Access and Learning Analytics by Brian Kelly
A brief summary and review of the Report to the European Commission on New Modes of LEarning and Teaching in Higher Education, delivered in October 2014 by the High Level Group on the Modernisation of Higher Education. The report makes explicit mention of learning analytics, recommending collaboration over competition, and an increase in personalized learning informed by better data. The report’s advocacy of ‘better data’ includes strong ethical considerations, including the full and informed consent of students and the ability to ‘opt-out.’

10 Hottest Technologies in Higher Education by Vala Afshar
Afshar summarizes the hottest technologies discussed by CIOs at the 2014 Annual EDUCUASE conference last month. Included in the list are wifi, social media, badges, analytics, wearables, drones, 3D printing, digital courseware, Small Private Online Courses (SPOCs), and virtual reality. Although analytics is included as one of many trends, it of course is also a major driver for each of these technologies as well.

Schools keep track of students’ online behavior, but do parents even know? by Taylor Armerding
A truly exceptional review of literature and debates surrounding the collection and use of data from K-12 students. What kinds of data are a school’s ‘business’ to collect? How does an institution ensure informed consent, when privacy policies are often so complex as to be inaccessible by many parents? What is a school’s responsibility if it discovers something with implications for student success? Are schools ‘grooming kids for a lifetime of surveillance?’

Should Google Be a Signatory to Student Privacy Pledge? by Tracy Mitrano
Mitrano asks why the K-12 School Service Provider Pledge to Safeguard Student Privacy is not being more strongly considered in higher education, and asks why Google and Amazon have not publicly committed themselves to the pledge alongside Microsoft, Houghton Mifflin Harcourt, Knewton, and others.

Why I’m Voting ‘Yes’ on the Smart Schools Bond Act, Proposition 3 by Leonie Haimson
New York Proposition 3 (also known as the Smart Schools Bond Act) would allow the sale of bonds top generate $2 billion statewide for capital funding. In spite of her resistance to using bond revenue to purchase electronic devices in schools (one of the key ways in which the bond revenues are meant to be spent), Haimson notes the urgent need that many schools have for an injection of funding, and notes that the finds may be spent in a wide variety of ways. She raises a concern about the proliferation of technolgies driven by companies interested in educational data mining, but notes that, thanks to the Children’s Online Privacy Protection Act, all parents have the right to opt out of any online data-mining instructional or testing program that collects personal data, whether their children participate in this program at school or home.

Featured Videos, Presentations & Webinars

Carolyn Rosé: Learning analytics and educational data mining in learning discourses
Talk delivered to the International Society of the learning Sciences Network of Academic Programs in the LEarning Sciences (NAPLES). Click HERE for full Webinar Recording.

Sie benötigen den Flash Player ab Version 9.0 sowie Javascript, um diese Mediendatei anzuzeigen.

Flash Player runterladen

Recent Publications

Embracing Big Data in Complex Educational Systems: The Learning Analytics Imperative and the Policy Challenge
Leah MacFadyen, Shane Dawson, Abelardo Pardo, Dragan Gašević

In the new era of big educational data, learning analytics (LA) offer the possibility of implementing real–time assessment and feedback systems and processes at scale that are focused on improvement of learning, development of self–regulated learning skills, and student success. However, to realize this promise, the necessary shifts in the culture, technological infrastructure, and teaching practices of higher education, from assessment–for–accountability to assessment–for–learning, cannot be achieved through piecemeal implementation of new tools. We propose here that the challenge of successful institutional change for learning analytics implementation is a wicked problem that calls for new adaptive forms of leadership, collaboration, policy development and strategic planning. Higher education institutions are best viewed as complex systems underpinned by policy, and we introduce two policy and planning frameworks developed for complex systems that may offer institutional teams practical guidance in their project of optimizing their educational systems with learning analytics.

Learning Analytics: Challenges and Future Research Directions
Mohamed Amine Chatti, Vlatko Lukarov, Hendrik Thüs, Arham Muslim, Ahmed Mohamed Fahmy Yousef, Usman Wahid, Christoph Greven, Arnab Chakrabarti, Ulrik Schroeder

In recent years, learning analytics (LA) has attracted a great deal of attention in technology-enhanced learning (TEL) research as practitioners, institutions, and researchers are increasingly seeing the potential that LA has to shape the future TEL landscape. Generally, LA deals with the development of methods that harness educational data sets to support the learning process. This paper provides a foundation for future research in LA. It provides a systematic overview on this emerging field and its key concepts through a reference model for LA based on four dimensions, namely data, environments, context (what?), stakeholders (who?), objectives (why?), and methods (how?). It further identifies various challenges and research opportunities in the area of LA in relation to each dimension.

Calls for Papers / Participation

Conferences

2015 Southeast Educational Data Symposium (SEEDS) Emory University (Atlanta, GA) | 20 Feb 2015 (APPLICATION DEADLINE: 14 November 2014)

11th International Conference on Computer Supported Collaborative
Learning: “Exploring the material conditions of learning: Opportunities and
challenges for CSCL”
 
University of Gothenburg, Sweden | 7 – 11 June 2015 (SUBMISSION DEADLINE: 17 November 2014)

28th annual Florida AI Research Symposium (FLAIRS-28) on Intelligent Learning Technologies Hollywood, Florida, USA (SUBMISSION DEADLINE: 17 November 2014)

EDM 2015: 8th International Conference on Education Data Mining Madrid, Spain | 26 – 29 June, 2015 (SUBMISSION DEADLINE: 12 January 2014)

Journals / Book Chapters

Universities and Knowledge Society Journal (RUSC): Special Section on Learning Analytics (SUBMISSION DEADLINE: 20 January 2015)

Employment Opportunities

Simon Fraser University (Victoria, BC, Canada)
Tenure Track Position In Educational Technology And Learning Design – The Faculty of Education, Simon Fraser University (http://www.sfu.ca/education.html) seeks applications for a tenure-track position in Educational Technology and Learning Design at the Assistant Professor rank beginning September 1, 2015, or earlier. The successful candidate will join an existing complement of faculty engaged in Educational Technology and Learning Design, and will contribute to teaching and graduate student supervision in our vibrant Masters program at our Surrey campus and PhD program at our Burnaby campus. DEADLINE FOR APPLICATION: December 1, 2014

NEW! University at Buffalo (Buffalo, NY, USA)
Associate for Institutional Research/Research Scientist: Online Learning Analytics – The University at Buffalo (UB), State University of New York seeks a scholar in online learning analytics to join its newly formed Center for Educational Innovation. Reporting to the Senior Vice-Provost for Academic Affairs, the Center for Educational Innovation has a mission to support and guide the campus on issues related to teaching, learning and assessment, and at the same time serves as a nexus for campus-wide efforts to further elevate the scholarship of and research support for pedagogical advancement and improved learning. The Research Scientist in online learning analytics will work in the area of Online Learning within the department and join a campus-wide network of faculty and researchers working on “big data”. DEADLINE FOR APPLICATION: December 6, 2014

NEW! University of Boulder Colorado (Boulder, Colorado, USA)
Multiple Tenure Track Positions in Computer Science – The openings are targeted at the level of Assistant Professor, although exceptional candidates at higher ranks may be considered. Research areas of particular interest include secure and reliable software systems, numerical optimization and high-performance scientific computing, and network science and machine learning. DEADLINE FOR APPLICATION: Posted Until Filled

University of Technology, Sydney (Sydney, AUS)
Postdoctoral Research Fellow: Academic Writing Analytics – Postdoctoral research position specialising in the use of language technologies to provide learning analytics on the quality of student writing, across diverse levels, genres and domains DEADLINE FOR APPLICATION: Posted Until Filled

University of Michigan (Ann Arbor, MI)
Senior Digital Media Specialist – The University of Michigan is seeking a qualified Senior Digital Media Specialist to create digital content in support of online and residential educational experiences for the Office of Digital Education & Innovation (DEI). DEADLINE FOR APPLICATION: Posted Until Filled

NYU Steinhardt School of Culture, Education,and Human Developments Center for Research on Higher Education Outcomes (USA)
12-month postdoctoral position – available for a qualified and creative individual with interests in postsecondary assessment, learning analytics, data management, and institutional research.The Postdoctoral Fellow will be responsible for promoting the use of institutional data sources and data systems for the purpose of developing institutional assessment tools that can inform decision making and contribute to institutional improvement across New York University (NYU). DEADLINE FOR APPLICATION: Open Until Filled

This Week in Learning Analytics (October 25 – 31, 2014)

Inspired by Halloween and comments made at the Blackboard Institutional Performance Conference last week | Image by Timothy Harfield Inspired by Halloween and comments made at the Blackboard Institutional Performance Conference last week
Image by Timothy Harfield

Latest News

30 October 2014
Survey Takes Pulse of Civitas Learning Partners’ Work in Analytics and Student Success
Civitas Learning today announced the results of its first “Pulse” survey recently conducted at its Pioneer Summit. More than 70 individuals representing more than 40 higher education institutions and systems participated in the survey. This is the first step in the ongoing effort to benchmark the burgeoning community’s work in predictive analytics focused on student success.

30 October 2014
Learning about Learning Analytics @ #Mozfest
Summary by Adam Lofting of a session he hosted, alongside Andrew Sliwinski, Doug Belshaw, and Simon Knight, on “Learning Analytics for Good in the Age of Big Data.” Lofting reflects upon the learning that took place as a result of this session, through a ‘silly’ meta-exercise.

29 October 2014
Notes from Utrecht Workshop on Ethics and Privacy Issues in the Application of Learning Analytics
Summary of excellent discussion that took place during the Workshop on Ethics & Privacy Issues in the Application of Learning Analytics, an event co-organized by LACE and SURF, and held in Utrecht on 28 October, 2014.

29 October 2014
Statistician explores how faculty can excel in blended learning environments
In a recent lecture sponsored by Emory’s QuanTM, learning analytics expert Chuck Dziuban explained trends about the new learning environment that blends face-to-face and virtual instruction.

Latest Blogs

The Quest for Data that Really Impacts Student Success by Dian Schaffhauser
A really nice review of the field, including 3 learning analytics tips worth knowing (from Josh Baron)

  1. Collaborate with other institutions
  2. Don’t jump into an analytics product willy-nilly
  3. Take care with ethics and data privacy considerations

OPINION: Personalization, Possibilities and Challenges with Learning Analytics by Arthur VanderVeen & Nick Sheltrown
The authors identify two key challenges facing personalized learning through learning analytics:

  1. the need to expand educators’ understanding of what is possible through analytics-driven personalized learning, and
  2. the need to actively engage with practicing educators on how to design and integrate analytics-driven learning experiences

Learning Analytics
Review of Greller and Drachsler article: Translating Learning into Numbers: A Generic Framework for Learning Analytics.

Tribal Student Insight: an interview with Chris Ballard by Niall Sclater
Interview with Chris Ballard, Data Scientist for Student Insight, about a tool that allows customers to build models to predict student risk. The product is currently being developed by Tribal with the University of Wolverhampton.

Recent Publications

Supporting competency-assessment through a learning analytics approach using enriched rubrics
Alex Rayón, Mariluz Guenaga, & Asier Núñez

Universities have increasingly emphasized competencies as central elements of students’ development. However, the assessment of these competencies is not an easy task. The availability of data that learners generate in computer mediated learning offers great potential to study how learning takes place, and thus, to gather evidences for competency-assessment using enriched rubrics. The lack of data interoperability and the decentralization of those educational applications set out a challenge to exploit trace data. To face these problems we have designed and developed SCALA (Scalable Competence Assessment through a Learning Analytics approach), an analytics system that integrates usage -how the user interacts with resources- and social -how students and teachers interact among them-trace data to support competency assessment. The case study of SCALA presents teachers a dashboard with enriched rubrics of blended datasets obtained from six assessment learning activities, performed with a group of 28 students working teamwork competency. In terms of knowledge discovery, we obtain results applying clustering and association rule mining algorithms. Thus, we provide a visual analytics tool ready to support competency-assessment.

Foundations of Big Data and Analytics in Higher Education
Ben Daniel & Russell Butson

This paper contributes to our theoretical understanding of the role Big Data plays in addressing contemporary challenges institutions of higher education face. The paper draws upon emergent literature in Big Data and discusses ways to better utilise the growing data available from various sources within the institutions of higher education, to help understand the complexity of influences on studentͲrelated outcomes, teaching and the ‘what if questions’ for research experimentation. The paper further presents opportunities and challenges associated with the implementation of Big Data analytics in higher education.

Dealing with complexity: educational data and tools for learning analytics
Ángel Hernández-García & Miguel Ángel Conde

The evolution of information technologies and their widespread use have caused an increase in complexity of the educational landscape, as institutions and instructors try to absorb and incorporate these innovations to learning processes. This in turn poses new and countless new challenges to educational research in general, and to new disciplines based on educational data analysis such as learning analytics in particular. In this paper, we introduce the Track on Learning Analytics within the Technological Ecosystems for Enhancing Multiculturality 2014 Conference, a track that aims to present new approaches that allow dealing with this complexity and solving some of these challenges.

The paper provides an overview of the motivations behind the proposal of this track, with a general introduction to learning analytics in this complex context and a presentation of the main challenges in current learning analytics research, both from a data analysis perspective and a tool analysis approach; this introduction is followed by an insight of the submission management and participants’ selection process. Then, a detailed summary of the manuscripts accepted for participation in the conference is presented.

Learning Analytics and Educational Data Mining in Practice: A Systematic Literature Review of Empirical Evidence

Zacharoula Papamitsiou & Anastasios A. Economides

This paper aims to provide the reader with a comprehensive background for understanding current knowledge on Learning Analytics (LA) and Educational Data Mining (EDM) and its impact on adaptive learning. It constitutes an overview of empirical evidence behind key objectives of the potential adoption of LA/EDM in generic educational strategic planning. We examined the literature on experimental case studies conducted in the domain during the past six years (2008-2013). Search terms identified 209 mature pieces of research work, but inclusion criteria limited the key studies to 40. We analyzed the research questions, methodology and findings of these published papers and categorized them accordingly. We used non-statistical methods to evaluate and interpret findings of the collected studies. The results have highlighted four distinct major directions of the LA/EDM empirical research. We discuss on the emerged added value of LA/EDM research and highlight the significance of further implications. Finally, we set our thoughts on possible uncharted key questions to investigate both from pedagogical and technical considerations.

Assessment of Robust Learning with Educational Data Mining
Ryan S. Baker & Albert T. Corbett

Many university leaders and faculty have the goal of promoting learning that connects across domains and prepares students with skills for their whole lives. However, as assessment emerges in higher education, many assessments focus on knowledge and skills that are specific to a single domain. Reworking assessment in higher education to focus on more robust learning is an important step towards making assessment match the goals of the context where it is being applied. In particular, assessment should focus on whether learning is robust (Koedinger, Corbett, & Perfetti, 2012), whether learning occurs in a way that transfers, prepares students for future learning, and is retained over time; and also on skills and meta–competencies that generalize across domains. By doing so, we can measure the outcomes that we as educators want to create, and increase the chance that our assessments help us to improve the outcomes we wish to create. In this article, we discuss and compare both traditional test–based methods for assessing robust learning, and new ways of inferring robustness of learning while the learning itself is occurring, comparing the methods within the domain of college genetics.

Calls for Papers / Participation

Conferences

Open Learning Analytics Network – Summit Europe Amsterdam | 1 January 2015 (APPLICATION DEADLINE: None, but spaces are limited)

Third International Conference on Data Mining & Knowledge Management Process Dubai, UAE | 23-24 January, 2015 (APPLICATION DEADLINE: 31 October 2014)

Learning at Scale 2015 Vancouver, BC (Canada) | 14 – 15 March 2015 (SUBMISSION DEADLINE: 22 October 2014)

2015 Southeast Educational Data Symposium (SEEDS) Emory University (Atlanta, GA) | 20 Feb 2015 (APPLICATION DEADLINE: 14 November 2014)

11th International Conference on Computer Supported Collaborative
Learning: “Exploring the material conditions of learning: Opportunities and
challenges for CSCL”
 
University of Gothenburg, Sweden | 7 – 11 June 2015 (SUBMISSION DEADLINE: 17 November 2014)

28th annual Florida AI Research Symposium (FLAIRS-28) on Intelligent Learning Technologies Hollywood, Florida, USA (SUBMISSION DEADLINE: 17 November 2014)

Journals / Book Chapters

Universities and Knowledge Society Journal (RUSC): Special Section on Learning Analytics (SUBMISSION DEADLINE: 20 January 2015)

Employment Opportunities

Simon Fraser University (Victoria, BC, Canada)
Tenure Track Position In Educational Technology And Learning Design – The Faculty of Education, Simon Fraser University (http://www.sfu.ca/education.html) seeks applications for a tenure-track position in Educational Technology and Learning Design at the Assistant Professor rank beginning September 1, 2015, or earlier. The successful candidate will join an existing complement of faculty engaged in Educational Technology and Learning Design, and will contribute to teaching and graduate student supervision in our vibrant Masters program at our Surrey campus and PhD program at our Burnaby campus. DEADLINE FOR APPLICATION: December 1, 2014

University of Technology, Sydney (Sydney, AUS)
Postdoctoral Research Fellow: Academic Writing Analytics – Postdoctoral research position specialising in the use of language technologies to provide learning analytics on the quality of student writing, across diverse levels, genres and domains DEADLINE FOR APPLICATION: Posted Until Filled

University of Michigan (Ann Arbor, MI)
Senior Digital Media Specialist – The University of Michigan is seeking a qualified Senior Digital Media Specialist to create digital content in support of online and residential educational experiences for the Office of Digital Education & Innovation (DEI). DEADLINE FOR APPLICATION: Posted Until Filled

NYU Steinhardt School of Culture, Education,and Human Developments Center for Research on Higher Education Outcomes (USA)
12-month postdoctoral position – available for a qualified and creative individual with interests in postsecondary assessment, learning analytics, data management, and institutional research.The Postdoctoral Fellow will be responsible for promoting the use of institutional data sources and data systems for the purpose of developing institutional assessment tools that can inform decision making and contribute to institutional improvement across New York University (NYU). DEADLINE FOR APPLICATION: Open Until Filled

Against ‘Dumbing Down’: A Response to Adam Cooper’s On the Question of Validity in Learning Analytics

The following is a response to Adam Cooper’s blog post, On the Question of Validity in Learning Analytics, and to a brief critique that I issued HERE. This is an interesting conversation, and one that I think deserves to be highlighted rather than buried in comments. I am fond of Adam’s writing and reflection on issues in the learning analytics space, and am grateful for the opportunity to enter into a discussion on this matter. As a point of reference, I am quoting Adam’s comment to my original response here:

I think I am paying the price of having written a somewhat compressed piece about a complex topic…

I did not intend to convey the idea that conflation of concerns is desirable; indeed, the first listed point in the conclusion section could be unpacked to include the idea that we (a community with expertise in different disciplines, but also a community that would benefit from drawing more scholars into it) need to better articulate the distinctions within a wider interperation [sic] of “validity” than exists in particular disciplinary areas. This is not a “loosening of conceptual clarity” but a call for widening of conceptual view.

It is precisely the risk of under-appreciating complexity that prompted me to write the article. For all non-expert practitioners may need a set of expert advisors, I believe the reality is that they are unlikely to have access to them, or simply not see the need to consult them. At present, it seems likely that these non-experts will make decisions based on seeing a visually-attractive dashboard, a quoted prediction rate, or a statement like “based on the system used by XXX”. We need to move this narrow/near view forwards, to widen the view, and yes, to raise awareness of the need to consult experts. In the process, we should be aware that specialised vocabularies can be a source of difficulty. The same applies across disciplines and vocational areas; not all teams involved in implementing learning analytics will be as diverse as would be ideal. There is, I think, a need to develop
awareness of the many sides of “validity”, even within the community.

So… yes, I’m all for conceptual sophistication, but also for “dumbing-down”. The way forward I see is to develop a more socialised conceptual map as a basis for working out how best to simplify the message.

In what follows, I don’t believe that I am truly disagreeing with Adam in any significant way. If anything, I am piggy-backing off of his original piece, riffing off of some important themes, and looking to clarify some concerns that his piece raised for me.


I take words very seriously. Words define the contours of things, shape our perceptions of reality, and have real effects on our behavior. What concerns me about Adam Cooper’s advocacy of a ‘dumbed-down vocabulary’ (or a kind of two-tiered approach to methodology: rigor for data scientists and a ‘socialised’ conceptual map for the rest), is that a simplified message may serve, not simply to facilitate analytical practice, but also change it.

The primary virtue of analytics is that it permits and encourages an evidence-based approach to decision-making. Whether it delivers on its promise or not, analytics claims to overcome the need for dependence upon anecdotal accounts of human behavior, by allowing us to get at the behavior itself. The only way that analytics can do this is through methods that employ shared and clearly defined conceptions of validity and reliability.

Reliability is ‘easy,’ as it merely refers to the extent that one’s approach is capable of yielding the same result through repeated application. Validity, on the other hand, is hard. It is hard, because it is a measure of the extent to which one’s claims about reality are true. It is not the case that there are many ‘sides’ of validity. To say this is to imply that all conceptions of validity ultimately get at the same thing, as in the parable of the three blind men and the elephant. Instead, conceptions of validity differ according to the nature of the specific objects and domains with respect to which the concept is applied.

In logic, an argument is valid if it is impossible for its premises to be true and its conclusions false. The domain of logical validity (the standard against which validity is judged) is not reality, but rather a set of axioms, known rules that define a system (i.e. the law of the excluded middle and the law of non-contradiction). In statistics, we do not simply talk about validity. The term ‘validity’ is meaningless unless accompanied by more specific information about a particular domain. A domain-specific concept of validity is useless unless it can be operationalized in terms of a concrete methodology.

In the social sciences, certain kinds of validity are more challenging than others. Certain general conceptions of validity look to reality as their domain, as in the case of construct validity (which evaluates the extent to which a test is capable of measuring a particular phenomenon) and tests of experimental validity (which evaluate causality). In the absence of direct access to reality, these general forms of validity are operationalized in terms of proxies and justified epistemologically through theory. For example, it is impossible for me to say whether there is such a thing as intelligence, and so it is impossible for me to know for sure whether a particular test is capable of grasping it. What I CAN say, is the extent to which several measures that should correlate (according to my theory) actually do. The latter (convergent validity) is NOT a measure of construct validity, but rather a proxy for it. Just as logical validity is a measure of the extent to which a conclusion conforms to the known rules of a closed system, so too is something like convergent validity. Understanding validity and its limits with respect to particular knowledge domains should give us pause at the moment of decision-making. No matter what our methodology, when working with data (a term with an ancient etymology, referring to a set of individual experiences of reality), we are never working with reality itself, even if our activity has real consequences. Between data science and action, there is always judgment, and it is in this space of judgment that we are wholly and infinitely responsible.

To return to more practical matters, my concern with a ‘dumbed down’ definition of validity is that it might lend itself to a failure on the part of practitioners to fully realize their own responsibility for decision-making. When clearly and expertly defined, validity emphasizes both (1) the need to strive after accurate articulations of true states of affairs, while at the same time (2) acknowledging our humility in the face of our epistemological limits, and (3) affirming our responsibility for decision-making. If practitioners are allowed, or even encouraged, to adopt ‘dumbed-down’ conceptions of validity, I fear a loss of humility and a failure to take responsibility for decision-making. My fear comes, in part, as a result of my experience in market research. The function of the market research industry is not to accurately describe or to predict reality. The function of market research is to mitigate the perception of risk and responsibility involved in decision-making. In business (and in education), the vast majority of decisions are still made on the basis of personal opinion and anecdotal evidence. Huge investments in market research are meant to ‘launder’ decision-making processes, giving the appearance that they were based on a more robust kind of evidence, so that responsibility for negative consequences sticks to the data rather than to individuals. In my experience, organizations often make strategic decisions before contracting market research, and it is rare to see market research make a significant impact on an organization’s trajectory.

I do not mean to say that any practitioner is looking to ‘get away’ with anything. I do not mean to imply that data are only used strategically to manipulate an audience into thinking that an idea is better supported than it would be if justified by anecdote alone (although in many cases this is very much the case). Rather, I mean to say that a lack of knowledge about methodology and specialized methodological vocabulary can give practitioners a false sense of confidence in their analytical abilities and results. The ‘dumbing down’ of language to make analytics easier means also making it easier for practitioners to avoid responsibility for either their research (there are tools for this) or their ‘evidence-based’ decisions (“just do what the data say, and you’ll be fine. If things go wrong, you were just following orders the data”). In light of the importance of words in informing behavior, the fact that practitioners are in the business of making decisions that have real effects should make conceptual rigor more important, not less.

What I appreciate the most about Adam’s piece, is the effort to take something methodological like ‘validity,’ and to demonstrate an ethical dimension. I whole-heartedly agree with Adam’s concern about practitioners leaning on pretty graphs and big numbers, and empathize with his point about specialized vocabularies representing an obstacle to meaningful participation by laymen. But language about validity is not jargon. It is rather a tool-kit of methodologically useful distinctions. Here, a ‘dumbing down’ of vocabulary is tantamount to giving a plastic hammer to a non-carpenter and expecting him/her to build a house. What practitioners need are not different tools, but rather education and community. With Adam, I would like to see an interdisciplinary community develop around concepts like validity. Such a community should not only aim to render those concepts more understandable, but also provide enrichment that would benefit even researchers and scholars who so expertly wield them. From what I have briefly written here, I hope to have illustrated in some small way how a multitude of perspectives yields more complexity, not less. Through complexity, however, comes reflection, and exactly the kind of reflective practice that is required in the human sciences.

This Week in Learning Analytics (October 18 – October 24, 2014)

Chuck Dziuban at Emory University, speaking on the topic of "Teaching and Learning in an Evolving Educational Environment"

Chuck Dziuban at Emory University, speaking on the topic of “Teaching and Learning in an Evolving Educational Environment”

Latest News

19 October 2014
Data, Analytics, and Learning MOOC goes live
The long-awaited edX MOOC on Data, Analytics, and Learning went live this week. The #DALMOOC, which is taught by George Siemens, Carolyn Rosé, Dragan Gasevic, and Ryan Baker, provides an introduction to learning analytics, its tools and methods, and various ways in which it might be deployed in educational environments. It is also an experiment in its own right, allowing for multiple learning pathways: either in a standard edX xMOOC format, or as a social competency-based and self-directed cMOOC.

I have yet to engage much in the course but, at first glance, I have one small (or large, depending on how you look at it) criticism: The DALMOOC course agreement is confusing.

Data from participation in this Massive Open Online Course (MOOC) will be used for research purposes in order to gain knowledge for better design of support for student learning in MOOCs. When participants are logged in to this course, the information they enter into the course interface will be logged for analysis. The data will not be shared beyond the researchers who have approval to use this data. Personal identifiers will be replaced by unique identifiers. A possible risk is a breach of confidentiality. Participation is voluntary, and participants may stop participating at any time. There will be no cost to participants for participation in this study, and likewise no financial compensation will be offered. There may be no personal benefit from participation in the study beyond the knowledge received in the area of learning analytics, which is the topic of the course.

On the one hand, the course agreement (note: NOT a research participation agreement) is the first page that the student encounters when clicking the ‘Courseware’ tab (following registration), and implies that participation in the course is contingent upon one’s agreement to participate in a the research project. This implied contingency would seem to contradict the first ‘O’ in MOOC. On the other hand, it states that participation is voluntary and that it may be withdrawn at any time. What is not clear, is whether withdrawal from participation means withdrawal from the study or from the course. The way that this agreement is structured strongly implies that course participation requires participation in the study. As a test, I have not clicked the “I have read the above an consent to participation” button and have, to date, not been limited in my ability to participate in the course. I wonder about the ethics of this approach to gaining consent and, at the very least, wish that the language of the DALMOOC Course Agreement was less equivocal. [Read more]

21 October 2014
Study will Teach Algebra with Student-Authored Stories that Draw on Their Own Interests
A new study by Candace Walkington (Southern Methodist University) will test the effectiveness of teaching algebra by embedding algebraic concepts into students’ day-to-day lives. The study uses a mixed methodology, employing qualitative and data mining to test the effectiveness of personalized instruction on conceptual comprehension and retention, and attitudes toward math.

This is an approach that is often employed (or rather SHOULD often be employed) in the humanities (nothing like using love and sex to make sense out of Hegel’s master-slave dialectic), and resonates with the educational philosophy of John Dewey, for whom learning is a function of a concept’s importance, which, in turn, is a function of past experience, present necessity, and future aspiration. It is also an approach that might also serve to ‘catch’ more humanistically oriented students who do not consider themselves very ‘math’ or ‘science.’ [Read more]

Latest Blogs

Social Learning, Blending xMOOCs & cMOOCs, and Dual Layer MOOCs by Matt Crosslin
A really nice discussion of the design methodology for #DALMOOC. Specifically, Crosslin addresses three primary quetions, of which only two are really interesting (the third involved color selection):

  • Don’t most MOOCs blend elements of xMOOCs and cMOOCs together? The xMOOC/cMOOC distinction is too simple and DALMOOC is not really doing anything different.
  • Isn’t it ironic to have a Google Hangout to discuss an interactive social learning course but not allow questions or interaction?

Learning analytics using business intelligence systems by Niall Sclater
A review of several generic Business Intelligence solutions (including Cognos, Qlikview, and Tableau) which are typically employed for the sake of gaining operational insight, and ways in which they might be leveraged to gain insight into student learning experience as well.

Use of an Early Warning System by Stephen J. Aguilar
Video of a lightening talk version (~5 min) of a talk originally delivered at the 2014 Learning Analytics and Knowledge Conference, on “Perception and Use of an Early Warning System During a Higher Education Transition Program.”

Teaching and Learning in an Evolving Educational Environment by Charles Dziuban
Full video of the inaugural lecture in Emory University’s 2014-2015 Learning Analytics Speaker Series. Dziuban uses a variety of metaphors (including the Anna Karenina principle) to offer a perspective on learning analytics through the lens of the scholarship of teaching and learning, and explains the successful support model that he has implemented with faculty at the University of Central Florida.


On the Question of Validity in Learning Analytics by Adam Cooper
Cooper calls for a rethinking of the term ‘validity’ within the context of learning analytics. Although he covers himself by saying that “This post is a personal view, incomplete and lacking academic rigour,” what he nevertheless seems to call for is a conflation of methodological and ethical concerns, and a loosening of conceptual clarity in the name of facilitating practice by non-experts.

At the end of his post, Cooper asks He asks: “what do you think?” When dealing with technologies with the likelihood of significantly affecting human behavior, conceptual sophistication in both ethical and methodological matters is more, not less, important. In the absence of rigor, we run the risk of under-appreciating complexity, and implementing interventions that cause harm. What non-expert practitioners need is not a ‘dumbed-down’ vocabulary (or technology that does the work), but rather a set of expert advisors capable of fully assessing problems and solutions from a wide variety of perspectives in order to arrive at solutions that, even if not perfect, are at least fully informed.

Recent Publications

Learning Analytics as a Metacognitive Tool
Eva Durall & Begoña Gros

The use of learning analytics is entering in the field of research in education as a promising way to support learning. However, in many cases data are not transparent for the learner. In this regard, Educational institutions shouldn’t escape the need of making transparent for the learners how their personal data is being tracked and used in order to build inferences, as well as how its use is going to affect in their learning. In this contribution, we sustain that learning analytics offers opportunities to the students to reflect about learning and develop metacognitive skills. Student-centered analytics are highlighted as a useful approach for reframing learning analytics as a tool for supporting self-directed and self-regulated learning. The article also provides insights about the design of learning analytics and examples of experiences that challenge traditional implementations of learning analytics.

Premise of Learning Analytics for Educational Context: Through Concept to Practice
Yasemin Gülbahar & Hale Ilgaz

The idea of using recorded data for evaluating the effectiveness of teaching-learning process and using the outcomes for improvement and enhancing quality lead to the emergence of the field known as “learning analytics”. Based on the analysis of this data, possible predictions could be reached to make suggestions and give decisions in order to implement interventions for the improvement of the quality of the process. Hence, the concept of “learning analytics” is a promising and important field of study, with its processes and potential to advance e-learning. In this study, learning analytics are defined in two ways – business and e-learning environments. As an e-learning environment, Moodle LMS was chosen and analyzed through SAS (Statistical Analysis System) Level of Analytics. According to the analysis, some practical ideas developed. However learning analytics seem to be mostly based on quantitative data, whereas qualitative insights can also be gained through various approaches which can be used to strengthen the numerical data by providing detailed facts about a phenomenon. Thus, in addition to focusing on the learner, for research studies at the course, program, and institutional level; the research should include instructors and administrators in order to reveal the best practices of instructional design and fulfill the premise of effective teaching.

Calls for Papers / Participation

Conferences

NEW! Open Learning Analytics Network – Summit Europe Amsterdam | 1 January 2015 (APPLICATION DEADLINE: None, but spaces are limited)

Third International Conference on Data Mining & Knowledge Management Process Dubai, UAE | 23-24 January, 2015 (APPLICATION DEADLINE: 31 October 2014)

Learning at Scale 2015 Vancouver, BC (Canada) | 14 – 15 March 2015 (SUBMISSION DEADLINE: 22 October 2014)

2015 Southeast Educational Data Symposium (SEEDS) Emory University (Atlanta, GA) | 20 Feb 2015 (APPLICATION DEADLINE: 14 November 2014)

11th International Conference on Computer Supported Collaborative
Learning: “Exploring the material conditions of learning: Opportunities and
challenges for CSCL”
 
University of Gothenburg, Sweden | 7 – 11 June 2015 (SUBMISSION DEADLINE: 17 November 2014)

28th annual Florida AI Research Symposium (FLAIRS-28) on Intelligent Learning Technologies Hollywood, Florida, USA (SUBMISSION DEADLINE: 17 November 2014)

Journals / Book Chapters

NEW! Universities and Knowledge Society Journal (RUSC): Special Section on Learning Analytics (SUBMISSION DEADLINE: 20 January 2015)

Employment Opportunities

Simon Fraser University (Victoria, BC, Canada)
Tenure Track Position In Educational Technology And Learning Design – The Faculty of Education, Simon Fraser University (http://www.sfu.ca/education.html) seeks applications for a tenure-track position in Educational Technology and Learning Design at the Assistant Professor rank beginning September 1, 2015, or earlier. The successful candidate will join an existing complement of faculty engaged in Educational Technology and Learning Design, and will contribute to teaching and graduate student supervision in our vibrant Masters program at our Surrey campus and PhD program at our Burnaby campus. DEADLINE FOR APPLICATION: December 1, 2014

University of Technology, Sydney (Sydney, AUS)
Research Fellow: Data Scientist – We invite applications from highly motivated data scientists wishing to work in a dynamic team, creating tools to provide insight into diverse datasets within the university and beyond. We welcome applicants from diverse backgrounds, although knowledge of educational theory and practice will be highly advantageous. You are a great communicator, bringing expertise in some combination of statistics, data mining, machine learning and visualisation, and a readiness to stretch yourself to new challenges. We are ready to consider academic experience from Masters level to several years’ Post-Doctoral research, as well as candidates who have pursued non-academic, more business-focused tracks. DEADLINE FOR APPLICATION: Posted Until Filled

University of Michigan (Ann Arbor, MI)
Senior Digital Media Specialist – The University of Michigan is seeking a qualified Senior Digital Media Specialist to create digital content in support of online and residential educational experiences for the Office of Digital Education & Innovation (DEI). DEADLINE FOR APPLICATION: Posted Until Filled

NYU Steinhardt School of Culture, Education,and Human Developments Center for Research on Higher Education Outcomes (USA)
12-month postdoctoral position – available for a qualified and creative individual with interests in postsecondary assessment, learning analytics, data management, and institutional research.The Postdoctoral Fellow will be responsible for promoting the use of institutional data sources and data systems for the purpose of developing institutional assessment tools that can inform decision making and contribute to institutional improvement across New York University (NYU). DEADLINE FOR APPLICATION: Open Until Filled