Making a Case for Learning Analytics at Emory University

At Emory University, high retention rates and high levels of student performance have opened up opportunities for rethinking student success and for embedding analytics within learning environments in support of teaching, learning, and instructional design. Before throwing data and tools at practitioners, it is important that those practitioners have a strong grasp of how, why, and in what context analytics may be employed to best support desirable outcomes. This brief presentation will provide an overview of Emory’s short learning analytics journey, including early successes and failures, and ongoing efforts to educate stakeholders to make use of educational data in a way that is both meaningful and actionable.

Was the Data, Analytics, and Learning MOOC (aka DALMOOC) a Success?

DALMOOC Distortion
In a recent blog post (2015-01-29 – The site on which this original post was published has been taken down, but a cached version of the piece is still available HERE), Matt Crosslin raised the question of how one might go about measuring the success of a MOOC. Matt is the developer for the recently completed Data, Analytics, and Learning MOOC (DALMOOC) offered by the University of Arlington using the edX platform. He concludes his piece with the following:

So, at the end of the day, I will be able to come up with some philosophical jargon to “prove” that DALMOOC was a success to the powers-that-be who ask – all of which will be true. But to be honest, the only thing I really want to do is shrug my shoulders and say “beats me – go look the participant output itself and see if that looks like success to you.”

Looks like success to me.

There are several issues with Matt’s piece that I would like to address here:

  • Dissonance between course design and the course content – it is remarkable to see such resistance to measuring the success of a course whose primary aim was to facilitate the ability to accomplish exactly that
  • Resistance to the concept of ‘success’ in general – Matt implies (and elsewhere insists) that ‘success’ is a “linear instructivist concept” and, as such, should be avoided except as part of post hoc justifications to funders and university leadership
  • An insistence that evidence speaks for itself – success is always a function of the extent to which a thing achieves the aim(s) for which it was created. By shrugging his shoulders and leaving evaluation up to course consumers, Matt implies either that the DALMOOC was produced without any particular aim or set of objectives, or that those goals and objectives have very little to do with the course’s success. In light of the tremendous amount of effort that went into the DALMOOC’s complex course design, it is clear that neither of these are right. The dismissive way in which Matt concludes his post fails to do justice to the thoughtfulness of the (his) DALMOOC course design.

Following a brief overview of the DALMOOC and the introduction of several key terms, I will critically examine each of these issues in an attempt to bring some clarity to the concept of success. As always I mean this in the spirit of philosophical charity, and as a starting point for a discussion that I feel is worth having.

Is the DALMOOC Instructivist or Connectivist?

In his blog post, Matt states that the DALMOOC experience was successful because “everyone that work (sic) on DALMOOC lived and are still on speaking terms.” A well-designed Massive Open Online Course is a tremendous amount of work. The amount of time and effort involved in delivering the DALMOOC, however, increased geometrically as a consequence of the complexity of its design (It was fairly typical to see Matt put in 60+ hours per week while the course was live). The DALMOOC was implemented as both a non-linear, learner-centered connectivist MOOC (cMOOC) — The ‘Social Learning Path’ — and a linear, teacher-centered instructivist MOOC (xMOOC) — the ‘Guided Learner Path’. The reasons for this complex ‘dual-layer’ design are not particularly well-articulated, but, in light of Matt’s personal commitment to the connectivist perspective (and the fact that the course staff includes George Siemens — one of the two ‘founding fathers’ of this perspective), they are presumably aligned with the values of connectivist instructional design in general. As George Siemens explains in Connectivism: A Learning Theory for the Digital Age (2005), the basic principles of connectivism are as follows:

  • Learning and knowledge rests in diversity of opinions.
  • Learning is a process of connecting specialized nodes or information sources.
  • Learning may reside in non-human appliances.
  • Capacity to know more is more critical than what is currently known
  • Nurturing and maintaining connections is needed to facilitate continual learning.
  • Ability to see connections between fields, ideas, and concepts is a core skill.
  • Currency (accurate, up-to-date knowledge) is the intent of all connectivist learning activities.
  • Decision-making is itself a learning process. Choosing what to learn and the meaning of incoming information is seen through the lens of a shifting reality. While there is a right answer now, it may be wrong tomorrow due to alterations in the information climate affecting the decision.

Guided by these basic principles, George Siemens explains that the DALMOOC was designed in order to solve the following problems with MOOCS:

      1. Students often flounder in MOOCs as there is limited social contact between learners and limited timely support.
      2. Learners have limited engagement in developing knowledge together. Many MOOCs reflect a structured and linear process of content presentation. There is little alignment with the architecture of knowledge in a participative age.
      3. Learners have a difficult time getting to know each other or finding like others as major platforms do not focus on developing learner profiles
      4. The connection between learning and application is hampered as MOOC resources do not always persist after a course has ended and there is limited search functionality in MOOCs.
      5. Courses are not adaptive and serve the same content to all learners, regardless of prior knowledge

In order to address these issues, the course design was meant to (1) facilitate timely access to resources through a tool called Quick Helper, (2) foster social embeddedness through a distributed information structure (learners were encouraged to produce their own learning spaces and exercise agency by connecting fragmented ‘information pieces’), (3) encourage persistence through the use of ProSolo as a platform for engaging course content and community following course completion, and (4) promote adaptability by providing learners with an assignment bank that allowed them to challenge themselves by selecting assignments with various levels of complexity.

On the surface, the DALMOOC is structured in such a way as to allow learners the option of navigating the course either as a connectivist cMOOC or as an instructivist xMOOC. In providing learners with the option, however, and actively encouraging them to bounce between paths in a way that best meets their needs and interests as individual learners, the dual-path format aims to nurture connection-making (between content, modalities, and people) and to encourage the learners to take responsibility for their learning as a result of embedding decision-making as an integral part of the MOOC’s design. These are both central principles of connectivism. This is not to say that the DALMOOC is a cMOOC (nor is it, of course, to call it an xMOOC). The DALMOOC was truly dual-layer in its design, in a way that permitted either an xMOOC or cMOOC experience. As is clear from the comments of both Matt and George, however, the motivations and values that underscore the way in which the dual-layer MOOC was implemented are connectivist through and through.

Evaluating DALMOOC’s Success

1. Dissonance between Course Design and subject Matter

Here I would merely like to point out the irony of a class on learning analtyics being designed by one who is skeptical of the possibility of measuring student success, and indeed who is ideologically resistant to such endeavors. In his blog post, Matt mentions the term ‘measurement’ only twice: once in the title and once in his introduction to describe several ways in which MOOC success is typically demonstrated. In a follow up tweet, however, he asserts that the idea of “measurable success” is behaviorist and so suspect:

He also equates the concept of success with intstructivism, with the implication that is is incompatible with a connectivist view of education:

I will address the concept of success in the next section. For now, it is enough to simply note the tension between these views and the broadly accepted definition of learning analytics as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.” ‘Measurement’ is clearly central to learning analytics. So, too, is ‘success,’ since the notion of optimization implies a standard of performance against which results are assessed. It is quite possible that learning analytics is opposed to connectivist values — i.e. that there is no such thing as a connectivist learning analytics. It is for this reason that I am not arguing that Matt is guilty of contradiction. There is, however, an obvious tension between his flavor of connectivism and the content of the course his connectivist course design is intended to support.

2. Resistance to Success

Matt is strongly opposed to efforts aimed at demonstrating the DALMOOC’s success (or of demonstrating the success of MOOCs in general). He reduces such efforts to “philosophical jargon,” the function of which is merely to provide post hoc evidence that would assure sponsors that their money was well spent. He equates the concept of ‘success’ with instructivism, and ‘measurable success’ with behaviorism. Is this right?

I would like to propose a definition of success (Aristotelian): Success is the extent to which the activity of a thing accords with the function for which that thing was produced. Given this definition, does the concept of success apply to connectivist teaching practices? Certainly! A connectivist teaching practice is successful to the extend that it accords with connectivist values. Does it promote diversity of opinion, foster the connecting of specialized information sources, offer a space for the non-human, cultivate the capacity to know, nurture continual learning, promote interdisciplinarity, maintain a view to the present, and encourage deliberation? If so, then SUCCESS!

Looking at the DAMOOC specifically, it is quite clear that the project set out to achieve a specific set of goals: to facilitate timely access to resources, foster social embeddedness, encourage persistence, and promote adaptability. The ‘right’ answer in response to the question “was DALMOOC successful” is NOT to shrug one’s shoulders and say “beats me.” Rather, the right answer would be something to the effect that “we designed the course with four objectives in mind, which we sought to achieve through the effective use of three different tools (Quick Helper, ProSolo, and an assignment bank), as well as a course design meant to foster community through a distributed information structure. Looking at student behavior (from course activity logs, etc) and opinions (as expressed in discussion boards, blog posts, on social media, etc), we found that some strategies demonstrably served to mitigate many problems that traditionally plague MOOCs. Other strategies had little effect, and others still were found to exacerbate problems while creating new ones. Overall, then, DALMOOC was successful in demonstrating the value of several of our approaches, while at the same time providing us with valuable feedback which we plan to incorporate in subsequent iterations of the course” … or something like that.

Success is not an idea to be resisted. Rather, the idea of success merely affirms the fact that our activities are intentioned. Can success be measured? Again, it is important to be clear about what we mean by ‘measurement.’ If by measurement we mean ‘quantified,’ then I whole heartedly agree with Matt that there are some forms of success that cannot be measured. Looking again to the list of connectivist values listed above, values like ingenuity and personal responsibility are goals of teaching practice, and a conectivist would agree that a student who has embraced those values in thought and in action is a successful student (and will also be more likely to survive and thrive in the 21st century), but the achievement of each of these goals is difficult to measure. And in fact, the attempt to measure a student’s ability in these areas may actually be antithetical to student achievement. If, on the other hand, we view measurement more broadly to refer simply to the comparison of something to a particular standard, then there is absolutely no reason why success cannot be measured. Indeed, judging the success of an activity is always a result of measurement in this sense, since it is only by comparing the performance of an activity against a particular standard of excellence that a performance may be judged at all.

3. DALMOOC as a Dingle Hopper

I would like to conclude this post by commenting on the idea that success is in the eye of the beholder. Matt concludes his blog post by asking the reader to “go look at the participant output itself and see if that looks like success to you.” In this remark, it seems to me that Matt has confused two perspectives: (1) the producer perspective, and (2) the consumer perspective. From the perspective of the producer, the success of the DALMOOC is a function of the extent to which its instructional design was seen to facilitate the achievement of clearly stated course objectives. From the perspective of the consumer, however, the goals of the course design are largely irrelevant. The consumer comes to a course as an artifact, and brings to it a set of personal values, goals, and expectations. What this means is that it is entirely possible for a course to be successful according to its own design objectives, while at the same time producing an experience that is perceived as unsuccessful by some of its students. It is likewise possible for a course to be successful by its own standard, but be perceived as unsuccessful when assessed by others who do not share the same set of values that governed the course’s design (i.e. like those for whom completion rates, enrollment, numbers, and earned certificates are really all that is important).

The success of a course is something that can only be judged relative to the goals and objectives that motivated its production. I think here of the dinglehopper. For those who produced the dinglhopper, its function is that of a fork…because it is a fork. It’s design is successful to the extent that it aids in the transportation of food from plate to mouth. For Scuttle the seagull, the dinglehopper is successful because it helps him to style his feathers. We see success in both cases. The latter is a successful experience, but only in the former do we see a successful object.

To conclude, then, questions about the success of the DALMOOC are questions about the values and objectives of the course designers, and the extent to which the implementation of the course design saw the expected results. To “shrug my shoulders and say ‘beats me'” is a terrible response, because it implies either a lack of awareness about, or reflection on, the DALMOOC objectives, or else a position that would put so little stock in those objectives that their achievement is not worth evaluating. From other accounts that have been written, and from my limited interaction with Matt, it is abundantly clear that neither is correct. Matt Crosslin and the other members of the DALMOOC design team have clearly done a great deal of work designing and implementing the course, and with a constant view to the course’s success. At the end of the day, the main issue with Matt’s post is that it fails to do justice to the work that he and the rest of the team have done. It is unfortunately flippant, and lamentably dismissive of many things both related and unrelated to the DALMOOC and its evaluation.

My goal in writing this piece has been to clarify several concepts while also coming to the defense of a project that is laudable in its goals, ambitious in its implementation, and (I suspect) successful in ways that will positively inform the design and effectiveness of future MOOCs. In the course of writing this piece, I have also had the great opportunity to learn more about the DALMOOC, and even more about connectivism (a term that I had heard, but about which I knew nothing). I am grateful to Matt for his blog post — which functioned as an artifact and as an opportunity to explore these subjects — and also for his willingness to engage me in a conversation motivated by the spirit of philosophical charity and for the sake of the community as a whole.

As always, comments are welcome.

This Week in Learning Analytics (January 17 – 24, 2015)

IMG_0440.JPGLet’s call this a ‘lite’ edition of This Week in Learning Analytics. I’m composing this from the passenger seat of a truck hauling three mustangs (horses, not cars) back from the Mustang Magic Trainer’s Challenge in Fort Worth, TX, where my wife earned a 4th place finish and the distinction of ‘fan favorite.’

Elisa Wallace and Hwin Take Home Fourth Place, Fan Favorite Award at Mustang Magic

In my dual capacity as both supportive husband and marketing director, I spent the weekend furiously taking pictures, producing video, managing social media, and working with sponsors, and didn’t have the time or ability to produce a standard post. I did, however, want to make sure that I put something up in service to the learning analytics community, and so — armed with my mobile phone and in the company of my wife, her groom, three horses, three dogs, and a hamster — I have produced a set of links that I think effectively highlight the major happenings from the field over the past week.


Headlines

January 19, 2015
Parents Step Up Fight Against Data Mining in Australian schools

January 20, 2015
Google Changes Course, Signs Student Data Privacy Pledge

Strategic Tech List for 2015: Mobile and Data Analytics Dominate

Industry Updates

January 20, 2015
WeLearnedIt iPad App for Project-based Learning and Digital Portfolios Now Available to All Future Ready Schools and Districts

RapidMiner Opens Modern Analytics Platform to Academia Worldwide

Opinions

January 12, 2015
Developing Strategies for Implementing Analytics

January 19, 2015
The Art of Linking Social Media with Learning

Bring Your Own Device: The Next Big Trend in Education

Guidelines for using Student Data

January 20, 2015
What Standardized Testing and Schools could Learn from Target

I, for one, welcome our new analytics overlords

January 22, 2015
The Uneven Legacy of No Child Left Behind

A a Straight Talking Guide to Trends and Vendors at Learning Technologies

The Missing Analytics from iTunes U Courses

January 23, 2015
Predictions for the Future of Student Data Privacy

Presentations

Creating an Action Plan for Learning Analytics

Events

Winter 2015 SLAM Series

Tools for Evidence-Based Action (TEA) Meeting – Day Two

The second day of the Tools for Evidence-Based Action (TEA) Meeting opened with a continuation discussion from yesterday, about features and feature requests for the two tools introduced yesterday: the Ribbon Flow Tool and the General Observation and Reflection Protocol (GORP) Tool.

Feature requests for the Ribbon Flow Tool included adding more stops and a csv to json convertor, but also the ability to customize color-codes and tool-tips. Several security concerns were raised in discussion yesterday, which developer Matt Steinwachs resolved by the morning. Nevertheless, general consensus was that comprehensive documentation is necessary in order to gain the institutional approvals necessary before hosting any kind of student data, aggregate or otherwise, on an external server.

bored

As far as the GORP tool is concerned, Matt Steinwachs presented a kind of development roadmap that would see the app implemented in a way that was fully customizable, allowing users to record observations according to any observation protocol or variation. In future, users will be able to fully customize the buttons presented (including text, images, colors, and position). Also customizable will be time frame (i.e. interval or real time logging). This is clearly going to be a powerful and flexible tool for recording classroom observations, but what really excites me is its potential as an active learning tool. I imagine, for example, customizing the app for use by students so that they can record affective and/or learning states during the course of a class period. The simplest example of this would be to strip down the app to include only a single ‘bored’ button and asking students to push it when they lose interest during a class period. Data collected from a course (especially from a large course) could produce a heatmap on a lecture recording, in order to associate classroom activity patterns with lags in student engagement.

In the afternoon, Mary Hueber and Pat Hutchings, Evaluators for the Bay View Alliance, talked about the organization’s latest Research Action Cluster (RAC): Using Academic Analytics to Support and Catalyze Transformation. Led by Marco Molinaro and Chris Pagliarulo (UC Davis), the project will produce three case studies about three institutions at different stages in their learning analytics program implementations. The emphasis of these case studies will be on data-driven intervention effectiveness. UC Davis has been selected as the first institution to be evaluated. The others (which are likely to include one other school from the Bay View Alliance, and one from outside) will be selected in the coming months.

The day concluded with discussions of how the TEA community might be sustained and grow, and how participants might bring insights back to their home institutions and promote tool adoption. The Helmsley Charitable Trust has committed to funding another TEA meeting in a year. I look forward to continuing to inform the development of the UC Davis learning analytics toolkit, and to continuing my active involvement in this initiative. Big thanks are due to Marco Molinaro, Chris Pagliarulo, Alan Gamage and Margie Barr for their hard work in organizing this event, and catalyzing this community of interest in approaches to evidence-based action.

This Week in Learning Analytics (December 26, 2014 – January 2, 2015)

Spent the first day of 2015 with the dogs and a helmet cam.

Spent the first day of 2015 with the dogs and a helmet cam.
http://youtu.be/mrvNS6m_N58

HEADLINES

Student Success


Colleges Reinvent Classes to Keep More Students in Science
Richard Pérez-Peña | New York Times | 26 Dec 2014

In the Middle East and North Africa, a Very Gradual Shift Away From Caring Only About Exam Results
The World Bank | 29 Dec 2014

Tennessee’s Largest Public College System Wants to Eliminate Undecided Majors
Emily Siner | Nashville Public Radio | 29 Dec 2014

Government


Bipartisan Bill for Evidence Could Improve Access to Higher Ed Data
Clare McCann & Amy Laitinen | Ed Central | 2 Jan 2015

Bids Submitted for New Data Systems
Paul Shott | Greenwich Time | 26 Dec 2014

RESEARCH

Articles


Big Data in Higher Education: Opportunities and Challenges
Ben Daniel | British Journal of Educational Technology | 45(6)

Let’s Not Forget: Learning Analytics are About Learning [OPEN ACCESS]
Dragan Gašević, Shane Dawson, George Siemens | Tech Trends | 59(1)

Big Data in Educational Data Mining and Learning Analytics
B R Prakash, M. Hanumanthaooa, Vasantha Kavitha | International Journal of Innovation Research in Computer and Communication Engineering | 2(12)

Book Chapters


Mapping Problems to Skills Combining Expert Opinion and Student Data
Juraj Nižnan, Radek Pelánek, Jiří Řihák | Mathematical and Engineering Methods in Computer Science | pp. 113-124

Conference Papers


Linked Data, Data Mining and External Open Data for Better Prediction of At-Risk Students
Farhana Sarker, Thanassis Tiropanis, Hugh C. Davis | 2014 International Conference on Control, Decision and Information Technologies (CoDIT) | 3-5 Nov 2014

OPINIONS

Blogs


Are Libraries Late to the Preventative Analytics Party?
Meredith Farkas | American Libraries | 26 Dec 2014

A Fresh Crop of California Data Privacy Laws
Lei Shen, Julian M. Dibbell | Mayer-Brown | 23 Dec 2014

Editorials


Teacher Hopefuls Go Through Big Data Wringer
Stephanie Simon | Politico | 29 Dec 2014

Colleges’ New Challenge: Keeping Students in School
Kelley Holland | NBC News | 31 Dec 2014

This Week in Learning Analytics (December 13 – 19, 2014)

How is it that small children, who are so cute, can produce art that is so incredibly creepy?MERRY CHRISTMAS EVERYONE!

How is it that small children, who are so cute, can produce art that is so incredibly creepy?

HAPPY CHRISTMAS EVERYONE!

News

Higher Ed


December 12, 2014
USC, IBM, and Fluor Corp to form Center for Applied Innovation in support of Personalized Learning
The University of South Carolina, IBM, and Fluor Corp. are forming the Center for Advanced Innovation to leverage predictive analytic techniques in the development of personalized learning curricula. The center will make use of technology initially developed as a result of IBM’s work with Gwinnet County Public Schools in Georgia.

December 17, 2014
University of Iowa Joins Unizin Consortium
After eliciting broad campus input, the University of Iowa has joined the Unizin Consortium of schools and will participate in a small pilot of Canvas during the summer and fall of 2015.

K-12


December 11, 2014
Former School Data Czar Receives Jail Time
Former Columbus City Schools data czar Stephen B. Tankovich has received 15 days of jail time after pleading no contest to attempted tampering with school records. Tankovich denies directing anyone to change student data (specifically attendance records), but rather claims that any changes were the result of acceptable data cleaning practices.

Privacy & Ethics


December 11, 2014
Oregon Attorney General Pushes for Online ‘Bill of Rights’
Oregon Attorney General Ellen Rosenblum is urging state lawmakers to adopt a privacy bill on the model of the one signed by California Governor Jerry Brown in September, requiring school’s contracts with technology vendors to prohibit the collection and dissemination of student information.

Research


December 10, 2014
Completing the Loop: Returning Meaningful Learning Analytic Data to Teachers
The Completing the Loop project was initiated in 2014 with support from the Australian Government, the University of Melbourne, Macquarie University, and the University of South Australia. The stated aim of the project is to “develop a better understanding of learning analytics and the ways in which analytics can be interpreted, applied, and actioned by teachers to improve teaching and learning practices. Phase one of the project has now been complete, with results presented at the 2014 Australian Learning Analytics Summer Institute (November 2014). Preliminary findings from interview data include:

  • Participants had fairly basic requests concerning their needs and ideas of how leaning analytics can be used and retrieved from their courses
  • Such requests mainly focused on analytics around student engagement, specifically frequency of access to resources
  • Due to the blended nature of their teaching, few participants made use of interactive online activities such as quizzes and discussions, limiting the availability of data.
  • Only a minority of participants currently monitor their students’ activities using learning analytics.

December 16, 2014
Results of Infamous HILT Attendance Study Released
The Harvard Initiative for Learning and Teaching (HILT) Attendance study, which famously through the university into hot water as a result of privacy concerns, found that courses requiring attendance saw higher attendance rates, and that attendance declined over the course of the semester. Was the study worth it?

Industry


December 11, 2014
2014 Educational and Training Content Trends
A recent survey fielded by Data Conversion Laboratory Inc found that 47% of respondents identified a lack of analytics for measuring effectiveness of training as one of the greatest challenge in developing and delivering training content. By far, the most common way of measuring training effectiveness continues to be the use of surveys (59%), with analytics being used by only a quarter of respondents.

December 10, 2014
The Advisory Board Company buys Royall & Company for $850 Million
Since 2012, the Educational Advisory Board has made significant and high-profile moves into the educational analytics space. With the acquisition of Royall, EAB sees a significant increase in its market share in the data-driven student engagement and enrollment solutions space, as well as building out its talent and expertise in support of future projects and initiatives.

Opinions

Blogs


A Student App for Learning Analytics by Niall Sclater
Sclater discusses a move on the part of Joint Information Systems Committee (JISC) to commission a range of learning analytics services in support of higher education in the UK. One of the services commissioned as part of this ambitious initiative is an app for students. The requirements gathering process will determine the kinds of features that will have the most value for students, but Sclater here anticipates likely results, important considerations, and possible issues.

Data Science: The New Skillset for Learning Technologists by Mark Aberdour
Aberdour observes the wealth of educational data that is increasingly at our fingertips, and emphasizes the need for increased data literacy…something that is easier to achieve thanks to a growing body of literature and the increased availability of analytical tools. That being said, data science is not something that can just be ‘picked up.’ It requires effort, expertise, and support from local communities of experts. At the end of the day, however, data are only as good as the questions that are asked, and it is in this aspect of data science here that learning technologists are best equipped to start contributing, and can contribute, starting right now.

Big Data: Is Small Beautiful? by Terry Freedman
Freedman presents a cynical view of big data in education and advocates, instead, a view that would see an emphasis on usefulness and relevance over succumbing to big data hype.

Averages Don’t Matter…and Other Common Mistakes in Data Analysis by Nick Sheltrown
Sheltrown offers five important lessons that function as an valuable introduction to methodological issues when working with educational data: “Generally, it is as much work to craft a poor analysis as it is a good one. Adhere to these lessons and you will save countless headaches in drawing value from your data.” Included in his list are (1) the importance of sanity checking, (2) suspicion of averages, (3) caution against using models you don’t understand, (4) acknowledging that actionable analysis requires comparisons, and (5) the fact that working with data is hard to do.

New Report on Emotional Presence in Online Education by Terry Anderson
Anderson summarizes and recommends a new report recently published by the Learning Analytics Community Exchange (LACE), which reviews the literature on emotions and learning, and extends the Community of Inquiry Model to include Emotional Presence. Anderson wonders why the researchers did not include more of an emphasis on teacher emotion, but notes that the piece represents a great starting point for future research.

Editorials


Murky Federal Privacy Law Puts MOOC Student Data in Questionable Territory by D. Frank Smith
Chief Privacy Officer for the U.S. Department of Education, Kathleen Styles, has said that MOOC data is seldom protected by FERPA. The reason for this is that FERPA only applies to schools receiving funding under an applicable program of the U/S. Department of Education, and MOOCs are rarely funded with Title IV dollars.

Walking the Student Data Tightrope by Dian Schaffhauser
An interview with attorney Bret Cohen, on student privacy issues that should be considered by school districts. The interview highlights the importance of being attuned not simply to legal issues, but political ones as well. When it comes to collecting student data, no notice to parents is technically required, since such data collection is in the service of improving education and student success. However, the success of data collection and analytics initiatives is contingent upon mitigating potential blow-back from parents with an interest in protecting the safety and future prospects of their children. Full transparency and explicit consent for initiatives, may not be legally necessary, but they are nonetheless advisable, if not absolutely necessary.

How the US Government’s Tiny Statistical Error is Distorting the True Cost of College by Zach Wener-Fligner
The US Government places students into income brackets differently than many American colleges, with the result that students in the lowest income bracket appear to be paying more for their education than they actually are.

Publications

Articles


Assessing the Suitability of Student Interactions from Moodle Data Logs as Predictors of Cross-Curricular Competencies
Santiago Iglesias-Pradas, Carmen Ruiz-de-Azcárate, & Ángel F. Agudo-Peregrina

In the past decades, online learning has transformed the educational landscape with the emergence of new ways to learn. This fact, together with recent changes in educational policy in Europe aiming to facilitate the incorporation of graduate students to the labor market, has provoked a shift on the delivery of instruction and on the role played by teachers and students, stressing the need for development of both basic and cross-curricular competencies. In parallel, the last years have witnessed the emergence of new educational disciplines that can take advantage of the information retrieved by technology-based online education in order to improve instruction, such as learning analytics.

This study explores the applicability of learning analytics for prediction of development of two cross-curricular competencies – teamwork and commitment – based on the analysis of Moodle interaction data logs in a Master’s Degree program at Universidad a Distancia de Madrid (UDIMA) where the students were education professionals. The results from the study question the suitability of a general interaction-based approach and show no relation between online activity indicators and teamwork and commitment acquisition. The discussion of results includes multiple recommendations for further research on this topic.

Reports


Measuring and Understanding Learner Emotions: Evidence and Prospects
Bart Rienties and Bethany Alden Rivers for Learning Analytics Community Exchange (LACE)

Emotions play a critical role in the learning and teaching process because they impact on learners’ motivation, self-regulation and academic achievement. In this literature review of over 100 studies, we identify many different emotions that may have a positive, negative or neutral impact on learners’ attitudes, behaviour and cognition. We explore seven data gathering approaches to measure and understand emotions. With increased affordances of technologies to continuously measure emotions (e.g., facial and voice expressions with tablets and smart phones), in the near future it might become feasible to monitor learners’ emotions on a real-time basis.

Videos, Presentations, and Webinars

Learning Analytics: An Essential Tool for Learning in the Future
Simon Buckingham Shum

This Week These Weeks in Learning Analytics (November 29 – December 12, 2014)

Apologies for not posting an update last week. I was in Canada visiting family, and so took a week off (fortunately, as we move into December holidays, the news cycle slows considerably). The outpouring of messages I have received from followers who noted the absence of a post, however, has served to underline the value that these news roundups have for the learning analytics community. Thank you to every one for your support! If you have any news worth sharing, or any suggestions for how this regular series of posts might be improved, please don’t hesitate to email me at mail@timothyharfield.com
Me and my new nephew Kingston while visiting family in Canada last week | Photo by Karen Harfield, my sister and Kingston's Mom

Me and my new nephew Kingston. Picture taken while visiting family in Canada last week.

News

December 1, 2014
School Districts Pressure Publishers to Adopt Interoperability Standards
Several large US school districts are applying pressure to publishers to adopt interoperability standards, like those developed by the IMS Global Learning Consortium. The adoption of such standards would increase competition in the k-12 educational publishing space by preventing ‘lock-in,’ and increase freedom on the part of teachers to customize course content. Such standards would also lend themselves powerfully to the development of more versatile learning analytics tools.

December 4, 2014
White House Urges Colleges, Ed Tech Companies To Help Graduate More Students
The White House, during its second “College Opportunity Day of Action,” announces 600 new actions related to college preparation and completion. Important among its commitments is a push toward increasing investment and capability in the areas of predictive analytics and adaptive learning.

Student Privacy and Ethics

November 30, 2014
Whistle Blown on Womb to Workforce Data-Mining Scheme
Two groups, Pennsylvania Against Common Core and Pennsylvanians Restoring Education, are asking Gov. Tom Corbett to place a moratorium on data collection in the Pennsylvania Information Management System or PIMS. The system gathers information on students in all 500 school districts across the state and some schools have started collecting behavioral data that goes beyond testing for academic knowledge, according to the two organizations.

December 1, 2014
New South Wales Schools to Share Information on Expelled Students
Under new rules, struck by NSW Education Minister Adrian Piccoli, mean public, private and Catholic schools will share information about the background and past behavior of transfer students. The decision was justified by a sense of moral responsibility to students and stewardship in the use of tax-payer money. The new rules do not apply to independent schools

Insights

December 1, 2014
3 Lessons From Data on Children’s Reading Habits: Data from Accelerated Reader, a program used in schools, highlights trends in children’s reading habits.
Report on the results of mining data collected by Renaissance Learning on the reading activity of students outside of the classroom. Three major findings include: (1) girls read more than boys, (2) 15 minutes of reading a day is a ‘sweet spot’ in terms of promoting optimum learning gains, and (3) students benefit from taking on the challenge of books above their reading level.

Opinions

Blogs

Learning Analytics and Educational Research — What’s New? by Simon Buckingham Shum
A wonderful exercise in clarifying the distinctions between (1) educational and learning sciences research, (2) learning analytics research, and (3) learning analytics systems. One of the claims here is that learning analytics only begins to take place with the automation of coding and other research processes. When learning analytics is viewed in the tradition inaugurated by the Decision Support System (as, indeed, I am inclined to do), then I am inclined to agree that automation serves as a distinguishing feature.

ascilite 2014: ‘Rhetoric and Reality’ – critical perspectives on educational technology by Richard Walker
This summary of the ascilite 2014 conference opens with a summary of remarks made by Shane Dawson, Cathy Gunn, and Linda Corrin on the topic of learning analytics. Dawson delivered an event keynote, during which he warned of the danger associated with use of vendor-provided solutions without “some form of application to pedagogic interventions that change academic practice and enhance the student learning experience.” Gunn cautioned against the temptation to view data as providing a complete picture of students and their learning. Corrin mentioned that the use of student-facing dashboards raised ethical concerns, as well as concerns about mitigating ‘gaming’ behaviors, and observed that pilot data provided little evidence in support of the effectiveness of such tools in changing student behavior.

Editorials

Online Education Run Amok?” Private Companies Want to Scoop Up your Child’s Data by Caitlin Emma
A compelling piece, reviewing the ways in which government and ed tech working in k-12, higher ed, and MOOCs are working to harvest student data amidst murky federal privacy laws. Citing David Hoffman (Global Privacy Officer, Intel), the piece concludes with the suggestion that next phase of educational data may not be marked by progress in predictive or analytics capabilities, but rather by advances in ethics.

Your Data Lack Value, and What You Can Do About It by Nick Sheltrown
The piece opens with the claim that “When it comes to data use in schools, our rhetoric outpaces reality. Even though many school districts lay claim to data-driven instruction, too often the expression serves only as a convenient slogan for school improvement plans, conference presentations, and accreditation documents.” Conceptually, ‘data-driven ________’ points to a basic misunderstanding about data science, by suggesting that data come first. Rather than start with the data, Sheltrown offers four basic steps that should define work with data: (1) articulate the information need, (2) identify the best measures, (3) develop processes, and (4) monitor data use for the sake of making adjustments as necessary.

What’s Wrong with Using Data to Grade Teachers? by Mercer Hall & Gina Sipley
The authors detail pushback against New York State’s new controversial teacher evaluation system. Value-added models (VAMs) aim to move beyond evaluation strictly based on student test scores, and instead to identify effective and ineffective teaching. whether this is a good idea or not, the implementation of the VAM rankings in New York has not only been reductionist, but also laughably poor in its execution.

Getting Privacy Policies Right…the First Time by Brenda Leong & Jules Polonetsky
Lessons that the education sector can learn from recent blunders from the private business sector: (1) Do not claim that you can simply change your policy at any time, (2) Do not simply say that if your company is sold, student data is an asset that will also be sold to the acquirer, and (3) Don’t disclaim responsibility for any third party code on your site.

Publications

Books

Using Evidence of Student Learning to Improve Higher Education
George D. Kuh, Stanley O. Ikenberry, Natasha Jankowski, Timothy Reese Cain, Peter T. Ewell, Pat Hutchings, Jillian Kinzie

American higher education needs a major reframing of student learning outcomes assessment
Dynamic changes are underway in American higher education. New providers, emerging technologies, cost concerns, student debt, and nagging doubts about quality all call out the need for institutions to show evidence of student learning. From scholars at the National Institute for Learning Outcomes Assessment (NILOA), Using Evidence of Student Learning to Improve Higher Education presents a reframed conception and approach to student learning outcomes assessment. The authors explain why it is counterproductive to view collecting and using evidence of student accomplishment as primarily a compliance activity.

Articles

Educational Data Crossroads: Data Literacy for Data-Driven Decision Making in Postsecondary Education
Carol Camp-Yeakey

This paper discusses how educational policies have shaped the development of large-scale educational data and reviews current practices on the educational data use in selected states. Our purposes are to: (1) analyze the common practice and use of educational data in postsecondary education institutions and identify challenges as the educational crossroads; (2) propose the concept of Data Literacy (DL) for teaching (Mandinach & Gummer, 2013a) and its relevance to researchers and stakeholders in postsecondary education; and (3) provide future implications for practices and research to increase educational DL among administrators, practitioners, and faculty in postsecondary education.

Student Privacy in Learning Analytics: An Information Ethics Perspective
Alan Rubel & Kyle M. L. Jones

In recent years, educational institutions have started using the tools of commercial data analytics in higher education. By gathering information about students as they navigate campus information systems, learning analytics “uses analytic techniques to help target instructional, curricular, and support resources” to examine student learning behaviors and change students’ learning environments. As a result, the information educators and educational institutions have at their disposal is no longer demarcated by course content and assessments, and old boundaries between information used for assessment and information about how students live and work are blurring. Our goal in this paper is to provide a systematic discussion of the ways in which privacy and learning analytics conflict and to provide a framework for understanding those conflicts.

Reports

Code of Practice for Learning Analytics: A literature Review of the Ethical and Legal Issues
Niall Sclater for Joint Information Systems Committee

Consultation by Jisc with representatives from the UK higher and further education sectors has identified a requirement for a code of practice for learning analytics. The complex ethical and legal issues around the collection and processing of student data to enhance educational processes are seen by universities and colleges as barriers to the development and adoption of learning analytics (Sclater 2014a). Consequently a literature review was commissioned by Jisc to document the main challenges likely to be faced by institutions and to provide the background for a sector-wide code of practice. This review incorporates many relevant issues raised in the literature and the legislation though it is not intended to provide definitive legal advice for institutions. It draws from 86 publications, more than a third of them published within the last year, from a wide range of sources

Videos, Presentations, and Webinars

Advancing University Teaching with Analytics: Linking Pedagogical Intent and Student Activity through Data-Based Reflection
Alyssa Wise

Calls for Papers / Participation

Conferences

Workshop: It’s About Time: 4th International Workshop on Temporal Analyses of Learning Data @LAK15 Poughkeepsie, NY | 16 – 20 March, 2015 (SUBMISSION DEADLINE: 11 January 2015)

EDM 2015: 8th International Conference on Education Data Mining Madrid, Spain | 26 – 29 June, 2015 (SUBMISSION DEADLINE: 12 January 2015)
Workshop: Ethics and Privacy in Learning Analytics (#EP4LA) @LAK15 Poughkeepsie, NY | 16 – 20 March, 2015 (SUBMISSION DEADLINE: 15 January 2015)

Workshop: LAK Data Challenge 2015 Poughkeepsie, NY | 16 – 20 March, 2015 (SUBMISSION DEADLINE: 31 January 2015)

EDEN Annual Conference Barcelona, Spain | 9 – 12 June, 2015 (SUBMISSION DEADLINE: 31 January 2015)

The Fourth International Conference on Data Analytics Poughkeepsie, NY | 19 – 24 July, 2015 (SUBMISSION DEADLINE: 27 February 2015)

Journals / Book Chapters

Universities and Knowledge Society Journal (RUSC): Special Section on Learning Analytics (SUBMISSION DEADLINE: 20 January 2015)

Journal of Learning Analytics: Special Section on Multimodal Learning Analytics (SUBMISSION DEADLINE: 1 March 2015)

Employment Opportunities

Post-Doctoral

NYU Steinhardt School of Culture, Education,and Human Developments Center for Research on Higher Education Outcomes (USA)
12-month postdoctoral position – available for a qualified and creative individual with interests in postsecondary assessment, learning analytics, data management, and institutional research.The Postdoctoral Fellow will be responsible for promoting the use of institutional data sources and data systems for the purpose of developing institutional assessment tools that can inform decision making and contribute to institutional improvement across New York University (NYU). DEADLINE FOR APPLICATION: Open Until Filled

Faculty

University of Boulder Colorado (Boulder, Colorado, USA)
Multiple Tenure Track Positions in Computer Science – The openings are targeted at the level of Assistant Professor, although exceptional candidates at higher ranks may be considered. Research areas of particular interest include secure and reliable software systems, numerical optimization and high-performance scientific computing, and network science and machine learning. DEADLINE FOR APPLICATION: Posted Until Filled

Other

University of Michigan (Ann Arbor, MI)
Senior Digital Media Specialist – The University of Michigan is seeking a qualified Senior Digital Media Specialist to create digital content in support of online and residential educational experiences for the Office of Digital Education & Innovation (DEI). DEADLINE FOR APPLICATION: Posted Until Filled