This Week in Learning Analytics (January 25 – 30, 2015)

Speaking of learning analytics, here's another picture of my dog, 'Pocket.'
Speaking of learning analytics, here’s another picture of my dog, ‘Pocket.’

Headlines

Higher Education


26 January 2015
University of Michigan Considers Data Collection Policy Changes

University of Michigan Vice Provost for Digital Education and Innovation, James Hilton, is working with an informal group to draft a Standard Practice Guide policy to govern the use of collected student data.

SOURCE: the Michigan Daily

25 January 2015
Stanford VPTL Educational Data Sharing Portal Completed

Stanford University’s Vice Provost for Teaching and Learning (VPTL) has completed its portal for sharing archived learner data from select online courses with researchers within Stanford and from other institutions. The purpose of this portal and its accompanying documentation is to enable ethically responsible, peer-governed scientific inquiry. Researchers may apply for archived learner data from Stanford at vpol.stanford.edu/research.

SOURCE: Learning Analytics Google Group via George Siemens

25 January 2015
Student Tracking Initiatives at Ball State University

Ball State University is involved in several initiatives meant to track and store information about student behavior. They are currently working with Rapid Insight to correlate freshman survey results with behavioral patterns gleaned from swipe-card activity, and are exploring the possibility of working with the Educational Advisory Board and/or joining the University Innovation Alliance to improve graduation rates through data analysis.

SOURCE: Tracking students’ digital footprint at Ball State

K-12


26 January 2015
Wisconsin Middle School find Success in Data-Driven Approach

Wrightstown Middle School tracks every detail of the student experience in order to achieve a uniform direction and constant improvement, and has become one of Wisconsin’s top schools. But Wrightstown is affluent. As Tim Kaufman, chair of the University of Wisconsin-Green Bay College of Education points out, “for schools with less poverty the report cards may be a truer indication of teaching and learning.”

SOURCE: http://www.greenbaypressgazette.com/story/news/education/2015/01/25/data-driven-teaching-earns-wrightstown-school-top-grade/22324059/

25 January 2015
Phys Ed Fitness Tracking in Minnesota Schools

A project to develop fitness tracking technology for use in physical education classes has received $2.1 Million in funding from the US Department of Education. Activity and heart-rate monitoring is already seeing positive changes in student behavior.

SOURCE: http://www.twincities.com/localnews/ci_27392289/homeroom-new-technology-every-move-counts-phy-ed

25 January 2015
New VA Bill to Prohibit Collection of Student Social Security numbers

There are two bills before the Virginia General Assembly meant to protect sensitive student information, the first, Bill 1307, would prevent schools from requiring the collection of social security numbers and would, instead, require the development of a separate set of identification numbers. The second, Bill 1334, would require the Department of Education to notify parents in cases where a security breech may have resulted in a FERPA violation.

SOURCE: http://www.tricities.com/news/local/article_958b2d7a-a50b-11e4-a4ed-6bea8f4fd0d7.html

Industry


27 January 2015
Unicon to Develop Integration for Sinclair Community College (SCC) between Apereo Notification Portlet and Student Success Plan (SSP)

Unicon will develop an integration between the Apereo uPortal and Student Success Plan (SSP) for Sinclair Community College. The project aims to make it possible for advisors, faculty, and students to view action items from within uPortal. The integration will be open source so that other institutions running uPortal and SSP can make use of it as well.

SOURCE: http://www.pressreleaserocket.net/unicon-to-develop-integration-for-sinclair-community-college-scc-between-apereo-notification-portlet-and-student-success-plan-ssp/51811/

25 January 2015
Learning Analytics Collaborative Seeks Partners to Transform Teaching Internationally

Sujoy Chaudhuri, data scientist at the American School of Bombay, has initiated a collaborative that aims to “take an active role in applying, researching, and developing learning analytics and data visualizations to transform teaching and learning.” To date, the collaborative has two partners, the American School of Bombay and the American International School – Chennai, and is involved in a incubator project with a high school to develop and interactive transcript.

SOURCE: http://asblac.org/index.htm

Events


27 January 2015
Over 200 People Take Part in LACE Seminars at Bett!
SOURCE: http://www.laceproject.eu/blog/200-people-take-part-lace-seminars-bett/

Research & Reports

Learning analytics: A Survey
Usha Keshavamurthy & H S Guruprasad | International Journal of Computer Trends and Technology (IJCTT) | Vol. 18 No. 6 (Dec 2014)

ABSTRACT — Learning analytics is a research topic that is gaining increasing popularity in recent time. It analyzes the learning data available in order to make aware or improvise the process itself and/or the outcome such as student performance. In this survey paper, we look at the recent research work that has been conducted around learning analytics, framework and integrated models, and application of various models and data mining techniques to identify students at risk and to predict student performance.

SOURCE: http://www.ijcttjournal.org/Volume18/number-6/IJCTT-V18P155.pdf

“Twitter Archeology” of Learning Analytics and Knowledge Conferences
Bodong Chen, Xin Chen, Wanli Xing | LAK ’15 (In Press) | March 16 – 20, 2015

ABSTRACT — The goal of the present study was to uncover new insights about the learning analytics community by analyzing Twitter archives from the past four Learning Analytics and Knowledge (LAK) conferences. Through descriptive analysis, interaction network analysis, hashtag analysis, and topic modeling, we found: extended coverage of the community over the years; increasing interactions among its members regardless of peripheral and in-persistent participation; increasingly dense, connected and balanced social networks; and more and more diverse research topics. Detailed inspection of semantic topics uncovered insights complementary to the analysis of LAK publications in previous research.

SOURCE: http://meefen.github.io/public/files/Chen_LAK15_Twitter_Archeology.pdf

Evidence Hub Review
Doug Clow, Simon Cross, Rebecca Ferguson, Bart Rienties | Learning Analytics Community Exchange (LACE) | D2.5 (Dec 2014)

ABSTRACT — This document sets out the background, development and launch for the LACE Evidence Hub (http://evidence.laceproject.eu/), a way of gathering the evidence about learning analytics in an accessible form. It describes the functionality of the site and analyses its quantitative and thematic content to date and the state of evidence. It then sets out plans for further work.

SOURCE: http://www.ijcttjournal.org/Volume18/number-6/IJCTT-V18P155.pdf

Opinions

Editorials


Leveraging Learning Analytics
Niall Sclater | University Business | 27 January 2015

Sclater discusses how UK institutions are getting started with learning analytics, how institutions are grappling with ethical questions, and important considerations as they move ahead into the future. Motivations for investment in learning analytics are varied, but retention continues to be a driving concern. Ethical implications are often raised in conversation, but have yet to prove very problematic in practice. Exemplary, here, is the Open University, which worked with its student association to develop a policy for the ethical use of student data. Sclater emphasizes that the UK (along with everyone else) is still in the early stages, but UK institutions are increasingly interested in understanding how to best make use of the vast amounts of data that they are accumulating.

SOURCE: http://universitybusiness.co.uk/Article/leveraging_learning_analytics

Colorado’s New Higher Education Funding Model
Mark Ferrandino | The Denver Post | 24 January 2015

Prescribed by HB 1319, Colorado’s new model for funding higher education is supported by real-time, student0level data. Ferrandino commends the stakeholders “who helped launch a more transparent higher education funding system based on data and incentives aligned with the policy goals of the General Assembly”

SOURCE: http://www.denverpost.com/opinion/ci_27380544/new-higher-education-funding-model

Blogs


Teachers turn to Data analysis to Help Students
Kristen Nicole | Silicon Angle | 28 January 2015

Interview with Raj Chary, Vice President of Technology/Architecture at Triumph Learning, LLC. To help educators with measuring the skills necessary for k-12 student’s to succeed in subsequent grades, Triumph Learning built Get Waggle, which launched in July 2014. Built from the ground up for grades 3-8, the platform supports Common Core subjects, including math and English language arts, and provides teachers an interactive platform to plan lessons that meet required standards.

SOURCE: http://siliconangle.com/blog/2015/01/28/teachers-turn-to-data-analysis-to-help-students/

A Taxonomy of the Ethical and Legal Issues of Learning Analytics v0.1
Niall Sclater | Jisc | 28 January 2015

Sclater provides a helpful summary of the literature review he conducted in December on behalf of Jisc. Following a workshop with the LACE Project and Apereo, during which the taxonomy will be clarified and refined, Sclater intends to use the taxonomy as the basis for work with the Code of Practice Advisory Group and the goal of developing a useful Code of Practice.

SOURCE: http://analytics.jiscinvolve.org/wp/2015/01/30/a-taxonomy-of-the-ethical-and-legal-issues-of-learning-analytics-v0-1/

K-12 Lessons and Higher Education Opportunity
Timothy D. Harfield | timothyharfield.com | 28 January 2015

Summary of a seminar at Emory University led by Ben Sayeski, Managing Partner of Education Strategy Consulting. Four major themes are identified, including (1) the importance of moving from the knowable to the known, (2) the value of transparency, (3) the priority of asking questions over making recommendations, and (4) maintaining a view to agility through platform agnosticism.

SOURCE: http://timothyharfield.com/blog/2015/01/25/was-the-data-analytics-and-learning-mooc-aka-dalmooc-a-success/

My First Course: Learning Analytics in the Knowledge Age
Bodong Chen | http://meefen.github.io/ | 27 January 2015

Bodong Chen (Assistant Professor in the Department of Curriculum and Instruction in the College of Education and Human Development at the University of Minnesota) discusses the details of a course that he is teaching in the Spring 2015 semester: Learning Analytics in the Knowledge Age. The course is distinct from other previous courses in learning analytics on account of (1) an emphasis on learning rather than focussing too heavily on analytics and (2) the aim of producing new knowledge to solve real-world problems.

SOURCE: http://meefen.github.io/blog/2015/01/27/learning-analytics-course/

Was the Data, Analytics, and Learning MOOC (aka DALMOOC) a Success?
Timothy D. Harfield | timothyharfield.com | 25 January 2015

Response to a recent post in which the author claimed that the success of Data, Analytics, and Learning MOOC (aka DALMOOC) was something that should not be measured, and implies that we should be resistant to the concepts of ‘measurement’ and ‘success’ in general. Harfield points out the irony of such a position from the designer of a MOOC about learning analytics, and uses the piece as an opportunity to reflect critically upon what ‘success’ and ‘measurement’ mean.

SOURCE: http://timothyharfield.com/blog/2015/01/25/was-the-data-analytics-and-learning-mooc-aka-dalmooc-a-success/

Presentations

Learning Analytics: Threats and Opportunities
Martin Hawksey | 28 January 2015

Read commentary on Martin’s presentation here: https://mashe.hawksey.info/2015/01/presentation-learning-analytics-threats-and-opportunities/

Technical Challenges for Realizing Learning Analytics
Ralf Klamma | 29 January 2015

Was the Data, Analytics, and Learning MOOC (aka DALMOOC) a Success?

DALMOOC Distortion
In a recent blog post (2015-01-29 – The site on which this original post was published has been taken down, but a cached version of the piece is still available HERE), Matt Crosslin raised the question of how one might go about measuring the success of a MOOC. Matt is the developer for the recently completed Data, Analytics, and Learning MOOC (DALMOOC) offered by the University of Arlington using the edX platform. He concludes his piece with the following:

So, at the end of the day, I will be able to come up with some philosophical jargon to “prove” that DALMOOC was a success to the powers-that-be who ask – all of which will be true. But to be honest, the only thing I really want to do is shrug my shoulders and say “beats me – go look the participant output itself and see if that looks like success to you.”

Looks like success to me.

There are several issues with Matt’s piece that I would like to address here:

  • Dissonance between course design and the course content – it is remarkable to see such resistance to measuring the success of a course whose primary aim was to facilitate the ability to accomplish exactly that
  • Resistance to the concept of ‘success’ in general – Matt implies (and elsewhere insists) that ‘success’ is a “linear instructivist concept” and, as such, should be avoided except as part of post hoc justifications to funders and university leadership
  • An insistence that evidence speaks for itself – success is always a function of the extent to which a thing achieves the aim(s) for which it was created. By shrugging his shoulders and leaving evaluation up to course consumers, Matt implies either that the DALMOOC was produced without any particular aim or set of objectives, or that those goals and objectives have very little to do with the course’s success. In light of the tremendous amount of effort that went into the DALMOOC’s complex course design, it is clear that neither of these are right. The dismissive way in which Matt concludes his post fails to do justice to the thoughtfulness of the (his) DALMOOC course design.

Following a brief overview of the DALMOOC and the introduction of several key terms, I will critically examine each of these issues in an attempt to bring some clarity to the concept of success. As always I mean this in the spirit of philosophical charity, and as a starting point for a discussion that I feel is worth having.

Is the DALMOOC Instructivist or Connectivist?

In his blog post, Matt states that the DALMOOC experience was successful because “everyone that work (sic) on DALMOOC lived and are still on speaking terms.” A well-designed Massive Open Online Course is a tremendous amount of work. The amount of time and effort involved in delivering the DALMOOC, however, increased geometrically as a consequence of the complexity of its design (It was fairly typical to see Matt put in 60+ hours per week while the course was live). The DALMOOC was implemented as both a non-linear, learner-centered connectivist MOOC (cMOOC) — The ‘Social Learning Path’ — and a linear, teacher-centered instructivist MOOC (xMOOC) — the ‘Guided Learner Path’. The reasons for this complex ‘dual-layer’ design are not particularly well-articulated, but, in light of Matt’s personal commitment to the connectivist perspective (and the fact that the course staff includes George Siemens — one of the two ‘founding fathers’ of this perspective), they are presumably aligned with the values of connectivist instructional design in general. As George Siemens explains in Connectivism: A Learning Theory for the Digital Age (2005), the basic principles of connectivism are as follows:

  • Learning and knowledge rests in diversity of opinions.
  • Learning is a process of connecting specialized nodes or information sources.
  • Learning may reside in non-human appliances.
  • Capacity to know more is more critical than what is currently known
  • Nurturing and maintaining connections is needed to facilitate continual learning.
  • Ability to see connections between fields, ideas, and concepts is a core skill.
  • Currency (accurate, up-to-date knowledge) is the intent of all connectivist learning activities.
  • Decision-making is itself a learning process. Choosing what to learn and the meaning of incoming information is seen through the lens of a shifting reality. While there is a right answer now, it may be wrong tomorrow due to alterations in the information climate affecting the decision.

Guided by these basic principles, George Siemens explains that the DALMOOC was designed in order to solve the following problems with MOOCS:

      1. Students often flounder in MOOCs as there is limited social contact between learners and limited timely support.
      2. Learners have limited engagement in developing knowledge together. Many MOOCs reflect a structured and linear process of content presentation. There is little alignment with the architecture of knowledge in a participative age.
      3. Learners have a difficult time getting to know each other or finding like others as major platforms do not focus on developing learner profiles
      4. The connection between learning and application is hampered as MOOC resources do not always persist after a course has ended and there is limited search functionality in MOOCs.
      5. Courses are not adaptive and serve the same content to all learners, regardless of prior knowledge

In order to address these issues, the course design was meant to (1) facilitate timely access to resources through a tool called Quick Helper, (2) foster social embeddedness through a distributed information structure (learners were encouraged to produce their own learning spaces and exercise agency by connecting fragmented ‘information pieces’), (3) encourage persistence through the use of ProSolo as a platform for engaging course content and community following course completion, and (4) promote adaptability by providing learners with an assignment bank that allowed them to challenge themselves by selecting assignments with various levels of complexity.

On the surface, the DALMOOC is structured in such a way as to allow learners the option of navigating the course either as a connectivist cMOOC or as an instructivist xMOOC. In providing learners with the option, however, and actively encouraging them to bounce between paths in a way that best meets their needs and interests as individual learners, the dual-path format aims to nurture connection-making (between content, modalities, and people) and to encourage the learners to take responsibility for their learning as a result of embedding decision-making as an integral part of the MOOC’s design. These are both central principles of connectivism. This is not to say that the DALMOOC is a cMOOC (nor is it, of course, to call it an xMOOC). The DALMOOC was truly dual-layer in its design, in a way that permitted either an xMOOC or cMOOC experience. As is clear from the comments of both Matt and George, however, the motivations and values that underscore the way in which the dual-layer MOOC was implemented are connectivist through and through.

Evaluating DALMOOC’s Success

1. Dissonance between Course Design and subject Matter

Here I would merely like to point out the irony of a class on learning analtyics being designed by one who is skeptical of the possibility of measuring student success, and indeed who is ideologically resistant to such endeavors. In his blog post, Matt mentions the term ‘measurement’ only twice: once in the title and once in his introduction to describe several ways in which MOOC success is typically demonstrated. In a follow up tweet, however, he asserts that the idea of “measurable success” is behaviorist and so suspect:

He also equates the concept of success with intstructivism, with the implication that is is incompatible with a connectivist view of education:

I will address the concept of success in the next section. For now, it is enough to simply note the tension between these views and the broadly accepted definition of learning analytics as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.” ‘Measurement’ is clearly central to learning analytics. So, too, is ‘success,’ since the notion of optimization implies a standard of performance against which results are assessed. It is quite possible that learning analytics is opposed to connectivist values — i.e. that there is no such thing as a connectivist learning analytics. It is for this reason that I am not arguing that Matt is guilty of contradiction. There is, however, an obvious tension between his flavor of connectivism and the content of the course his connectivist course design is intended to support.

2. Resistance to Success

Matt is strongly opposed to efforts aimed at demonstrating the DALMOOC’s success (or of demonstrating the success of MOOCs in general). He reduces such efforts to “philosophical jargon,” the function of which is merely to provide post hoc evidence that would assure sponsors that their money was well spent. He equates the concept of ‘success’ with instructivism, and ‘measurable success’ with behaviorism. Is this right?

I would like to propose a definition of success (Aristotelian): Success is the extent to which the activity of a thing accords with the function for which that thing was produced. Given this definition, does the concept of success apply to connectivist teaching practices? Certainly! A connectivist teaching practice is successful to the extend that it accords with connectivist values. Does it promote diversity of opinion, foster the connecting of specialized information sources, offer a space for the non-human, cultivate the capacity to know, nurture continual learning, promote interdisciplinarity, maintain a view to the present, and encourage deliberation? If so, then SUCCESS!

Looking at the DAMOOC specifically, it is quite clear that the project set out to achieve a specific set of goals: to facilitate timely access to resources, foster social embeddedness, encourage persistence, and promote adaptability. The ‘right’ answer in response to the question “was DALMOOC successful” is NOT to shrug one’s shoulders and say “beats me.” Rather, the right answer would be something to the effect that “we designed the course with four objectives in mind, which we sought to achieve through the effective use of three different tools (Quick Helper, ProSolo, and an assignment bank), as well as a course design meant to foster community through a distributed information structure. Looking at student behavior (from course activity logs, etc) and opinions (as expressed in discussion boards, blog posts, on social media, etc), we found that some strategies demonstrably served to mitigate many problems that traditionally plague MOOCs. Other strategies had little effect, and others still were found to exacerbate problems while creating new ones. Overall, then, DALMOOC was successful in demonstrating the value of several of our approaches, while at the same time providing us with valuable feedback which we plan to incorporate in subsequent iterations of the course” … or something like that.

Success is not an idea to be resisted. Rather, the idea of success merely affirms the fact that our activities are intentioned. Can success be measured? Again, it is important to be clear about what we mean by ‘measurement.’ If by measurement we mean ‘quantified,’ then I whole heartedly agree with Matt that there are some forms of success that cannot be measured. Looking again to the list of connectivist values listed above, values like ingenuity and personal responsibility are goals of teaching practice, and a conectivist would agree that a student who has embraced those values in thought and in action is a successful student (and will also be more likely to survive and thrive in the 21st century), but the achievement of each of these goals is difficult to measure. And in fact, the attempt to measure a student’s ability in these areas may actually be antithetical to student achievement. If, on the other hand, we view measurement more broadly to refer simply to the comparison of something to a particular standard, then there is absolutely no reason why success cannot be measured. Indeed, judging the success of an activity is always a result of measurement in this sense, since it is only by comparing the performance of an activity against a particular standard of excellence that a performance may be judged at all.

3. DALMOOC as a Dingle Hopper

I would like to conclude this post by commenting on the idea that success is in the eye of the beholder. Matt concludes his blog post by asking the reader to “go look at the participant output itself and see if that looks like success to you.” In this remark, it seems to me that Matt has confused two perspectives: (1) the producer perspective, and (2) the consumer perspective. From the perspective of the producer, the success of the DALMOOC is a function of the extent to which its instructional design was seen to facilitate the achievement of clearly stated course objectives. From the perspective of the consumer, however, the goals of the course design are largely irrelevant. The consumer comes to a course as an artifact, and brings to it a set of personal values, goals, and expectations. What this means is that it is entirely possible for a course to be successful according to its own design objectives, while at the same time producing an experience that is perceived as unsuccessful by some of its students. It is likewise possible for a course to be successful by its own standard, but be perceived as unsuccessful when assessed by others who do not share the same set of values that governed the course’s design (i.e. like those for whom completion rates, enrollment, numbers, and earned certificates are really all that is important).

The success of a course is something that can only be judged relative to the goals and objectives that motivated its production. I think here of the dinglehopper. For those who produced the dinglhopper, its function is that of a fork…because it is a fork. It’s design is successful to the extent that it aids in the transportation of food from plate to mouth. For Scuttle the seagull, the dinglehopper is successful because it helps him to style his feathers. We see success in both cases. The latter is a successful experience, but only in the former do we see a successful object.

To conclude, then, questions about the success of the DALMOOC are questions about the values and objectives of the course designers, and the extent to which the implementation of the course design saw the expected results. To “shrug my shoulders and say ‘beats me'” is a terrible response, because it implies either a lack of awareness about, or reflection on, the DALMOOC objectives, or else a position that would put so little stock in those objectives that their achievement is not worth evaluating. From other accounts that have been written, and from my limited interaction with Matt, it is abundantly clear that neither is correct. Matt Crosslin and the other members of the DALMOOC design team have clearly done a great deal of work designing and implementing the course, and with a constant view to the course’s success. At the end of the day, the main issue with Matt’s post is that it fails to do justice to the work that he and the rest of the team have done. It is unfortunately flippant, and lamentably dismissive of many things both related and unrelated to the DALMOOC and its evaluation.

My goal in writing this piece has been to clarify several concepts while also coming to the defense of a project that is laudable in its goals, ambitious in its implementation, and (I suspect) successful in ways that will positively inform the design and effectiveness of future MOOCs. In the course of writing this piece, I have also had the great opportunity to learn more about the DALMOOC, and even more about connectivism (a term that I had heard, but about which I knew nothing). I am grateful to Matt for his blog post — which functioned as an artifact and as an opportunity to explore these subjects — and also for his willingness to engage me in a conversation motivated by the spirit of philosophical charity and for the sake of the community as a whole.

As always, comments are welcome.

This Week in Learning Analytics (October 18 – October 24, 2014)

Chuck Dziuban at Emory University, speaking on the topic of "Teaching and Learning in an Evolving Educational Environment"
Chuck Dziuban at Emory University, speaking on the topic of “Teaching and Learning in an Evolving Educational Environment”

Latest News

19 October 2014
Data, Analytics, and Learning MOOC goes live
The long-awaited edX MOOC on Data, Analytics, and Learning went live this week. The #DALMOOC, which is taught by George Siemens, Carolyn Rosé, Dragan Gasevic, and Ryan Baker, provides an introduction to learning analytics, its tools and methods, and various ways in which it might be deployed in educational environments. It is also an experiment in its own right, allowing for multiple learning pathways: either in a standard edX xMOOC format, or as a social competency-based and self-directed cMOOC.

I have yet to engage much in the course but, at first glance, I have one small (or large, depending on how you look at it) criticism: The DALMOOC course agreement is confusing.

Data from participation in this Massive Open Online Course (MOOC) will be used for research purposes in order to gain knowledge for better design of support for student learning in MOOCs. When participants are logged in to this course, the information they enter into the course interface will be logged for analysis. The data will not be shared beyond the researchers who have approval to use this data. Personal identifiers will be replaced by unique identifiers. A possible risk is a breach of confidentiality. Participation is voluntary, and participants may stop participating at any time. There will be no cost to participants for participation in this study, and likewise no financial compensation will be offered. There may be no personal benefit from participation in the study beyond the knowledge received in the area of learning analytics, which is the topic of the course.

On the one hand, the course agreement (note: NOT a research participation agreement) is the first page that the student encounters when clicking the ‘Courseware’ tab (following registration), and implies that participation in the course is contingent upon one’s agreement to participate in a the research project. This implied contingency would seem to contradict the first ‘O’ in MOOC. On the other hand, it states that participation is voluntary and that it may be withdrawn at any time. What is not clear, is whether withdrawal from participation means withdrawal from the study or from the course. The way that this agreement is structured strongly implies that course participation requires participation in the study. As a test, I have not clicked the “I have read the above an consent to participation” button and have, to date, not been limited in my ability to participate in the course. I wonder about the ethics of this approach to gaining consent and, at the very least, wish that the language of the DALMOOC Course Agreement was less equivocal. [Read more]

21 October 2014
Study will Teach Algebra with Student-Authored Stories that Draw on Their Own Interests
A new study by Candace Walkington (Southern Methodist University) will test the effectiveness of teaching algebra by embedding algebraic concepts into students’ day-to-day lives. The study uses a mixed methodology, employing qualitative and data mining to test the effectiveness of personalized instruction on conceptual comprehension and retention, and attitudes toward math.

This is an approach that is often employed (or rather SHOULD often be employed) in the humanities (nothing like using love and sex to make sense out of Hegel’s master-slave dialectic), and resonates with the educational philosophy of John Dewey, for whom learning is a function of a concept’s importance, which, in turn, is a function of past experience, present necessity, and future aspiration. It is also an approach that might also serve to ‘catch’ more humanistically oriented students who do not consider themselves very ‘math’ or ‘science.’ [Read more]

Latest Blogs

Social Learning, Blending xMOOCs & cMOOCs, and Dual Layer MOOCs by Matt Crosslin
A really nice discussion of the design methodology for #DALMOOC. Specifically, Crosslin addresses three primary quetions, of which only two are really interesting (the third involved color selection):

  • Don’t most MOOCs blend elements of xMOOCs and cMOOCs together? The xMOOC/cMOOC distinction is too simple and DALMOOC is not really doing anything different.
  • Isn’t it ironic to have a Google Hangout to discuss an interactive social learning course but not allow questions or interaction?

Learning analytics using business intelligence systems by Niall Sclater
A review of several generic Business Intelligence solutions (including Cognos, Qlikview, and Tableau) which are typically employed for the sake of gaining operational insight, and ways in which they might be leveraged to gain insight into student learning experience as well.

Use of an Early Warning System by Stephen J. Aguilar
Video of a lightening talk version (~5 min) of a talk originally delivered at the 2014 Learning Analytics and Knowledge Conference, on “Perception and Use of an Early Warning System During a Higher Education Transition Program.”

Teaching and Learning in an Evolving Educational Environment by Charles Dziuban
Full video of the inaugural lecture in Emory University’s 2014-2015 Learning Analytics Speaker Series. Dziuban uses a variety of metaphors (including the Anna Karenina principle) to offer a perspective on learning analytics through the lens of the scholarship of teaching and learning, and explains the successful support model that he has implemented with faculty at the University of Central Florida.

On the Question of Validity in Learning Analytics by Adam Cooper
Cooper calls for a rethinking of the term ‘validity’ within the context of learning analytics. Although he covers himself by saying that “This post is a personal view, incomplete and lacking academic rigour,” what he nevertheless seems to call for is a conflation of methodological and ethical concerns, and a loosening of conceptual clarity in the name of facilitating practice by non-experts.

At the end of his post, Cooper asks He asks: “what do you think?” When dealing with technologies with the likelihood of significantly affecting human behavior, conceptual sophistication in both ethical and methodological matters is more, not less, important. In the absence of rigor, we run the risk of under-appreciating complexity, and implementing interventions that cause harm. What non-expert practitioners need is not a ‘dumbed-down’ vocabulary (or technology that does the work), but rather a set of expert advisors capable of fully assessing problems and solutions from a wide variety of perspectives in order to arrive at solutions that, even if not perfect, are at least fully informed.

Recent Publications

Learning Analytics as a Metacognitive Tool
Eva Durall & Begoña Gros

The use of learning analytics is entering in the field of research in education as a promising way to support learning. However, in many cases data are not transparent for the learner. In this regard, Educational institutions shouldn’t escape the need of making transparent for the learners how their personal data is being tracked and used in order to build inferences, as well as how its use is going to affect in their learning. In this contribution, we sustain that learning analytics offers opportunities to the students to reflect about learning and develop metacognitive skills. Student-centered analytics are highlighted as a useful approach for reframing learning analytics as a tool for supporting self-directed and self-regulated learning. The article also provides insights about the design of learning analytics and examples of experiences that challenge traditional implementations of learning analytics.

Premise of Learning Analytics for Educational Context: Through Concept to Practice
Yasemin Gülbahar & Hale Ilgaz

The idea of using recorded data for evaluating the effectiveness of teaching-learning process and using the outcomes for improvement and enhancing quality lead to the emergence of the field known as “learning analytics”. Based on the analysis of this data, possible predictions could be reached to make suggestions and give decisions in order to implement interventions for the improvement of the quality of the process. Hence, the concept of “learning analytics” is a promising and important field of study, with its processes and potential to advance e-learning. In this study, learning analytics are defined in two ways – business and e-learning environments. As an e-learning environment, Moodle LMS was chosen and analyzed through SAS (Statistical Analysis System) Level of Analytics. According to the analysis, some practical ideas developed. However learning analytics seem to be mostly based on quantitative data, whereas qualitative insights can also be gained through various approaches which can be used to strengthen the numerical data by providing detailed facts about a phenomenon. Thus, in addition to focusing on the learner, for research studies at the course, program, and institutional level; the research should include instructors and administrators in order to reveal the best practices of instructional design and fulfill the premise of effective teaching.

Calls for Papers / Participation

Conferences

NEW! Open Learning Analytics Network – Summit Europe Amsterdam | 1 January 2015 (APPLICATION DEADLINE: None, but spaces are limited)

Third International Conference on Data Mining & Knowledge Management Process Dubai, UAE | 23-24 January, 2015 (APPLICATION DEADLINE: 31 October 2014)

Learning at Scale 2015 Vancouver, BC (Canada) | 14 – 15 March 2015 (SUBMISSION DEADLINE: 22 October 2014)

2015 Southeast Educational Data Symposium (SEEDS) Emory University (Atlanta, GA) | 20 Feb 2015 (APPLICATION DEADLINE: 14 November 2014)

11th International Conference on Computer Supported Collaborative
Learning: “Exploring the material conditions of learning: Opportunities and
challenges for CSCL”
 
University of Gothenburg, Sweden | 7 – 11 June 2015 (SUBMISSION DEADLINE: 17 November 2014)

28th annual Florida AI Research Symposium (FLAIRS-28) on Intelligent Learning Technologies Hollywood, Florida, USA (SUBMISSION DEADLINE: 17 November 2014)

Journals / Book Chapters

NEW! Universities and Knowledge Society Journal (RUSC): Special Section on Learning Analytics (SUBMISSION DEADLINE: 20 January 2015)

Employment Opportunities

Simon Fraser University (Victoria, BC, Canada)
Tenure Track Position In Educational Technology And Learning Design – The Faculty of Education, Simon Fraser University (http://www.sfu.ca/education.html) seeks applications for a tenure-track position in Educational Technology and Learning Design at the Assistant Professor rank beginning September 1, 2015, or earlier. The successful candidate will join an existing complement of faculty engaged in Educational Technology and Learning Design, and will contribute to teaching and graduate student supervision in our vibrant Masters program at our Surrey campus and PhD program at our Burnaby campus. DEADLINE FOR APPLICATION: December 1, 2014

University of Technology, Sydney (Sydney, AUS)
Research Fellow: Data Scientist – We invite applications from highly motivated data scientists wishing to work in a dynamic team, creating tools to provide insight into diverse datasets within the university and beyond. We welcome applicants from diverse backgrounds, although knowledge of educational theory and practice will be highly advantageous. You are a great communicator, bringing expertise in some combination of statistics, data mining, machine learning and visualisation, and a readiness to stretch yourself to new challenges. We are ready to consider academic experience from Masters level to several years’ Post-Doctoral research, as well as candidates who have pursued non-academic, more business-focused tracks. DEADLINE FOR APPLICATION: Posted Until Filled

University of Michigan (Ann Arbor, MI)
Senior Digital Media Specialist – The University of Michigan is seeking a qualified Senior Digital Media Specialist to create digital content in support of online and residential educational experiences for the Office of Digital Education & Innovation (DEI). DEADLINE FOR APPLICATION: Posted Until Filled

NYU Steinhardt School of Culture, Education,and Human Developments Center for Research on Higher Education Outcomes (USA)
12-month postdoctoral position – available for a qualified and creative individual with interests in postsecondary assessment, learning analytics, data management, and institutional research.The Postdoctoral Fellow will be responsible for promoting the use of institutional data sources and data systems for the purpose of developing institutional assessment tools that can inform decision making and contribute to institutional improvement across New York University (NYU). DEADLINE FOR APPLICATION: Open Until Filled