Was the Data, Analytics, and Learning MOOC (aka DALMOOC) a Success?

DALMOOC Distortion
In a recent blog post (2015-01-29 – The site on which this original post was published has been taken down, but a cached version of the piece is still available HERE), Matt Crosslin raised the question of how one might go about measuring the success of a MOOC. Matt is the developer for the recently completed Data, Analytics, and Learning MOOC (DALMOOC) offered by the University of Arlington using the edX platform. He concludes his piece with the following:

So, at the end of the day, I will be able to come up with some philosophical jargon to “prove” that DALMOOC was a success to the powers-that-be who ask – all of which will be true. But to be honest, the only thing I really want to do is shrug my shoulders and say “beats me – go look the participant output itself and see if that looks like success to you.”

Looks like success to me.

There are several issues with Matt’s piece that I would like to address here:

  • Dissonance between course design and the course content – it is remarkable to see such resistance to measuring the success of a course whose primary aim was to facilitate the ability to accomplish exactly that
  • Resistance to the concept of ‘success’ in general – Matt implies (and elsewhere insists) that ‘success’ is a “linear instructivist concept” and, as such, should be avoided except as part of post hoc justifications to funders and university leadership
  • An insistence that evidence speaks for itself – success is always a function of the extent to which a thing achieves the aim(s) for which it was created. By shrugging his shoulders and leaving evaluation up to course consumers, Matt implies either that the DALMOOC was produced without any particular aim or set of objectives, or that those goals and objectives have very little to do with the course’s success. In light of the tremendous amount of effort that went into the DALMOOC’s complex course design, it is clear that neither of these are right. The dismissive way in which Matt concludes his post fails to do justice to the thoughtfulness of the (his) DALMOOC course design.

Following a brief overview of the DALMOOC and the introduction of several key terms, I will critically examine each of these issues in an attempt to bring some clarity to the concept of success. As always I mean this in the spirit of philosophical charity, and as a starting point for a discussion that I feel is worth having.

Is the DALMOOC Instructivist or Connectivist?

In his blog post, Matt states that the DALMOOC experience was successful because “everyone that work (sic) on DALMOOC lived and are still on speaking terms.” A well-designed Massive Open Online Course is a tremendous amount of work. The amount of time and effort involved in delivering the DALMOOC, however, increased geometrically as a consequence of the complexity of its design (It was fairly typical to see Matt put in 60+ hours per week while the course was live). The DALMOOC was implemented as both a non-linear, learner-centered connectivist MOOC (cMOOC) — The ‘Social Learning Path’ — and a linear, teacher-centered instructivist MOOC (xMOOC) — the ‘Guided Learner Path’. The reasons for this complex ‘dual-layer’ design are not particularly well-articulated, but, in light of Matt’s personal commitment to the connectivist perspective (and the fact that the course staff includes George Siemens — one of the two ‘founding fathers’ of this perspective), they are presumably aligned with the values of connectivist instructional design in general. As George Siemens explains in Connectivism: A Learning Theory for the Digital Age (2005), the basic principles of connectivism are as follows:

  • Learning and knowledge rests in diversity of opinions.
  • Learning is a process of connecting specialized nodes or information sources.
  • Learning may reside in non-human appliances.
  • Capacity to know more is more critical than what is currently known
  • Nurturing and maintaining connections is needed to facilitate continual learning.
  • Ability to see connections between fields, ideas, and concepts is a core skill.
  • Currency (accurate, up-to-date knowledge) is the intent of all connectivist learning activities.
  • Decision-making is itself a learning process. Choosing what to learn and the meaning of incoming information is seen through the lens of a shifting reality. While there is a right answer now, it may be wrong tomorrow due to alterations in the information climate affecting the decision.

Guided by these basic principles, George Siemens explains that the DALMOOC was designed in order to solve the following problems with MOOCS:

      1. Students often flounder in MOOCs as there is limited social contact between learners and limited timely support.
      2. Learners have limited engagement in developing knowledge together. Many MOOCs reflect a structured and linear process of content presentation. There is little alignment with the architecture of knowledge in a participative age.
      3. Learners have a difficult time getting to know each other or finding like others as major platforms do not focus on developing learner profiles
      4. The connection between learning and application is hampered as MOOC resources do not always persist after a course has ended and there is limited search functionality in MOOCs.
      5. Courses are not adaptive and serve the same content to all learners, regardless of prior knowledge

In order to address these issues, the course design was meant to (1) facilitate timely access to resources through a tool called Quick Helper, (2) foster social embeddedness through a distributed information structure (learners were encouraged to produce their own learning spaces and exercise agency by connecting fragmented ‘information pieces’), (3) encourage persistence through the use of ProSolo as a platform for engaging course content and community following course completion, and (4) promote adaptability by providing learners with an assignment bank that allowed them to challenge themselves by selecting assignments with various levels of complexity.

On the surface, the DALMOOC is structured in such a way as to allow learners the option of navigating the course either as a connectivist cMOOC or as an instructivist xMOOC. In providing learners with the option, however, and actively encouraging them to bounce between paths in a way that best meets their needs and interests as individual learners, the dual-path format aims to nurture connection-making (between content, modalities, and people) and to encourage the learners to take responsibility for their learning as a result of embedding decision-making as an integral part of the MOOC’s design. These are both central principles of connectivism. This is not to say that the DALMOOC is a cMOOC (nor is it, of course, to call it an xMOOC). The DALMOOC was truly dual-layer in its design, in a way that permitted either an xMOOC or cMOOC experience. As is clear from the comments of both Matt and George, however, the motivations and values that underscore the way in which the dual-layer MOOC was implemented are connectivist through and through.

Evaluating DALMOOC’s Success

1. Dissonance between Course Design and subject Matter

Here I would merely like to point out the irony of a class on learning analtyics being designed by one who is skeptical of the possibility of measuring student success, and indeed who is ideologically resistant to such endeavors. In his blog post, Matt mentions the term ‘measurement’ only twice: once in the title and once in his introduction to describe several ways in which MOOC success is typically demonstrated. In a follow up tweet, however, he asserts that the idea of “measurable success” is behaviorist and so suspect:

He also equates the concept of success with intstructivism, with the implication that is is incompatible with a connectivist view of education:

I will address the concept of success in the next section. For now, it is enough to simply note the tension between these views and the broadly accepted definition of learning analytics as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.” ‘Measurement’ is clearly central to learning analytics. So, too, is ‘success,’ since the notion of optimization implies a standard of performance against which results are assessed. It is quite possible that learning analytics is opposed to connectivist values — i.e. that there is no such thing as a connectivist learning analytics. It is for this reason that I am not arguing that Matt is guilty of contradiction. There is, however, an obvious tension between his flavor of connectivism and the content of the course his connectivist course design is intended to support.

2. Resistance to Success

Matt is strongly opposed to efforts aimed at demonstrating the DALMOOC’s success (or of demonstrating the success of MOOCs in general). He reduces such efforts to “philosophical jargon,” the function of which is merely to provide post hoc evidence that would assure sponsors that their money was well spent. He equates the concept of ‘success’ with instructivism, and ‘measurable success’ with behaviorism. Is this right?

I would like to propose a definition of success (Aristotelian): Success is the extent to which the activity of a thing accords with the function for which that thing was produced. Given this definition, does the concept of success apply to connectivist teaching practices? Certainly! A connectivist teaching practice is successful to the extend that it accords with connectivist values. Does it promote diversity of opinion, foster the connecting of specialized information sources, offer a space for the non-human, cultivate the capacity to know, nurture continual learning, promote interdisciplinarity, maintain a view to the present, and encourage deliberation? If so, then SUCCESS!

Looking at the DAMOOC specifically, it is quite clear that the project set out to achieve a specific set of goals: to facilitate timely access to resources, foster social embeddedness, encourage persistence, and promote adaptability. The ‘right’ answer in response to the question “was DALMOOC successful” is NOT to shrug one’s shoulders and say “beats me.” Rather, the right answer would be something to the effect that “we designed the course with four objectives in mind, which we sought to achieve through the effective use of three different tools (Quick Helper, ProSolo, and an assignment bank), as well as a course design meant to foster community through a distributed information structure. Looking at student behavior (from course activity logs, etc) and opinions (as expressed in discussion boards, blog posts, on social media, etc), we found that some strategies demonstrably served to mitigate many problems that traditionally plague MOOCs. Other strategies had little effect, and others still were found to exacerbate problems while creating new ones. Overall, then, DALMOOC was successful in demonstrating the value of several of our approaches, while at the same time providing us with valuable feedback which we plan to incorporate in subsequent iterations of the course” … or something like that.

Success is not an idea to be resisted. Rather, the idea of success merely affirms the fact that our activities are intentioned. Can success be measured? Again, it is important to be clear about what we mean by ‘measurement.’ If by measurement we mean ‘quantified,’ then I whole heartedly agree with Matt that there are some forms of success that cannot be measured. Looking again to the list of connectivist values listed above, values like ingenuity and personal responsibility are goals of teaching practice, and a conectivist would agree that a student who has embraced those values in thought and in action is a successful student (and will also be more likely to survive and thrive in the 21st century), but the achievement of each of these goals is difficult to measure. And in fact, the attempt to measure a student’s ability in these areas may actually be antithetical to student achievement. If, on the other hand, we view measurement more broadly to refer simply to the comparison of something to a particular standard, then there is absolutely no reason why success cannot be measured. Indeed, judging the success of an activity is always a result of measurement in this sense, since it is only by comparing the performance of an activity against a particular standard of excellence that a performance may be judged at all.

3. DALMOOC as a Dingle Hopper

I would like to conclude this post by commenting on the idea that success is in the eye of the beholder. Matt concludes his blog post by asking the reader to “go look at the participant output itself and see if that looks like success to you.” In this remark, it seems to me that Matt has confused two perspectives: (1) the producer perspective, and (2) the consumer perspective. From the perspective of the producer, the success of the DALMOOC is a function of the extent to which its instructional design was seen to facilitate the achievement of clearly stated course objectives. From the perspective of the consumer, however, the goals of the course design are largely irrelevant. The consumer comes to a course as an artifact, and brings to it a set of personal values, goals, and expectations. What this means is that it is entirely possible for a course to be successful according to its own design objectives, while at the same time producing an experience that is perceived as unsuccessful by some of its students. It is likewise possible for a course to be successful by its own standard, but be perceived as unsuccessful when assessed by others who do not share the same set of values that governed the course’s design (i.e. like those for whom completion rates, enrollment, numbers, and earned certificates are really all that is important).

The success of a course is something that can only be judged relative to the goals and objectives that motivated its production. I think here of the dinglehopper. For those who produced the dinglhopper, its function is that of a fork…because it is a fork. It’s design is successful to the extent that it aids in the transportation of food from plate to mouth. For Scuttle the seagull, the dinglehopper is successful because it helps him to style his feathers. We see success in both cases. The latter is a successful experience, but only in the former do we see a successful object.

To conclude, then, questions about the success of the DALMOOC are questions about the values and objectives of the course designers, and the extent to which the implementation of the course design saw the expected results. To “shrug my shoulders and say ‘beats me'” is a terrible response, because it implies either a lack of awareness about, or reflection on, the DALMOOC objectives, or else a position that would put so little stock in those objectives that their achievement is not worth evaluating. From other accounts that have been written, and from my limited interaction with Matt, it is abundantly clear that neither is correct. Matt Crosslin and the other members of the DALMOOC design team have clearly done a great deal of work designing and implementing the course, and with a constant view to the course’s success. At the end of the day, the main issue with Matt’s post is that it fails to do justice to the work that he and the rest of the team have done. It is unfortunately flippant, and lamentably dismissive of many things both related and unrelated to the DALMOOC and its evaluation.

My goal in writing this piece has been to clarify several concepts while also coming to the defense of a project that is laudable in its goals, ambitious in its implementation, and (I suspect) successful in ways that will positively inform the design and effectiveness of future MOOCs. In the course of writing this piece, I have also had the great opportunity to learn more about the DALMOOC, and even more about connectivism (a term that I had heard, but about which I knew nothing). I am grateful to Matt for his blog post — which functioned as an artifact and as an opportunity to explore these subjects — and also for his willingness to engage me in a conversation motivated by the spirit of philosophical charity and for the sake of the community as a whole.

As always, comments are welcome.

This Week in Learning Analytics: Privacy and Ethics


Another set of ethical issues that were raised this week involve the intersection of analytics and the humanities. Joshua Kim sparked a conversation about the place of analytics in the liberal arts. In the discussion following Kim’s post, greatest attention was paid to issues of definition: What is ‘assessment’? What are the ‘Liberal Arts’? (Mike Sharkey, for example, suggests that the liberal arts simply imply “small classes and a high-touch environment,” and argues that analytics offers very little value in such contexts. Timothy Harfield argues that the liberal arts provide a critical perspective on analytics, and are crucial to ensuring that educational institutions are learning-driven rather than data-driven). Lastly, in an article for Educause Review Online, James E. Willis discusses the failure of ethical discussions in learning analytics, and offers an ethical framework that highlights some of the complexities involved in the debate. He categorizes ethical questions in terms of three distinct philosophical perspectives, what he calls “Moral Utopianism,” “Moral Ambiguity,” and “Moral Nihilism.” The framework itself is at once overly pedantic and lacking in the clarity and sophistication that one would expect from a piece with tacit claims to a foundation in the history of philosophy, but nevertheless represents an interesting attempt to push the debate outside of the more comfortable legal questions that most often frame conversations about data and privacy.

Recent Blog Posts

Featured Articles

Privacy, Anonymity, and Big Data in the Social Sciences
Jon P. Daries, Justin Reich, Jim Waldo, Elise M. Young, Jonathan Whittinghill, Daniel Thomas Seaton, Andrew Dean Ho, Isaac Chuang

Open data has tremendous potential for science, but, in human subjects research, there is a tension between privacy and releasing high-quality open data. Federal law governing student privacy and the release of student records suggests that anonymizing student data protects student privacy. Guided by this standard, we de-identified and released a data set from 16 MOOCs (massive open online courses) from MITx and HarvardX on the edX platform. In this article, we show that these and other de-identification procedures necessitate changes to data sets that threaten replication and extension of baseline analyses. To balance student privacy and the benefits of open data, we suggest focusing on protecting privacy without anonymizing data by instead expanding policies that compel researchers to uphold the privacy of the subjects in open data sets. If we want to have high-quality social science research and also protect the privacy of human subjects, we must eventually have trust in researchers. Otherwise, we’ll always have the strict tradeoff between anonymity and science illustrated here.

Using Learning Analytics to Analyze Writing Skills of Students: A Case Study in a Technological Common Core Curriculum Course
Chi-Un Lei, Ka Lok Man, and T. O. Ting

Pedagogy with learning analytics is shown to facil- itate the teaching-learning process through analyzing student’s behaviours. In this paper, we explored the possibility of using learning analytics tools Coh-Metrix and Lightside for analyzing and improving writing skills of students in a technological common core curriculum course. In this study, we i) investigated linguistic characteristics of student’s essays, and ii) applied a machine learning algorithm for giving instant sketch feedback to students. Results illustrated the necessity of improving student’s writing skills in their university learning through e- learning technologies, so that students can effectively circulate their ideas to the public in the future.

Calls for Papers

CALL FOR CHAPTERS: Developing Effective Educational Experiences through Learning Analytics
Edge Hill University Press (ABSTRACT SUBMISSION DEADLINE: 15 September 2014)

CALL FOR PAPERS: 5th International Learning Analytics and Knowledge (LAK) Conference
Marist College (Poughkeepsie, NY) | 16-20 March 2015 (SUBMISSION DEADLINE: 14 October 2014)

Recommended Resources

swirl: Learn R, in R
swirl teaches you R programming and data science interactively, at your own pace, and right in the R console!

Upcoming Events

6-9 October 2014
Learning Analytics Week
École polytechnique fédérale de Lausanne

15 October 2014
ALE Speaker Series: Charles Dziuban on Engaging Students in an Engaging Educational Environment

Emory University (Streaming Available)

20 October 2014
Data, Analytics and Learning: An introduction to the logic and methods of analysis of data to improve teaching and learning

University of Texas Arlington | EdX