An important and fruitful area of discussion in learning analytics involves the use of embedded student dashboards, which are most commonly sold and promoted as tools for leveraging peer pressure to increase student success (like UMBC’s Check My Activity Tool). In my experience with a similar tool over the last year however, it has become abundantly clear that not all students respond to analytics in the same way. In fact, in two separate classes, instructors who piloted the tool found otherwise high-performing students see decreases in academic performance as a consequence of a kind of ‘gaming’ behavior (not intentional, but a consequence of confusing proxies — ie. Course accesses, minutes in course, interactions, etc — with learning outcomes). Others have observed similar negative results on the part of poor performers, who see a decrease in motivation following an ‘objective’ display of their performance relative to peers. This doesn’t involve learning styles, but does point to the fact that students differ and in such a way that we can’t expect them all to react the same in common learning environments. The task of the teacher, then, would seem to involve communicative strategies that would mitigate damaging effects while enhancing positive ones. The worst thing an instructor can do with any educational technology is to “set it and forget it,” expecting that it will achieve some glorious effect without the need for support from good pedagogy and good teaching.
In other words, Educational technology is not a rotisserie oven.