Signs of the apocalypse and learner dashboards

A sunny Monday morning in Edinburgh (one of the signs of the apocalypse) comes with the good news that a proposal to run a half-day workshop at the European Conference on Technology Enhanced Learning has been accepted.

Student-facing Learning Analytics – Principles for Design and Evaluation

The workshop aims, in the context of institutions of higher and further education, to explore through discussion what a set of principles for student-facing learning analytics might be. We will consider learning analytics systems which are designed to provide feedback to students on their engagement, progress, and likely success. The workshop will explore and critique existing examples and scholarship with a view to developing principles which can be applied both to the design and evaluation of student-facing learning analytics software, and which will improve on current visions for such systems and promote aspects which are desirable in being: ethically defensible, effective at improving student outcomes, focussed on developing “graduate attributes”, aligned with regional or disciplinary norms, etc.

The motivation for the workshop is a concern that conceptions of what student-facing learning analytics, both from the developer/supplier and user/adopter sides of things, is unsophisticated and either not evidence-based or counter to evidence and theory in education and learning science research.

The objective of the workshop is to identify principles which will influence the future, based on a distillation of wisdom and experience from TEL research and practice perspectives.To this end, it will emphasise principles that are actionable by adopters, and where this is not possible, we will document outstanding research questions.

I will be working with a fine team – and much credit goes to Adam Cooper for organising us to get this submission in on time – basically he did all the work.

This work is very timely. I’ve been running some workshops recently to demo OnTask to various academic teams and it’s opened up some good discussions about how we identify the right points at which to deliver coaching feedback in our course learning design process; and how this kind of feedback fits within the larger set of communications a student receives on a course.

These workshops have also raised the question of why we don’t show the students their own data however – something I hear quite often. I think there is a big difference between being transparent about the data we collect about students*, and student facing data dashboards and other such things, and I worry that the two are often mixed up. Data as oil is an appropriate metaphor (including the environmental consequences of spillage inherent in that) because data is generally crude and unrefined. The expectation that students can easily interpret it may be very wrong. Some recent SRHE funded research that Dr Liz Bennett at Huddersfield has published bears out much of this:

“The findings show that the way that learner dashboards are currently being designed needs to refined. Currently the dominant theoretical model that underpins the design of most Learner Dashboards is students’ self-regulated learning (Jivet et al. 2017).

However, the findings suggest that it would be valuable to think of dashboards as socio-material assemblages and that this would enable the messiness of the learning process, the complexity of individual dispositions and variety of contexts to be more completely represented.” (Students’ learning responses to receiving dashboard data)

Data dashboards are a form of feedback, and a recent paper from David Carless & David Boud emphasises more generally the need to invest in developing student feedback literacies. Poor understanding of what to do with feedback remains a significant factor.

“Sutton (Sutton, P. 2012. “Conceptualizing Feedback Literacy: Knowing, Being, and Acting.” Innovations in Education and Teaching International 49 (1): 3140.10.1080/14703297.2012.647781) put the notion of feedback literacy on the agenda from an academic literacies perspective and defined it as the ability to read, interpret and use written feedback. We extend this useful starting-point by defining student feedback literacy as the understandings, capacities and dispositions needed to make sense of information and use it to enhance work or learning strategies. Students’ feedback literacy involves an understanding of what feedback is and how it can be managed effectively; capacities and dispositions to make productive use of feedback; and appreciation of the roles of teachers and themselves in these processes. One of the main barriers to effective feedback is generally low levels of student feedback literacy.” (The development of student feedback literacy: enabling uptake of feedback)

Our own Learning Analytics Principles and Purposes includes Principle 6:

“The introduction of learning analytics systems will be supported by focused staff and student development activities to build our institutional capacity;” (Learning Analytics Principles and Purposes)

I’m looking forward to working with our academic teams on developing feedback exemplars for OnTask and also working with sector colleagues to develop guidelines and principles for student facing data presentations. It seems to me that this hits right into conversations that I had at LAK18 about taking back control of the narrative around learning analytics in operational contexts; widening the understanding that it can be more than predictive analytics, and grounding it more firmly in the research.

———-

* We definitely should do this btw – and in Europe GDPR mandates it along with being clear what it will be used for, and the legal basis for holding it. Much of the data used for learning analytics purposes is data we have been collecting for years for one primary purpose (often system support / audit trail stuff) that is now being used for a secondary purpose. We also need to draw finer distinctions between data collection, and the purposes for which it is used.

2 thoughts on “Signs of the apocalypse and learner dashboards

  1. This workshop looks awesome. I just graduated with my PhD from Brigham Young University in the Instructional Psychology and Technology program. I have previously worked with Katrien Verbert (KU Leuven), and I work on learning analytics dashboard research. We have published a few papers together on learner dashboards. Unfortunately, I will not be attending EC-TEL. Will any of the sessions from the workshop be recorded and posted online?

    1. Hi Robert – I don’t know whether the workshop itself will be recorded. Given it’s likely to be very discursive I’m not sure how well it would work in practice. However it is our intention to publish the output of the workshop.

      “The final outcome of the workshop will be a concise document which seeks to faithfully represent the union of the principles identified during the workshop, indicating which ideas resonated well across all the groups, which were seen as being most important, and which may be the subject of contention. This will be published online with a Creative Commons licence.”

      We’re just in the process of knocking together a site to support the workshop, so follow me on Twitter (@ammienoot) and I’ll tweet it out when it’s available.

Leave a Reply to admin Cancel reply

Your email address will not be published. Required fields are marked *