Tracking (Un)belonging: At the Intersections of Human-algorithmic Student Support

Thumbnail Image
Journal Title
Journal ISSN
Volume Title
Corporate Author
Commonwealth of Learning (COL)

Central to this paper are the questions: How can open, distance and distributed learning use algorithmic decision-making systems to identify and respond to those students who may experience anomie and disengage from their studies before dropping out? What are the dangers, benefits and ethical considerations of using such systems? The extent to which students become assimilated into the institutional, pedagogical, and disciplinary culture was and continue to be seen as key to understanding student success or failure. Research shows that students who do not ‘fit’, experience anomie, or have problems with assimilation have a greater probability of disengaging, failure or dropping out. Anomie or (un)belonging in open, distance and distributed learning environments may manifest in various ways, such as, but not limited to, the non-submission of assignments or tasks, absence from online discussion forums, etc. In courses with large student enrolment, noticing these disengagements or evidence of (un)belonging is often difficult, and clarifying the reasons underlying this behaviour, as well as responding to the data, is almost humanly impossible. So, how do we identify these students? What data and systems will allow us to enter into a conversation with them, to explore ways of making them feel at home, whether in a disciplinary or delivery (online, blended or distance) context? Higher education has always collected, measured, analysed and used student data for a variety of purposes, such as operational planning, reporting to a variety of stakeholders, allocation of resources, and student support. Having access to increasing volumes of increasingly nuanced and granular data, from a variety of disparate sources, opens opportunities to proactively identify students who seem to have disengaged, or may experience feelings of anomie. Despite well-documented concerns about, inter alia, the role of bias, lack of accountability and need for regulation, algorithmic decision-making systems offer huge potential to respond to evidence that students may be experiencing issues pertaining to student experiences of (un)belonging. In this conceptual paper, I will situate the potential and dangers of algorithmic decision-making systems in the context of the need to make the most of evidence that students may feel that they do not belong, or have difficulty in belonging. // Paper ID 87

Student Support Services,Open and Distance Learning (ODL),Education Data