Journal Club: Analyzing Learning Analytics

By Candice Dahl,
Learning Services Librarian, University of Saskatchewan

The topic of October’s C-EBLIP journal club at the University of Saskatchewan (USask) library was learning analytics in libraries based on a blog post by April Hathcock and a YouTube video by Beauchamp and Rawls of Florida State University.  The video by Beauchamp and Rawls defines learning analytics as “the measurement, collection, analysis and reporting of data about learners in their contexts, for purposes of understanding and optimizing learning and the environment in which it occurs.”  The use of learning analytics in libraries has been the subject of much critical discussion.  While some champion the use of learning analytics, practitioners of critical librarianship and others raise noteworthy criticisms of the learning analytics movement.

One argument made against the use of learning analytics is based on the core library value of privacy.  Beauchamp and Rawls argue that the use of learning analytics breaches student privacy and is therefore unethical. Students are typically not aware of the data being collected about them or how it is being used.  April Hathcock (Hathcock, 2018) opposes learning analytics on the basis that “metrics and assessment are for property and animals, and people considered property and animals,” while Rawls also notes that that our work and value in libraries and universities should not be quantified. At a time when education is being corporatized and library value is tied to measurable metrics, Hathcock proposes that we resist the use of learning analytics by focusing instead on facilitating student agency – by informing students of the practices of their institution, and educating and empowering them to resist or participate in an informed way.

Beauchamp and Rawls also make the argument that learning analytics do not document learning but rather demonstrate students’ procedural compliance: they measure how well students follow our (library) procedures, as a way of proving our value to our institutions rather than measuring what or how well students learn. Closely tied to the issue of compliance is the notion of power. Through surveillance and bureaucracy, learning analytics research functions as an exercise in power, represents a colonial approach to research, and overlooks the humanity of our students by turning them into data points (Rawls).

Our discussion centered on the need to develop clear goals for any investigations into optimizing student learning, and to ensure that we  do not use learning analytics to actually measure ourselves and our performance (as a library) rather than student learning.  We thought of many questions to ask ourselves before getting involved in such an undertaking. What could we learn about students’ interaction with libraries that would be helpful to us? How do we ensure that the data we collect is meaningful? And how committed are we to actually making positive changes in response to meaningful data?

We also wondered about ways to impact student learning in more personalized and human-centric ways. Can our face-to-face interactions with students achieve results similar to interventions based on learning analytics? Could we have a positive impact on the students we don’t see in libraries by using learning analytics? And, based on the work of Rawls, we considered how reliance on learning analytics could impact students from underrepresented and minority groups, who may already be wary of additional surveillance and who may use our spaces and services in very different ways. Will learning analytics help us discover useful interventions for these students, or reinforce inaccurate perceptions?

Overall we concluded that libraries should consider how to define their value in non-corporatized, non-colonial ways, and how to engage students ethically and sensitively in optimizing their learning environments. Ultimately, determining whether or not to employ learning analytics in libraries should be guided by our profession’s values and ethics, not just our desire for information.

References

Beauchamp, Adam, and Mallary Rawls. “In Search of a Just and Responsible Culture of Assessment.” YouTube, uploaded by Beauchamp and Rawls, 31 Aug. 2020, https://www.youtube.com/watch?v=t6f_1QvAu_A&feature=youtu.be.

Hathcock, April. “Learning Agency, Not Analytics.” At the Intersection, 24 Jan. 2018, https://aprilhathcock.wordpress.com/2018/01/24/learning-agency-not-analytics/. Accessed 24 Oct. 2020.

What’s Legal isn’t Always Ethical: Learning Analytics and Article 2.5

by Kathleen Reed, Assessment and Data Librarian, Instructor in the Department of Women’s Studies, and VP of the Faculty Association at Vancouver Island University

Recently I met with members of an institutional unit outside the library that are working on building a predictive learning analytics system to identify at-risk students so that interventions can be made. The desired model is expansive, pulling in many points of data. The group wanted access to library user records, so they could match student ID numbers with library account activations, check-outs, and physical usage of library space. At my institution, like many others, students agree upon entering the university to have their institutional records be accessed for internal research. But do student actually know what they’re agreeing to when they click the “accept” button? How many of us actually read the fine-print when we access services, be they universities or a social media platforms? While technically the students may have given consent, I walk away from these meetings feeling like I need a shower, and questioning why so many people in education are uncritically hopping on the learning analytics train.

Librarian professional ethics mandate us to resist the panopticon-style student information systems being built by many post-secondary institutions in the name of “student success,” and that are built into learning management systems like D2L and Moodle. The American Library Association has clear policies on Privacy, and the Confidentiality of Personally Identifiable Information about Library Users. The ALA’s Code of Professional Ethics states, “We protect each library user’s right to privacy and confidentiality with respect to information sought or received and resources consulted, borrowed, acquired or transmitted.” (ALA, Professional Ethics) There’s been plenty of librarians talking about libraries and learning analytics; Zoe Fisher has a nice summary on her blog.

Don’t get me wrong – I’m not saying that learning analytics don’t have a place in education. But that place should be driven by informed consent of the people whose data are being analyzed. To me, simply throwing in fine print in a long legalistic document doesn’t count as “informed consent” (and IMHO it also doesn’t stand up to strict British Columbia privacy laws, but that’s a debate for another time). Students need to be told exactly which data are being accessed, and for what purpose. Right now, at least at my place of work, they’re not. I’m teaching in Gender Studies this term, and using the learning management system D2L. When I mentioned to students that I don’t look at the learning analytics that are a part of the instructor view in D2L, most were shocked that professors could look up when and what course information students were accessing.

I sit in Learning Analytics meetings and think “if only we were subject to the Research Ethics Board (REB)…” Despite my early-career eye-rolling at some of the hoops I’ve had to jump through for REB approvals, I’m coming to appreciate these bodies as a valued voice in keeping researchers within solid ethical boundaries. REBs make us stop and think about the choices we make in our research design, and the rights of participants. Because REBs serve this function, the research being done is frequently to a high ethical standard.

Contrast this with internal work being done that doesn’t require REB approval, or any training on ethics or privacy. Much of this work is done under the guise of the Tri-Council Policy Statement – Ethical Conduct for Research Involving Humans (2014) Article 2.5, which exempts institutional research from Research Ethics Board approval. Article 2.5 states:

Article 2.5 Quality assurance and quality improvement studies, program evaluation activities, and performance reviews, or testing within normal educational requirements when used exclusively for assessment, management or improvement purposes, do not constitute research for the purposes of this Policy, and do not fall within the scope of REB review.

Application Article 2.5 refers to assessments of the performance of an organization or its employees or students, within the mandate of the organization, or according to the terms and conditions of employment or training. Those activities are normally administered in the ordinary course of the operation of an organization where participation is required, for example, as a condition of employment in the case of staff performance reviews, or an evaluation in the course of academic or professional training. Other examples include student course evaluations, or data collection for internal or external organizational reports. Such activities do not normally follow the consent procedures outlined in this Policy.”

What this means is that most of the assessment work done in the library – unless it’s research for articles or conference presentations later on – is not subject to REB review. It’s the same for folks who are building learning analytics tools, or monitoring the progress of students within the institution. From what I’ve witnessed, the projects that fall under Article 2.5 are some of the most ethically-fraught ground within post-secondary education. I’m not arguing that anyone who does internal work should have to go through a full REB approval process. But they should have to have some training on ethics and privacy. Perhaps there should be the equivalent of a Research Ethics Officer for investigations that fall under Article 2.5, to help ensure that internal work is held to the same high ethical standard as research.

The guidance that REBs give and the mindset in which they train us to think is valuable, and should be more widespread in post-secondary institutions regardless of whether research falls into or outside of Article 2.5.

REBs, I take back every shady comment and eye roll I ever threw your way.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.