What’s Legal isn’t Always Ethical: Learning Analytics and Article 2.5

by Kathleen Reed, Assessment and Data Librarian, Instructor in the Department of Women’s Studies, and VP of the Faculty Association at Vancouver Island University

Recently I met with members of an institutional unit outside the library that are working on building a predictive learning analytics system to identify at-risk students so that interventions can be made. The desired model is expansive, pulling in many points of data. The group wanted access to library user records, so they could match student ID numbers with library account activations, check-outs, and physical usage of library space. At my institution, like many others, students agree upon entering the university to have their institutional records be accessed for internal research. But do student actually know what they’re agreeing to when they click the “accept” button? How many of us actually read the fine-print when we access services, be they universities or a social media platforms? While technically the students may have given consent, I walk away from these meetings feeling like I need a shower, and questioning why so many people in education are uncritically hopping on the learning analytics train.

Librarian professional ethics mandate us to resist the panopticon-style student information systems being built by many post-secondary institutions in the name of “student success,” and that are built into learning management systems like D2L and Moodle. The American Library Association has clear policies on Privacy, and the Confidentiality of Personally Identifiable Information about Library Users. The ALA’s Code of Professional Ethics states, “We protect each library user’s right to privacy and confidentiality with respect to information sought or received and resources consulted, borrowed, acquired or transmitted.” (ALA, Professional Ethics) There’s been plenty of librarians talking about libraries and learning analytics; Zoe Fisher has a nice summary on her blog.

Don’t get me wrong – I’m not saying that learning analytics don’t have a place in education. But that place should be driven by informed consent of the people whose data are being analyzed. To me, simply throwing in fine print in a long legalistic document doesn’t count as “informed consent” (and IMHO it also doesn’t stand up to strict British Columbia privacy laws, but that’s a debate for another time). Students need to be told exactly which data are being accessed, and for what purpose. Right now, at least at my place of work, they’re not. I’m teaching in Gender Studies this term, and using the learning management system D2L. When I mentioned to students that I don’t look at the learning analytics that are a part of the instructor view in D2L, most were shocked that professors could look up when and what course information students were accessing.

I sit in Learning Analytics meetings and think “if only we were subject to the Research Ethics Board (REB)…” Despite my early-career eye-rolling at some of the hoops I’ve had to jump through for REB approvals, I’m coming to appreciate these bodies as a valued voice in keeping researchers within solid ethical boundaries. REBs make us stop and think about the choices we make in our research design, and the rights of participants. Because REBs serve this function, the research being done is frequently to a high ethical standard.

Contrast this with internal work being done that doesn’t require REB approval, or any training on ethics or privacy. Much of this work is done under the guise of the Tri-Council Policy Statement – Ethical Conduct for Research Involving Humans (2014) Article 2.5, which exempts institutional research from Research Ethics Board approval. Article 2.5 states:

Article 2.5 Quality assurance and quality improvement studies, program evaluation activities, and performance reviews, or testing within normal educational requirements when used exclusively for assessment, management or improvement purposes, do not constitute research for the purposes of this Policy, and do not fall within the scope of REB review.

Application Article 2.5 refers to assessments of the performance of an organization or its employees or students, within the mandate of the organization, or according to the terms and conditions of employment or training. Those activities are normally administered in the ordinary course of the operation of an organization where participation is required, for example, as a condition of employment in the case of staff performance reviews, or an evaluation in the course of academic or professional training. Other examples include student course evaluations, or data collection for internal or external organizational reports. Such activities do not normally follow the consent procedures outlined in this Policy.”

What this means is that most of the assessment work done in the library – unless it’s research for articles or conference presentations later on – is not subject to REB review. It’s the same for folks who are building learning analytics tools, or monitoring the progress of students within the institution. From what I’ve witnessed, the projects that fall under Article 2.5 are some of the most ethically-fraught ground within post-secondary education. I’m not arguing that anyone who does internal work should have to go through a full REB approval process. But they should have to have some training on ethics and privacy. Perhaps there should be the equivalent of a Research Ethics Officer for investigations that fall under Article 2.5, to help ensure that internal work is held to the same high ethical standard as research.

The guidance that REBs give and the mindset in which they train us to think is valuable, and should be more widespread in post-secondary institutions regardless of whether research falls into or outside of Article 2.5.

REBs, I take back every shady comment and eye roll I ever threw your way.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Gathering Evidence by Asking Library Users about Memorable Experiences

by Kathleen Reed
Assessment and Data Librarian, Vancouver Island University

For this week’s blog, I thought I’d share a specific question to ask library users that’s proving itself highly useful, but that I haven’t seen used much before in library assessment:

“Tell me about a memorable time in the library.”

Working with colleagues Cameron Hoffman-McGaw and Meg Ecclestone, I first used this question during the in-person interview phase of an on-going study on information literacy (IL) practices in academic library spaces. In response, participants gave detailed accounts of studying with friends, moments that increased or decreased their stress levels, and insight into the 24/7 Learning Commons environment – a world that librarians at my place of work see very infrequently, as the library proper is closed after 10pm. The main theme of answers was the importance of supportive social networks that form and are maintained in the library.

The question was so successful in the qualitative phase of our IL study, I was curious how it might translate to another project – an upcoming major library survey that was to be sent to all campus library users in March, 2016. Here’s the text of the survey question that we used:

“Tell us about a memorable time in the library. It might be something that you were involved in, or that you witnessed. It might be a positive or negative experience.”

It wasn’t a required question; people were free to skip it. But 47% (404/851) of survey takers answered the question, and the answers ranged in length from a sentence to several paragraphs. While analysis isn’t complete on the data generated from this question, some obvious themes jump out. Library users wrote about how both library services and spaces help or cause anxiety and stress, the importance of social connections and accompanying support received in our spaces, the role of the physical environment, and the value placed on the library as a space where diverse people can be encountered, among many other topics.

To what end are we using data from this question? First, we’re doing the usual analysis – looking at the negative experiences and emotions users expressed and evaluating whether changes need to be made, policies created, etc. Second, the question helped surface some of the intangible benefits of the library, which we hadn’t spent a lot of time considering (emotional support networks, the library’s importance as a central place on campus where diverse groups interact). Now librarians are able to articulate a wider range of these benefits – backed up with evidence in the form of answers to the “memorable time” question – which helps when advocating for the library on campus, and connecting to key points in our Academic Plan document.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Library Assessment Data Management ≠ Post-It Notes

by Kathleen Reed
Assessment and Data Librarian, Vancouver Island University

Over the last few years, data librarians have become increasingly focused on data management planning as major funders and journals insist researchers have data management plans (DMPs) in place. A DMP is a document that outlines how data will be taken care of during its life cycle. Lately I’ve spent a lot of time thinking about how my data service portfolio dovetails nicely with library assessment activities. A lot of discussion in the library evidence-based practice community is about obtaining and analyzing data and stats, with little emphasis about stewardship of said information. Recently, I gave a talk at an assessment workshop put on by the Council of Post-Secondary Library Directors where I reflected on data management for assessment activities. This post introduces some of the steps in working out a DMP. Please note that while DMPs usually refer to data, not statistics, in my world the ‘D’ stands for both.

Step 1: Take an Inventory
You can’t manage what you don’t know about! Spend some time identifying all your sources of evidence and what format they’re in. I like to group by themes – reference, instruction, e-resources, etc. While you’re doing this, it’s also helpful to reflect on whether the data you’re collecting is meeting your needs. Collecting something that you don’t need? Ditch it. Not getting the evidence you need? Figure out how to collect it. Also think about the format in which the data are coming in. Are you downloading a PDF that’s making your life miserable when what you need is a .csv file? See if that option is available.

Step 2: The ‘Hit by the Bus’ Rule (aka Documentation)
If you’re going to archive assessment data, you need to give the data some context. I like to think of this as the ‘hit by a bus’ rule. If a bus hits me tomorrow, will one of my colleagues be able to step into my job and carry on with minimal problems? What documentation is required by someone else to understand, collect, and use your data? Every single year when I’m working on stats for external bodies, I have to pull out my notes and see how I calculated various numbers in previous years. This is documentation that should be stored in a safe, yet accessible place.

7004634470_2eebf7e336_z
Post-its don’t count as ‘documentation.’ Neither does a random sheet of paper in a towering stack on your desk. Photo by Wade Morgan

Step 3: Storage and Backup
You’ve figured out what evidence and accompanying documentation you need to archive. Now what are you actually going to do with it? For routine data and stats, around my institution we use a shared drive. Within the shared drive I’ve built a simple website that sits on top of all the individuals data files; instead of having to scroll through hundreds of file names, users can just click links that are nicely divided by themes, years, vendors, etc. IT backs up the shared drive, as do I on an external hard drive. If your institution has access to Dataverse hosted on a Canadian server, this is a good option.

Step 4: Preservation
For key documents, you might consider archiving them in a larger university archive. LibQUAL+ results, documentation, and raw data are currently being archived via my institution’s instance of DSpace.

Step 5: Sharing
I always felt like a hypocrite, imploring researchers to make their data open when I squirreled library data away in closed-access shared drives. Starting with LibQUAL+, this past year I’ve tried to make as much library data open as possible. This wasn’t just uploading the files to our DSpace, but also involved anonymizing data to ensure no one was identifiable.

If you’re going to share data with your university community and/or the general public, keep in mind that you’ll need to identify this right away when you’re designing your evidence-collection strategies. For example, if you’re doing a survey, participants need to be informed that their responses will be made available. Ethics boards will also want to know if you’re doing research with humans (monkeys too, but if you’ve got monkeys in your library you’ve got bigger problems than figuring out a DMP…)

Perhaps the most important aspect of sharing is setting context around stats and data that go into the wild. If you’re going to post information, make sure there’s a story around it to explain what viewers are seeing. For example, there’s a very good reason that most institutions score below expectations in the “Information Control” category on LibQUAL+ – there isn’t a library search tool that’s as good as Google, which is what our users expect. Adding some context that explains that the poor scores are part of a wider trend in libraries and why this trend is happening will help people understand it’s not that your library is necessarily doing a bad job compared to other libraries.

Want more info on data management planning? Here are a few good resources:

DMP Builder by the California Digital Library

VIU’s Data Management Guide

Research Data Management @ UBC

What are your thoughts about library assessment data and DMPs? Does your institution have a DMP for assessment data? If so, how does your institution keep assessment data and stats safe? Let’s keep the conversation going below in the comments, or contact me at kathleen.reed@viu.ca or on Twitter @kathleenreed

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.