Demystifying the Research Ethics Board

By Cara Bradley
University of Regina Library

I have been a member of my institution’s Research Ethics Board (REB) since 2015 and it has turned out to be one of the most valuable and interesting service opportunities of my career. In that time, I have come to realize that many in academia (librarians, graduate students, and faculty) don’t have a clear understanding of the workings of the REB, and some have expressed sheer terror at the prospect of applying for ethical approval of research. I thought I’d share some tips that I’ve learned in my time on the REB, in hopes of making the application process a little smoother for those with less experience in this arena.

Upon joining the REB, I was given a lot of training materials to read. And I mean A LOT! Among these materials was an online tutorial program, the TCPS 2 Tutorial Course on Research Ethics (CORE). This self-paced course was created by the Government of Canada’s Panel on Research Ethics (PRE) to educate ethics board members and researchers about ethical expectations of research funded by the Government of Canada’s various funding programs.* CORE is widely used as a resource for ethics training by Canadian universities and its standards are often applied even to research not funded by the federal government. The course is freely available for anyone to complete online and it is a great way for researchers to get really comfortable with research ethics requirements. It take a couple of hours, but you get a certificate of completion (for the tenure/promotion file!) and it may well save you considerable time in preparing/revising your application.

The second phase of my REB training was co-reviewing applications with a mentor. I was paired with an experienced REB member and we started by reviewing a few applications together, after which we reviewed separately, comparing and discussing our reviews. I learned a great deal in this process, and it might be useful for new REB applicants to consider replicating it. Contact a colleague who has successfully obtained REB approval and ask to read their application. Ask them to read your application. I guarantee that they will point out errors or omissions and correcting them before submission will save you time down the road and give you greater confidence in submitting your application.

Another piece of advice—don’t be scared of us! I know many new researchers feel very intimidated by the REB process. You shouldn’t! REB members, in my experience, are eager to help move your research forward. They have nothing to gain by being difficult or mean (and I am truly very sorry if you encounter one of these rare meanies). I see myself as a critical friend to the researcher, pointing out where things might go awry so that they can avoid pitfalls and succeed in their work. It is extremely rare for an application to be outright rejected—instead, you can expect comments indicating where you need to elaborate on or rethink an aspect of your work. This guidance is intended to strengthen your work and ensure that you don’t inadvertently put your research subjects at risk.

Researchers new to the ethical approval process often think of it as something that happens right at the beginning of your project. And while REB approval is necessary before you make any contact with prospective research participants, your research needs to be thoroughly planned before you can apply for REB approval. We want to know exactly what you plan to do—who you are going to contact, how you are going to do so (including the text of any recruitment emails/posters), and we also want to see your data gathering tools (survey/focus group/interview questions, etc.). Your project should be pretty much “shovel ready,” to the point that your planning is complete and you can start your research immediately upon receiving REB approval.

And finally—fill out every relevant section of the form, and be sure to check the web site of your own institutional ethics board for additional guidance on filling out their specific form. Most university research offices have created significant supplemental content to accompany their forms and ensure that the ethical approval process is a smooth one.

I highly recommend serving on your institution’s Research Ethics Board. It is a great way to learn more about research ethics, gain confidence in completing your own applications, contribute in a small way to the success of research at your institution (I positively beam when I see participant recruitment notices, publications, etc. arising from applications that I have reviewed), and gain a good sense of the range of research conducted on your campus.

* I suspect similar training materials are available in other jurisdictions, but I don’t know the details—let me know in the comments if you can provide information on other countries.

(Editor’s note: Brain-Work is hosted by the University of Saskatchewan and there is a problem with the comments that cannot be resolved. If you try to comment on this or any blog post and you get a “forbidden to comment” error message, please send your comment to virginia.wilson@usask.ca and I will post the comment on your behalf and alert the author. We apologize for this annoying problem.)

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Teaching scholarly communication skills to graduate students

by Chris Chan
Head of Information Services at Hong Kong Baptist University Library

Part1: 7 May 2019

I am writing this first part of the blog post from my comfortable hotel room in downtown Minneapolis, where I have arrived ahead of the LOEX Annual Conference that will start at the end of the week. Navigating US Immigration at Chicago after the longest flight I have ever taken (15 hours!!) has taken its toll, but I am hoping that arriving a few days early will give me a chance to more or less recover from jet lag ahead of the event starting on the 9th.

The time will also allow me to put the finishing touches to my breakout session presentation. I’ll be talking about the efforts we have been making at HKBU Library to ensure our graduate students are equipped with the scholarly communication knowledge and skills that they will need to be successful researchers. For several years we have had required library workshops for our research students that covered the basics of scholarly publishing. These sessions also sought to raise awareness of current issues in scholarly communication, such as open access and altmetrics. Although student feedback has generally been positive, we found it challenging to design sessions that were suitable for both novice researchers and those graduate students that already had prior publication experience. We also wanted to better assess the extent to which students were achieving the learning outcomes of the session, as the results of relatively simple in-class exercises could tell us only so much.

Our new approach, launched this year, has been to adapt our workshop content into a modular online course. The course is designed so that students can skip content that they are already familiar with. To fulfill the new course requirement, students need to achieve a passing grade on a short online quiz assessing their knowledge of course content. In my presentation, I’ll be sharing the results from our first year of implementation. I’m also hoping to find out what approaches other institutions are taking, and to this end I’ll be using Mentimeter for the entire presentation. I’m a little nervous about having to rely on an online service, but fingers crossed that it runs smoothly. Another benefit is that I will be able to share the results in the second part of this blog post.

Part 2: 11 May 2019
All done! The conference was excellent – there were so many things that I will be bringing back to my own institution. As for my own presentation, everything went smoothly in technological terms. Mentimeter worked as advertised, and having the interactive segments seemed to help keep things interesting for the audience. Their responses were incorporated into the presentation material in real-time. For example, the results for this question supported a point that I had seen in the literature – that this type of support for graduate students is often not formally assessed (the full question was “How is support for graduate student scholarly communication skill development assessed at your institution?”):

I also used the built-in quiz function in Mentimeter to showcase some of the questions that we use to assess the student learning. Shout out to Marcela for winning!

You can view the full presentation (including the results of the audience voting) here: https://www.mentimeter.com/s/e1451a492dd1d3a21747448a6ff3ce70

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Breaking up with ResearchGate: Streamlining Scholarly Profile Online

by Kathleen Reed
Vancouver Island University

I’ve finally had it with ResearchGate. After what feels like the hundredth time the site emailed me to ask “Did your colleague [name] publish [article]?”, I’m through. These nagging emails are annoying, and asking me to report on my colleagues crosses a line. Beyond my annoyance with spam email, though, lies a deeper question that I’ve been pondering lately: what does a manageable, well-curated online scholarly profile look like?

You’d think I would have a good answer to this question for myself, being a librarian that leads sessions on this very question. But up to this point, my profile is a mishmash of full-text and indexed publications, across multiple profile platforms. These include my institution’s digital repository, Twitter, ORCID, Google Scholar, Research Gate, and Academia.edu. I make all my work open access, but not in one central place.

I tell myself that this scatter-shot approach has been at least partially because I demonstrate multiple sites for other researchers as part of my job, and I need to be familiar with them. And I worry that I’ll be splitting my readership stats if I publish in an OA journal, and then turn around and put my work up OA somewhere else. Mostly, though, keeping all of my profiles updated is a time-consuming task and just doesn’t happen. Thus, I have a series of half-completed and stale profiles online – not exactly the scholarly image I wish to project, and certainly not what I preach in my sessions on the subject.

During the upcoming year I’m off on leave to start a PhD, and scholarly profile seems more important than ever before. Add to that the idea of not getting annoying ResearchGate emails, and I’m finding motivation to change my online profile. Yes, I know I can opt-out of ResearchGate emails and still have a presence on the site. But the monetizing of public scholarship on private platforms bothers me. I don’t want to promote that ResearchGate and Academia.edu are acceptable places to deposit OA versions – they’re not, according to the Tri-Agencies. So I’ve decided to focus on my institution’s IR, ORCID, and Google Scholar. Three places to update seems more manageable, and I like getting away from for-profit companies at least a little. See ya, ResearchGate.

How do you manage your scholarly profile online? If you feel like you’ve got a system that works, what does that look like? Please share in the comments below.

(Editor’s note: Brain-Work is hosted by the University of Saskatchewan and there is a problem with the comments that cannot be resolved. If you try to comment on this or any blog post and you get a “forbidden to comment” error message, please send your comment to virginia.wilson@usask.ca and I will post the comment on your behalf and alert the author. We apologize for this annoying problem.)

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The Beauty of ORCIDs

by Kristin Hoffmann, University of Western Ontario

A few months ago, I published an article that Jane Schmidt had written for the Canadian Journal of Academic Librarianship special issue on diversity (I am one of CJAL’s co-editors). Before I knew it, Jane had tweeted about her article’s publication. I was surprised; had she been constantly refreshing the journal’s web site? How had she found out about it so quickly?

I got my answer in a follow-up tweet:
Screenshot of a tweet from Jane Schmidt that says “Also, shout out to @ORCID_Org for notifying me that it was published! Shazam!”

And that, librarian-authors, is the beauty of ORCID:

• When you have an ORCID ID,
• and you give it to a journal you are publishing with,*
• and the journal registers DOIs for the articles it publishes (and many journals do),

then, when the journal publishes your article,
• the DOI registration sends the information about your ORCID to CrossRef,
• and CrossRef sends information to ORCID about your new article,
• and ORCID lets you know that it has added a new publication to your profile.

This all happens seamlessly. It’s a great example of technologies talking to each other and making our researching and authoring lives easier.

As an editor, it was gratifying to see an author promote her publication online so soon after it was published. I’ve also used my ORCID ID as an author, and getting the notification from ORCID that my publication was added to my profile was a good ego boost—even an automated email can be affirming.

Other benefits of an ORCID profile include:
• Pulling together publications with different names or name variants (e.g., initials, full first names, different last names)
• Helping you keep your CV up to date
• Communicating information to and from funding agencies, if you apply for grants
• Helping you demonstrate the attention and reach of your publications, by connecting ORCID with tools such as ImpactStory

If you don’t have an ORCID ID, take 30 seconds to sign up for one at the ORCID site, https://orcid.org/.

And take another ten minutes or so to add your previous publications to your profile:

Then include your ORCID ID with the next article you submit, and when your article is published, you too can have a Shazam! moment and experience the beauty of ORCID.

*Technically, in this case, I searched for Jane’s ORCID ID and added it to her article’s metadata before I published the article.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Troubleshooting or Trouble? When Research Tools Fail

By Elizabeth Stregger
Mount Allison University Library

Spoiler alert: by the end of this story, a Data and Digital Services Librarian finds joy in coding with paper.

When my research collaborator, Dr. Christiana MacDougall, asked about using RQDA (R package for Qualitative Data Analysis) to analyze our data, I was enthusiastic. Open source software, a different way to use R, and yes, it was listed on some library guides. There were detailed YouTube tutorials. I was confident that it would meet our needs.

Following the installation instructions for Mac OSX (last tested in June 2016) was not immediately successful. I found some helpful advice on GitHub (Kopf, 2014) and installed RQDA on the three computers we use most frequently, all Macs. Our first impression was that the interface was a bit clunky and slow. We could cope with it. After all, we’d said we would use RQDA in our Data Management Plan, and installing it had been quite a lot of work. I thought we were on track.

Then we started coding. The system lagged, making it very hard to select text. Was it bad wifi in the coffee shop? Would it work better with a mouse? Did the file location make a difference? I was determined to find a way for this work, so that I could give other faculty members solid advice in the future.

I installed it on my work desktop computer, a Windows machine. Finally, RQDA worked as expected. At that point, I knew that my best advice for faculty members was to abandon any attempt to use RQDA on a Mac.

I did my coding using RQDA on my work computer. Christiana printed, cut, and manually sorted our data. While I was proud of trying my very best to get a system to work, I gained a lot of appreciation for analog methods. Moving strips of paper around on a wooden table is as satisfying as working on a puzzle or sorting shades of yarn into a fade.

I’m grateful that Christiana was open, curious, and patient with my persistence. In addition to good communication with your research partners, these are my recommendations for working with open source tools:
– Try using the research tools your library is promoting as open source options
– Keep an eye on how long it has been since open source tools have been tested for operating systems
– Balance user experience with your commitment to open source
– Share stories about research tools
– Contribute to open source projects if you have the skills
– Have a backup plan!

Reference

Kopf, S. (2014). Installation information for R with GTK on Windows/Mac OS. Retrieved from https://gist.github.com/sebkopf/9405675

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Did They Learn Anything? Strategies for Evaluating Discussion Posts

By Tasha Maddison, Saskatchewan Polytechnic

Transitioning library instruction into an online platform allows for increased flexibility for content provision, as well as further opportunities for assessment; provided that the learners access and fully engage with the content. Saskatchewan Polytechnic has a new online library module for faculty intended to complement face-to-face information literacy sessions. The course covers the entire research process; from information need, to writing and formatting academic papers. Opportunities to assess the students’ learning are built into each learning outcome through discussion posts and quizzes.

The key to a successful blended learning project is the “purposefully integrated assessment, both formative and summative, into the actual design of the course” (Arora, Evans, Gardener, Gulbrandsen & Riley, 2015, p. 249), thus evaluating student learning by incorporating quizzes and discussions. Their goal was to “create an online community of active and engaged learners”, and through instructor prompts, “students were directed to state what they found valuable in the peers’ response, what questions they had … and also what support they would add to make their peers’ argument stronger” (Arora et al., 2015, p. 239). The researchers noted the assessment activities “layered the learning experience – helped to generate and sustain student interest and enthusiasm for the course material to fashion a vibrant community of active learners in the process of meaning- and knowledge-making” (p. 241). Perhaps their most compelling statement is that the “students saw the online classroom as an active learning space, not just a repository for materials or a place to take quizzes” (p. 248).

Students in our online course were informally evaluated on their responses to three discussion posts. Being present and available to the online audience is vital for both student success and their corresponding engagement with the learning materials. Students are more likely to participate in a conversation if they feel that there is someone reviewing their posts and who is prepared to respond when necessary or appropriate. Discussion posts were scheduled at the mid-way point of each learning outcome so that students could reflect on the information that was just covered, as well as provide a foundation to the materials that were included later in the outcome. A librarian reviewed all content and responded to most of the discussion posts, providing feedback and suggestions. Anecdotally, responses were thoughtful and thorough, demonstrating a high level of comprehension and a strong connection with the content.

Participation in discussion posts allows students to synthesise their thoughts about a particular topic and more importantly, to learn from their peer group. Students can review the sentiments shared by their colleagues and learn from their experiences. According to Arora et al. (2015) discussion posts, help students to develop individual skills such as “critical reading and analysis through close reading and textual analysis” (p. 243) and builds community by “encouraging students to work together on their interpretive skills and in negotiating knowledge” (p. 244). This form of assessment is meaningful to the instructor as well, as they can learn about the student and their understanding of the materials.

This course was piloted in July 2018. Discussions were included in the pilot, but were not formally assessed at the time. The development of a rubric was designed to evaluate the student’s output for the next iteration of the course this spring. A rubric will assist students in determining how they are doing in the course and identify any growth areas.

In order to perfect formative assessment of online library instruction, librarians should include a variety of measurement instruments that fully address student learning. As recognized in the work of Lockhart (2015) “it is very important that the academic programme continually tests and builds the information literacy skills of students” (p. 20), which can be accomplished by developing strong partnerships with instructors so that the librarians can adequately analyse student learning and success.

The module discussed here was developed by Diane Zerr and Tasha Maddison, librarians at Saskatchewan Polytechnic, and is used by students in the Adult Teaching and Learning program.

References
Arora, A., Evans, S., Gardner, C., Gulbrandsen, K., & Riley, J. E. (2015). Strategies for success: Using formative assessment to build skills and community in the blended classroom. In S. Koç, X. Liu & P. Wachira (Eds.), Assessment in online and blended learning environments (pp. 235-251), [EBSCOhost eBook Collection]. Retrieved from https://ezproxy.saskpolytech.ca/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=971840&site=ehost-live&scope=site

Lockhart, J. (2015). Measuring the application of information literacy skills after completion of a certificate in information literacy. South African Journal of Libraries and Information Science, 81(2), 19-25. doi:10.7553/81-2-1567

table of the rubric used in the assessment
Click for larger image

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

When Research isn’t Counted

by Nicole Eva
University of Lethbridge

As the last 6 months of my 2-year reporting period wind down, and as the same time remains until the start of my study leave, I have been reflecting on the research I’ve done in the recent past. It’s been an unusually high period of service for me – for 2016/2017 & 2017/2018 I was chair of our faculty association’s Gender, Equity, and Diversity Committee, during which time I conducted an extensive literature review on the potential biases in Student Evaluations of Teaching (statement can be found here; annotated literature review here), and I searched the literature for examples of faculty perception surveys to lead the creation of such a survey at our institution. This past year I’ve served as past chair on that committee, during which time a few of us have been involved in a deep qualitative analysis of those survey results. I am also chairing the President’s Committee on Diversity, Equity, and Inclusion this year, for which we conducted surveys, held campus consultation sessions, and interviewed local experts for their thoughts. It’s been enlightening and valuable work, but there’s a lot of it. And a lot of the analysis is among the most rigorous research I’ve been involved with to date. While it will pay off in great experience for future research I might take on, it won’t be ‘counted’ in the traditional sense in terms of publication output. The survey work of both groups is highly confidential and while reports are being produced, they won’t be published in a peer-reviewed journal. The same goes for the Teaching Evaluations work; while both the statement and the annotated literature review are published in our Institutional Repository, again they aren’t ‘published’ in the traditional sense. I am fortunate that I did produce one other article last year which should be published this year, and some prior work which finally came out in the last couple of years, so it’s not like I have nothing for publications; but still, there has been SO much time, effort, and actual rigorous research done regarding these projects I hate to ignore them as research output.

Another element: I’ve been pulled into writing a grant which supports some of our President’s committee recommendations. But again, while normally grant applications would be counted under Research, in this case it’s more Service (which is of course limited in its value for evaluation purposes, and not at all for promotional purposes). But from my perspective, having limited experience applying for external grants, the experience is invaluable. But that value is quite invisible. I’ve also gained a lot in terms of the people I’ve worked with, relationships developed, and institutional knowledge gained. But again, intangible.

It made me curious: what ‘counts’ as research? If you’re doing research for committee work that results in internal documents, does it still ‘count’? If it’s highly confidential and you can never publish the results because you didn’t clear ethics for that purpose, does it ‘count’?

These are my thoughts as I was faced with writing this blog. What has changed since my last blog post, in terms of actual research effort? Well, quite a bit. And yet it looks like nothing at all.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

[Editor’s note: I encourage reader comments on this issue. Brain-Work is hosted by the University of Saskatchewan and there is a problem with the comments that cannot be resolved. If you try to comment on this or any blog post and you get a “forbidden to comment” error message, please send your comment to virginia.wilson@usask.ca and I will post the comment on your behalf and alert the author. I apologize for this annoying problem.]

Getting Loud in the Archives

by Kristin Lee, Tisch Library, Tufts University

I got to spend last week in the Archives of The Ringling Museum, home of records related to circus and the Ringling Brothers. In addition to escaping chilly Boston for sunny Sarasota, FL, I also got to meet up with an amazing, brilliant group of women who are also doing research in various areas of circus and sideshow. We all met last summer at the Circus Historical Society (CHS) Convention in Baraboo, WI, and decided that a joint trip would be a way to get some research done and save some money.

I went into this trip without a well-defined research question. My interest is in collecting and pulling together data about the circus to create a foundation that other researchers can use for their work. I want the ledgers, routes, and receipts; basically, anything that comes in the form of a table. I have some parameters around my research (Illinois State University’s Milner Library’s Special Collections has contracts and permits for 1914, so that has become a focus), but mostly I just want to turn these facts into data that a computer can do something with. For my early forays into circus I used the resources on the CHS website, so it was nice to see the beautiful, flowing handwriting in the ledgers and the details on the route cards that I didn’t see in the transcribed routes.

The Archives at the Ringling are in the Tibbals Learning Center, which is also the home of circus exhibits and The Howard Bros. Circus, a miniature circus created by Howard Tibbals. When our group arrived at the Archives on our first of three research days we were greeted by the staff there with warm handshakes and enthusiasm. As someone who has a day job assisting researchers it was a little overwhelming to be treated like I was a real researcher in my own right (cue impostor syndrome). Everyone was accommodating and helpful but also, I think more importantly, as excited about our research as we were. Files, journals, and scrapbooks were passed around, read through, and immediately replaced with some new treasure. Requests were revised on the fly and I think we all got something special that we didn’t even know existed. I marvel at my colleagues who work in archives and can find things in their collections that answers questions you didn’t even know you had.

One of the great joys of this trip was being able to share in everyone’s delight as they came across information that they hadn’t previously known and documents that filled in pieces of the puzzles from centuries past that give us a picture of the people who inhabited the early American circus. Everyone helped each other and pointed out materials that we thought might help the others. Squeals of delight and gasps of astonishment were common and everyone (including several of the people who worked at the archives) would gather around to discuss the item in question. When handwriting was unclear (P. T. Barnum had especially terrible penmanship) the letter was passed around to get more opinions. No one shushed us or gave us dirty look, and for all our discussion I think we got some good research done.

I feel like I was especially fortunate to be there and get a better understanding of how my work can help other researchers where my research fits in the broader field. I will probably never write a book, but I will make some cool maps, data visualizations, and tables that will provide access to facts about the circus that will be important building blocks in other projects. Even though I spend my days advocating that research data is a valid research product it has taken me a while to recognize that my own work counts too. There is a lot of work involved in creating clean datasets, and I tend to dismiss that because this is research that I do “for fun”. Working alone I sometimes worry that I am the only one in the world who will care about things like where the Sells-Floto Circus was playing in 1914 but getting the chance to talk circus for a week with other people who are as excited about their areas as I am about mine reminded me that I am definitely not alone and that what I do has value.

I have never been particularly good at solitary pursuits, so this format of research and exploration suited me very well. If you can find a group of people to storm an archive with I highly recommend it.

Thanks to my circus ladies for a great week: Betsy Golden Kellem; Amelia Osterud; Shannon Scott; and Kat Vecchio.

Enormous thanks to all the people at the Ringling Archives for your generosity and patience: Jennifer Lemmer Posey, Tibbals Curator of Circus; Heidi Connor, Archivist; Peggy Williams, Education Outreach Manager at Feld Entertainment.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Musings on Mandatory Copyright Training

by Gina Brander, Saskatchewan Polytechnic Library

The ongoing Access Copyright lawsuit against York University and anticipated amendments to the Copyright Act have made many institutions increasingly vigilant about copyright compliance. Some have responded by urging or requiring faculty to submit course materials for copyright review. Others have ramped up copyright training and education among faculty and staff.

With these approaches in mind, how can we ensure that faculty use, or even know about, the copyright services and training available to them? Instructors select materials and design their courses with varying levels of autonomy depending on their school, department and/or program. As a result, messaging isn’t always effectively communicated from the top down to everyone who needs to hear it, such as non-faculty and part-time instructors (Zerkee, 2017). And while mandatory copyright reviews of course materials ensure compliance, not all institutions have the resources to perform these reviews.

Mandatory copyright training is an alternative approach that has been cautiously explored. A recent survey of Canadian universities found that only 13.6% of respondents’ institutions required instructors to undertake copyright training or education (Zerkee, 2017). Canadian colleges and polytechnics appear to be following a similar track. A number of Ontario colleges in partnership with Heads of Libraries and Learning Resources (HLLR) collaboratively developed a copyright education online learning module, the first of which launched in 2013 (Copyright Literacy Ontario Colleges, n.d.). Yet few of the participating colleges have made these modules required training (Buckley, Muller, Peters, & Shannon, n.d.).

It would seem that mandatory copyright training is the exception rather than the rule in Canadian post-secondary institutions. There are many reasons for this—lack of resources, other institutional priorities, difficulties enforcing non-legislated materials, service culture, etc. For copyright offices firmly planted in the library, enforced education of any kind may feel counterintuitive.

Yet still I must ask—what is the best way to ensure that faculty and staff are equipped with the knowledge and tools they need to make informed, deliberate copyright choices? Perhaps some institutions will find more success and buy-in by promoting rather than enforcing copyright training. Even so, making basic training a requirement of employment is a surefire way for an institution to send the unequivocal message (both outwardly and inwardly) that copyright compliance is everyone’s responsibility. And that’s an important message.

References
Buckley, P., Muller, J., Peters, J., & Shannon, M. (n.d.). Copyright literacy in Ontario colleges [PowerPoint presentation]. Retrieved from https://copyrightliteracy.wordpress.com/about/

Copyright Literacy Ontario Colleges. (n.d.). Implementing. Retrieved from https://copyrightliteracy.wordpress.com/implementing/

Zerkee, J. (2017). Approaches to copyright education for faculty in Canada. Partnership: The Canadian Journal of Library and Information Practice and Research, 11(2). 1-28. https://doi.org/10.21083/partnership.v11i2.3794

An Early Valentine for Journal Editors

by Selinda Berg
Leddy Library, University of Windsor

Currently, as a member of CAPAL’s Research and Scholarship Committee, I am working as one member of an editorial team building a special issue of the Canadian Journal of Academic Libriarianship on research and scholarship in Academic Libraries. While I have so enjoyed learning about the array of scholarly interests related to the topic, the most significant learning that I have achieved is about the incredible and relentless work of being a journal editor. I have a long list of respected colleagues who have taken on this role. However my foray into editorship brought to me a new and profound admiration for editors.

The work that journal editors do was captured in Lori Kloda’s CEBLIP blog post from May 2016. I have since learned that the list is modest in its description of the work of the editor. I have long recognized that there was a lot of work behind the scenes for journal publications. However, the amount of invisible labour involved in editorial work truly is astounding. And perhaps even harder to capture is the work and effort to be balanced, sensitive, and patient. Soliciting reviewers, tracking down reviews, mediating conflicting reviews, considering papers for rejection, balancing the voice of the author and one’s own voice are only a few of the tasks that editors must consciously and carefully engage in. Editors recognize that the works submitted to them are those of their respected colleagues who have made a personal and professional investment in their writings, and in turn treat them as such.

This post is intended to be a cheer for journal editors in our profession. They are supporting, encouraging, and facilitating research in our field. They are investing their time and efforts towards building our scholarly platforms. As such, with this valentine to the journal editors, I endeavor to be a better author and a better peer reviewer.

A Valentine for the journal editors…
Roses are red,
Violets are blue,
I have a better understanding,
Of all that you do.
So I promise to do better,
And proofread to the letter,
I will keep to the deadlines,
And read more closely the guidelines.
You read my articles with care,
And made them the best I could possibly share.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.