An Ending and a Beginning

by Virginia Wilson
Director, Centre for Evidence Based Library and Information Practice (C-EBLIP)

Another year of Brain-Work blog posts is in the bag. A huge thank you to all our Brain-Work authors who have given their time and expertise to provide stellar content for the C-EBLIP blog. And an equally big thanks to our readers. I’m writing on a lovely June day to tell you about some exciting changes that are coming to the Centre for Evidence Based Library and Information Practice (C-EBLIP). After 6 years, which included applying for and being granted a University of Saskatchewan Type A centre, opening the Centre at the 7th International Evidence Based Library and Information Practice conference (hosted at USask), 3 symposiums, a long-running journal club, a blog, prof dev activities and so much more, I am stepping down as C-EBLIP Director.

I told someone about my plans the other day and the reply was “oh, but that’s your baby!” And it is. But I hoped very much that my baby would leave the nest and fly on its own, with others to carry the good work forward. I am very happy to report that is the case and that my colleague Catherine Boden, a health sciences librarian here at USask, will be assuming the Directorship of C-EBLIP starting July 1. I know that C-EBLIP is in good hands with Catherine. She has a strong background in research methodologies and an interest in promoting evidence-based research as a way to address topics relevant to professional practice and library services.

These past 6 years have been the best. I’ve been able to support librarians as researchers and have been fortunate to be involved in mentoring several librarians on their way through the tenure process. I’ve had challenging and fun discussions with colleagues during our journal club meetings. The C-EBLIP Fall Symposium was a space where librarians could meet and share research and research experiences. The Brain-Work blog gave librarians across Canada a place to share concepts, tips, and thoughts. I’ve seen my ideas come to fruition, thanks to the support of the University Library. A special thank you to Vicki Williamson, who saw potential in my idea as Dean of the Library back in 2012. Thanks as well to Melissa Just, our current Dean, who sees the continued value of C-EBLIP.

Also, I must not forget my dear colleagues who have ensured that C-EBLIP work has not happened in a vacuum. The outputs of the Centre are a result of collaboration, connection, conversation, and teamwork. The events and initiatives undertaken in the past six years would not have been possible without likeminded and generous co-workers. You know who you are!

So what does the future hold for C-EBLIP? I don’t know all the details and that’s good! I’m excited to see what great things Catherine will do with the Centre within the mandate of supporting librarians as researchers and promoting evidence based library and information practice. As for me, I will continue in my role as embedded librarian for the College of Agriculture and Bioresources and as liaison librarian for the School of Environment and Sustainability at USask. And I will always be grateful for the opportunities I’ve been given to do the work I was keen on doing.

Brain-Work will be going on hiatus for the summer months.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Demystifying the Research Ethics Board

By Cara Bradley
University of Regina Library

I have been a member of my institution’s Research Ethics Board (REB) since 2015 and it has turned out to be one of the most valuable and interesting service opportunities of my career. In that time, I have come to realize that many in academia (librarians, graduate students, and faculty) don’t have a clear understanding of the workings of the REB, and some have expressed sheer terror at the prospect of applying for ethical approval of research. I thought I’d share some tips that I’ve learned in my time on the REB, in hopes of making the application process a little smoother for those with less experience in this arena.

Upon joining the REB, I was given a lot of training materials to read. And I mean A LOT! Among these materials was an online tutorial program, the TCPS 2 Tutorial Course on Research Ethics (CORE). This self-paced course was created by the Government of Canada’s Panel on Research Ethics (PRE) to educate ethics board members and researchers about ethical expectations of research funded by the Government of Canada’s various funding programs.* CORE is widely used as a resource for ethics training by Canadian universities and its standards are often applied even to research not funded by the federal government. The course is freely available for anyone to complete online and it is a great way for researchers to get really comfortable with research ethics requirements. It take a couple of hours, but you get a certificate of completion (for the tenure/promotion file!) and it may well save you considerable time in preparing/revising your application.

The second phase of my REB training was co-reviewing applications with a mentor. I was paired with an experienced REB member and we started by reviewing a few applications together, after which we reviewed separately, comparing and discussing our reviews. I learned a great deal in this process, and it might be useful for new REB applicants to consider replicating it. Contact a colleague who has successfully obtained REB approval and ask to read their application. Ask them to read your application. I guarantee that they will point out errors or omissions and correcting them before submission will save you time down the road and give you greater confidence in submitting your application.

Another piece of advice—don’t be scared of us! I know many new researchers feel very intimidated by the REB process. You shouldn’t! REB members, in my experience, are eager to help move your research forward. They have nothing to gain by being difficult or mean (and I am truly very sorry if you encounter one of these rare meanies). I see myself as a critical friend to the researcher, pointing out where things might go awry so that they can avoid pitfalls and succeed in their work. It is extremely rare for an application to be outright rejected—instead, you can expect comments indicating where you need to elaborate on or rethink an aspect of your work. This guidance is intended to strengthen your work and ensure that you don’t inadvertently put your research subjects at risk.

Researchers new to the ethical approval process often think of it as something that happens right at the beginning of your project. And while REB approval is necessary before you make any contact with prospective research participants, your research needs to be thoroughly planned before you can apply for REB approval. We want to know exactly what you plan to do—who you are going to contact, how you are going to do so (including the text of any recruitment emails/posters), and we also want to see your data gathering tools (survey/focus group/interview questions, etc.). Your project should be pretty much “shovel ready,” to the point that your planning is complete and you can start your research immediately upon receiving REB approval.

And finally—fill out every relevant section of the form, and be sure to check the web site of your own institutional ethics board for additional guidance on filling out their specific form. Most university research offices have created significant supplemental content to accompany their forms and ensure that the ethical approval process is a smooth one.

I highly recommend serving on your institution’s Research Ethics Board. It is a great way to learn more about research ethics, gain confidence in completing your own applications, contribute in a small way to the success of research at your institution (I positively beam when I see participant recruitment notices, publications, etc. arising from applications that I have reviewed), and gain a good sense of the range of research conducted on your campus.

* I suspect similar training materials are available in other jurisdictions, but I don’t know the details—let me know in the comments if you can provide information on other countries.

(Editor’s note: Brain-Work is hosted by the University of Saskatchewan and there is a problem with the comments that cannot be resolved. If you try to comment on this or any blog post and you get a “forbidden to comment” error message, please send your comment to virginia.wilson@usask.ca and I will post the comment on your behalf and alert the author. We apologize for this annoying problem.)

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Teaching scholarly communication skills to graduate students

by Chris Chan
Head of Information Services at Hong Kong Baptist University Library

Part1: 7 May 2019

I am writing this first part of the blog post from my comfortable hotel room in downtown Minneapolis, where I have arrived ahead of the LOEX Annual Conference that will start at the end of the week. Navigating US Immigration at Chicago after the longest flight I have ever taken (15 hours!!) has taken its toll, but I am hoping that arriving a few days early will give me a chance to more or less recover from jet lag ahead of the event starting on the 9th.

The time will also allow me to put the finishing touches to my breakout session presentation. I’ll be talking about the efforts we have been making at HKBU Library to ensure our graduate students are equipped with the scholarly communication knowledge and skills that they will need to be successful researchers. For several years we have had required library workshops for our research students that covered the basics of scholarly publishing. These sessions also sought to raise awareness of current issues in scholarly communication, such as open access and altmetrics. Although student feedback has generally been positive, we found it challenging to design sessions that were suitable for both novice researchers and those graduate students that already had prior publication experience. We also wanted to better assess the extent to which students were achieving the learning outcomes of the session, as the results of relatively simple in-class exercises could tell us only so much.

Our new approach, launched this year, has been to adapt our workshop content into a modular online course. The course is designed so that students can skip content that they are already familiar with. To fulfill the new course requirement, students need to achieve a passing grade on a short online quiz assessing their knowledge of course content. In my presentation, I’ll be sharing the results from our first year of implementation. I’m also hoping to find out what approaches other institutions are taking, and to this end I’ll be using Mentimeter for the entire presentation. I’m a little nervous about having to rely on an online service, but fingers crossed that it runs smoothly. Another benefit is that I will be able to share the results in the second part of this blog post.

Part 2: 11 May 2019
All done! The conference was excellent – there were so many things that I will be bringing back to my own institution. As for my own presentation, everything went smoothly in technological terms. Mentimeter worked as advertised, and having the interactive segments seemed to help keep things interesting for the audience. Their responses were incorporated into the presentation material in real-time. For example, the results for this question supported a point that I had seen in the literature – that this type of support for graduate students is often not formally assessed (the full question was “How is support for graduate student scholarly communication skill development assessed at your institution?”):

I also used the built-in quiz function in Mentimeter to showcase some of the questions that we use to assess the student learning. Shout out to Marcela for winning!

You can view the full presentation (including the results of the audience voting) here: https://www.mentimeter.com/s/e1451a492dd1d3a21747448a6ff3ce70

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Breaking up with ResearchGate: Streamlining Scholarly Profile Online

by Kathleen Reed
Vancouver Island University

I’ve finally had it with ResearchGate. After what feels like the hundredth time the site emailed me to ask “Did your colleague [name] publish [article]?”, I’m through. These nagging emails are annoying, and asking me to report on my colleagues crosses a line. Beyond my annoyance with spam email, though, lies a deeper question that I’ve been pondering lately: what does a manageable, well-curated online scholarly profile look like?

You’d think I would have a good answer to this question for myself, being a librarian that leads sessions on this very question. But up to this point, my profile is a mishmash of full-text and indexed publications, across multiple profile platforms. These include my institution’s digital repository, Twitter, ORCID, Google Scholar, Research Gate, and Academia.edu. I make all my work open access, but not in one central place.

I tell myself that this scatter-shot approach has been at least partially because I demonstrate multiple sites for other researchers as part of my job, and I need to be familiar with them. And I worry that I’ll be splitting my readership stats if I publish in an OA journal, and then turn around and put my work up OA somewhere else. Mostly, though, keeping all of my profiles updated is a time-consuming task and just doesn’t happen. Thus, I have a series of half-completed and stale profiles online – not exactly the scholarly image I wish to project, and certainly not what I preach in my sessions on the subject.

During the upcoming year I’m off on leave to start a PhD, and scholarly profile seems more important than ever before. Add to that the idea of not getting annoying ResearchGate emails, and I’m finding motivation to change my online profile. Yes, I know I can opt-out of ResearchGate emails and still have a presence on the site. But the monetizing of public scholarship on private platforms bothers me. I don’t want to promote that ResearchGate and Academia.edu are acceptable places to deposit OA versions – they’re not, according to the Tri-Agencies. So I’ve decided to focus on my institution’s IR, ORCID, and Google Scholar. Three places to update seems more manageable, and I like getting away from for-profit companies at least a little. See ya, ResearchGate.

How do you manage your scholarly profile online? If you feel like you’ve got a system that works, what does that look like? Please share in the comments below.

(Editor’s note: Brain-Work is hosted by the University of Saskatchewan and there is a problem with the comments that cannot be resolved. If you try to comment on this or any blog post and you get a “forbidden to comment” error message, please send your comment to virginia.wilson@usask.ca and I will post the comment on your behalf and alert the author. We apologize for this annoying problem.)

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The Beauty of ORCIDs

by Kristin Hoffmann, University of Western Ontario

A few months ago, I published an article that Jane Schmidt had written for the Canadian Journal of Academic Librarianship special issue on diversity (I am one of CJAL’s co-editors). Before I knew it, Jane had tweeted about her article’s publication. I was surprised; had she been constantly refreshing the journal’s web site? How had she found out about it so quickly?

I got my answer in a follow-up tweet:
Screenshot of a tweet from Jane Schmidt that says “Also, shout out to @ORCID_Org for notifying me that it was published! Shazam!”

And that, librarian-authors, is the beauty of ORCID:

• When you have an ORCID ID,
• and you give it to a journal you are publishing with,*
• and the journal registers DOIs for the articles it publishes (and many journals do),

then, when the journal publishes your article,
• the DOI registration sends the information about your ORCID to CrossRef,
• and CrossRef sends information to ORCID about your new article,
• and ORCID lets you know that it has added a new publication to your profile.

This all happens seamlessly. It’s a great example of technologies talking to each other and making our researching and authoring lives easier.

As an editor, it was gratifying to see an author promote her publication online so soon after it was published. I’ve also used my ORCID ID as an author, and getting the notification from ORCID that my publication was added to my profile was a good ego boost—even an automated email can be affirming.

Other benefits of an ORCID profile include:
• Pulling together publications with different names or name variants (e.g., initials, full first names, different last names)
• Helping you keep your CV up to date
• Communicating information to and from funding agencies, if you apply for grants
• Helping you demonstrate the attention and reach of your publications, by connecting ORCID with tools such as ImpactStory

If you don’t have an ORCID ID, take 30 seconds to sign up for one at the ORCID site, https://orcid.org/.

And take another ten minutes or so to add your previous publications to your profile:

Then include your ORCID ID with the next article you submit, and when your article is published, you too can have a Shazam! moment and experience the beauty of ORCID.

*Technically, in this case, I searched for Jane’s ORCID ID and added it to her article’s metadata before I published the article.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Ethical Publishing Choices and the Librarian Researcher

by DeDe Dawson @dededawson
Science Library, University of Saskatchewan

As librarians we have a unique vantage point on the scholarly publishing market – both as publishing researchers ourselves and as our institution’s agents in acquiring content from publishers. We are perfectly situated to appreciate the dysfunction and unsustainability of the current for-profit system. And I believe we have a professional obligation to raise the awareness of our university colleagues about this issue. Certainly many of our faculty colleagues already have some level of awareness, but the details and extent of the problem remains mostly hidden to the average person outside of libraries.

In the past month or so I have been riveted by the steady stream of news and analyses of the University of California (UC) system’s cancellation of all Elsevier journal titles. It is not that the UC system cannot afford the big deal subscription. UC is actually taking a principled stand with their key goal being “securing universal open access to UC research while containing the rapidly escalating costs associated with for-profit journals.”

The UC libraries have worked for a decade or so now to raise the awareness of the faculty on their campuses of the problems with the current publishing system and the benefits of a transition to open access. So, the faculty are largely supportive of the stance UC libraries took with Elsevier. Some have even started a petition to boycott Elsevier in support of open access. Those who have signed resolve to publish their work elsewhere and to refuse to donate their time as reviewers and editorial board members. The free content and labour provided by the authors, reviewers, and editors is why commercial scholarly publishers are so extremely profitable. As Adriane MacDonald and Nicole Eva of University of Lethbridge note: It’s time to stand up to the academic publishing industry.

This is not just a library problem. And solutions need to come with the active involvement of the community of authors, reviewers, editors, and readers. Authors, reviewers, and editors in particular have real power! As Lorcan Dempsey ends his recent blog post on the UC cancellations:

“The UC action has galvanized attention. For Elsevier, the financial impact may be less of an issue than the potential loss of participation in their journals of UC authors, editors, and reviewers. This is because of the scale of the UC research enterprise. For faculty elsewhere, it is potentially important as an exemplary event – the example of UC authors may have more of an influence than the exhortation of their library. For other consortia and libraries it is a call to action.”

What about us? As librarian-researchers, those most aware of the problems in the current system, do we have an ethical obligation to lead by example with our publishing, editorial, and reviewing choices?

Personally, I think so. For years I have chosen to only publish my research in open access journals and I will not donate my time as a peer reviewer or editorial board member to closed-access, for-profit journals either. I consider this an ethical and values-driven decision. Having said that, I recognize I am in a privileged position as a tenured librarian (though I made this decision well before I achieved tenure), so I will not judge those who feel they need to publish in certain titles for career advancement. I only note that this in itself is the underlying reason for this dysfunctional market: the incentive structures in academia are extremely problematic. If we could let go of our addiction to “high impact” and “prestige” journals, and instead judge research by its own merits (not the package it comes in), then we could free ourselves from the grip of the Elseviers of the world. But I have already written an entire blogpost on that…

I’ll end with a reminder that the C-EBLIP website hosts a list of peer-reviewed LIS journals, those that are open access are identified by the orange open lock symbol!

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Troubleshooting or Trouble? When Research Tools Fail

By Elizabeth Stregger
Mount Allison University Library

Spoiler alert: by the end of this story, a Data and Digital Services Librarian finds joy in coding with paper.

When my research collaborator, Dr. Christiana MacDougall, asked about using RQDA (R package for Qualitative Data Analysis) to analyze our data, I was enthusiastic. Open source software, a different way to use R, and yes, it was listed on some library guides. There were detailed YouTube tutorials. I was confident that it would meet our needs.

Following the installation instructions for Mac OSX (last tested in June 2016) was not immediately successful. I found some helpful advice on GitHub (Kopf, 2014) and installed RQDA on the three computers we use most frequently, all Macs. Our first impression was that the interface was a bit clunky and slow. We could cope with it. After all, we’d said we would use RQDA in our Data Management Plan, and installing it had been quite a lot of work. I thought we were on track.

Then we started coding. The system lagged, making it very hard to select text. Was it bad wifi in the coffee shop? Would it work better with a mouse? Did the file location make a difference? I was determined to find a way for this work, so that I could give other faculty members solid advice in the future.

I installed it on my work desktop computer, a Windows machine. Finally, RQDA worked as expected. At that point, I knew that my best advice for faculty members was to abandon any attempt to use RQDA on a Mac.

I did my coding using RQDA on my work computer. Christiana printed, cut, and manually sorted our data. While I was proud of trying my very best to get a system to work, I gained a lot of appreciation for analog methods. Moving strips of paper around on a wooden table is as satisfying as working on a puzzle or sorting shades of yarn into a fade.

I’m grateful that Christiana was open, curious, and patient with my persistence. In addition to good communication with your research partners, these are my recommendations for working with open source tools:
– Try using the research tools your library is promoting as open source options
– Keep an eye on how long it has been since open source tools have been tested for operating systems
– Balance user experience with your commitment to open source
– Share stories about research tools
– Contribute to open source projects if you have the skills
– Have a backup plan!

Reference

Kopf, S. (2014). Installation information for R with GTK on Windows/Mac OS. Retrieved from https://gist.github.com/sebkopf/9405675

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Did They Learn Anything? Strategies for Evaluating Discussion Posts

By Tasha Maddison, Saskatchewan Polytechnic

Transitioning library instruction into an online platform allows for increased flexibility for content provision, as well as further opportunities for assessment; provided that the learners access and fully engage with the content. Saskatchewan Polytechnic has a new online library module for faculty intended to complement face-to-face information literacy sessions. The course covers the entire research process; from information need, to writing and formatting academic papers. Opportunities to assess the students’ learning are built into each learning outcome through discussion posts and quizzes.

The key to a successful blended learning project is the “purposefully integrated assessment, both formative and summative, into the actual design of the course” (Arora, Evans, Gardener, Gulbrandsen & Riley, 2015, p. 249), thus evaluating student learning by incorporating quizzes and discussions. Their goal was to “create an online community of active and engaged learners”, and through instructor prompts, “students were directed to state what they found valuable in the peers’ response, what questions they had … and also what support they would add to make their peers’ argument stronger” (Arora et al., 2015, p. 239). The researchers noted the assessment activities “layered the learning experience – helped to generate and sustain student interest and enthusiasm for the course material to fashion a vibrant community of active learners in the process of meaning- and knowledge-making” (p. 241). Perhaps their most compelling statement is that the “students saw the online classroom as an active learning space, not just a repository for materials or a place to take quizzes” (p. 248).

Students in our online course were informally evaluated on their responses to three discussion posts. Being present and available to the online audience is vital for both student success and their corresponding engagement with the learning materials. Students are more likely to participate in a conversation if they feel that there is someone reviewing their posts and who is prepared to respond when necessary or appropriate. Discussion posts were scheduled at the mid-way point of each learning outcome so that students could reflect on the information that was just covered, as well as provide a foundation to the materials that were included later in the outcome. A librarian reviewed all content and responded to most of the discussion posts, providing feedback and suggestions. Anecdotally, responses were thoughtful and thorough, demonstrating a high level of comprehension and a strong connection with the content.

Participation in discussion posts allows students to synthesise their thoughts about a particular topic and more importantly, to learn from their peer group. Students can review the sentiments shared by their colleagues and learn from their experiences. According to Arora et al. (2015) discussion posts, help students to develop individual skills such as “critical reading and analysis through close reading and textual analysis” (p. 243) and builds community by “encouraging students to work together on their interpretive skills and in negotiating knowledge” (p. 244). This form of assessment is meaningful to the instructor as well, as they can learn about the student and their understanding of the materials.

This course was piloted in July 2018. Discussions were included in the pilot, but were not formally assessed at the time. The development of a rubric was designed to evaluate the student’s output for the next iteration of the course this spring. A rubric will assist students in determining how they are doing in the course and identify any growth areas.

In order to perfect formative assessment of online library instruction, librarians should include a variety of measurement instruments that fully address student learning. As recognized in the work of Lockhart (2015) “it is very important that the academic programme continually tests and builds the information literacy skills of students” (p. 20), which can be accomplished by developing strong partnerships with instructors so that the librarians can adequately analyse student learning and success.

The module discussed here was developed by Diane Zerr and Tasha Maddison, librarians at Saskatchewan Polytechnic, and is used by students in the Adult Teaching and Learning program.

References
Arora, A., Evans, S., Gardner, C., Gulbrandsen, K., & Riley, J. E. (2015). Strategies for success: Using formative assessment to build skills and community in the blended classroom. In S. Koç, X. Liu & P. Wachira (Eds.), Assessment in online and blended learning environments (pp. 235-251), [EBSCOhost eBook Collection]. Retrieved from https://ezproxy.saskpolytech.ca/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=971840&site=ehost-live&scope=site

Lockhart, J. (2015). Measuring the application of information literacy skills after completion of a certificate in information literacy. South African Journal of Libraries and Information Science, 81(2), 19-25. doi:10.7553/81-2-1567

table of the rubric used in the assessment
Click for larger image

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

When Research isn’t Counted

by Nicole Eva
University of Lethbridge

As the last 6 months of my 2-year reporting period wind down, and as the same time remains until the start of my study leave, I have been reflecting on the research I’ve done in the recent past. It’s been an unusually high period of service for me – for 2016/2017 & 2017/2018 I was chair of our faculty association’s Gender, Equity, and Diversity Committee, during which time I conducted an extensive literature review on the potential biases in Student Evaluations of Teaching (statement can be found here; annotated literature review here), and I searched the literature for examples of faculty perception surveys to lead the creation of such a survey at our institution. This past year I’ve served as past chair on that committee, during which time a few of us have been involved in a deep qualitative analysis of those survey results. I am also chairing the President’s Committee on Diversity, Equity, and Inclusion this year, for which we conducted surveys, held campus consultation sessions, and interviewed local experts for their thoughts. It’s been enlightening and valuable work, but there’s a lot of it. And a lot of the analysis is among the most rigorous research I’ve been involved with to date. While it will pay off in great experience for future research I might take on, it won’t be ‘counted’ in the traditional sense in terms of publication output. The survey work of both groups is highly confidential and while reports are being produced, they won’t be published in a peer-reviewed journal. The same goes for the Teaching Evaluations work; while both the statement and the annotated literature review are published in our Institutional Repository, again they aren’t ‘published’ in the traditional sense. I am fortunate that I did produce one other article last year which should be published this year, and some prior work which finally came out in the last couple of years, so it’s not like I have nothing for publications; but still, there has been SO much time, effort, and actual rigorous research done regarding these projects I hate to ignore them as research output.

Another element: I’ve been pulled into writing a grant which supports some of our President’s committee recommendations. But again, while normally grant applications would be counted under Research, in this case it’s more Service (which is of course limited in its value for evaluation purposes, and not at all for promotional purposes). But from my perspective, having limited experience applying for external grants, the experience is invaluable. But that value is quite invisible. I’ve also gained a lot in terms of the people I’ve worked with, relationships developed, and institutional knowledge gained. But again, intangible.

It made me curious: what ‘counts’ as research? If you’re doing research for committee work that results in internal documents, does it still ‘count’? If it’s highly confidential and you can never publish the results because you didn’t clear ethics for that purpose, does it ‘count’?

These are my thoughts as I was faced with writing this blog. What has changed since my last blog post, in terms of actual research effort? Well, quite a bit. And yet it looks like nothing at all.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

[Editor’s note: I encourage reader comments on this issue. Brain-Work is hosted by the University of Saskatchewan and there is a problem with the comments that cannot be resolved. If you try to comment on this or any blog post and you get a “forbidden to comment” error message, please send your comment to virginia.wilson@usask.ca and I will post the comment on your behalf and alert the author. I apologize for this annoying problem.]

Getting Loud in the Archives

by Kristin Lee, Tisch Library, Tufts University

I got to spend last week in the Archives of The Ringling Museum, home of records related to circus and the Ringling Brothers. In addition to escaping chilly Boston for sunny Sarasota, FL, I also got to meet up with an amazing, brilliant group of women who are also doing research in various areas of circus and sideshow. We all met last summer at the Circus Historical Society (CHS) Convention in Baraboo, WI, and decided that a joint trip would be a way to get some research done and save some money.

I went into this trip without a well-defined research question. My interest is in collecting and pulling together data about the circus to create a foundation that other researchers can use for their work. I want the ledgers, routes, and receipts; basically, anything that comes in the form of a table. I have some parameters around my research (Illinois State University’s Milner Library’s Special Collections has contracts and permits for 1914, so that has become a focus), but mostly I just want to turn these facts into data that a computer can do something with. For my early forays into circus I used the resources on the CHS website, so it was nice to see the beautiful, flowing handwriting in the ledgers and the details on the route cards that I didn’t see in the transcribed routes.

The Archives at the Ringling are in the Tibbals Learning Center, which is also the home of circus exhibits and The Howard Bros. Circus, a miniature circus created by Howard Tibbals. When our group arrived at the Archives on our first of three research days we were greeted by the staff there with warm handshakes and enthusiasm. As someone who has a day job assisting researchers it was a little overwhelming to be treated like I was a real researcher in my own right (cue impostor syndrome). Everyone was accommodating and helpful but also, I think more importantly, as excited about our research as we were. Files, journals, and scrapbooks were passed around, read through, and immediately replaced with some new treasure. Requests were revised on the fly and I think we all got something special that we didn’t even know existed. I marvel at my colleagues who work in archives and can find things in their collections that answers questions you didn’t even know you had.

One of the great joys of this trip was being able to share in everyone’s delight as they came across information that they hadn’t previously known and documents that filled in pieces of the puzzles from centuries past that give us a picture of the people who inhabited the early American circus. Everyone helped each other and pointed out materials that we thought might help the others. Squeals of delight and gasps of astonishment were common and everyone (including several of the people who worked at the archives) would gather around to discuss the item in question. When handwriting was unclear (P. T. Barnum had especially terrible penmanship) the letter was passed around to get more opinions. No one shushed us or gave us dirty look, and for all our discussion I think we got some good research done.

I feel like I was especially fortunate to be there and get a better understanding of how my work can help other researchers where my research fits in the broader field. I will probably never write a book, but I will make some cool maps, data visualizations, and tables that will provide access to facts about the circus that will be important building blocks in other projects. Even though I spend my days advocating that research data is a valid research product it has taken me a while to recognize that my own work counts too. There is a lot of work involved in creating clean datasets, and I tend to dismiss that because this is research that I do “for fun”. Working alone I sometimes worry that I am the only one in the world who will care about things like where the Sells-Floto Circus was playing in 1914 but getting the chance to talk circus for a week with other people who are as excited about their areas as I am about mine reminded me that I am definitely not alone and that what I do has value.

I have never been particularly good at solitary pursuits, so this format of research and exploration suited me very well. If you can find a group of people to storm an archive with I highly recommend it.

Thanks to my circus ladies for a great week: Betsy Golden Kellem; Amelia Osterud; Shannon Scott; and Kat Vecchio.

Enormous thanks to all the people at the Ringling Archives for your generosity and patience: Jennifer Lemmer Posey, Tibbals Curator of Circus; Heidi Connor, Archivist; Peggy Williams, Education Outreach Manager at Feld Entertainment.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.