Breaking up with ResearchGate: Streamlining Scholarly Profile Online

by Kathleen Reed
Vancouver Island University

I’ve finally had it with ResearchGate. After what feels like the hundredth time the site emailed me to ask “Did your colleague [name] publish [article]?”, I’m through. These nagging emails are annoying, and asking me to report on my colleagues crosses a line. Beyond my annoyance with spam email, though, lies a deeper question that I’ve been pondering lately: what does a manageable, well-curated online scholarly profile look like?

You’d think I would have a good answer to this question for myself, being a librarian that leads sessions on this very question. But up to this point, my profile is a mishmash of full-text and indexed publications, across multiple profile platforms. These include my institution’s digital repository, Twitter, ORCID, Google Scholar, Research Gate, and Academia.edu. I make all my work open access, but not in one central place.

I tell myself that this scatter-shot approach has been at least partially because I demonstrate multiple sites for other researchers as part of my job, and I need to be familiar with them. And I worry that I’ll be splitting my readership stats if I publish in an OA journal, and then turn around and put my work up OA somewhere else. Mostly, though, keeping all of my profiles updated is a time-consuming task and just doesn’t happen. Thus, I have a series of half-completed and stale profiles online – not exactly the scholarly image I wish to project, and certainly not what I preach in my sessions on the subject.

During the upcoming year I’m off on leave to start a PhD, and scholarly profile seems more important than ever before. Add to that the idea of not getting annoying ResearchGate emails, and I’m finding motivation to change my online profile. Yes, I know I can opt-out of ResearchGate emails and still have a presence on the site. But the monetizing of public scholarship on private platforms bothers me. I don’t want to promote that ResearchGate and Academia.edu are acceptable places to deposit OA versions – they’re not, according to the Tri-Agencies. So I’ve decided to focus on my institution’s IR, ORCID, and Google Scholar. Three places to update seems more manageable, and I like getting away from for-profit companies at least a little. See ya, ResearchGate.

How do you manage your scholarly profile online? If you feel like you’ve got a system that works, what does that look like? Please share in the comments below.

(Editor’s note: Brain-Work is hosted by the University of Saskatchewan and there is a problem with the comments that cannot be resolved. If you try to comment on this or any blog post and you get a “forbidden to comment” error message, please send your comment to virginia.wilson@usask.ca and I will post the comment on your behalf and alert the author. We apologize for this annoying problem.)

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

The Beauty of ORCIDs

by Kristin Hoffmann, University of Western Ontario

A few months ago, I published an article that Jane Schmidt had written for the Canadian Journal of Academic Librarianship special issue on diversity (I am one of CJAL’s co-editors). Before I knew it, Jane had tweeted about her article’s publication. I was surprised; had she been constantly refreshing the journal’s web site? How had she found out about it so quickly?

I got my answer in a follow-up tweet:
Screenshot of a tweet from Jane Schmidt that says “Also, shout out to @ORCID_Org for notifying me that it was published! Shazam!”

And that, librarian-authors, is the beauty of ORCID:

• When you have an ORCID ID,
• and you give it to a journal you are publishing with,*
• and the journal registers DOIs for the articles it publishes (and many journals do),

then, when the journal publishes your article,
• the DOI registration sends the information about your ORCID to CrossRef,
• and CrossRef sends information to ORCID about your new article,
• and ORCID lets you know that it has added a new publication to your profile.

This all happens seamlessly. It’s a great example of technologies talking to each other and making our researching and authoring lives easier.

As an editor, it was gratifying to see an author promote her publication online so soon after it was published. I’ve also used my ORCID ID as an author, and getting the notification from ORCID that my publication was added to my profile was a good ego boost—even an automated email can be affirming.

Other benefits of an ORCID profile include:
• Pulling together publications with different names or name variants (e.g., initials, full first names, different last names)
• Helping you keep your CV up to date
• Communicating information to and from funding agencies, if you apply for grants
• Helping you demonstrate the attention and reach of your publications, by connecting ORCID with tools such as ImpactStory

If you don’t have an ORCID ID, take 30 seconds to sign up for one at the ORCID site, https://orcid.org/.

And take another ten minutes or so to add your previous publications to your profile:

Then include your ORCID ID with the next article you submit, and when your article is published, you too can have a Shazam! moment and experience the beauty of ORCID.

*Technically, in this case, I searched for Jane’s ORCID ID and added it to her article’s metadata before I published the article.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Ethical Publishing Choices and the Librarian Researcher

by DeDe Dawson @dededawson
Science Library, University of Saskatchewan

As librarians we have a unique vantage point on the scholarly publishing market – both as publishing researchers ourselves and as our institution’s agents in acquiring content from publishers. We are perfectly situated to appreciate the dysfunction and unsustainability of the current for-profit system. And I believe we have a professional obligation to raise the awareness of our university colleagues about this issue. Certainly many of our faculty colleagues already have some level of awareness, but the details and extent of the problem remains mostly hidden to the average person outside of libraries.

In the past month or so I have been riveted by the steady stream of news and analyses of the University of California (UC) system’s cancellation of all Elsevier journal titles. It is not that the UC system cannot afford the big deal subscription. UC is actually taking a principled stand with their key goal being “securing universal open access to UC research while containing the rapidly escalating costs associated with for-profit journals.”

The UC libraries have worked for a decade or so now to raise the awareness of the faculty on their campuses of the problems with the current publishing system and the benefits of a transition to open access. So, the faculty are largely supportive of the stance UC libraries took with Elsevier. Some have even started a petition to boycott Elsevier in support of open access. Those who have signed resolve to publish their work elsewhere and to refuse to donate their time as reviewers and editorial board members. The free content and labour provided by the authors, reviewers, and editors is why commercial scholarly publishers are so extremely profitable. As Adriane MacDonald and Nicole Eva of University of Lethbridge note: It’s time to stand up to the academic publishing industry.

This is not just a library problem. And solutions need to come with the active involvement of the community of authors, reviewers, editors, and readers. Authors, reviewers, and editors in particular have real power! As Lorcan Dempsey ends his recent blog post on the UC cancellations:

“The UC action has galvanized attention. For Elsevier, the financial impact may be less of an issue than the potential loss of participation in their journals of UC authors, editors, and reviewers. This is because of the scale of the UC research enterprise. For faculty elsewhere, it is potentially important as an exemplary event – the example of UC authors may have more of an influence than the exhortation of their library. For other consortia and libraries it is a call to action.”

What about us? As librarian-researchers, those most aware of the problems in the current system, do we have an ethical obligation to lead by example with our publishing, editorial, and reviewing choices?

Personally, I think so. For years I have chosen to only publish my research in open access journals and I will not donate my time as a peer reviewer or editorial board member to closed-access, for-profit journals either. I consider this an ethical and values-driven decision. Having said that, I recognize I am in a privileged position as a tenured librarian (though I made this decision well before I achieved tenure), so I will not judge those who feel they need to publish in certain titles for career advancement. I only note that this in itself is the underlying reason for this dysfunctional market: the incentive structures in academia are extremely problematic. If we could let go of our addiction to “high impact” and “prestige” journals, and instead judge research by its own merits (not the package it comes in), then we could free ourselves from the grip of the Elseviers of the world. But I have already written an entire blogpost on that…

I’ll end with a reminder that the C-EBLIP website hosts a list of peer-reviewed LIS journals, those that are open access are identified by the orange open lock symbol!

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Troubleshooting or Trouble? When Research Tools Fail

By Elizabeth Stregger
Mount Allison University Library

Spoiler alert: by the end of this story, a Data and Digital Services Librarian finds joy in coding with paper.

When my research collaborator, Dr. Christiana MacDougall, asked about using RQDA (R package for Qualitative Data Analysis) to analyze our data, I was enthusiastic. Open source software, a different way to use R, and yes, it was listed on some library guides. There were detailed YouTube tutorials. I was confident that it would meet our needs.

Following the installation instructions for Mac OSX (last tested in June 2016) was not immediately successful. I found some helpful advice on GitHub (Kopf, 2014) and installed RQDA on the three computers we use most frequently, all Macs. Our first impression was that the interface was a bit clunky and slow. We could cope with it. After all, we’d said we would use RQDA in our Data Management Plan, and installing it had been quite a lot of work. I thought we were on track.

Then we started coding. The system lagged, making it very hard to select text. Was it bad wifi in the coffee shop? Would it work better with a mouse? Did the file location make a difference? I was determined to find a way for this work, so that I could give other faculty members solid advice in the future.

I installed it on my work desktop computer, a Windows machine. Finally, RQDA worked as expected. At that point, I knew that my best advice for faculty members was to abandon any attempt to use RQDA on a Mac.

I did my coding using RQDA on my work computer. Christiana printed, cut, and manually sorted our data. While I was proud of trying my very best to get a system to work, I gained a lot of appreciation for analog methods. Moving strips of paper around on a wooden table is as satisfying as working on a puzzle or sorting shades of yarn into a fade.

I’m grateful that Christiana was open, curious, and patient with my persistence. In addition to good communication with your research partners, these are my recommendations for working with open source tools:
– Try using the research tools your library is promoting as open source options
– Keep an eye on how long it has been since open source tools have been tested for operating systems
– Balance user experience with your commitment to open source
– Share stories about research tools
– Contribute to open source projects if you have the skills
– Have a backup plan!

Reference

Kopf, S. (2014). Installation information for R with GTK on Windows/Mac OS. Retrieved from https://gist.github.com/sebkopf/9405675

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Did They Learn Anything? Strategies for Evaluating Discussion Posts

By Tasha Maddison, Saskatchewan Polytechnic

Transitioning library instruction into an online platform allows for increased flexibility for content provision, as well as further opportunities for assessment; provided that the learners access and fully engage with the content. Saskatchewan Polytechnic has a new online library module for faculty intended to complement face-to-face information literacy sessions. The course covers the entire research process; from information need, to writing and formatting academic papers. Opportunities to assess the students’ learning are built into each learning outcome through discussion posts and quizzes.

The key to a successful blended learning project is the “purposefully integrated assessment, both formative and summative, into the actual design of the course” (Arora, Evans, Gardener, Gulbrandsen & Riley, 2015, p. 249), thus evaluating student learning by incorporating quizzes and discussions. Their goal was to “create an online community of active and engaged learners”, and through instructor prompts, “students were directed to state what they found valuable in the peers’ response, what questions they had … and also what support they would add to make their peers’ argument stronger” (Arora et al., 2015, p. 239). The researchers noted the assessment activities “layered the learning experience – helped to generate and sustain student interest and enthusiasm for the course material to fashion a vibrant community of active learners in the process of meaning- and knowledge-making” (p. 241). Perhaps their most compelling statement is that the “students saw the online classroom as an active learning space, not just a repository for materials or a place to take quizzes” (p. 248).

Students in our online course were informally evaluated on their responses to three discussion posts. Being present and available to the online audience is vital for both student success and their corresponding engagement with the learning materials. Students are more likely to participate in a conversation if they feel that there is someone reviewing their posts and who is prepared to respond when necessary or appropriate. Discussion posts were scheduled at the mid-way point of each learning outcome so that students could reflect on the information that was just covered, as well as provide a foundation to the materials that were included later in the outcome. A librarian reviewed all content and responded to most of the discussion posts, providing feedback and suggestions. Anecdotally, responses were thoughtful and thorough, demonstrating a high level of comprehension and a strong connection with the content.

Participation in discussion posts allows students to synthesise their thoughts about a particular topic and more importantly, to learn from their peer group. Students can review the sentiments shared by their colleagues and learn from their experiences. According to Arora et al. (2015) discussion posts, help students to develop individual skills such as “critical reading and analysis through close reading and textual analysis” (p. 243) and builds community by “encouraging students to work together on their interpretive skills and in negotiating knowledge” (p. 244). This form of assessment is meaningful to the instructor as well, as they can learn about the student and their understanding of the materials.

This course was piloted in July 2018. Discussions were included in the pilot, but were not formally assessed at the time. The development of a rubric was designed to evaluate the student’s output for the next iteration of the course this spring. A rubric will assist students in determining how they are doing in the course and identify any growth areas.

In order to perfect formative assessment of online library instruction, librarians should include a variety of measurement instruments that fully address student learning. As recognized in the work of Lockhart (2015) “it is very important that the academic programme continually tests and builds the information literacy skills of students” (p. 20), which can be accomplished by developing strong partnerships with instructors so that the librarians can adequately analyse student learning and success.

The module discussed here was developed by Diane Zerr and Tasha Maddison, librarians at Saskatchewan Polytechnic, and is used by students in the Adult Teaching and Learning program.

References
Arora, A., Evans, S., Gardner, C., Gulbrandsen, K., & Riley, J. E. (2015). Strategies for success: Using formative assessment to build skills and community in the blended classroom. In S. Koç, X. Liu & P. Wachira (Eds.), Assessment in online and blended learning environments (pp. 235-251), [EBSCOhost eBook Collection]. Retrieved from https://ezproxy.saskpolytech.ca/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=971840&site=ehost-live&scope=site

Lockhart, J. (2015). Measuring the application of information literacy skills after completion of a certificate in information literacy. South African Journal of Libraries and Information Science, 81(2), 19-25. doi:10.7553/81-2-1567

table of the rubric used in the assessment
Click for larger image

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

When Research isn’t Counted

by Nicole Eva
University of Lethbridge

As the last 6 months of my 2-year reporting period wind down, and as the same time remains until the start of my study leave, I have been reflecting on the research I’ve done in the recent past. It’s been an unusually high period of service for me – for 2016/2017 & 2017/2018 I was chair of our faculty association’s Gender, Equity, and Diversity Committee, during which time I conducted an extensive literature review on the potential biases in Student Evaluations of Teaching (statement can be found here; annotated literature review here), and I searched the literature for examples of faculty perception surveys to lead the creation of such a survey at our institution. This past year I’ve served as past chair on that committee, during which time a few of us have been involved in a deep qualitative analysis of those survey results. I am also chairing the President’s Committee on Diversity, Equity, and Inclusion this year, for which we conducted surveys, held campus consultation sessions, and interviewed local experts for their thoughts. It’s been enlightening and valuable work, but there’s a lot of it. And a lot of the analysis is among the most rigorous research I’ve been involved with to date. While it will pay off in great experience for future research I might take on, it won’t be ‘counted’ in the traditional sense in terms of publication output. The survey work of both groups is highly confidential and while reports are being produced, they won’t be published in a peer-reviewed journal. The same goes for the Teaching Evaluations work; while both the statement and the annotated literature review are published in our Institutional Repository, again they aren’t ‘published’ in the traditional sense. I am fortunate that I did produce one other article last year which should be published this year, and some prior work which finally came out in the last couple of years, so it’s not like I have nothing for publications; but still, there has been SO much time, effort, and actual rigorous research done regarding these projects I hate to ignore them as research output.

Another element: I’ve been pulled into writing a grant which supports some of our President’s committee recommendations. But again, while normally grant applications would be counted under Research, in this case it’s more Service (which is of course limited in its value for evaluation purposes, and not at all for promotional purposes). But from my perspective, having limited experience applying for external grants, the experience is invaluable. But that value is quite invisible. I’ve also gained a lot in terms of the people I’ve worked with, relationships developed, and institutional knowledge gained. But again, intangible.

It made me curious: what ‘counts’ as research? If you’re doing research for committee work that results in internal documents, does it still ‘count’? If it’s highly confidential and you can never publish the results because you didn’t clear ethics for that purpose, does it ‘count’?

These are my thoughts as I was faced with writing this blog. What has changed since my last blog post, in terms of actual research effort? Well, quite a bit. And yet it looks like nothing at all.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

[Editor’s note: I encourage reader comments on this issue. Brain-Work is hosted by the University of Saskatchewan and there is a problem with the comments that cannot be resolved. If you try to comment on this or any blog post and you get a “forbidden to comment” error message, please send your comment to virginia.wilson@usask.ca and I will post the comment on your behalf and alert the author. I apologize for this annoying problem.]

Getting Loud in the Archives

by Kristin Lee, Tisch Library, Tufts University

I got to spend last week in the Archives of The Ringling Museum, home of records related to circus and the Ringling Brothers. In addition to escaping chilly Boston for sunny Sarasota, FL, I also got to meet up with an amazing, brilliant group of women who are also doing research in various areas of circus and sideshow. We all met last summer at the Circus Historical Society (CHS) Convention in Baraboo, WI, and decided that a joint trip would be a way to get some research done and save some money.

I went into this trip without a well-defined research question. My interest is in collecting and pulling together data about the circus to create a foundation that other researchers can use for their work. I want the ledgers, routes, and receipts; basically, anything that comes in the form of a table. I have some parameters around my research (Illinois State University’s Milner Library’s Special Collections has contracts and permits for 1914, so that has become a focus), but mostly I just want to turn these facts into data that a computer can do something with. For my early forays into circus I used the resources on the CHS website, so it was nice to see the beautiful, flowing handwriting in the ledgers and the details on the route cards that I didn’t see in the transcribed routes.

The Archives at the Ringling are in the Tibbals Learning Center, which is also the home of circus exhibits and The Howard Bros. Circus, a miniature circus created by Howard Tibbals. When our group arrived at the Archives on our first of three research days we were greeted by the staff there with warm handshakes and enthusiasm. As someone who has a day job assisting researchers it was a little overwhelming to be treated like I was a real researcher in my own right (cue impostor syndrome). Everyone was accommodating and helpful but also, I think more importantly, as excited about our research as we were. Files, journals, and scrapbooks were passed around, read through, and immediately replaced with some new treasure. Requests were revised on the fly and I think we all got something special that we didn’t even know existed. I marvel at my colleagues who work in archives and can find things in their collections that answers questions you didn’t even know you had.

One of the great joys of this trip was being able to share in everyone’s delight as they came across information that they hadn’t previously known and documents that filled in pieces of the puzzles from centuries past that give us a picture of the people who inhabited the early American circus. Everyone helped each other and pointed out materials that we thought might help the others. Squeals of delight and gasps of astonishment were common and everyone (including several of the people who worked at the archives) would gather around to discuss the item in question. When handwriting was unclear (P. T. Barnum had especially terrible penmanship) the letter was passed around to get more opinions. No one shushed us or gave us dirty look, and for all our discussion I think we got some good research done.

I feel like I was especially fortunate to be there and get a better understanding of how my work can help other researchers where my research fits in the broader field. I will probably never write a book, but I will make some cool maps, data visualizations, and tables that will provide access to facts about the circus that will be important building blocks in other projects. Even though I spend my days advocating that research data is a valid research product it has taken me a while to recognize that my own work counts too. There is a lot of work involved in creating clean datasets, and I tend to dismiss that because this is research that I do “for fun”. Working alone I sometimes worry that I am the only one in the world who will care about things like where the Sells-Floto Circus was playing in 1914 but getting the chance to talk circus for a week with other people who are as excited about their areas as I am about mine reminded me that I am definitely not alone and that what I do has value.

I have never been particularly good at solitary pursuits, so this format of research and exploration suited me very well. If you can find a group of people to storm an archive with I highly recommend it.

Thanks to my circus ladies for a great week: Betsy Golden Kellem; Amelia Osterud; Shannon Scott; and Kat Vecchio.

Enormous thanks to all the people at the Ringling Archives for your generosity and patience: Jennifer Lemmer Posey, Tibbals Curator of Circus; Heidi Connor, Archivist; Peggy Williams, Education Outreach Manager at Feld Entertainment.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Musings on Mandatory Copyright Training

by Gina Brander, Saskatchewan Polytechnic Library

The ongoing Access Copyright lawsuit against York University and anticipated amendments to the Copyright Act have made many institutions increasingly vigilant about copyright compliance. Some have responded by urging or requiring faculty to submit course materials for copyright review. Others have ramped up copyright training and education among faculty and staff.

With these approaches in mind, how can we ensure that faculty use, or even know about, the copyright services and training available to them? Instructors select materials and design their courses with varying levels of autonomy depending on their school, department and/or program. As a result, messaging isn’t always effectively communicated from the top down to everyone who needs to hear it, such as non-faculty and part-time instructors (Zerkee, 2017). And while mandatory copyright reviews of course materials ensure compliance, not all institutions have the resources to perform these reviews.

Mandatory copyright training is an alternative approach that has been cautiously explored. A recent survey of Canadian universities found that only 13.6% of respondents’ institutions required instructors to undertake copyright training or education (Zerkee, 2017). Canadian colleges and polytechnics appear to be following a similar track. A number of Ontario colleges in partnership with Heads of Libraries and Learning Resources (HLLR) collaboratively developed a copyright education online learning module, the first of which launched in 2013 (Copyright Literacy Ontario Colleges, n.d.). Yet few of the participating colleges have made these modules required training (Buckley, Muller, Peters, & Shannon, n.d.).

It would seem that mandatory copyright training is the exception rather than the rule in Canadian post-secondary institutions. There are many reasons for this—lack of resources, other institutional priorities, difficulties enforcing non-legislated materials, service culture, etc. For copyright offices firmly planted in the library, enforced education of any kind may feel counterintuitive.

Yet still I must ask—what is the best way to ensure that faculty and staff are equipped with the knowledge and tools they need to make informed, deliberate copyright choices? Perhaps some institutions will find more success and buy-in by promoting rather than enforcing copyright training. Even so, making basic training a requirement of employment is a surefire way for an institution to send the unequivocal message (both outwardly and inwardly) that copyright compliance is everyone’s responsibility. And that’s an important message.

References
Buckley, P., Muller, J., Peters, J., & Shannon, M. (n.d.). Copyright literacy in Ontario colleges [PowerPoint presentation]. Retrieved from https://copyrightliteracy.wordpress.com/about/

Copyright Literacy Ontario Colleges. (n.d.). Implementing. Retrieved from https://copyrightliteracy.wordpress.com/implementing/

Zerkee, J. (2017). Approaches to copyright education for faculty in Canada. Partnership: The Canadian Journal of Library and Information Practice and Research, 11(2). 1-28. https://doi.org/10.21083/partnership.v11i2.3794

An Early Valentine for Journal Editors

by Selinda Berg
Leddy Library, University of Windsor

Currently, as a member of CAPAL’s Research and Scholarship Committee, I am working as one member of an editorial team building a special issue of the Canadian Journal of Academic Libriarianship on research and scholarship in Academic Libraries. While I have so enjoyed learning about the array of scholarly interests related to the topic, the most significant learning that I have achieved is about the incredible and relentless work of being a journal editor. I have a long list of respected colleagues who have taken on this role. However my foray into editorship brought to me a new and profound admiration for editors.

The work that journal editors do was captured in Lori Kloda’s CEBLIP blog post from May 2016. I have since learned that the list is modest in its description of the work of the editor. I have long recognized that there was a lot of work behind the scenes for journal publications. However, the amount of invisible labour involved in editorial work truly is astounding. And perhaps even harder to capture is the work and effort to be balanced, sensitive, and patient. Soliciting reviewers, tracking down reviews, mediating conflicting reviews, considering papers for rejection, balancing the voice of the author and one’s own voice are only a few of the tasks that editors must consciously and carefully engage in. Editors recognize that the works submitted to them are those of their respected colleagues who have made a personal and professional investment in their writings, and in turn treat them as such.

This post is intended to be a cheer for journal editors in our profession. They are supporting, encouraging, and facilitating research in our field. They are investing their time and efforts towards building our scholarly platforms. As such, with this valentine to the journal editors, I endeavor to be a better author and a better peer reviewer.

A Valentine for the journal editors…
Roses are red,
Violets are blue,
I have a better understanding,
Of all that you do.
So I promise to do better,
And proofread to the letter,
I will keep to the deadlines,
And read more closely the guidelines.
You read my articles with care,
And made them the best I could possibly share.

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Research in teams & groups…when it works!

by Jaclyn McLean
Electronic Resources Librarian
University of Saskatchewan

Collaborations can be hard. Successful collaborations are rare (IMHO). I’ve been on teams of different shapes and sizes, and for different purposes since I became a librarian a decade(!) ago. Since I joined the USask Library five years ago, I’ve been lucky to have both the time and the opportunity to do some formalized learning about leadership and team development. I can look specifically to the Library Leadership Development Program (LLDP) and two posts from this blog as a turning point in the way I work in, and set expectations for, collaborative teams.

I could do a bunch of further research into the topic (and I have, see below for some sources I’ve consulted), but I thought I’d rather share my experiences:

  • Take time to plan early in the project: What are everyone’s expectation of timelines, deliverables? What are your goals from the project? If you want to publish an article, is there an outlet in mind? Who will be lead author? Are there roles each member will play on the team (aka note taker for meetings, booking meeting times/places, etc.)?
  • Talk about how you like to work: What makes you nutty? How do you measure success? How about others on the team? Where are your common values, and where are the potential conflicts? Identifying them early makes it easier to talk about them later—remember how I can think clearer if we meet in the mornings? remember how I like to take detailed notes?—rather than having to bring up these preferences in the heat of the moment.
  • Communicate: Talk to each other often and keep good notes. Keep track of decisions about methodology or changes along the way and check in with each other throughout the project to build trust with your collaborators.
  • Admit when you’re going to miss a deadline: do this before the deadline comes. Be understanding when another team member needs some flexibility on timelines too. We’re all busy, and shit happens.
  • It doesn’t have to be all business, all the time: being able to talk about other projects, or things in your life outside the research team not only lets your team members know when you will have reduced bandwidth (e.g., your cat is sick, or you’re going on vacation), but also builds relationships. Working on a team can’t be all about the working—it’s got to be about the team too.

I’ve always been a “get down to business” kind of person when it comes to work. It’s taken some hard lessons for me to remember to prioritize the more social elements of teamwork. They used to seem like a waste of time, time that could be spent getting the work done! I have now learned that making the time to build a foundation with your team and talking about how you want to work before you start doing the work is invaluable.

My apologies to anyone who was on a team with me before I realized this—I probably cut you off, or stifled your ideas, or rushed ahead with the task at hand without considering what you needed from the collaboration. Let’s be honest, I probably still do that sometimes. But I’m getting better 😊.

Further reading:
(if you only have time for one):

Shneiderman, B. (2016). The advantages of doing research in teams (essay) | Inside Higher Ed. Retrieved from https://www.insidehighered.com/advice/2016/04/06/advantages-doing-research-teams-essay [Accessed 21 Dec. 2018].

Dunn, B. (2018). Leading a productive research group | University of Oxford. [online] Ox.ac.uk. Available at: https://www.ox.ac.uk/research/support-researchers/principal-investigators/principal-investigations-blog-pis/leading-productive-research-group?wssl=1 [Accessed 21 Dec. 2018].

Lee, T., & Mitchell, T. (2011). Working in Research Teams: Lessons from Personal Experiences. Management And Organization Review, 7(03), 461-469. doi: 10.1111/j.1740-8784.2011.00224.x

McEwan, D., Ruissen, G., Eys, M., Zumbo, B., & Beauchamp, M. (2017). The Effectiveness of Teamwork Training on Teamwork Behaviors and Team Performance: A Systematic Review and Meta-Analysis of Controlled Interventions. PLOS ONE, 12(1), e0169604. doi: 10.1371/journal.pone.0169604

Other excellently informed posts on the topic from this blog:

https://words.usask.ca/ceblipblog/2017/08/22/research-groups-and-the-gift-of-spaciousness/

https://words.usask.ca/ceblipblog/2016/10/18/considering-collaborations/

https://words.usask.ca/ceblipblog/2016/07/05/a-book-editing-collaboration/

https://words.usask.ca/ceblipblog/2015/09/01/co-authoring2/

https://words.usask.ca/ceblipblog/2015/06/09/collaborating-for-research-experiences-and-lessons-learnt/

https://words.usask.ca/ceblipblog/2015/04/21/co-authoring-shared-work-%E2%89%A0-less-work/

https://words.usask.ca/ceblipblog/2014/08/19/to-boldly-go-the-research-collaboration/

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.