Writing for Scholarly Publication: an Editor’s Perspective

by David Fox Librarian Emeritus
University of Saskatchewan

Not every manuscript submitted to a scholarly journal is a well-constructed, cogently written, polished work of prose. As Editor-in-Chief of Partnership: the Canadian Journal of Library and Information Practice and Research from 2011 to 2014, this writer evaluated more than 150 manuscripts of varying quality, and all of them required some editing or revision. This includes some of my own pieces for Partnership. I’m painfully aware that, as a writer, I’m just as inclined to slip-ups and omissions as anyone else. We sometimes seem to be blind to our own mistakes. That’s why we need editors. It takes many passes and many different eyeballs on a page to make it as clean as it should be.

When it comes to editing manuscripts, you can’t make a silk purse out of a sow’s ear, but with a bit of effort you can usually produce a pretty serviceable pigskin wallet – and that’s often good enough for publication. Manuscripts from first-time authors, authors with a limited command of English, and authors not familiar with the conventions of academic writing need more than the average amount of editorial work, but I’m proud to say that at Partnership we rarely rejected a manuscript due to deficiencies in the writing alone. If the author had something interesting and important to say, we worked with that author to make the article publishable. Faulty methodology is another matter. Editors can fix bad writing, but we can’t fix bad research.

Below are some tips on writing for submission to a scholarly journal based on my experience reading manuscripts at Partnership. A lot of this advice may seem obvious to readers of this blog, but many of the papers I reviewed overlooked some of these points. Journals are typically juggling a number of manuscripts simultaneously under tight timelines. Anything that interrupts or slows down work on a manuscript may delay its publication. Attention to the following suggestions may expedite acceptance and processing of a submission.

What to write and where to submit?
• Pick an appropriate topic. To justify publication, a manuscript must have something new and interesting to say to the target readership of the journal. At Partnership in recent years, the most frequently cited articles have dealt with the adaptation of new technologies, particularly social media applications, to library functions; development of new services, including services to specific communities or user populations; new scholarship and publishing models; and new approaches to traditional library competencies.
• Pick an appropriate journal for your topic. What audience are you trying to reach? Is your topic of wide, general interest or narrow and specialized? Read the “purpose and scope” notes associated with potential journals to determine whether your submission will be a good fit for the readership.
• Where possible (and it’s almost always possible), choose an open access journal. Remember that every time you publish behind a paywall, a kitten dies!
• Work must be original, not previously published, and not simultaneously submitted to another publication. If you are considering submitting something to a scholarly journal, don’t pre-post it to an open access repository or conference Web site. If an identical or similar version of your paper can be located via a Google search, then it has essentially already been published and will probably be rejected.

Pay attention to publisher’s guidelines
Journal publishers tend to be fairly strict about adherence to style guidelines. This is in order to promote consistency of presentation from article to article.
• Pay attention to your publication’s instructions for authors re. manuscript length, spacing, etc.
• Follow your publication’s guidelines for bibliographic style and citation format. If the publisher’s instructions call for APA style and you submit your paper in MLA, Chicago, or some other format, it will likely be sent back to you for revision, and you will lose time.

Writing
• Write with the reader in mind. Avoid jargon, colloquialisms, and unexplained acronyms (unless you’re sure the audience will understand the reference).
• Adhere to the conventions for scholarly writing:

-Cite your sources. Every fact, idea, opinion, or quotation borrowed from another author needs to be documented (MLA 165). The editor cannot do this for you.
-Write clear, precise, simple, and straightforward prose.
-Use formal English (What is Academic English?). Avoid conversational language, e.g., “great”. “Fun” is not an adjective!
-Write in the third person. Avoid the use of personal pronouns: I, my, you, your. Refer to yourself as “the author”, “this researcher”, etc.
-Avoid using contractions: won’t, doesn’t.
-Exercise caution when expressing opinions and outcomes: use “may”, rather than “is” unless completely certain of your claims.
-Unlike creative writing, the passive voice is often appropriate in academic prose.

• Master basic punctuation and grammar. Poor grammar and punctuation, although fixable, conveys a negative impression to the editor and will require more time and effort by the copyeditor. In reading manuscripts at Partnership, it was astonishing to find that many librarian authors do not seem to have a good grasp of the rudiments of punctuation. In future blog posts I will discuss the most common punctuation mistakes and how to avoid them. It’s important for librarian authors to master these basic skills. Insistence on following standard punctuation rules is not just pedantry. Good punctuation helps to convey meaning, to avoid confusion, and allows a manuscript to be read more quickly and efficiently.
• Avoid word repetition. Use a thesaurus!

Prior to submitting your manuscript…
• While working towards a submission deadline, make sure to leave time for quality control.
• Have one or more trusted colleagues read your paper for clarity and comprehension before submission. This is especially advisable if English is not the author’s first language. If your closest colleagues don’t understand what you’re trying to say, then the average reader certainly won’t.
• Have another colleague with a good eye for detail proofread your work for spelling accuracy, typos, and word omissions. Sometimes it’s difficult to see one’s own mistakes.
• Leave time for revisions based on your colleagues’ suggestions.

Revision
• Assume that you will be asked to revise your manuscript. Editors rarely accept a manuscript without asking for changes, and peer-reviewers almost always suggest revisions. Don’t be discouraged by constructive criticism.
• Do take seriously the comments of peer-reviewers as peer-review usually results in substantial improvements to a manuscript; however, reviewers of the same paper can sometimes have conflicting opinions, and some advice they give may be off the mark (Soule 14). A good editor will evaluate the fairness of reviews and decide which comments to share with the author, or recommend which comments the author should particularly focus on. Remember that ultimately you are responsible for the integrity and coherence of your own work. Make those recommended changes that seem appropriate and sensible, and let the editor decide whether your revisions are acceptable.

A writer’s best friends are a thesaurus, style guide, and punctuation and grammar manuals. Keep them within easy reach on your desktop (either physical or virtual) and consult them frequently!

Works Cited
MLA Style Manual and Guide to Scholarly Writing. 3rd ed. New York: Modern Language Association of America, 2008. Print.

Soule, Daniel P. J. , Lucy Whiteley, and Shona McIntosh, eds. Writing for Scholarly Journals: Publishing in the Arts, Humanities and Social Sciences. Glasgow: eSharp, 2007. Web. 6 March 2015. http://www.gla.ac.uk/media/media_41223_en.pdf

What is Academic English? The Open University, 2015. Web. 6 March 2015.
http://www2.open.ac.uk/students/skillsforstudy/what-is-academic-english.php

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Librarians as Practitioner-Researchers: Constructive Concept

by Kristin Hoffmann
Associate Librarian, University of Western Ontario

Librarians as practitioner-researchers: constructive concept or limiting label? Last summer, my colleague Selinda Berg and I had an invigorating conversation about this question. We presented our reflections at the 2014 C-EBLIP Fall Symposium, and this post is my part of that presentation. Selinda’s part will be published here later this spring.

We want to share our conversation about librarians as practitioner-researchers because we see a link between researcher identity and research culture. Academic librarians, particularly in Canada, are in the process of establishing and shaping a research culture for ourselves. Part of establishing a research culture is having a clear sense of who we are as researchers and what it means to us to be researchers. We hope that our conversation can spark similar conversations for others.

Peter Jarvis developed the concept of practitioner-researcher in his 1999 book The practitioner-researcher: developing theory from practice. Rebecca Watson-Boone (2000) and Virginia Wilson (2013) have examined the concept specifically for librarianship.

I want to share two reasons why I believe that “practitioner-researcher” is a constructive concept for librarians.

1. We are both practitioners and researchers and so we need an identity that encompasses both of those roles, rather than trying to manage or embody two distinct identities.

The practitioner-researcher concept is a truer and better representation of who we are and what we do as academic librarians than either practitioner or researcher on their own. We often talk about the challenge of how to “fit” research into our workdays, and I think part of that is because we are separating our researcher selves from our practitioner selves and trying to create a separate place for each of those identities. Embracing the identity of practitioner-researcher can help us truly affirm the importance of both roles and the interplay between them.

2. Embracing the practitioner-researcher identity can bring us to a fuller, and unique, understanding of both practice and research.

Previous discussions of practitioner-researchers first emphasize the practitioner role, and research is seen as something that informs practice: we are practitioners who also happen to be researchers, therefore we are practitioner-researchers.

However, our knowledge and understanding of our practice can also inform and enlighten our research. This may be a much more powerful and constructive concept for librarians. To illustrate this, I offer an example from my own research.

In a recent project, I worked with the sociological theory of strategic action fields. Very briefly, this is a theory that provides a framework for thinking about stability and change in social institutions. Since libraries are a social institution, applying this theory to librarianship can help us come to a deeper understanding of change in librarianship. Why do some things change in library-land, why do other things never seem to change even though we wish they would, and what might it take for those changes to happen?

My research looked at librarian-vendor relations and why there seems to be so much enthusiasm for librarians to stand up to vendors and yet so little apparent meaningful change in this aspect of collections. The theory of fields was the tool I used to analyze this situation in an objective, systematic way.

It was through the process of applying the theory of fields to this collections-related example that I really came to see myself as a practitioner-researcher. My research with this theory was deeply informed and influenced by my practice as a librarian. Because I’m an “insider”, intimately familiar with librarianship, I could see aspects of the theory that a so-called “pure” researcher couldn’t – I had unique insight from practice that informed my research.

The theory of fields sociologists came to their theory as researchers; their book (Fligstein and McAdam 2012) makes no mention of practice or how their ideas might shape or be shaped by real-life situations. Librarians who talk about implementing change management might have approached my topic as practitioners. I was seeing it as a practitioner-researcher.

My practice directly informed my approach to this research project. And, yes, my research also informed my practice: having a rigorous and systematic theoretical framework to apply to my practice gave me new insight that has influenced how I understand my profession.

In summary, therefore, practitioner-research is a constructive concept because:
• embracing the practitioner-researcher identity can bring us to a fuller understanding of and a unique perspective on both practice and research; and
• we are both practitioners and researchers and need an identity that encompasses both of those roles.

References

Fligstein, N. and McAdam, D. (2012). Theory of fields. Oxford: Oxford U Press.

Jarvis, P. (1999). The practitioner-researcher: developing theory from practice. San Francisco: Jossey-Bass.

Watson-Boone, R. (2000). Academic librarians as practitioner-researchers. The Journal of Academic Librarianship, 26(2), 85-93. DOI:10.1016/S0099-1333(99)00144-5

Wilson, V. (2013). Formalized curiosity: reflecting on the librarian practitioner-researcher. Evidence Based Library and Information Practice, 8(1), 111-117. http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/18901

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Serendipity, Algorithms, and Managing Down the Collective Print Collection

by Frank Winter, Librarian Emeritus
University of Saskatchewan

My first contribution to Brain-Work introduced the conceit of the Gone-Away World, and explored the mindset exhibited by Traditionalist librarians who believe that the traditional rationale for developing a local collection and the traditional collection-based services of reference, information literacy and so on are still in existence (Winter, 2014). The core belief of this mindset is the overarching importance of the local print collection. I asked, “[i]f the local collection of printed scholarly books by-and-large just Goes-Away as a useful and important service, what choice will we have as university libraries and university librarians but to adapt? Evidence-based solutions will be increasingly necessary for good decisions in this environment.” In the paragraphs below I explore in a bit more detail the use of serendipity by Traditionalist librarians as one specific example of resistance against any evidence-based reduction of the local collection.

The concept of the collective print collection (where print = books, rather than journals which have long-since migrated almost completely to digital format) has been a very significant development in the options available to research libraries as they consider how they might allocate their resources to meet the existing, evolving, and competing demands of their users within the overarching expectations of what the library must contribute to the mission of the university. Specifically, this concept offers options to “manage down” local print collections so that some of the substantial resources required to maintain them can be reallocated to address other priorities.

Originating with Lorcan Dempsey, the concept has been developed by OCLC as it investigates how and why some library services might now be more efficiently and effectively organized and delivered at supra-institutional or network levels (Dempsey, 2013). As part of its research, OCLC began to more systematically investigate the characteristics of the print holdings of its members with respect to overlap, uniqueness, and geographical distribution. What has emerged is a deeper and more nuanced understanding of how non-local print book collections could be organized and managed including the importance of proximity as it relates to delivery options. OCLC has identified as many as twelve North American mega-regions as well as other, smaller regions. One of the smaller regions has been named Canada Extra-Regional, which includes libraries in the prairie provinces (Demspey and Malpas, 2015). Careful reading of the research reveals many caveats such as the awareness that implementing various actions that might work in larger mega-regions might not work as well in smaller ones such as the Canadian prairies, which have smaller aggregate print collections distributed across relatively larger geographic areas.[1]

Proposals to manage down the local collection – even ones to shift materials to an on-campus repository let alone into a regional print storage facility – typically encounter resistance from local users and librarians based on any number of factors. One factor that is often invoked is that removing materials from the open stacks will reduce opportunities for serendipitous discovery by users. Preserving the opportunity for serendipity is advanced by some librarians and some users as a positive value to be privileged when making decisions with respect to how and where collections are housed.

Although I once heard a very senior and accomplished professor tartly dismiss serendipity as “the tool of the lazy scholar,” we all have our serendipity stories. The most amusing serendipity story I ever heard was told by Dan Cohen, Founding Executive Director of the Digital Public Library of America (DPLA). A colleague went into the stacks and reached up to the top shelf to pull off his desired book. The book beside it dropped on his head and led to a lifelong program of research (Cohen, 2014). Well, who among us would not want to preserve the conditions for such fruitful accidents? In the same blog posting, Cohen described some interesting work at the DPLA to determine whether they could “engineer serendipity” into its user interface although, regrettably, without any capacity for virtual book bonking.

It always has struck me how contingent serendipity is, depending as it does on a chain of events such as a physical book being published, being collected, being on the shelf in a particular spot, having a user actually go into the stacks and coming across the item, as well as the innumerable, unknowable factors taking place inside the mind of the user.[2] The deliberate incorporation into the local collection of hundreds of thousands of e-books let alone hundreds of thousands of other digital objects over the past decade seems to me to only further attenuate serendipity in the stacks as a factor to be given much weight.[3]

Librarians mask the complexity that underlies their operations so that their users can get their hands on the desired object in as friction-free a manner as possible. The invocation of serendipity when making decisions about the disposition of local collections in research libraries conceals and even, in some cases, denies the decades- and even centuries-long application of a long chain of information labour and expertise that brought about these local collections in the first place. Those books did not just happen to be there. The librarian-related actions that got them there such as collection building, describing, organizing, and preserving, have been termed “epistemological engineering” (Guédon, 2001).

It is but a short step from engineering to algorithms if one considers the algorithm to be a codification of the innumerable operations involved in such engineering. Currently there is considerable discussion concerned the algorithms that permeate our lives, from Google’s 200+ “signals,” through Facebook, Amazon, and eHarmony, to the relevance ranking in online catalogs and discovery systems. I have found the work of Tarleton Gillespie especially helpful here with his careful unpacking of the many parts of algorithmic culture (Gillespie, 2014). And Ian Bogost has written recently that, “Concepts like ‘algorithm’ have become sloppy shorthands, slang terms for the act of mistaking multipart complex systems for simple, singular ones. Of treating computation theologically rather than scientifically or culturally.” (Bogost, 2015). For him, it is imperative to recognize all the other inputs (even preceding data and many of them human) that are required in order for an algorithm to produce its answer. One of these multipart complex systems is the library collection and the services that surround it.

I have contrasted serendipity, a somewhat Latinish neologism originating in the 18th Century, with the Arabic-derived algorithm as a way of foregrounding the information labour that is often concealed in the order of books on the shelves and their discovery by users. For me, however, the word fluke, a pithy 19th century, vaguely Anglo-Saxonish neologism, more accurately describes what happens in cases of serendipity in the stacks of the local print book collection. There are so many variables that determine whether a user stumbles across something relevant that they are almost impossible to identify. It is far less impressive defending flukiness than invoking serendipity as an operating principle when deciding how to manage down print collections

____________________________________________________

[1] It is important to acknowledge the caution with which university librarians have approached any “managing down” of the local physical collection as a result of the existence of digital surrogates. It took at least ten years for most librarians to become comfortable that appropriate institutional arrangements were place before acting to discard some local print runs of e-journals. It will take at least as long and probably much longer for equivalent actions to take place with respect to print books. And although some commentators invoke initiatives such as Google Books, the Internet Archive, and Project Gutenberg as replacements and supplements for print book collections, these resources are clearly not library collections in any sense recognized by librarians and not acceptable as long term solutions.

[2] Frederick Kilgour’s study examining the sources of user failure incurred in obtaining the desired item in the stacks included factors such as a book never being acquired, being in on order, awaiting cataloguing and processing, circulation, awaiting reshelving, or misshelved, as well as user error (Kilgour, 1989). One could add any number of other contingencies such as being allowed into the stacks in the first place, books that disappeared into in-library graduate study carrels and faculty offices, and so on. Kilgour’s interest was estimating the impact that an automated library information system would have on these factors. It would be interesting to update Kilgour’s work and the literature of user failure generally to determine what effect the incorporation of e-books into the collection would have on those factors.

[3] See, for example, a recent article by David Woolwine (2014). Woolwine writes from the perspective of supporting undergraduate learning in a small to medium size academic library and includes references to studies involving serendipity. I wonder about the relevance of these earlier studies in an age of digital abundance, scholarly e-books, and changing user practices.

References

Bogost, Ian. 2015. Cathedral of computation. The Atlantic. January 15, 2015. http://www.theatlantic.com/technology/archive/2015/01/the-cathedral-of-computation/384300/

Cohen, Dan. 2014. Planning for Serendipity. http://dp.la/info/2014/02/07/planning-for-serendipity/.

Dempsey, Lorcan. 2013. The Emergence of the Collective Collection: Analyzing Aggregate Print Library Holdings. http://www.oclc.org/content/dam/research/publications/library/2013/2013-09intro.pdf. In: Dempsey, Lorcan, Brian Lavoie, Constance Malpas, Lynn Silipigni Connaway, Roger C. Schonfeld, JD Shipengrover, and Günter Waibel. 2013. Understanding the Collective Collection: Towards a System-wide Perspective on Library Print Collections. Dublin, Ohio: OCLC Research. http://www.oclc.org/content/dam/research/publications/library/2013/2013-09.pdf.

Dempsey, Lorcan and Malpas, Constance. 2015. Evolving Collection Directions. Collection Development Strategies in an Evolving Marketplace: An ALCTS Symposium. Chicago, 30 January 2015. http://www.slideshare.net/lisld/alctssymposium.

Gillespie, Tarleton. 2014. The relevance of algorithms, in Media Technologies: Essays on Communication, Materiality, and Society, ed. Tarleton Gillespie, Pablo Boczkowski and Kirsten Foot. Cambridge, MA: MIT Press.

Guédon, Jean-Claude. 2001. In Oldenburg’s Long Shadow: Librarians, Research Scientists, Publishers, and the Control of Scientific Publishing. Washington, D.C., Association of Research Libraries. http://www.arl.org/storage/documents/publications/in-oldenburgs-long-shadow.pdf.

Kilgour, Frederick G. 1989. Toward 100 percent availability. Library Journal, 11450-53.

Winter, Frank. 2014. Traditionalists, Progressives and the Gone-Away World. Brain-Work. http://words.usask.ca/ceblipblog/2014/10/14/traditionalistsprogressives-and-the-gone-away-world/.

Woolwine, David E. 2014. Collection development in the humanities and social sciences in a transitional age: Deaccession of print items. Library Philosophy and Practice. http://digitalcommons.unl.edu/libphilprac/1173/.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

EBLIP + IL = SoTL

by Margy MacMillan
Mount Royal University Library

I practiced SoTL for at least 5 years in blissful ignorance of its existence. You too may be a SoTList or have SoTList leanings and not even know it; it may well be time to explore the Scholarship of Teaching and Learning.

The research projects I’m currently involved in occur at the intersection of information literacy (IL) and SoTL, and like all intersections it’s an exciting, slightly unsettling place to be. There’s a lot of movement in many directions, a lot of choices on where to go next, and some things just have to wait until there’s a break in the traffic to get going. Standing at this intersection I`ve had some time to think about the links between SoTL and evidence based library and information practice (EBLIP)…
MargyHanoi
Hanoi, 2013. By D. MacMillan

EBLIP and SoTL

SoTL might be described as evidence-based practice in teaching. It is focused, like EBLIP on gathering evidence to understand different situations and/or the impact of different interventions. It uses a range of methodologies and works both within and across discipline boundaries. While it is most obviously akin to evidence-based research in IL, branches of SoTL concerned with technology or institutional cultures may resonate with other library researchers. Much like EBLIP conferences where those who work with bioinformatics data discover common ground with public librarians working with citizen science initiatives, SoTL fosters conversations between people who might not otherwise meet. Academics working in SoTL don’t always get much support for their research at their own institutions (sound familiar?) or within their own disciplines and they value conferences both for finding kindred spirits and for the interdisciplinarity that brings fresh ideas and approaches. Since arriving in this welcoming SoTLsphere, I have enjoyed exploring further – attending conferences, getting involved in SoTL on my campus and currently supporting the SoTL work of colleagues through Mount Royal`s Institute for SoTL.

3 ways SoTL has helped me EBLIP

Methodologies – SoTL work rests on applying disciplinary research methods to understanding teaching and learning. I’ve encountered a really broad range of methods in SoTL work that also apply to EBLIP.

Understanding Threshold Concepts (TCs) – While I had first heard of TC’s at a library conference, this way of looking at learning is a major focus in SoTL and I have been able to bring knowledge from SoTL folks into discussions around the new TC-informed Framework for IL.

Focus on building a community – Some SoTLers are involved with building communities on campuses by expanding relationships, providing support, and developing policy. There are many useful insights here for library initiatives and I have benefited from becoming part of a very supportive, cross disciplinary group of scholars.

3 ways EBLIP has helped me SoTL

Better understanding of diverse literatures and how to search them – This has helped me enter a new field, but also allows me to contribute back to the SoTL community on campus as I am aware of resources and tools for searching outside their disciplines.

Longer experience with evaluating usefulness of small steps and interventions – IL is often assessed at micro levels: the use of a particular tool, or the effectiveness of a teaching strategy, often within a single class. We have developed a number of strategies to examine teaching and learning at this atomized level useful for instructors accustomed to thinking in course-sized chunks.

Understanding how dissemination works – Work like Cara Bradley`s is informing my work with SoTLers in identifying venues for publication, and my next project on studying dissemination patterns in SoTL.

Interest in SoTL among librarians is growing, as evidenced by increasing numbers at conferences and a colleague in the UK who is writing a book about SoTL and librarians (many thanks to Emma Coonan for a great conversation that clarified many of these thoughts and if you aren’t reading her The Mongoose Librarian blog on a regular basis … .well, you should be!). Explore a little, dip into their literature, maybe go to a conference or talk to the teaching and learning folks on your campus… they can use our help and we might be able to borrow a few things from them. Maybe we’re overdue for a change.

3 good reads about SoTL

Felten, P. (2013). Principles of good practice in SoTL. Teaching and Learning Inquiry: The ISSOTL Journal, 1(1), 121-125. http://muse.jhu.edu/journals/teaching_and_learning_inquiry__the_issotl_journal/v001/1.1.felten.html

Huber, Mary Taylor and Sherwyn P. Morreale, eds. Disciplinary Styles in the Scholarship of Teaching and Learning: Exploring Common Ground. Menlo Park, CA: Carnegie Foundation, 2002.

Hutchings, P. (2010). The scholarship of teaching and learning: From idea to integration. New Directions for Teaching and Learning, 2010(123), 63-72. http://fresnostate.edu/academics/csalt/documents/Hutchings2010.pdf

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Reflecting on Our Biases

by Denise Koufogiannakis
University of Alberta Libraries

Reflection is an important part of evidence based practice. The principle has been embraced by those who aim to practice in an evidence-based manner because being an evidence based practitioner is not just about the evidence itself, but about the process of how and why we use that evidence. To date, reflection has generally been inserted into the evidence based process towards the end of the cycle, prompting one to look reflectively back on what was done in order to reach a decision. One reflects on such questions as: what evidence did I find and use; what evidence was lacking; what happened during the decision making process; did the chosen implementation work; what did I personally learn; what would I change next time? Reflection has also largely been considered an individual and private act that a professional undertakes for self-improvement.

I’d like to propose that we begin the process of reflection earlier in the process, specifically as it pertains to the biases we have in relation to a particular question or problem at hand. And, since library decisions are frequently made in groups, we should make reflection on our biases a shared act with colleagues who are also engaged in finding a solution to the problem. As we strive to incorporate evidence into our decision making, it is important to be aware of the biases that we all bring to finding, interpreting, weighing, and using evidence. We work in organizations, large or small, with others – we all have different perspectives, motivations, and desires. Decision making as part of a group is not easy! We need to be conscious of how the biases of each group member and the collective dynamic might influence the process. Through reflection and openness, we may be able to limit our biases and therefore make better decisions.

In practical terms, what this means is being upfront with our colleagues, and where a group has been tasked to make a decision or put forward recommendations on a specific new initiative or review of an existing area, that we have conversations about our biases from the very start. This requires that each person reflect on how they are considering and approaching the problem or question, what their initial reaction was, what they hope will be the outcome, and any other preconceived notions they have related to the issue. It also means that collectively, the group discusses and acknowledges the various biases, and consciously moves forward with the intent to address all biases so that they do not adversely affect the final decision. Doing this may be a bit risky for each individual, but it creates a climate in which trust can be built, and the group can proceed with an open and transparent approach to their decision making. It means that in all likelihood, more sources of evidence will be sought and considered, potential solutions will not be dismissed out of hand, and a sound approach will be chosen.

Here are some common biases people have, and without being aware of them, they may adversely affect our decision making:
• overconfidence bias – when people think they know more than they actually do
• confirmation bias – when people gather information selectively in order to confirm what they already think
• framing bias – when people make different decisions depending upon how information is presented
• representative bias – when people rely on stereotypes and predict outcomes based on past situations
• anchoring bias – when people rely too heavily on one piece of information
(Robbins, 2005; Greenberg and Baron, 2008)

For more on biases within the workplace, I recommend this brief overview by Rykersmith (2013) who provides a list of 5 biases in decision making, based on the research of Lovallo and Sibony (2010). While taken from business, the advice soundly applies to decision making within libraries, and provides ways for us to spot these biases and overcome them.

Recognizing your own biases or those within your group is important. Here are some questions to ask yourself and your group, in order to identify possible biases and discuss them.
• What is my natural inclination with respect to this problem? Do I already think I know the answer for what is best?
• Am I picking and choosing evidence that only suits my predetermined notion?
• If I have passionate feelings about this topic, why is that? Is there an important ethical or professional principle that needs to be considered within the decision?
• Are there other people with opposing views that I find difficult to discuss the problem with, and this is clouding my judgement?
• Am I reacting due to my own motivations/desires? Is a potential change going to impact me personally and therefore I am afraid of it?
• Am I easily influenced by one particular piece of evidence? Why might that be? Why did that piece of evidence impress me?
• Do I stand to gain or lose based on the outcome of this decision? Is this potential change influencing me?
• Have I gathered the types of evidence that would help, or just what was easy? Have all possibilities been considered? Have all perspectives been represented?
• Is the evidence sound or just based on anecdote and sentiment?

Once a bias has been brought to light, it is much easier to deal with and proceed with a higher level of consciousness. Such reflection is sure to bring us closer to better decision making.

References

Greenberg, J., & Baron, R. A. (2008). Behavior in organizations (9th ed.). Upper Saddle River, NJ: Pearson Prentice Hall.

Lovallo, D., & Sibony, O. (2010). The case for behavioral strategy. McKinsey Quarterly. Accessed 15 Feb. 2015
http://www.i3-invest.com/uploads/pdf_file/c850526fa6f572915c3645199db25297.pdf

Robbins, S. P. (2005). Essentials of organisational behavior (8th ed.). Upper Saddle River, NJ: Pearson Education, Inc.

Rykrsmith, E. (2013). 5 biases in decision making – Part 2. The Fast Track Blog. Accessed 15 Feb. 2015
http://quickbase.intuit.com/blog/2013/06/07/5-biases-in-decision-making-part-2/

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

A Style Manual for the Rest of Us

by Christine Neilson
Information Specialist, St. Michael’s Hospital
Toronto, Ontario

I’ve decided I like writing. But the hard part about writing is making sure that it’s done well, and I’ve read enough library literature that was not well written that I get concerned about my work falling into that category, too. Before Christmas some colleagues told me about Steven Pinker, a cognitive psychologist/linguist, and his recent book The Sense of Style: The Thinking Person’s Guide to Writing in the 21st Century. They told me that Pinker’s books are very good and even entertaining. I thought, “The man writes about writing – how entertaining could it be?”, but I was curious. When I flipped through the book, I noticed that it included several cartoons: I took this to be a good sign. In the prologue Pinker wrote “By replacing dogma about usage with reason and evidence, I hope not just to avoid giving ham-fisted advice but to make the advice that I do give easier to remember than a list of dos and don’ts.” (p 6). If that doesn’t speak to an EBLIPer, I don’t know what does. I was sold.

It turns out the book was indeed entertaining. More importantly, it was easy to read and helped me to pinpoint a few areas that I need to work on. When all is said and done, I took two things away from Pinker’s book. First: there are rules to follow including, but not limited to, grammar and punctuation, but they are not an end in themselves. They are tools to get you closer to the goal that any writer should have in mind: composing clear prose that engages a reader in way that makes the topic easy to understand. In fact some of the rules we were taught in school are incorrect and the application of some others includes room for the writer’s discretion, and learning this made me feel a better about my writing (maybe it’s not so bad after all!). But what I liked best about Pinker’s book was the use of concrete examples of good and bad writing, and how the bad writing might be improved. It reminded me of the reality TV program “What Not to Wear”, where the hosts set out to improve participants’ wardrobes by showing them not only which elements work and which don’t, but also by giving the reasoning behind the advice so participants can continue to improve their style after the show is over. Pinker’s examples were drawn from a variety of sources, from academic papers to advice columns, and they illustrate that good (or bad!) writing is not limited to a specific area.

Pinker’s book also drove home for me that writing is an art form. There are rules and techniques to learn, but just like being able to follow a recipe doesn’t make you a master chef, knowing the rules does not necessarily make you a great writer. Any art form requires creativity, time, and effort. You have to develop a feel for what you’re doing that comes from experience: learning when to follow the rules and when to throw them away, and learning from others’ example. This may not be very encouraging for those of us who are not naturally inclined to be great authors and want a quick fix, but it shouldn’t come as a surprise to anyone who has ever tried to become proficient at anything, whether it’s writing, karate, mathematics, or Ukrainian dancing.

So how can we move our writing along the spectrum of “bad” to “good”? By practicing and reflecting on the good writing we come across. Reading Pinker’s book can’t hurt either. In fact, I believe I’ll read it again.

This article gives the views of the author and not necessarily the views of St. Michael’s Hospital, the Centre for Evidence Based Library and Information Practice, or the University Library, University of Saskatchewan.

Can I get a copy of your slides?: Sharing Conference Content Accurately and Reliably

by Selinda Berg
Schulich School of Medicine – Windsor Program
Leddy Library, University of Windsor

Librarians rely heavily on conferences as venues in which to share ideas, innovations, developments, and scholarly research. While conferences offer great opportunities to share information, it can sometimes be challenging when audience and community members want to make use of, build on, or even delve deeper into the content presented. Is there a way we can improve upon the ways that the information shared at conferences is disseminated and applied to our practice?

To the typical conference presenter, the conference process has gotten rather routine: Submit a 250-300 word abstract of the planned work (sometimes 4, 5, 8 months in advance); receive acceptance notice (hopefully); continue to work out the ideas presented in the abstract; and finally present the paper before both new and familiar colleagues. This is often followed by another predictable occurrence: Two or three days following the conference an email is received from an audience member requesting the slides from the presentation (I think now it is also becoming increasingly common for conference organizers to be burdened with trying to retrieve slides from presenters to share with delegates and on public websites.) For me, the request for my slides is both exhilarating (“Wow! The content resonated enough for someone to follow-up!”), and unsettling (“My slides? Oh no!”). This unsettledness does not evolve from an apprehension to share with others. My concern is that I am not sure how clearly my ideas are articulated or how accurately my results are presented through my slides.

Like many of us, I have embraced commonly accepted guidelines on effective PowerPoint slides:
• Minimal text: key words used only as a means emphasize and highlight points to the audience
• An image: A pleasing aesthetic which complements and re-enforces the content presented
• A quotation: A passage that is critical to the presentation but may not even align with my ideas, but rather be used as a point of reference to counter
• Data: Tables of data for which I provide robust explanation

Because my slides in no way provide or capture the complexity of ideas that I have presented, I worry about providing my slides without my interpretation of what is on those slides. People have taken steps to share the wholeness of their ideas by posting conference scripts up on research blogs and other open sites. I really like and respect this movement, however, it is not my general practice to write the kind of script that would be appropriate to share in such a way.

The reality is that the far-reaching use of PowerPoint has long been questioned and criticized. While I don’t agree with Tufte, one of the most often cited critics of PowerPoint, who compared the presentation software to a drug that is ‘‘making us stupid, degrading the quality and credibility of our communication’’ and ‘‘turning us into bores’’ (2003, p. 24), I do agree with Doumont (2005) who provides a counter argument suggesting that PowerPoint can indeed be valuable, but who also emphasizes that PowerPoint is first and foremost a companion for oral presentations, which typically have a different purpose than written documents. Slides are designed to be “viewed while the presenter is speaking, not read in silence like written documents.” I very much see PowerPoint as a companion for the audience while I am speaking, not as a standalone document to be read.

So in the end, I am left wondering if there is value in exploring a more formal and consistent process for sharing conference content so it is more trustworthy and usable. Thinking about this challenge, I have thrown around a few ideas and come up with one possible solution. But I still wonder what other ideas are out there.

In addition to the 250-300 word abstract 4-7 months prior submitted for accepted, perhaps also requesting/requiring that presenters provide a 500-700 word extended abstract (approx.. 1 page single spaced) either immediately prior to or following the conference to be posted as a type of modified conference proceedings that is common in other fields. Some library and LIS conferences including but not limited to EBLIP, ALISE, CAIS have embraced a longer abstract, but I am most commonly asked to provide 250 words which becomes the document of record for my presentation. The collection of the extended abstract during or following the conference event allows the presenter to ensure the ideas captured are in their final form, and also allows the information to be shared accurately and in the tone and manner that the researcher/presenter intended. Libraries’ increasing role in the management of institutional repository software, which often includes conference modules, makes managing this initiative both simple and accessible for library conferences. It also aligns with our values to make information and research more open and accessible.

For me, a 600-word extended abstract seems much more reliable and robust than a set of visually pleasing slides or a brief abstract created months before. I think such a gesture would help us build on the important ideas, research, and evidence presented at conferences, and of course allow for better citation of these ideas. Here’s a concrete example of this need: I was asked by an article reviewer to cite a conference presentation directly related my topic, that I had not attended, but the slides were available online. I so wanted to acknowledge the ideas but felt very uncomfortable citing something that I had such slight knowledge of and only had a visual glimpse into. I would have felt much more comfortable had I been able to view or access a conference record that was composed as a written—not visual—document, created with the intention of sharing the ideas and interpretations as fully and clearly as possible.

I have been to so many incredible conferences where the presentations have been innovative, robust, and valuable; I worry that the ideas of these scholars are not as accessible, usable, and reliable as they deserve to be.

How do you think we can ensure the valuable knowledge presented at our professional conferences can be shared accurately and reliably?

Doumont, J. L. (2005). The cognitive style of PowerPoint: Slides are not all evil. Technical communication, 52(1), 64-70.
Tufte, E. R. (2003). The cognitive style of PowerPoint. Cheshire, CT: Graphics Press.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Library Assessment Data Management ≠ Post-It Notes

by Kathleen Reed
Assessment and Data Librarian, Vancouver Island University

Over the last few years, data librarians have become increasingly focused on data management planning as major funders and journals insist researchers have data management plans (DMPs) in place. A DMP is a document that outlines how data will be taken care of during its life cycle. Lately I’ve spent a lot of time thinking about how my data service portfolio dovetails nicely with library assessment activities. A lot of discussion in the library evidence-based practice community is about obtaining and analyzing data and stats, with little emphasis about stewardship of said information. Recently, I gave a talk at an assessment workshop put on by the Council of Post-Secondary Library Directors where I reflected on data management for assessment activities. This post introduces some of the steps in working out a DMP. Please note that while DMPs usually refer to data, not statistics, in my world the ‘D’ stands for both.

Step 1: Take an Inventory
You can’t manage what you don’t know about! Spend some time identifying all your sources of evidence and what format they’re in. I like to group by themes – reference, instruction, e-resources, etc. While you’re doing this, it’s also helpful to reflect on whether the data you’re collecting is meeting your needs. Collecting something that you don’t need? Ditch it. Not getting the evidence you need? Figure out how to collect it. Also think about the format in which the data are coming in. Are you downloading a PDF that’s making your life miserable when what you need is a .csv file? See if that option is available.

Step 2: The ‘Hit by the Bus’ Rule (aka Documentation)
If you’re going to archive assessment data, you need to give the data some context. I like to think of this as the ‘hit by a bus’ rule. If a bus hits me tomorrow, will one of my colleagues be able to step into my job and carry on with minimal problems? What documentation is required by someone else to understand, collect, and use your data? Every single year when I’m working on stats for external bodies, I have to pull out my notes and see how I calculated various numbers in previous years. This is documentation that should be stored in a safe, yet accessible place.

7004634470_2eebf7e336_z
Post-its don’t count as ‘documentation.’ Neither does a random sheet of paper in a towering stack on your desk. Photo by Wade Morgan

Step 3: Storage and Backup
You’ve figured out what evidence and accompanying documentation you need to archive. Now what are you actually going to do with it? For routine data and stats, around my institution we use a shared drive. Within the shared drive I’ve built a simple website that sits on top of all the individuals data files; instead of having to scroll through hundreds of file names, users can just click links that are nicely divided by themes, years, vendors, etc. IT backs up the shared drive, as do I on an external hard drive. If your institution has access to Dataverse hosted on a Canadian server, this is a good option.

Step 4: Preservation
For key documents, you might consider archiving them in a larger university archive. LibQUAL+ results, documentation, and raw data are currently being archived via my institution’s instance of DSpace.

Step 5: Sharing
I always felt like a hypocrite, imploring researchers to make their data open when I squirreled library data away in closed-access shared drives. Starting with LibQUAL+, this past year I’ve tried to make as much library data open as possible. This wasn’t just uploading the files to our DSpace, but also involved anonymizing data to ensure no one was identifiable.

If you’re going to share data with your university community and/or the general public, keep in mind that you’ll need to identify this right away when you’re designing your evidence-collection strategies. For example, if you’re doing a survey, participants need to be informed that their responses will be made available. Ethics boards will also want to know if you’re doing research with humans (monkeys too, but if you’ve got monkeys in your library you’ve got bigger problems than figuring out a DMP…)

Perhaps the most important aspect of sharing is setting context around stats and data that go into the wild. If you’re going to post information, make sure there’s a story around it to explain what viewers are seeing. For example, there’s a very good reason that most institutions score below expectations in the “Information Control” category on LibQUAL+ – there isn’t a library search tool that’s as good as Google, which is what our users expect. Adding some context that explains that the poor scores are part of a wider trend in libraries and why this trend is happening will help people understand it’s not that your library is necessarily doing a bad job compared to other libraries.

Want more info on data management planning? Here are a few good resources:

DMP Builder by the California Digital Library

VIU’s Data Management Guide

Research Data Management @ UBC

What are your thoughts about library assessment data and DMPs? Does your institution have a DMP for assessment data? If so, how does your institution keep assessment data and stats safe? Let’s keep the conversation going below in the comments, or contact me at kathleen.reed@viu.ca or on Twitter @kathleenreed

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

I can’t make bricks without clay: A Sherlock Holmesian approach to Research and EBLIP

by Marjorie Mitchell
Librarian, Learning and Research Services
UBC Okanagan Library

Sherlock Holmes was invoked in the inaugural post of Brain-Work, the C-EBLIP blog, and I would like to revisit Conan-Doyle for the inspiration of this post. So, being a curious librarian, I searched “Sherlock Holmes and research” and came across the heading “What user researchers can learn from Sherlock Holmes.” The author, Dr. Philip Hodgson, took quotes from a variety of the Sherlock Holmes novels and laid out a five-step research process for investigators working in the user-experience field. As I read Dr. Hodgson’s article, it struck me there was a strong kinship between the user-experience community and the library community that extends beyond the electronic world. I also believe Hodgson’s steps provide a reasonable starting point for novice evidence-based library and information practice (EBLIP) researchers to follow.

Step 1 – According to Hodgson, the first step is understanding the problem, or formulating the question. I would adapt it even further and suggest being curious is the very first step. If we’re not curious, we can’t identify what we want to know, or the question we hope to answer. Perhaps a mindset of being open to the discomfort of not knowing motivates researchers to embark on the adventure of inquiry. Once my curiosity has been aroused, I move to formulating a question. Personally, my question remains somewhat fluid as I begin my research because there are times I really don’t have enough information to formulate an answerable question at the beginning of my research.

Step 2 – Collecting the facts, or as I prefer to call it, gathering the evidence, follows. This is one of the juicy, tingly, exciting parts of research. Once I have a question, I think about what information will answer the question. Sometimes simply reading the literature will give me enough of an answer. At other times, I have to go further. Even just thinking about methods can send a shiver of excitement through me. Administering surveys, or conducting interviews, or running the reports from the ILS in hopes it will illuminate some arcane or novel library user behavior are all ways of collecting juicy evidence; it is exciting to see initial results come in and begin to decipher what the results are actually saying. Sometimes the results are too skimpy, or inconclusive, and the evidence gathering net needs to be cast again in a different spot for better results.

Step 3 – Hodgson suggests the next step should be developing a hypothesis to explain the facts you have gathered. This is one step, as much or more than the others, that requires our brain-work. Here we bring our former knowledge to bear on the results and how they relate to the question. It is a time for acute critical thinking as we take the results of our evidence gathering and determine their meaning(s). Several possible meaning may arise at this stage. Hodgson implies it is important to remain open to the multiple meanings and work to understand the evidence gathered in preparation for the next step.

Step 4 – In this step, Hodgson is especially Holmsian. He suggests it is now time to eliminate the weaker hypotheses in order to come closer to a solution. The focus on user experience research is especially strong here. Specific, actionable solutions are being sought to the question identified in the first step. Here he recommends evaluating your evidence to eliminate the weaker evidence in favor of the stronger. He is also cognizant of the need to have solutions that will be affordable and able to be implemented in a given situation. While the whole of this step may not apply to all research, much of it will.

Step 5 – Implementation or action now have their turn. Again, Hodgson is speaking directly to the user experience audience here. However, implementation or action based on research may lead to a decision to not implement or act upon a suggestion. The strength lies in the process around reaching this decision. Questions were asked; evidence was gathered; analysis took place; judgment was applied. As Hodgson pointed out, this is a much better process than proceeding by intuition.

Finally, I would like to add a Step 6 to Hodgson’s list. In order to really know whether the action implemented had the desired effect, or an unintended effect, it is important to evaluate the results of the action or change. In the effort to publish results of research, timeliness is an issue. It is not often possible to have the luxury of the amount of time it would take to be able to measure an effect. However, even in those cases, I am interested in what type of evaluation might take place at a later date. Sometimes researchers will address their future evaluation plans, sometimes they don’t. Even if they aren’t being shared, I hope they are being considered.

This is a simple and elegant plan for research. In its simplicity, it glosses over many of the messy complications that arise when conducting research. That said, I hope this post encourages librarians to follow their curiosity down the path of research.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Open Data and EBLIP – How open are we?

by Pam Ryan
Director, Collections & Technology at Edmonton Public Library

When we talk about evidence based library and information practice (EBLIP), we’re most often talking about research – conducting our own or finding and integrating research and best-available evidence into our practice. Part of this continuum should also include working towards making our own library service and operations data openly available for analysis and re-use.

This isn’t out of line with current library initiatives. Academic libraries have long supported the open access movement and for many, services around managing institutional research data are a current priority. Influenced by open government developments in their municipalities, public libraries are increasingly working to increase open data literacy through programming, encouraging citizens to think critically about government services and learn how to unlock the value of open data.

Why does open data matter for libraries? It aligns with our core values of access to information, sharing, openness, transparency, accountability, and stewardship. It supports our missions to provide information and data literacy, it can provide others with information to help us in our advocacy and value of libraries initiatives, and maybe most importantly, it can fuel research and initiatives we ourselves haven’t yet thought of.

My own place of work has a current business plan goal to: Develop an open data policy that includes how we will use and share our own data; participate in Edmonton’s Open Data community and support data literacy initiatives. We’ve begun to make progress in these areas by developing a statement on open data and collaborating with the City of Edmonton on public programs:

• Edmonton Public Library’s Statement on Open Data:
http://www.epl.ca/opendata

• EPL’s 2014 Open Data Day program:
http://www.epl.ca/odd2014

Has your library started a discussion about what your library’s approach to open data will be?

Further Reading:

Thompson, B, The open library and its enemies, Insights, 2014, 27(3), 229–232; DOI: http://dx.doi.org/10.1629/2048-7754.172

Data is Law / Civic Innovations: The Future is Open http://civic.io/2014/12/28/data-is-law/

pryan@epl.ca / Twitter: @pamryan

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.