“I miss math…”- Strengths & Comfort Zones When Choosing Research Methods

by Laura Newton Miller, on sabbatical from Carleton University

I have the great fortune to be on a one-year sabbatical. I love to learn, and I’ve moved out of my comfort zone by doing more qualitative research. I am interpreting a lot of open-ended comments from many interesting people, and have gone from being overwhelmed to kind of/sort-of comfortable in the mounds of data I’ve collected. I really do appreciate and love the learning.

So, a little story: In late spring, I was helping my 11-year-old son with his homework to find the surface area of triangular prisms. After watching some YouTube videos, we eventually started working through a practice sheet until he finally got the hang of it. While working on some problems myself in order to help him understand, I had a bit of an epiphany: I miss math.

You see, in “real life” I’m an assessment librarian. This started as mainly collections assessment, and eventually broadened to also include service and space.  If anyone ever thought that they would like to become a librarian to avoid math, they best not be working in collections, administration, or assessment. I do math all the time in my job. Does it drive me crazy sometimes? Yep. But I like it- I’ve always been pretty good at it.

For the most part, my research so far this year does not include much math. And that’s ok; It doesn’t work for what I’m trying to do at the moment. I have been stretching out of my comfort zone, treading my way through to learn new skills. I guess this is nothing new- I get out of my comfort zone a lot in my regular job too (ie. I never knew I’d use Excel so much). With learning any new skill, there are overwhelming moments- the “what have I gotten myself into” kinds of moments. They are happening less and less now, but I sometimes find myself comparing this sabbatical to my last one in 2010. At that time, I was just getting used to the idea of doing research at all. One of the things I did was a bibliographic study on graduate biology theses at Carleton University (shameless plug here: http://www.istl.org/11-winter/refereed3.html). There was lot of math involved.  It was a very new process for me and I’m sure I had my doubts at the time, but I also remember saying out loud “I LOVE this”. Not that I’m NOT loving what I’m doing now…I’ve certainly had my “ooh” moments…. I just find it more…difficult maybe?

I love Selinda Berg’s blog post (https://words.usask.ca/ceblipblog/2016/03/22/capacity-not-competencies/) focusing on capacities for research- not just research competencies. I have to keep reminding myself that this is a learning process. I’m definitely growing as a researcher. I remember being part of the Librarians’ Research Institute (2014) (http://www.carl-abrc.ca/strengthening-capacity/workshops-and-training/librarians-research-institute/). Although I can’t find it in my notes (and I still refer to them 🙂 ), I do remember us talking about choosing research methods to answer your questions- understanding the advantages and disadvantages of choosing quantitative, qualitative, or critical/theoretical methods. In the end though, someone said you do have to feel comfortable with your choice of research method. As an example, if you are a complete introvert, you have to ask yourself if you really want to conduct focus groups or interviews. Just how much do you want to get out of your comfort zone?

I’m happy to be out of my comfort zone, but I have also learned that when I’m looking at future ways to answer my research questions, I need to remember my strengths and skills that I do have. I purposely did not say “weaknesses” because those are the opportunities to learn. I do think that librarians can sometimes be a little “judgey” about some methods (ie “not another survey”) and this is not helpful.

Ultimately choose the research method that is right for your research question, and when weighing the pros and cons of each method, remember your strengths and the learning curve that might be involved. Next time (if it makes sense to do so) I know that I won’t necessarily leave math out of the equation (bad pun intended).

This article gives the views of the author and not necessarily the views the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

More Data Please! Research Methods, Libraries, and Geospatial Data Catalogs: C-EBLIP Journal Club, August 25, 2016

by Kristin Bogdan
Engineering and GIS Librarian
University Library, University of Saskatchewan

Article: Kollen, C., Dietz, C., Suh, J., & Lee, A. (2013). Geospatial Data Catalogs: Approaches by Academic Libraries. Journal of Map & Geography Libraries, 9(3), 276-295.

I was excited to have the opportunity to kick-off the C-EBLIP Journal Club after a brief summer hiatus with a topic that is close to my heart – geospatial data! This article was great in the context of C-EBLIP Journal Club because it introduced the basics of geospatial data catalogs and the services around them, and provided an opportunity to look at the methods used by the authors as part of an ALA Map and Geospatial Information Round Table (MAGIRT) subcommittee research project.

Most of the group was unfamiliar with geospatial data catalogs, so the introductory material provided a good base for further discussion. There was good material about the breadth of the different metadata standards involved and how they are applied at the different levels of data detail. There was also good discussion about the importance of collaboration and the OpenGeoportal consortium in developing geospatial data catalogs.

One of the key themes of our discussion was that we would have liked to see more information about the research design and more data. We would have liked to see mention of the ethics process that the authors went through before carrying out their study. Our group had questions about the process that the subcommittee used to choose their sample, as it seemed like it was fairly limited. The authors acknowledge that this was not meant “to create a complete inventory” (p.281), but it seemed like it could have been broader to be more representative. We would also have liked to see the questions that were asked during the interviews and more of the qualitative data from the interviews themselves. It was unclear how structured the conversations with the catalog managers were and how the data presented in the tables and the conclusions were derived. The information presented in the tables was not consistently organized and seemed like it would have been more useful in the context of the interview. The pie chart they used on page 283 to show the “Approaches to Developing Geospatial Data Catalogs” was not as useful as a table of the same information would have been, as there are 5 pie sections to represent 11 data points.

In light of the questions around the data collection, the leap from the tables of responses to the recommendations seemed fairly large. In general, the lists of questions to consider when determining how to implement a geospatial data catalog were helpful but they aren’t really recommendations. The cases that they present provide some ideas about the staffing and skills required to create a geospatial catalog, but they are vague. The first case seemed unnecessary, as it states “The library has determined that there is a clear need to provide access to the library’s spatial data and other spatial data needed by the library’s customers. However, the library does not have the technology, staffing, or funding needed to develop a spatial data catalog.” It would have been nice to see some alternative solutions for those without the ability to create a full-blown data catalog like suggestions about some practices that could be put in place to start building the foundation of a geospatial data catalog like specific cataloging practices or file-type considerations.

Our discussion concluded with reflection on how carefully and critically we read articles in our general research lives. One of the great things about Journal Club is that we have the opportunity to really interrogate and dissect what we are reading. The ensuing discussion is an opportunity to see the article from many different perspectives. This makes us better researchers in two ways: we are trained to more thoroughly evaluate the things we read and we take that into consideration in the research that we produce.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.

Describing a phenomenon through experience: C-EBLIP Journal Club, February 16, 2016

by Carolyn Pytlyk
Research Facilitator
University Library, University of Saskatchewan

Article:
Forster, M. (2015). Phenomenography: A methodology for information literacy research. Journal of Librarianship and Information Science, 1–10. doi: 10.1177/ 0961000614566481.

Way back in October at the C-EBLIP Fall Symposium, Margy MacMillan from Mount Royal University talked about phenomenography as a new methodology for conducting research on information literacy. Phenomenography is “the empirical study of the limited number of qualitatively different ways in which various phenomena in, and aspects of, the world around us are experienced” (Marton quoted in Forster, p. 1). Margy’s enthusiasm and excitement for phenomenography certainly piqued my interest. In my conversations with library researchers, research methodology is often to topic of discussion when planning research projects, applying for grants, or developing research budgets and can sometimes be a stumbling block for researchers. As such when it came my turn to convene Journal Club, I thought Forster’s review article might be a good opportunity to explore phenomenography as a viable library research methodology for library researchers.

The majority of our conversation revolved around whether phenomenography was indeed a useful new methodology for conducting library research or not. For the most part, we agreed that from the perspective of the review article, it seemed a rather complex and involved methodology. However, in the end, we couldn’t really tell without actually following a researcher through the process. This review article was a fairly good introduction to and overview of phenomenography but to really understand its complexity, we agreed that we would need to read research employing phenomenography as a methodology to see how it works and if it is really as complex as it seems at the outset.

While presenting an intriguing and possible methodological alternative, this article left us with many more questions than answers. Some questions stemming from this review article include:
1. Is this a useful methodology? Would library researchers use it?
2. Is it a methodology about how we think?
3. How do researchers unobtrusively interview people without priming the participants? Is it even possible?
4. Is it a complex methodology, or does it just seem like it?
5. What are the steps involved? How does someone actually do it?
6. Could it be appropriate for library research other than information literacy (like usability or librarians as researchers)?
7. What other methodologies are out there in other disciplines that are possible for library research?
8. What sorts of learning/training would researchers need before undertaking phenomenography?
9. Do researchers have to be experienced interviewers to use it?

Still, despite the numerous unanswered questions, we were not deterred and were in agreement that we are all keen to learn more about it and its process.

Finally, we rounded out our conversation with the value of review articles, although not all of us are keen on them. (Don’t worry; I won’t name names.). Forster’s article not only opened our eyes to phenomenography as a new methodology; it also opened our eyes to the value of review articles as providing overviews of new methodologies, both as consumers and producers of knowledge.

This article gives the views of the author(s) and not necessarily the views of the Centre for Evidence Based Library and Information Practice or the University Library, University of Saskatchewan.