CategoryData Management

Strongest Signal That I’ve Seen

I’ve added a privacy policy page to my site. I am not going to pretend that I fully understand the European Union’s General Data Protection Regulation (GDPR), mostly because I haven’t bothered to read it. Not to say I am completely oblivious: I’ve mostly read the emails I’ve received from every website I’ve ever submitted my email address to, which has been hilarious, especially when I got emails from websites I had forgotten about.

Technically speaking, the GDPR shouldn’t affect this site, but I made sure to include a privacy policy on our Eurovision blog, especially because we use Google Analytics and AdSense on our site. WordPress had provided a template, but to be frank, I found it a little bit too bloated  for my liking.

https://twitter.com/Fiona_Bradley/status/983783222399119361

Inspired by the above tweet from Fiona Bradley, I wrote up a brief description of how our site uses cookies and what Google and Automattic (WordPress’ developer) does with them. I know I have a tendency to be verbose so I tried to keep it as simple as possible, then linked out to the Silicon Valley jargon repositories for more information.

That done, I could sit back and pop some popcorn to enjoy while reading about non-European websites caught off guard when the regulation went into effect on May 25.

We Want Qualitative Information

I am quite desperate for a way to manage and analyse qualitative data.

My main job task at work is managing quantitative data. If it’s a number we can plug into a spreadsheet, I know how to collect it. There is plenty of room for improvement and, to be honest, I am cursing the fact that I didn’t pay more attention in my Access class in grad school. But generally speaking, I have a good handle on the types of quantitative data we collect, the flaws and the needed improvements to our processes and to our datasets, and the ways we can use that data to tell our story to whoever asks.

But we have access to all sorts of qualitative data as well. For example:

  • Reports from foreign service officers;
  • Cables from posts;
  • Reports submitted to an internal reporting system;
  • Posts in the community forum on our website;
  • Posts in our Facebook group;
  • Newsletters and other activity reports that are either sent to us directly or shared via our email lists.

All of this is spread out over a variety of disconnected locations. We have troves of information stashed in mattresses all over our house and seemingly no good way to tie it all together.

So that is my holy quest: to research and compile ideas for managing qualitative data and figure out how best to implement those ideas. I am told that it better not just result in a word cloud.

Things That Might Go Click With Me

I have to admit that my mind is a bit of a swirl right now. It’s hard to explain why yet, but perhaps I am dropping hints below. Or I am just summarizing three interesting reading materials that I recently pored through.

Tanya Golash-Boza. “Writing a Literature Review: Six Steps to Get You from Start to FinishWiley Exchanges (2015).

Golash-Boza lists steps to help dissertation writers organize and write their literature reviews. The post summarizes the literature review section of Sonja Foss and William Walters’ book Destination Dissertation: A Traveler’s Guide to a Done Dissertation.

Katherine Brown and Chris Hensman (editors). “Data Driven Public Diplomacy: Progress Towards Measuring the Impact of Public Diplomacy and International Broadcasting Activities (PDF)” U.S. Advisory Commission on Public Diplomacy: Reports (2014).

The U.S. Advisory Commission on Public Diplomacy report names five areas of public diplomacy evaluation at the U.S. Department of State and the Broadcasting Board of Governors that need to be changed and makes recommendations on how to modernize and systemize evaluation in those areas.

Kylie Hutchinson. “The Demise of the Lengthy Report.” AEA365 (2017).

Hutchinson describes how “layering” (her term) data into different types of reporting formats, such as newsletters, infographics, presentations, et cetera, can expland the value of data, extend its reach, and replace an ominous final report. The post is a bit of a promo of Hutchinson’s new book, but it also succinctly encourages you to think about different ways to present your data to different audiences.

Where Do I Begin?

Here is as clear a mission statement for this blog as you are going to get:

This blog will explore how librarians use data to understand audiences and improve services.

What does that specifically mean? I don’t know. We’ll see.

To start, I’ve written brief summaries of a few articles and reports that have been influential in my work over the past six months or so.

Pip Christie. “Are Librarians Becoming Data Analysts?Vable (2016).

Christie points to potential opportunities librarians have to market themselves as data analysts and discusses ways to use data analysis tools to one’s advantage. Useful from the perspective of identifying ways librarians can put their skills to use in new ways.

Mahesh Kelkar, et al. “Data-driven Decision Making In Government.” Deloitte Center for Government Insights (2016).

A team from Deloitte Center for Government Insights describes best practices in U.S. government data-driven decision-making and outlines techniques government offices can use to improve their analytics capabilities. Really nice report that offers a thoughtful road map for building program evaluation capacity.

Bill Pardi. “If You Want to Be Creative, Don’t Be Data Driven.” Microsoft Design (2017).

Pardi discusses potential problems with being too reliant on data to drive decision-making. Reminiscent of Darrell Huff’s “How to Lie with Statistics.”

On a Need to Know Basis

One of the assumptions that librarians make is that the value of our services is self-evident. We see the value every day, but we take it for granted that others (ones who have a say over library budgets, for example) see things our way.

How do we justify our existence to our stakeholders? Do we present numbers from usage reports? Do we present anecdotal evidence? What can we use to incontrovertibly prove our worth to those who hold our fates in their hands?

As mentioned in Measurement Points, my office has trying to determine what usage data that we’ve collected best shows the success of our electronic resources platform. The conclusion we came to is that, while we have a lot of good data, we need to make some tweaks to improve their quality.

Certainly, it should be easy enough to enhance the data we get from the systems we built ourselves. We can also use code to improve the type of data that we collect using Google Analytics.

We don’t have a say in the types of data we get from our resource providers. We can (and have been) requesting additional reports beyond the canned ones in the client services modules. Some of these reports exist, some have become enhancement requests. Honestly, not all of the reports we’ve asked for would be useful to other clients, so it’s probably easier for our reps to just run them when we ask for them.

There are other types of analysis that we can do to enhance the data we already have. We’re going to do cost analysis reports from a couple of different angles to figure out where we’re getting bang for our buck and where we can get more bang.

We also need to do more analysis of qualitative data, but we have to figure out how to get it. There’s a central system where spaces in the field report on their work and activities, and just about all of the items related to our platform that have been submitted are about training sessions. I’m glad to see the training, but as we enter the fifth year of this project, I really want to see more about how users are using the resources.

That is going to be our biggest challenge moving forward, because, while the hard numbers can be impressive, they don’t measure the impact the resources have had on our users. And that is really what is going to show the value of our services.

© 2018 Chris Zammarelli

Theme by Anders NorénUp ↑