Measures and Metrics: Domain: Reference Transactions

Update: The comment period for Version 1 closed on August 22, 2016. Version 2 will be released for comments in January 2017. Archivists and special collections librarians should direct further comments to Task Force co-chairs Amy Schindler  amycschindler@gmail.com (SAA) and Christian Dupont  christian.dupont@bc.edu (ACRL/RBMS).

 

Basic measure (“Reference Questions”)

Count the number of unique Reference Questions received from Users regardless of the method used to submit the request.

Guidelines for collection:

  • Count requests on different topics from the same individual as separate questions.

  • Do not include directional/general information questions, tours, or instruction sessions.

  • Do not include requests from a User to use the next folder, box, item, etc. in a collection while working in the Reading Room as a Reference Transaction.

  • Do include an inquiry from a Reading Room User who asks a follow-up question.

  • Do not include multiple emails, social media interactions, or other conversations on the same question as separate questions.

  • Do include email, social media, or other replies with follow up questions on a new topic as separate questions.

  • Some repositories may find it more practical to collect statistics for only a limited period of time rather than continuously. Academic libraries, for example, are sometimes asked to collect and report the average number of Reference Questions received in a typical week, with a typical week defined according to definitional criteria.

Application and examples:

  • Tally the number of unique Reference Questions.

  • If a User contacts the Repository via email with a Reference Question and then follows up with a clarifying or related question within a reasonable period of time, count this as a single Reference Question. If a User and staff exchange multiple emails related to the same research topic, the Repository may wish to rate this as a single Reference Question, but with a higher “Complexity of Transaction.”

Advanced Measure (“Method”)

Categorize the method by which a Reference Question is received.

Guidelines for collection:

  • Count the methods by which Reference Questions are received. For example, through in-person consultation, telephone, email, social media, website, letter, or other contact.

Application and examples:

  • Tally Reference Questions by the method they were received.

  • At the annual university strategic planning forum, the university archivist is seated with the director of study abroad and is asked to investigate the history of international programs at the institution. The archivist replies via email with a summary of information and links to digital material available online. This is counted as one Reference Question asked in-person from an internal User for an administrative purpose.

Advanced Measure (“Time Spent”)

Track the time spent by staff on a Reference Transaction with a User.

Guidelines for collection:

  • Include time spent in in-person consultation, on the phone, responding to email, etc. as well as time staff spend investigating or conducting research as part of the Reference Transaction.

  • Establish a local policy to record either the actual time spent responding to Reference Questions or an estimated amount of time according to fixed intervals (e.g., 15- or 20-minute time blocks).

Application and examples:

  • Measure the time spent for each transaction.This will provide more details for resource allocations and assist in delineating the level of assistance or consultation needed to meet the needs of Users.

Advanced Measure (“Purpose of Transaction”)

Record the purpose of the Reference Transaction according to a defined rubric. This rubric could be institutionally defined or defined by another body. There are several different types of purposes.

Guidelines for collection:

  • Use a rubric for the transaction as defined by your customer base to determine the purpose. A simple rubric could be internal and external purposes. A more complex rubric could be aligned to specific research topics or archives functions.

Application and examples:

  • A Repository may wish to record whether the User was seeking information about a Repository’s services, the nature of the collection use that it involves, or the intended product or outcome of the consultation. If more than one categorization scheme is employed, care should be taken to avoid conflating statistics from different categories. For example, a tally of the number of genealogical queries should not be added to the number of requests for Reproductions.

  • Repository services: For example, is the purpose of the transaction to request information, request a Reproduction, or request an Interlibrary Loan.

  • Nature of the use: For example, is the purpose of the transaction an internal administrative use, to conduct genealogical research, or conduct research for a class.

  • Product: The purpose could also be phrased in terms of end-product: completing a class assignment, conducting research for a publication, or researching your company’s branding over time.

Advanced Measure (“Complexity of Transaction”)

Record the level of complexity of the transaction based on a predefined scale or category.

Guidelines for collection:

  • Predefine a scale or category to ensure consistency over time of the measure collected. An example of a predefined scale is the READ scale. Examples of predefined categories include: ready reference, research assistance, and research consultation meeting.

  • Predefined scales or categories may be developed for specific Repository types such as government archives, business archives, academic archives, etc.

Application and examples:

  • To assist in creating a complexity scale unique to your repository, determine the types of materials used to respond to the question and the level of staff knowledge needed.

Recommended metrics

Total number of Reference Questions per day/week/month/year

  • Graphing the total number of Reference Transactions over a given period of time can reveal usage patterns. For instance, daily Reference Transactions might increase before an academic institution’s annual alumni weekend.

  • Comparing the total number of Reference Transactions per day/week/month for multiple years in succession can reveal fluctuations in usage levels and trends.

Average number of minutes spent responding to Reference Questions

  • Divide the total number of minutes spent assisting Users by the total number of Reference Transactions.

  • Comparing the average length of time spent responding to Reference Questions may point to a need to review staffing allocations.

Average number of minutes spent responding to internal vs. external Users

  • Divide the total number of minutes responding to internal Users by number of internal Users; do the same for external Users.

  • Comparing the average length of time spent responding to internal vs. external Users may point to a need to review a Repository’s mission or customer service philosophy.

Relationship of time spent with Users and time spent in Reading Room

  • Correlate the staff interaction time (not including retrievals) with Users with the actual length of time Users spent in the Reading Room using Collection Units.

Total number of Users in each demographic per day/week/month/year

  • Comparing the number of Users in each demographic category tracked by the Repository can reveal usage patterns and trends.

  • Repositories may identify demographic groups whose needs are not being met by the Repository’s holdings or staff outreach. For instance, if few alumni make use of an academic archives it may point to the need to market to that demographic group or review collection development in institutional topics of interest to alumni.

  • Repositories may be able to attribute changes in Users by demographic year to year to the effectiveness of outreach and marketing programs.

Total number of Reference Questions per day/week/month/year by each User demographic

  • Comparing the number Reference Questions received in each User category tracked by the Repository can reveal usage patterns and trends.

Total number of Reference Questions per week/month/year via each method

  • Comparing the number of Reference Questions via each method tracked by the Repository can reveal usage patterns and trends.

  • Repositories may identify a need for changes in staffing patterns for public service desks.

 

 

Next: Domain: Reading Room Visits

Table of Contents

Introduction

Measures and Metrics:

Appendix A: Glossary