Guidelines for Metrics and Value for ETDs

From IMLS
Jump to: navigation, search

This is a placeholder workspace for draft and final documents related to this project deliverable.

Contents

About this Document

This document will provide guidance for institutions concerning the critical issue of assessment of ETD usage, and how communicating such assessment metrics can demonstrate the benefits of the program to stakeholders. Practical examples of how to document and convey usage metrics will be provided.

A final working outline for this Guidance document is available to all project steering committee authors for review and comment at Google Docs.

Demonstrations of ETD Value

  • Open access ETD

Benefits to scholarly communication:

  • Scholarly Communication: open access overall impact; then ETD specific

http://rivendell.lib.uic.edu/news/2011/10/24/open-access-journals-gain-attention-and-impact/ Experience indicates that open access articles may receive more notoriety and exposure than those released via subscription-based journal models.

Benefits to a university: University role as teaching and research, which can be seen value-added to university (the Graduate College, and the libraries), and students.

  • teaching and research : ETD adds value to graduate study experience.
  • research: mainly library resources and measure of usage of library resources: (ETD adds visibility for universities. e.g. ARL status, Repository ranking etc... )

Benefits to students: core value:

  • further career goal
  • negative impacts?
  • graduate study experience
  • cost-saving

Collecting Usage Metrics

measure impacts (above)

Graduate Students / Value-added Experience

Introduction to Evaluation of Library Resources

Methods and Issues of Evaluation of Traditional Library Resources

Libraries have a long history of evaluating and studying use of library resources and collections. Assessment of library resources for traditional print materials were collected, analyzed, and organized by library. In a traditional print library, librarians count various outputs such as collection size and collection usage data (e.g. circulation numbers), reference question numbers and types, numbers of inter-library loan etc. It is challenge to have a consistent and reliable way to collect these numbers due to variances on collecting methods and difficulty. “The gross data available … have been too global in character and too imprecise in nature to serve as an adequate basis for reformulation of acquisitions policies. …. It is useless to tell teh acquisitions librarian that half the monographs ordered will never be used, unless we can specify which 50 percent to avoid buying. “ (Glavin and Kent, 1977) Traditional library metrics fails to demostrate its academic value. ( redefining the library, 2011).

As prevail of digital materials, driven by changes in content delivery and users behaviors, assessment methods and tools have fundamental changes. For digital collections, they can be regarded as both a resource and a service. As a resource, assessment focuses on how the collection was used. As a service, assessment focuses on how users use digital resources. <Franklin, Kyrillidou, plum, 2006?>

In addition to the problematic issues collecting traditional library metrics, recent study published by OCLC shows that 0% undergraduate students searching for information start with a library’s website, and merely 4% faculty start their research from a library building. (OCLC, 2010). Academic libraries can no longer rely on these traditional library metrics to show their importance, but need to demonstrate the value. through other qualitative and quantitative measures.

History and initiatives of Evaluation of Digital Resources:

As Internet fundamentally changes the way people communicate and share information, libraries see a profound increase in acquiring and serving networked digital resources from traditional materials. Digital resources becomes the de facto standard to deliver information. the Association of Research Libraries (ARL) started to work on new measures for evluation of electronic resources. The E-Metrics project was designed to measure electronic resources as one of the highest priority in 1999. Efforts were made to define the variables to track data on searches, sessions and downloads. (ARL, 2010)

Evaluation of Digital Resources: some briefing on approaches.

It is critical that both quantitative and qualitative approaches shall be taken in evaluating digital resources as a collection and as a service.

Quantitative approaches:

Quantitative approaches are the most frequently used evaluation methods for collection and usage because data is relatively easy to collect and the data ...

  • Library methodology:
    • E-metrics project
    • ARL stats
    • Other efforts.
  • Web analytics methodology:
    • logfile analysis
    • page analysis
    • Google Analytics

ETD collections are generally accessible through institutional repository system such as DSpace, CONTENTdm, and Fedora, where collection data and web analytics data are easy to collection. Some institutions also set up third party web analytics software such as Google Analytics to collect detailed usage and users behaviors.

Qualitative approaches

Qualitative research involves studying and collecting of a variety of empirical materials such as case study and interviews, along with interactional and visual observations, whose purpose is to identify meaning to individual(s). It is reasonable that each individual is different, and therefore the research has to study more than one interpretive practice (Denzin and Lincoln, 2011). the use of multiple methods in qualitative research attempts to gain an in-depth understanding of a research topic. Each method has its history, uses and meanings, context, and implementation.

  • Methodology:
    • Methodology used by libraries: see biblio.
    • Survey: see biblio.
    • Focus group: see biblio
    • LibQual+

Interpretation of Evaluation Data

  • Interpretation of web statistics

When reviewing web analytics, it is advised that each tool/system might have different definitions for the same term. Visit, hit, and page view are not necessarily meaning the same. For example, “visit” is defined as “...” in Google Analytics.

Use of Evaluation Data

  • ROI: with the cost model (see the other doc) and qualitative and quantitative data, a ROI statistics can be established.
  • Comparison with other collections such as licensed database, other online collection if data found.
Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox