What resources don’t we have stats for?
ABC-CLIO, hoopla audiobooks and ebooks, ProQuest Research Companion, North American Women's Letters and Diaires (starting in June 2020), Statistical Abstract (starting in January 2020). No consortial reports are available. Statistics can be downloaded by individual libraries for all of these resources except for hoopla. Consortial reports will not be available for hoopla until they roll out the consortial platform functionality.
Why are abstract numbers lower with COUNTER 5 than COUNTER 4 reports?
NC LIVE used Result Clicks for Abstracts in COUNTER 5, and we use Total Item Investigations for Abstracts plus Full Text Views in COUNTER 5. “Result clicks (COUNTER 4 metric) is defined as a click resulting from search results. This can include article links, ILL, catalog links in the search results. Total item investigations (COUNTER 5 metric) is defined as a retrieval (viewing, downloading, printing, emailing items – whether abstract or full-text).”
COUNTER 5 Usage Reports
Starting with January 2020 reports, NC LIVE is using COUNTER 5 reports for Gale and ProQuest, instead of COUNTER 4 reports. We are using the COUNTER 5 DR D1 (Database Search and Item Usage) Report.
For more information about COUNTER 5 Item Requests vs. Total Item Investigations, please see: https://www.
Can my library see our vendor reports?
Any library is welcome to access usage statistics directly from vendors, including title-level and other detailed reports not included in the NC LIVE usage reports. More information is available here.
How is NC LIVE tracking usage statistics to make decisions?
The Resource Advisory Committee (RAC) reviews usage data reports to make resource purchasing recommendations to the NC LIVE Librarians Council. Vendor-reported usage statistics are just one measure the RAC uses to make resource selection decisions. Other data sources are reviewed to get a holistic picture of member library needs including member library feedback, discovery service statistics, available funding, and resource cost.
Auto Repair Reference Center Statistics
Starting with February 2019 reports, EBSCO changed the way that Auto Repair Reference Center usage statistics are calculated. January 2019 and prior, data was calculated in a way that attributed multiple sessions to different vehicle makes and models within Auto Repair Reference Center. Now, EBSCO is de-duplicating session IDs, which represents unique user sessions, in a more consistent manner. This inevitably results in a perceived decrease in Auto Repair Reference Center usage compared to previous months. However, the de-duplication ensures a true and consistent number of unique user sessions by allotting one "session" per user login, regardless of the amount of content accessed within the interface once logged in.
Ebook Central - Ebrary Comparison
Ebrary and Ebook Central usage statistics are most accurately compared by using Ebrary “Sessions” and Ebook Central “FT-Views” as reported on NC LIVE’s usage reports.
Ebrary data listed on NC LIVE reports under “Sessions” was reported on Ebrary reports as “User Sessions”. Ebrary data listed on NC LIVE reports as “FT-Views” and “AbsPlusViews” was reported on Ebrary reports as “Full-Title Downloads”.
The Ebook Central “Usage Report” lists one row for each time an ebook is accessed, regardless what actions, reading, downloading, printing, etc. were taken in the ebook. The sum of these rows, by library, shows how many times ebooks were accessed in Ebook Central and is reported on NC LIVE reports as “FT-Views” and “AbsPlusViews”.
Ebook Central does not provide a user sessions report or a more accurate full-text metric. Ebook Central does provide COUNTER Book Report 2 reports. However, this report contains "the sum of Pages Viewed, Pages Printed, Pages Copied, Chapter Downloads, and Full Downloads". The inclusion of page metrics in this summation creates inflated view of full-text usage, so NC LIVE does not use it.
Due to a technical issue on Ancestry's side, the session count data in customer COUNTER Database reports is incorrect for the time period of December 10, 2018 through January 22, 2019. These incorrect session counts overestimate the number of sessions during that time period for both Ancestry Library Edition and HeritageQuest Online.
As of January 23, 2019, Ancestry has fixed this technical issue and the session counts from January 23 forward are correct and accurate. Unfortunately, Ancestry has advised that corrected session counts for the time period in question cannot be provided or recovered.
Effective November 2017 HeritageQuest changed the customer hierarchy reporting methodology used to measure library usage on HertiageQuest Online and Ancestry Library Edition. No change was made to the information displayed in the reports or the usage recorded. The methodology change may result in lower usage counts for some libraries due to the new methodology using a revised parent/child hierarchy to ensure usage is only counted once. Usage data for October 2017 and prior uses the old methodology, which may result in higher usage counts. Usage data from November 2017 to present uses the new methodology HeritageQuest implemented to more accurately represent usage and avoid duplication of usage.
Infobase - Ferguson’s Career Guidance Center and Films on Demand Statistics
Infobase underwent a COUNTER certification review with LibLynx that involved an extensive review of how usage is logged by Infobase. This resulted in several reporting changes that are reflected in NC LIVE reports for Ferguson’s Career Guidance Center and Films on Demand starting January 1, 2019. Overall, lower usage should be expected for Ferguson’s and a slight increase in usage for Films on Demand.
Searches in both platforms will be lower post January 1, 2019. Previously going to additional pages of search results via pagination links or load more buttons registered additional searches. These actions are no longer recorded as searches.
For Ferguson’s index pages and other browsable pages were inaccurately logging record views when these pages were accessed. No actual content is viewable on these pages and therefore these events should not have been counted. Additionally, page tool access were incorrectly logged as record views. Only tools derived around downloading or printing in certain circumstances may log record views or multimedia views. Lastly, when two requests were made for the same article within specified time limits (10 seconds for HTML, 30 seconds for PDF), the first request should be removed and the second retained. Any additional requests for the same article within these time limits should be treated identically: always remove the first and retain the second. Our legacy reports had no mechanism for filtering out these types of requests. Starting January 1, 2019, these changes will likely result in noticeably lower ‘full-text views’ in NC LIVE reports for Ferguson’s.
For Films on Demand, InfoBase discovered that usage was undercounted in some places. For example, watching a video via the video preview page tool was not registering record views and now it does. Starting January 1, 2019, this may result in higher ‘full-text views’ in NC LIVE reports for Films on Demand.
ProQuest Search Statistics
Total ProQuest searches shown on NC LIVE reports is the deduplicated number of federated searches performed across the entire ProQuest platform and listed on NC LIVE reports as “ProQuest Unique Searches” plus searches from SIRS Knowledge Source and Statistical Abstract of the United States searches. ProQuest COUNTER Database 1 Report is used by NC LIVE to compile these statistics.
Neither SIRS Knowledge Source nor the Statistical Abstract of the United States are included in the ProQuest federated search and are therefore not included in “ProQuest Unique Searches”.
All other searches in ProQuest are preformed across multiple platforms as a federated search and are deduplicated for accuracy. For example, when a user performs a search in the Psychology Database it also counts as a search in each of the individual databases across the ProQuest Platform, such as the Science Database, U.S. Newsstream as well as all other ProQuest databases. To avoid inflating statistics, search statistics across ProQuest are deduplicated which removes the count of the same search being performed in each ProQuest database. These deduplicated searches are labeled as "ProQuest unique searches", which is included in the ProQuest search totals.
The ProQuest administrative portal offers Database Activity reports in addition to COUNTER reports. Usage statistics may differ between these two reports due to ProQuest measuring searches different than is required by COUNTER standards.
Testing & Education Reference Center - Learning Express Comparison
Given the disparity in full-text data reported by Learning Express and Testing & Education Reference Center (TERC) it is recommended to evaluate “sessions” when comparing these two resources even though “sessions” for either resource does not fully capture user activity within the resource.
TERC data listed on the NC LIVE reports under “FT-Views” and “AbsPlusViews” are measured on TERC reports as “retrievals” which are the number of times a PDF book was opened within TERC. TERC data listed on the NC LIVE reports under “Sessions”, are measured on TERC reports as “sessions” and are the number of times users log in, regardless of what actions they take after logging in. Neither TERC metric, “retrievals” nor “sessions”, fully captures user activity as many resources are accessed without downloading a PDF, and users may view more than one resource during each session.
Learning Express reported “recorded sessions” which shows in NC LIVE data under “FT-Views” and “AbsPlusViews”. “Recorded Sessions” is the sum of: tests added, eBooks added, tutorials added, computer tutorials added. Learning Express “User Sessions” are shown on NC LIVE reports as “Sessions”.
Why do 2015 usage reports have very different numbers as compared to 2014?
Beginning in January 2015, NC LIVE began creating library usage reports using COUNTER-compliant vendor data when available. This change was made to allow for more consistent and comparable usage definitions across resource vendors. NC LIVE provided updated usage data definitions that detail which vendor reports are used and what constitutes a search, session, full-text view, and full-text view plus abstract view for each NC LIVE resource.
This change, combined with discovery service changes made in 2015, means that reliable direct comparisons cannot be made between resources NC LIVE licensed in 2012-2014 (such as EBSCOhost research databases), and different resources licensed in 2015-2017 (such as ProQuest research databases). This is because the usage definitions (i.e. what constitutes a search, session, full-text view, or full-text view plus abstract) for different vendors are not necessarily the same.
Why are my library’s search numbers extremely low compared to 2014?
Differences in the way that NC LIVE’s previous discovery service (EDS) and current discovery service (Summon) affect search totals in individual databases accounts for the very different search totals some libraries see for some resources in 2015. Where EDS would search every database in a profile every time a search or search refinement was performed, the structure of Summon does not result in as many instances of all databases being searched at once. For some libraries this greatly reduced the number of searches reported.
Because of the impact of discovery on search totals, libraries should be cautious about adding up searches across different databases to report as a single search total. Because one user’s query may register as a search for multiple databases, adding up those numbers will count one query multiple times.
Additionally, NC LIVE usage reports only show searches for individual libraries across our databases, but do not include Summon searches. We do not currently have a reliable method to differentiate Summon users by institution, so those search numbers are omitted from usage reports.
Does this mean usage reports are not useful for comparing ‘old’ and ‘new’ resources?
It is always difficult to compare usage of one vendor’s resource to another. Factors including discovery methods, vendor data definitions, and local integrations can mean a comparison between two resources from different vendors is not informative---like comparing “apples to oranges.”
However, comparing usage of a resource to itself over time with reliable and consistent data can be very useful. The change to COUNTER-compliant vendor data when available ensures that this type of comparison is reliable, and could even allow for more accurate cross-vendor comparisons in the future.
Which resources have the same data definitions for both 2014 and 2015?
The following resources use the same measures both before and after 2015:
Alexander Street Press
CQ Researcher and CQ Weekly
eBooks on EBSCOhost
NC LIVE Video Collection
The following resources were available before 2015, but the data definitions used to measure them were changed in 2015:
Gale Infotrac Newsstand - changed to COUNTER data
Gale Virtual Reference Library - changed to COUNTER data
Wall Street Journal - changed to COUNTER data ***this resource is now a title in ABI-INFORM and title-level data is available via the ProQuest Administrator Module***