COUNTER Code of Practice Release 5.0.3
COUNTER’s library and content provider members have contributed to the development of Release 5 (R5) of the COUNTER Code of Practice.
The Code of Practice enables content providers to produce consistent, comparable and credible usage data for their online content. This allows librarians and other interested parties to compare the usage data they receive, and to understand and demonstrate the value of the electronic resources to which they subscribe.
Release 5.0.3 (published 28 April 2023) will become the current Code of Practice and the requirement for COUNTER compliance effective immediately.
The Code of Practice is available from the COUNTER website as an interactive code. This online version is the version of record for Release 5 of the Code of Practice.
Foreword
Librarians spend considerable amounts of money licensing different types of online content and want to measure return on the investment and to ensure that library budgets are spent as productively as possible. One of the ways to measure this return on investment is to assess usage statistics.
This release of the COUNTER Code of Practice is designed to balance changing reporting needs with the need to make things simpler so that all content providers can achieve compliance and librarians can have usage statistics that are credible, consistent and comparable.
Consistency in report formats
Release 5 consists of four Master Reports. Each of the Master Reports is associated with several pre-set filtered Standard Views, but can also be examined from different viewpoints to suit the needs of the person working with the report. Librarians will be able to use Master Report to customize their analysis to meet their specific reporting need.
Consistency and clarity in metrics
Release 5 also introduces a new Metric Types which ensure flexibility and depth of reporting.
Flexibility
Flexibility is built into Release 5 with the introduction of attributes, pieces of information which can be associated with multiple metrics. Providing information about matters such as year of publication, access type, and data types means that users can roll up or drill down through reports with ease, eliminating the need for special purpose reports.
How do I use this Code of Practice?
The Code of Practice is available from the COUNTER website as an interactive code. This online version is the version of record for Release 5 of the Code of Practice.
You can download each of the sections in the Code of Practice.
In the navigation bar immediately below Search, clicking on Glossary, will provide a pop-up window with terms and definitions.
You can click the + or - key to increase or decrease the font size in the Code of Practice.
The Code of Practice will be of interest to both content providers and librarians, however some sections are more relevant to particular user cases.
Sections 1 and 2 provide an introduction and outline of the scope of the COUNTER Code of Practice.
Sections 3 and 4 provide an explanation of the Master Reports and Standard Views which are a requirement for COUNTER-compliance and that allow the librarian to filter and configure to create customized “views” of their usage data. Section 3 also explains Metric Types and Attributes.
Content Providers implementing Release 5
Sections 5 to 7 provide essential information. These sections give detail on the delivery of COUNTER-compliant reports and views, logging usage and processing rules. You will also want to refer to the Friendly Guide To Release 5 Technical Notes for Providers.
COUNTER compliance requires content hosts to implement COUNTER_SUSHI (the standardised model for harvesting online usage data). Section 8 provides the specifications for the RESTful COUNTER_SUSHI API and the methods that must be supported. Appendix F explains handling errors and exceptions.
Content Providers preparing for COUNTER audit
An important feature of the COUNTER Code of Practice is that compliant content providers must be independently audited on a regular basis in order to maintain their COUNTER compliant status. If you are preparing for a COUNTER audit, Section 9 explains the audit process and procedures. Appendix E explains audit requirements and tests.
COUNTER would like to acknowledge the support of UKSG in the publication of the Code of Practice Release 5.
Conventions
This Code of Practice is implemented using the following convention:
The keywords MUST (or REQUIRED), MUST NOT, SHOULD (or RECOMMENDED), SHOULD NOT (or NOT RECOMMENDED), and OPTIONAL in this document are to be interpreted as described in RFC 2119.
Note that the force of these words is modified by the requirement level of the document in which they are used.
MUST (or REQUIRED) means that the definition is an absolute requirement of the specification.
MUST NOT means that the definition is an absolute prohibition of the specification.
SHOULD (or RECOMMENDED) means that there may be valid reasons in certain circumstances to ignore a particular item, but the full implications should be understood and carefully weighed before choosing a different course.
SHOULD NOT (or NOT RECOMMENDED) means that there may be valid reasons in certain circumstances when the particular behaviour is acceptable or even useful, but the full implications should be understood and the case carefully weighed before implementing any behaviour described with this label.
Content providers implementing the Code of Practice who feel they have a valid disagreement with a requirement of the code are requested to present their case in writing to the COUNTER Project Director and ask for clarification on interpretation of the code.
Text appearing in italic will be replaced with appropriate values at implementation time, terms enclosed in curly brackets are variables. For example, Exception in the format “{Exception Code}: {Exception Message}” might resolve to “3030: No Usage Available for Requested Dates”.
Introduction
Since its inception in 2002, COUNTER has been focused on providing a code of practice that helps ensure librarians have access to consistent, comparable, and credible usage reporting for their online scholarly information. COUNTER serves librarians, content providers, and others by facilitating the recording and exchange of online usage statistics. The COUNTER Code of Practice provides guidance on data elements to be measured and definitions of these data elements, as well as guidelines for output report content and formatting and requirements for data processing and auditing. To have their usage statistics and reports designated COUNTER compliant, content providers MUST provide usage statistics that conform to the current Code of Practice.
General Information
Purpose
The purpose of the COUNTER Code of Practice is to facilitate the recording, exchange, and interpretation of online usage data by establishing open international standards and protocols for the provision of content-provider-generated usage statistics that are consistent, comparable, and credible.
Scope
This COUNTER Code of Practice provides a framework for the recording and exchange of online usage statistics for the major categories of e-resources (journals, databases, books, reference works, and multimedia databases) at an international level. In doing so, it covers the following areas: data elements to be measured, definitions of these data elements, content and format of usage reports, requirements for data processing, requirements for auditing, and guidelines to avoid duplicate counting.
Application
COUNTER is designed for librarians, content providers and others who require reliable online usage statistics. The guidelines provided by this Code of Practice enable librarians to compare statistics from different platforms, to make better-informed purchasing decisions, and to plan more effectively. COUNTER also provides content providers with the detailed specifications they need to follow to generate data in a format useful to their customers, to compare the relative usage of different delivery channels, and to learn more about online usage patterns. COUNTER also provides guidance to others interested in information about online usage statistics.
Strategy
COUNTER provides an open Code of Practice that evolves in response to the demands of the international library and content provider communities. The Code of Practice is continually under review; feedback on its scope and application are actively sought from all interested parties. See Section 12 below.
Governance
The COUNTER Code of Practice is owned and developed by Counter Online Metrics (COUNTER), a non-profit distributing company registered in England. A Board of Directors governs Counter Online Metrics. An Executive Committee reports to the Board, and the day-to-day management of COUNTER is the responsibility of the Project Director.
Definitions
This Code of Practice provides definitions of data elements and other terms that are relevant, not only to the usage reports specified in Release 5 (R5), but also to other reports that content providers may wish to generate. Every effort has been made to use existing ISO, NISO, etc. definitions where appropriate, and these sources are cited (see Appendix A).
Versions
The COUNTER Code of Practice will be extended and upgraded as necessary based on input from the communities it serves. Each new version will be made available as a numbered release on the COUNTER website; users will be alerted to its availability. R5 of the Code of Practice replaces Release 4 (R4) of the Code of Practice. The deadline date for implementation of this Release is 01-Jan-2019. After this date, only those content providers compliant with R5 will be deemed compliant with the Code of Practice.
COUNTER R5 introduces a continuous maintenance process (see Section 12 below) that will allow the Code of Practice to evolve over time minimizing the need for major version changes.
Auditing and COUNTER Compliance
An independent annual audit is REQUIRED of each content provider’s reports and processes to certify that they are COUNTER compliant. The auditing process is designed to be simple, straightforward and not unduly burdensome or costly to the content provider while providing reassurance to customers of the reliability of the COUNTER usage data. See Section 9 below and Appendix E for more details.
Relationship to other Standards, Protocols and Codes
The COUNTER Code of Practice builds on several existing industry initiatives and standards that address content provider-based online performance measures. Where appropriate, definitions of data elements and other terms from these sources have been used in this Code of Practice, and these are identified in Appendix A.
Making Comments on the Code of Practice
The COUNTER Executive Committee welcomes comments on the Code of Practice (see Section 12 below).
Changes from COUNTER Release 4
Changes in the nature of online content and how it is accessed have resulted in the COUNTER Code of Practice evolving in an attempt to accommodate those changes. This evolution resulted in some ambiguities and, in some cases, conflicts and confusions within the Code of Practice. R5 of the COUNTER Code of Practice is focused on improving the clarity, consistency, and comparability of usage reporting.
List of Reports
R5 of the COUNTER Code of Practice reduces the overall number of reports by replacing many of the special-purpose reports that are seldom used with a small number of flexible generic reports. All COUNTER R4 reports have either been renamed or eliminated in favour of other COUNTER R5 report options.
See Appendix B, Section B.1.1 for more details.
Report Format
The Standardized Usage Statistics Harvesting Initiative (SUSHI) protocol used in R4 was designed to simplify the gathering of usage statistics by librarians. In R5 the SOAP/XML based SUSHI protocol is replaced with the RESTful COUNTER_SUSHI API that uses JavaScript Object Notation (JSON) for a more lightweight data-interchange. The JSON format not only is easy for humans to read and write, but it is easy for machines to parse and generate. Support of the COUNTER_SUSHI API is mandatory for compliance with R5 (see Section 8 below).
With R5, all COUNTER reports are structured the same way to ensure consistency, not only between reports, but also between the JSON and tabular versions of the reports. Now, all reports share the same format for the header, the report body is derived from the same set of element names, total rows have been eliminated, and data values are consistent between the JSON and tabular version. R5 also addresses the problems of terminology and report layouts varying from report to report, as well as JSON and tabular versions of the same report producing different results while still being compliant.
Metric Types
R5 strives for simplicity and clarity by reducing the number of Metric_Types and applying these Metric_Types across all reports, as applicable. With R4, Book Reports had metric types that could be considered different from metric types in Journal Reports and metric types attempting to reflect additional attributes such as mobile usage, usage by format, etc. Most R4 metric types have either been renamed or eliminated in favour of new R5 Metric_Types.
See Appendix B, Section B.1.3 for a table showing the R4 metric types and their R5 equivalence or status.
New Elements and Attributes Introduced
With R4 the nature of the usage sometimes had to be inferred based on the name of the report. To provide more consistent and comparable reporting, R5 introduces some additional attributes that content providers can use to create breakdowns and summaries of usage.
Access_Type |
Used to track usage of content that is either OA_Gold (Gold Open Access) or Controlled (requires a license). |
Access_Method |
Used to track if the purpose of the access was for regular use or for text and data mining (TDM). This attribute allows TDM usage to be excluded from Standard Views and reported on separately. |
Data_Type |
Identifies the type of content usage being reported on. Expanded to include additional Data_Types, including Article, Book, Book_Segment, Database, Dataset, Journal, Multimedia, Newspaper_or_Newsletter, Other, Platform, Report, Repository_Item, and Thesis_or_Dissertation. |
Publisher_ID |
Introduced to improve matching and reporting by publisher. |
Section_Type |
Identifies the type of section that was accessed by the user, including Article, Book, Chapter, Other and Section. Used primarily for reporting on book usage where content is delivered by section. |
YOP |
Year of publication as a single element, simplifies reporting by content age. |
The above items are covered in more detail in Section 3 below as well as in Appendix B, Section B.1.4.
Overview
This section provides an overview of the scope of the COUNTER Code of Practice.
Section 3 Technical Specifications for COUNTER Reports introduces the REQUIRED reports, describes the common format shared by all COUNTER reports, and defines the COUNTER report attributes and their values.
Section 4 COUNTER reports provides detailed specifications for each COUNTER report. Use this section to understand what elements are included in each report.
Section 5 Delivery of COUNTER Reports outlines the options a content provider MUST provide to enable customers to access their reports.
Section 6 Logging Usage describes various options used for logging usage transactions.
Section 7 Processing Rules for Underlying COUNTER Reporting Data discusses topics such as which HTTP status codes to count, double-click filtering, calculating unique items and unique titles accessed in a session, classifying searches (regular, federated, automated, or platform), robots and internet crawlers, tools that cause bulk downloads, and text and data mining.
Section 8 SUSHI for Automated Report Harvesting offers a more in-depth description of the REQUIRED COUNTER_SUSHI API support.
Section 9 Audit provides the requirements for the COUNTER audit.
Section 10 Other Compliance Topics talks about license language to require COUNTER usage statistics, confidentiality of data, and supporting consortia in their need to obtain usage data for their members.
Section 11 Extending the Code of Practice offers suggestions for content providers who may want to create custom reports or include additional elements and attribute values in COUNTER reports.
Section 12 Continuous Maintenance outlines the procedures that have been put in place to allow the Code of Practice to be amended and expanded on an incremental basis in a controlled and managed way.
Section 13 Transitioning from Previous Releases or to New Reporting Services describes the procedures and requirements for transitioning to a new reporting service or underlying logging system and for transitioning to a new COUNTER release, in particular from R4 to R5.
Section 14 Change History provides a list of the Code of Practice releases.
Technical Specifications for COUNTER Reports
COUNTER Reports for Libraries
Reports for R5 consist of four Master Reports that allow the librarian to filter and configure to create customized views of their usage data. R5 also specifies Standard Views (pre-set filters/configuration).
To achieve compliance, a content provider MUST offer the Master Reports and Standard Views that are applicable to their Host_Types, with the exception of Standard Views that always would be empty (e.g. an Access Denied Standard View if denials cannot occur). An independent audit is required for these reports.
Content providers may offer additional Master Reports and Standard Views not required for compliance or custom reports (see Section 11.2), according to the rules set for the reports by the Code of Practice. For these reports an audit isn’t required.
Master Reports
Master Reports include all relevant metrics and attributes; they are intended to be customizable through the application of filters and other configuration options, allowing librarians to create a report specific to their needs. The four Master Reports are shown in Table 3.a along with their Report_ID, Report_Name and Host_Types who are REQUIRED to provide these reports. See Section 3.3.1 below for details on Host_Types.
Table 3.a (below): Master Reports
Report_ID |
Report_Name |
Details |
Host_Types |
---|---|---|---|
PR |
Platform Master Report |
A customizable report summarizing activity across a content provider’s platforms that allows the user to apply filters and select other configuration options. |
All Host_Types: |
DR |
Database Master Report |
A customizable report detailing activity by database that allows the user to apply filters and select other configuration options. |
A&I_Database |
TR |
Title Master Report |
A customizable report detailing activity at the title level (journal, book, etc.) that allows the user to apply filters and select other configuration options. |
Aggregated_Full_Content |
IR |
Item Master Report |
A granular, customizable report showing activity at the level of the item (article, chapter, media object, etc.) that allows the user to apply filters and select other configuration options. |
Data_Repository* |
* Data repositories may choose to conform to the Code of Practice Release 5 or, alternatively, may wish to work with the Code of Practice for Research Data.
Figure 3.a (below) provides an example of how the user interface could look. The user will be presented with an interface that allows them to select usage dates, one or more Metric_Types, Data_Types, Access_Types, etc. and indicate if the filter columns are to be included. Including the column will cause usage to be broken out by individual values for the selected filter, whereas not including the column will result in usage being summarized for the selected filter.

Figure 3.a: Example of a user interface
Standard Views
The goal of Standard Views is to provide a set of pre-filtered views of the Master Reports covering the most common set of library needs. Report_IDs for Standard Views are derived from the Report_ID of the Master Report that they are based on. The format is {Master Report_ID}_{View ID}.
Platform Usage Standard Views
The Platform Usage Standard Views are derived from the Platform Master Report and provide a summary of activity on a given platform to support the evaluation of platforms and to provide high-level statistical data to support surveys and reporting to funders.
Table 3.b (below): Platform Usage Standard Views
Report_ID |
Report_Name |
Details |
Host_Types |
---|---|---|---|
PR_P1 |
Platform Usage |
Platform-level usage summarized by Metric_Type. |
All Host_Types: |
*Data repositories may choose to conform to the Code of Practice Release 5 or, alternatively, may wish to work with the Code of Practice for Research Data.
See Section 4.1 below for details on Platform Usage Reports.
Database Usage Standard Views
The Database Usage Standard Views support the evaluation of the value of a given database of resources (e.g. a full-text database, an A&I database, or a multimedia collection).
Table 3.c (below): Database Usage Standard Views
Report_ID |
Report_Name |
Details |
Host_Types |
---|---|---|---|
DR_D1 |
Database Search and Item Usage |
Reports on key Searches, Investigations and Requests metrics needed to evaluate a database. |
A&I_Database |
DR_D2 |
Database Access Denied |
Reports on Access Denied activity for databases where users were denied access because simultaneous-use licenses were exceeded or their institution did not have a license for the database. |
A&I_Database |
See Section 4.2 below for details on Database Usage Reports.
Title Usage Standard Views
Title Usage Standard Views are used to support the evaluation of the value of a given serial (e.g. journal, magazine, or newspaper) or monograph (e.g. book, eBook, textbook, or reference work) title.
Table 3.d (below): Title Usage Standard Views
Report_ID |
Report_Name |
Details |
Host_Types |
---|---|---|---|
TR_B1 |
Book Requests (Excluding OA_Gold) |
Reports on full-text activity for books, excluding Gold Open Access content, as Total_Item_Requests and Unique_Title_Requests. The Unique_Title_Requests provides comparable usage across book platforms. The Total_Item_Requests shows overall activity; however, numbers between sites will vary significantly based on how the content is delivered (e.g. delivered as a complete book or by chapter). |
Aggregated_Full_Content |
TR_B2 |
Book Access Denied |
Reports on Access Denied activity for books where users were denied access because simultaneous-use licenses were exceeded or their institution did not have a license for the book. |
Aggregated_Full_Content |
TR_B3 |
Book Usage by Access Type |
Reports on book usage showing all applicable Metric_Types broken down by Access_Type. |
Aggregated_Full_Content |
TR_J1 |
Journal Requests (Excluding OA_Gold) |
Reports on usage of journal content, excluding Gold Open Access content, as Total_Item_Requests and Unique_Item_Requests. The Unique_Item_Requests provides comparable usage across journal platforms by reducing the inflationary effect that occurs when an HTML full text automatically displays and the user then accesses the PDF version. The Total_Item_Requests shows overall activity. |
Aggregated_Full_Content |
TR_J2 |
Journal Access Denied |
Reports on Access Denied activity for journal content where users were denied access because simultaneous-use licenses were exceeded or their institution did not have a license for the title. |
Aggregated_Full_Content |
TR_J3 |
Journal Usage by Access Type |
Reports on usage of journal content for all Metric_Types broken down by Access_Type. |
Aggregated_Full_Content |
TR_J4 |
Journal Requests by YOP (Excluding OA_Gold) |
Breaks down the usage of journal content, excluding Gold Open Access content, by year of publication (YOP), providing counts for the Metric_Types Total_Item_Requests and Unique_Item_Requests. Provides the details necessary to analyze usage of content in backfiles or covered by perpetual access agreements. Note that COUNTER reports do not provide access model or perpetual access rights details. |
Aggregated_Full_Content |
See Section 4.3 below for details on Title Usage Standard Views.
Item Usage Standard Views
The Standard Views for item-level reporting are designed to support the most common reporting needs. The Standard View for repositories (Journal Article Requests) provides insight into the usage of individual journal articles. The Standard View for multimedia (Multimedia Item Requests) allows evaluation of multimedia at the title level.
Table 3.e (below): Item Usage Standard Views
Report_ID |
Report_Name |
Details |
Host_Types |
---|---|---|---|
IR_A1 |
Journal Article Requests |
Reports on journal article requests at the article level. This report is limited to content with a Data_Type of Article, Parent_Data_Type of Journal, and Metric_Types of Total_Item_Requests and Unique_Item_Requests. This Standard View must be provided only if (a) it is clear for all articles in IR whether they are journal articles or not and (b) the parent item is known for all journal articles. |
Repository |
IR_M1 |
Multimedia Item Requests |
Reports on multimedia requests at the item level. |
Multimedia |
See Section 4.4 below for details on Item Usage Reports.
Formats for COUNTER Reports
R5 reports can be delivered in tabular form, or as machine-readable data (JSON) via the COUNTER_SUSHI API. The tabular form MUST be provided as either an Excel or a tab-separated-value (TSV) file, or both. Additional file formats that can be easily imported into spreadsheet programs without loss or corruption may be offered at the vendor’s discretion. The reports in JSON, TSV and other text formats MUST be encoded using UTF-8. The JSON format MUST comply with the COUNTER_SUSHI API Specification (see Section 8 below).
All COUNTER reports have the same layout and structure. Figure 3.b (below) provides an example of the “Journal Requests (Excluding OA_Gold)” Standard View. Figure 3.c (below) shows the layout for tabular reports, which will be the focus of the discussions throughout this document. Note that the COUNTER_SUSHI API Specification includes the same elements with the same or similar names; therefore, understanding the tabular reports translates to an understanding of what is REQUIRED in reports retrieved via the COUNTER_SUSHI API.

Figure 3.b: Sample “Journal Requests (Excluding OA_Gold)” Standard View

Figure 3.c: Layout for tabular COUNTER reports
All COUNTER reports have a header. In tabular reports, the header is separated from the body with a blank row (to facilitate sorting and filtering in Excel). Beneath that is the body of the report with column headings. The contents of the body will vary by report. Figure 3.c (above) identifies the different kinds of information you may find in the report and the relative positioning of this information. All of this is discussed in more detail below.
Report Header
The first 12 rows of a tabular COUNTER report contain the header, and the 13th row is always blank. The header information is presented as a series of name-value pairs, with the names appearing in Column A and the corresponding values appearing in Column B. All tabular COUNTER reports have the same names in Column A. Column B entries will vary by report.

Figure 3.d: Common Report Header Information
Figure 3.d (above) shows the layout of the common header. The 12 elements in Column A and the values in Column B are discussed in more detail in the table below. Note that the element names (Column A) MUST appear in the COUNTER report exactly as they are shown here. Capitalization, spelling, and punctuation MUST match exactly.
Table 3.f (below): COUNTER Report Header Elements
Element Name |
Description of value to provide |
Example |
---|---|---|
Report_Name |
The name of the report as it appears in Section 3.1. |
Journal Requests (Excluding OA_Gold) |
Report_ID |
The unique identifier for the report as it appears in Section 3.1. |
TR_J1 |
Release |
The COUNTER release this report complies with. |
5 |
Institution_Name |
The name of the organization to which the usage is attributed. This can be a higher education institution, or for example a country for a country-wide contract, or a publisher if an aggregator or discovery service wants to report usage of a publisher’s content to the publisher. For reports including usage of open content that cannot be attributed to an institution, the Institution_Name should be “The World”. Note that such a report would include all global usage, whether attributed to institutions or not, but it could be filtered and broken down as usual, including by using Attributed and other extensions (see Section 11.5). |
Mt. Laurel University |
Institution_ID |
A series of identifiers that represent the institution, in tabular reports in the format of {namespace}:{value}. Include multiple identifiers separated with a semicolon-space (“; ”), but only one value per namespace. In JSON reports multiple values per namespace can be included, separated by the vertical pipe (“|”) character. Permitted identifier namespaces are ISIL, ISNI, OCLC, ROR and, for local identifiers assigned by the content provider, the platform ID of the content provider. |
ISNI:0000000419369078; ROR:00hx57361; pubsiteA:PrncU |
Metric_Types |
A semicolon-space delimited list of Metric_Types requested for this report. Note that even though a Metric_Type was requested, it might not be included in the body of the report if no report items had usage of that type. |
Unique_Item_Investigations; Unique_Item_Requests |
Report_Filters |
A series of zero or more report filters applied on the reported usage, excluding Metric_Type, Begin_Date and End_Date (which appear in separate rows in the tabular reports for easier reading). Typically, a report filter affects the amount of usage reported. Entries appear in the form of {filter name}={filter value} with multiple filter name-value pairs separated with a semicolon-space (“; ”) and multiple filter values for a single filter name separated by the vertical pipe (“|”) character. |
Access_Type=Controlled; Access_Method=Regular |
Report_Attributes |
A series of zero or more report attributes applied to the report. Typically, a report attribute affects how the usage is presented but does not change the totals. Entries appear in the form of {attribute name}={attribute value} with multiple attribute name-value pairs separated with a semicolon-space (“; ”) and multiple attribute values for a single attribute name separated by the vertical pipe (“|”) character. |
Attributes_To_Show=Access_Type |
Exceptions |
An indication of some difference between the usage that was requested and the usage that is being presented in the report. The format for the exception values is “{Exception Code}: {Exception Message} ({Data})” with multiple exception values separated by semicolon-space (“; ”). The Exception Code and Exception Message MUST match values provided in Table F.1 of Appendix F. For some exceptions further information MUST be provided in the Data element as indicated in Table F.1, otherwise the Data is optional. Note that for tabular reports usually only the limited set of exceptions which indicate that usage is not, not yet or no longer available will occur. |
3031: Usage Not Ready for Requested Dates (request was for 2016-01-01 to 2016-12-31; however, usage is only available to 2016-08-31) |
Reporting_Period |
The date range for the usage represented in the report, in the form of: “Begin_Date=yyyy-mm-dd; End_Date=yyyy-mm-dd”. |
Begin_Date=2016-01-01; End_Date=2016-08-31 |
Created |
The date and time the usage was prepared, in RFC3339 date-time format (yyyy-mm-ddThh:mm:ssZ). |
2016-10-11T14:37:15Z |
Created_By |
The name of the organization or system that created the COUNTER report. |
EBSCO Information Services |
(blank row) |
Row 13 MUST be blank. |
Report Body
Figures 3.b and 3.c (above) show the body of the COUNTER reports containing an extensive array of data elements. Not all reports will include all elements. When formatting a report, maintain the order of elements described below, but only include those elements relevant to that report. Where practical, the discussion below will provide guidance as to which reports an element may be included in. See Section 4 below for an extensive mapping of elements to reports.
Report Item Description
Every COUNTER report will have columns that describe its report items.
Table 3.g (below): Elements that Describe the Report Item
Element Name |
Description |
Reports |
Examples |
---|---|---|---|
Database |
Name of database for which usage is being reported. Applies only to Database Reports. |
DR |
MEDLINE |
Title |
Name of the book or journal for which usage is being reported. Applies only to Title Reports. |
TR |
Journal of Economics |
Item |
Name of the article, book chapter, multimedia work, or repository item for which usage is being reported. Applies only to Item Reports. |
IR |
CRISPR gene-editing tested in a person for the first time |
Publisher |
Name of the publisher of the content item. Note that when the content item is a database, the publisher would be the organization that creates that database. |
DR, TR, IR |
Taylor & Francis |
Publisher_ID |
A unique identifier for the publisher, in tabular reports in the form of {namespace}:{value}. When multiple identifiers are available for a given publisher, include all identifiers separated with semicolon-space (“; ”), but only one value per namespace. In JSON reports multiple values per namespace can be included, separated by the vertical pipe (“|”) character. Permitted identifier namespaces are ISNI, ROR and, for local identifiers assigned by the content provider, the platform ID of the content provider. |
DR, TR, IR |
ISNI:1234123412341234; ROR:012a3bc45; ebscohost:PubX |
For Database the value MUST NOT be empty. For Title, Item and Publisher the value SHOULD NOT be empty, and if the value for Title or Item is empty at least one DOI, ISBN, Online_ISSN, Print_ISSN, Proprietary_ID or URI MUST be provided so that the report item can be identified. Note that content providers are expected to make all reasonable efforts to provide this information and that using an empty value may affect the result of an audit (see Section 3.3.10).
Platform
The next column in the report identifies the platform where the activity happened.
Table 3.h (below): Elements that Identify the Platform
Element Name |
Description |
Reports |
Examples |
---|---|---|---|
Platform |
Identifies the platform/content host where the activity took place. Note that in cases where individual titles or groups of content have their own branded user experience but reside on a common host, the identity of the underlying common host MUST be used as the Platform. |
All reports: |
EBSCOhost |
Report Item Identifiers
The item being reported on is further identified by the columns to the right of the platform.
Table 3.i (below): Elements for Report Item Identifiers
Element Name |
Description |
Reports |
Examples |
---|---|---|---|
Authors |
Authors of the work for which usage is being reported in the format {author name} ({author identifier}) with one OPTIONAL author identifier in the format {namespace}:{value}. Permitted identifier namespaces are ISNI and ORCID. A maximum of three authors should be included with multiple authors separated by semicolon-space (“; ”). Note that this element is only used in tabular reports, in JSON reports authors are represented as Item_Contributors with Type Author. |
IR |
John Smith (ORCID:0000-0001-2345-6789) |
Publication_Date |
Date of publication for the work in the format yyyy-mm-dd. |
IR |
2018-09-05 |
Article_Version |
ALPSP/NISO code indicating the version of the work. Possible values are the codes for Accepted Manuscript, Version of Record, Corrected Version of Record, and Enhanced Version of Record. |
IR |
VoR |
DOI |
Digital Object Identifier for the item being reported on in the format {DOI prefix}/{DOI suffix}. |
TR, IR |
10.1629/uksg.434 |
Proprietary_ID |
A proprietary ID assigned by the content provider for the item being reported on. Format as {namespace}:{value} where the namespace is the platform ID of the host which assigned the proprietary identifier. |
DR, TR, IR |
publisherA:jnrlCode123 |
ISBN |
International Standard Book Number in the format ISBN-13 with hyphens. |
TR, IR |
978-3-16-148410-0 |
Print_ISSN |
International Standard Serial Number assigned to the print instance of a serial publication in the format nnnn-nnn[nX]. |
TR, IR |
0953-1513 |
Online_ISSN |
International Standard Serial Number assigned to the online instance of a serial publication in the format nnnn-nnn[nX]. |
TR, IR |
2048-7754 |
Linking_ISSN |
International Standard Serial Number that links together the ISSNs assigned to all instances of a serial publication in the format nnnn-nnn[nX] (JSON reports only). |
TR, IR |
0953-1513 |
URI |
Universal Resource Identifier, a valid URL or URN according to RFC 3986. |
TR, IR |
At least one DOI, ISBN, Online_ISSN, Print_ISSN, Proprietary_ID or URI SHOULD be provided for each report item. Note that only one value per identifier is permitted, unless specified otherwise.
Parent Item Description and Identifiers
When reporting usage on content items like articles and book chapters, it is often desirable to identify the item’s parent item, such as the journal or book it is part of. This next grouping of columns identifies the parents and is used by a small subset of reports.
Table 3.j (below): Elements that Describe a Parent Item
Element Name |
Description |
Reports |
Examples |
---|---|---|---|
Parent_Title |
Title of the parent item. |
IR |
The Serials Librarian |
Parent_Authors |
Authors of the parent work. See the Authors element in Table 3.i for the format. |
IR |
|
Parent_Publication_Date |
Date of publication for the parent work in the format yyyy-mm-dd. |
IR |
|
Parent_Article_Version |
ALPSP/NISO code indicating the version of the parent work. Possible values are the codes for Accepted Manuscript, Version of Record, Corrected Version of Record, and Enhanced Version of Record. |
IR |
VoR |
Parent_Data_Type |
Identifies the nature of the parent. |
IR |
Journal |
Parent_DOI |
DOI assigned to the parent item in the format {DOI prefix}/{DOI suffix}. |
IR |
|
Parent_Proprietary_ID |
A proprietary ID that identifies the parent item. Format as {namespace}:{value} where the namespace is the platform ID of the host which assigned the proprietary identifier. |
IR |
TandF:wser20 |
Parent_ISBN |
ISBN of the parent item in the format ISBN-13 with hyphens. |
IR |
|
Parent_Print_ISSN |
Print ISSN assigned to the parent item in the format nnnn-nnn[nX]. |
IR |
0361-526X |
Parent_Online_ISSN |
Online ISSN assigned to the parent item in the format nnnn-nnn[nX]. |
IR |
1541-1095 |
Parent_URI |
URI (valid URL or URN according to RFC 3986) for the parent item. |
IR |
https://www.tandfonline.com/action/journalInformation?journalCode=wser20 |
At least one DOI, ISBN, Online_ISSN, Print_ISSN, Proprietary_ID or URL MUST be included if parent information is provided for a report item. Note that only one value per identifier is permitted, unless specified otherwise.
Component Item Description and Identifiers
Repositories often store multiple components for a given repository item. These components could take the form of multiple files or datasets, which can be identified and usage reported on separately in Item Master Reports. Note that the component usage may only be reported for Total_Item_Investigations and Total_Item_Request. For other Metric_Types the usage cannot be broken down by component and the corresponding cells MUST be empty.
Table 3.k (below): Elements that Describe a Component Item
Element Name |
Description |
Reports |
Examples |
---|---|---|---|
Component_Title |
Name or title of the component item. |
IR |
|
Component_Authors |
Authors of the component item. See the Authors element in Table 3.i for the format. |
IR |
|
Component_Publication_Date |
Date of publication for the component item in the format yyyy-mm-dd. |
IR |
|
Component_Data_Type |
Data type of the component item. |
IR |
|
Component_DOI |
DOI assigned to the component item in the format {DOI prefix}/{DOI suffix}. |
IR |
|
Component_Proprietary_ID |
A proprietary ID assigned by the repository to uniquely identify the component. Format as {namespace}:{value} where the namespace is the platform ID of the repository which assigned the proprietary identifier. |
IR |
|
Component_ISBN |
ISBN that is assigned to the component item in the format ISBN-13 with hyphens. |
IR |
|
Component_Print_ISSN |
Print ISSN that is assigned to the component item in the format nnnn-nnn[nX]. |
IR |
|
Component_Online_ISSN |
Online ISSN that is assigned to the component item in the format nnnn-nnn[nX]. |
IR |
|
Component_URI |
URI (valid URL or URN according to RFC 3986) assigned to the component item. |
IR |
At least one DOI, ISBN, Online_ISSN, Print_ISSN, Proprietary_ID or URI per component MUST be included if component information is provided for a report item. Note that only one value per identifier is permitted, unless specified otherwise.
Item and Report Attributes
Table 3.l (below): Elements for Item and Report Attributes
Element Name |
Description |
Reports |
Examples |
---|---|---|---|
Data_Type |
Nature of the content that was used. See Section 3.3.2 for more detail. |
PR, DR, TR, IR |
Book |
Section_Type |
When content is accessed in chunks or sections, this attribute describes the nature of the content unit. See Section 3.3.3 for more detail. |
TR |
Article |
YOP |
Year of publication for the item being reported on. See Section 3.3.7 for more detail. |
TR, IR |
1997 |
Access_Type |
See Section 3.3.5 for more detail. |
TR, IR |
Controlled |
Access_Method |
See Section 3.3.6 for more detail. |
PR, DR, TR, IR |
Regular |
If one of the elements is included in a report, either because it is mandatory for a Standard View (as specified in Section 4) or it is requested for a Master Report, a permissible value MUST be specified for each report item. The only exception is Section_Type which MUST be empty (tabular reports) or omitted (JSON reports) for Data_Type Book and Unique_Title metrics, since it is not applicable in this case. Note that this results in two report items in JSON reports, one for the Total_Item and Unique_Item metrics with Section_Type and one for the Unique_Title metrics without Section_Type.
Metric Type
Table 3.m (below): Report Element for Metric_Type
Element Name |
Description |
Reports |
Examples |
---|---|---|---|
Metric_Type |
The type of activity that is being counted. See Section 3.3.4 for more detail. |
All reports: |
Total_Item_Investigations |
Usage Data
Table 3.n (below): Elements for Usage Data
Element Name |
Description |
Reports |
Examples |
---|---|---|---|
Reporting_Period_Total |
Total of usage in this row for all months covered. Note that this element does NOT appear in the JSON reports, instead the JSON format offers a Granularity report attribute (see Section 3.3.8 for details). |
All reports: |
123456 |
Mmm-yyyy |
A series of columns with usage for each month covered by the report. The format is Mmm-yyyy. Note: In the JSON format this is represented by Begin_Date and End_Date date elements for each month. |
All reports: |
May-2016 |
COUNTER Report Common Attributes and Elements
Early releases of the COUNTER Code of Practice focused on usage statistics related to journals. That was expanded to books, and later articles and multimedia collections were added. R5 further expands the scope of COUNTER into the area of research data and social media. In order to help organize this increased scope in a single, consistent, and coherent Code of Practice, several new elements and attributes have been added.
Host Types
Usage reports are provided by many different types of content hosts ranging from eBook to A&I_Database, eJournal, Discovery_Service, Multimedia etc. The usage reporting needs vary by Host_Type. To accommodate this variance, the R5 defines a set of Host_Type categories. Although the Host_Type does not appear on the COUNTER report, the Code of Practice uses Host_Types throughout this document to help content providers identify which reports, elements, metric types, and attributes are relevant to them. The Host_Types are:
Table 3.o (below): List of Host_Type Values
Host_Type |
Description |
Examples |
---|---|---|
A&I_Database |
Provides access to databases containing abstract and index information on scholarly articles intended to support discovery. |
APA |
Aggregated_Full_Content |
Provides access to aggregated pre-set databases of full text and other content where content is accessed in the context of the licensed database. |
EBSCOhost |
Data_Repository |
Includes subject repositories, institution, etc. |
UK Data Service - ReShare |
Discovery_Service |
Assists users with discovery of scholarly content by providing access to a central index of articles, books, and other metadata. |
EBSCOhost (EDS) |
eBook |
Provides access to eBook content made available as individual eBooks or eBook packages. |
EBL |
eBook_Collection |
Provides access to eBook content that is sold as fixed collections and behaves like databases. |
EBSCOhost |
eJournal |
Provides access to online serial (journals, conferences, newspapers, etc.) content made available as individual titles or packages. |
ScienceDirect |
Full_Content_Database |
Provides access to databases that are a collection of content items that are not otherwise part of a serial or monograph (i.e. non-aggregated). |
Cochrane |
Multimedia |
Provides access to audio, video, or other multimedia content. |
Alexander Street Press |
Multimedia_Collection |
Provides access to multimedia materials sold as and accessed like databases. |
|
Repository |
Provides access to an institution’s research output. Includes subject repositories, institution, department, etc. |
Cranfield CERES |
Scholarly_Collaboration_Network |
A service used by researchers to share information about their work. |
Mendeley |
Note that a given content host may be classified as having multiple Host_Types and would be expected to provide reports, metric types, elements, and attributes applicable to all. For example, EBSCOhost would be classified as A&I_Database, Aggregated_Full_Content, Discovery_Service, eBook, and eBook_Collection.
Data Types
R5 reports on scholarly information in many ways. These major groupings, referred to as Data_Types, are listed in the table below along with the Host_Types and reports that they apply to. All Data_Types apply to the Platform Reports since they summarize the usage on the platform. Note that the table lists only Host_Types required to provide one or more reports for compliance, but that content providers may offer additional reports. For example, Host_Type eJournal might also offer IR and IR_A1 and would then use Data_Type Article in these reports.
Table 3.p (below): List of Data_Type Values
Data_Type |
Description |
Host_Types |
Reports |
---|---|---|---|
Article |
An article, typically published in a journal or reference work. Note that Data_Type Article is only applicable for Item Reports when the article is the item, in Title Reports this is represented by the Section_Type. |
Repository |
PR, IR |
Book |
A monograph text. |
A&I_Database |
PR, DR, TR, IR |
Book_Segment |
A book segment (e.g. chapter, section, etc.). Note that Data_Type Book_Segment is only applicable for Item Reports when the book segment is the item, in Title Reports this is represented by the Section_Type. |
Repository |
PR, IR |
Database |
A fixed database where content is searched and accessed in the context of the database. A given item on the host may be in multiple databases but a transaction must be attributed to a specific database. Note that Data_Type Database is only applicable for Searches and Access Denied at the database level and for Investigations and Requests for Full_Content_Databases*. |
A&I_Database |
PR, DR |
Dataset |
A data set. |
Data_Repository |
PR, IR |
Journal |
A serial that is a branded and continually growing collection of original articles within a particular discipline. |
A&I_Database |
PR, DR, TR, IR |
Multimedia |
Multimedia content such as audio, image, streaming audio, streaming video, and video. |
Multimedia |
PR, DR, IR |
Newspaper_or_Newsletter |
Textual content published serially in a newspaper or newsletter. |
A&I_Database |
PR, DR, TR, IR |
Other |
Content that cannot be classified by any of the other Data_Types. Note that Data_Type Other MUST NOT be used if there isn’t sufficient information available to classify the content. |
A&I_Database |
PR, DR, TR, IR |
Platform |
A content platform that may reflect usage from multiple Data_Types. Note that Data_Type Platform is only applicable for Searches_Platform. |
All Host_Types: |
PR |
Report |
A report. |
A&I_Database |
PR, DR, TR, IR |
Repository_Item |
A generic classification used for items stored in a repository. |
Repository |
PR, IR |
Thesis_or_Dissertation |
A thesis or dissertation. |
A&I_Database |
PR, DR, TR, IR |
Unspecified |
It is not possible to classify the content because there isn’t sufficient information available. Note that content providers are expected to make all reasonable efforts to classify the content and that using Data_Type Unspecified may affect the result of an audit, see Section 3.3.10 for details. |
A&I_Database |
PR, DR, TR, IR |
*Full_Content_Databases may also use Data_Type Database in the Master Title Report if this report is offered. All other Host_Types MUST report Investigations and Requests either with the title-level Data_Types (e.g. Journal for a journal article or Book for a book, from Host_Type A&I_Database, Aggregated_Full_Content, Discovery_Service, eBook, eBook_Collection and eJournal), or with the item-level Data_Types (e.g. Article for an article or Multimedia for a video from Host_Type Data_Repository, Multimedia, Multimedia_Collection, Repository and Scholarly_Collaboration_Network). These Data_Types MUST be used across all reports required for compliance to ensure a consistent reporting.
Section Types
Some scholarly content is accessed in sections. For example, a user may access a chapter or section at a time. Section_Type was introduced to provide a categorization of the transaction based on the type of section accessed. For example, a librarian could use a Title Master Report to see a breakdown of usage by Title and Section_Type. The following table lists the Section_Types defined by COUNTER and the Host_Types and reports to which they apply.
Table 3.q (below): List of Section_Type Values
Section_Type |
Description |
Host_Types |
Reports |
---|---|---|---|
Article |
An article from a compilation, such as a journal, encyclopedia, or reference book. |
Aggregated_Full_Content |
TR |
Book |
A complete book, accessed as a single file. |
Aggregated_Full_Content |
TR |
Chapter |
A chapter from a book. |
Aggregated_Full_Content |
TR |
Other |
Content delivered in sections not otherwise represented on the list. |
Aggregated_Full_Content |
TR |
Section |
A group of chapters or articles. |
Aggregated_Full_Content |
TR |
Metric Types
Metric_Types, which represent the nature of activity being counted, can be grouped into the categories of Searches, Investigations, Requests, and Access Denied. The Tables 3.r, 3.s and 3.t (below) list the Metric_Types and the Host_Types and reports they apply to.
Searches
Table 3.r (below): List of Metric_Types for Searches
Metric_Type |
Description |
Host_Types |
Reports |
---|---|---|---|
Searches_Regular |
Number of searches conducted against a database where results are returned to the user on the host UI and either a single database is searched, or multiple databases are searched and the user has the option of selecting the databases to be searched. This metric only applies to usage tracked at the database level and is not represented at the platform level. |
A&I_Database |
DR |
Searches_Automated |
Number of searches conducted against a database on the host site or discovery service where results are returned in the host UI, multiple databases are searched and the user does NOT have the option of selecting the databases to be searched. This metric only applies to usage that is tracked at the database level and is not represented at the platform level. |
A&I_Database |
DR |
Searches_Federated |
Searches conducted by a federated search engine where the search activity is conducted remotely via client-server technology. This metric only applies to usage that is tracked at the database level and is not represented at the platform level. |
A&I_Database |
DR |
Searches_Platform |
Searches conducted by users and captured at the platform level. Each user-initiated search can only be counted once regardless of the number of databases involved in the search. This metric only applies to Platform Reports. |
All Host_Types: |
PR |
*Repositories should provide these Metric_Types if they are able to.
Investigations and Requests of Items and Titles
This group of Metric_Types represents activities where content items were retrieved (Requests) or information about a content item (e.g. an abstract) was examined (Investigations). Any user activity that can be attributed to a content item will be considered an Investigation including downloading or viewing the item. Requests are limited to user activity related to retrieving or viewing the content item itself. The figure below provides a graphical representation of the relationship between Investigations and Requests.

Figure 3.e: The relationship between Investigations and Requests
Totals, Unique Items and Unique Titles
R5 also introduces the concept of unique items and unique titles. The Metric_Types that begin with Total are very similar to the metrics of R4, i.e. if a given article or book or book chapter was accessed multiple times in a user session, the metric would increase by the number of times the content item was accessed (minus any adjustments for double-clicks).
Unique_Item metrics have been introduced in R5 to help eliminate the effect different styles of user interfaces may have on usage counts. With R5, if a single article is accessed multiple times in a given user session, the corresponding Unique_Item metric can only increase by 1 to simply indicate that the content item was accessed in the session. Unique_Item metrics provide comparable usage across journal platforms by reducing the inflationary effect that occurs when an HTML full text automatically displays and the user then accesses the PDF version.
Unique_Title metrics have been introduced in R5 to help normalize eBook metrics. Since eBooks can be downloaded as an entire book in a single PDF or as separate chapters, the counts for R4’s BR1 (book downloads) and BR2 (section downloads) are not comparable. With R5, the book’s Unique_Title metrics are only increased by 1 no matter how many (or how many times) chapters or sections were accessed in a given user session. Unique_Title metrics provide comparable eBook metrics regardless of the nature of the platform and how eBook content was delivered.
The Unique_Title metrics MUST NOT be used for Data_Types other than Book as they are not meaningful for them. If a book contains both OA_Gold and Controlled sections or sections with different YOPs, the usage must be broken down by Access_Type and YOP so that the total counts are consistent between reports including and not including these columns/elements.
Table 3.s (below): List of Metric_Types for Requests and Investigations
Metric_Type |
Description |
Host_Types |
Reports |
---|---|---|---|
Total_Item_Investigations |
Total number of times a content item or information related to a content item was accessed. Double-click filters are applied to these transactions. Examples of content items are articles, book chapters, or multimedia files. |
All Host_Types: |
PR, DR, TR, IR |
Unique_Item_Investigations |
Number of unique content items investigated in a user-session. Examples of content items are articles, book chapters, or multimedia files. |
All Host_Types: |
PR, DR, TR, IR |
Unique_Title_Investigations |
Number of unique titles investigated in a user-session. This Metric_Type is only applicable for Data_Type Book. |
A&I_Database |
PR, DR, TR |
Total_Item_Requests |
Total number of times a content item was requested (i.e. the full text or content was downloaded or viewed). Double-click filters are applied to these transactions. Examples of content items are articles, book chapters, or multimedia files. |
Aggregated_Full_Content |
PR, DR, TR, IR |
Unique_Item_Requests |
Number of unique content items requested in a user-session. Examples of content items are articles, book chapters, or multimedia files. |
Aggregated_Full_Content |
PR, DR, TR, IR |
Unique_Title_Requests |
Number of unique titles requested in a user-session. This Metric_Type is only applicable for Data_Type Book. |
Aggregated_Full_Content |
PR, DR, TR |
*Repositories should provide these Metric_Types if they are able to.
Access Denied
Table 3.t (below): List of Metric_Types for Access Denied
Metric_Type |
Description |
Host_Types |
Reports |
---|---|---|---|
No_License |
Number of times access was denied because the user’s institution did not have a license to the content. Double-click filtering applies to this Metric_Type. Note that if the user is automatically redirected to an abstract, that action will be counted as a No_License and also as an Item_Investigation. |
A&I_Database |
DR, TR, IR |
Limit_Exceeded |
Number of times access was denied because the licensed simultaneous-user limit for the user’s institution was exceeded. Double-click filtering applies to this Metric_Type. |
A&I_Database |
DR, TR, IR |
Access Types
In order to track the value of usage for licensed content, librarians want to know how much Open Access or other freely available content was used and how much content was behind a paywall. To accommodate this R5 has introduced an Access_Type attribute with values of Controlled, OA_Gold, OA_Delayed, and Other_Free_To_Read. The table below lists the Access_Types and the Host_Types and reports they apply to. Note that Access_Type relates to access on the platform where the usage occurs: if access to a Gold Open Access article is restricted on a platform (for example because the article is included in an aggregated full-text database available to subscribers only) the Access_Type is Controlled.
Table 3.u (below): List of Access_Type Values
Access_Type |
Description |
Host_Types |
Reports |
---|---|---|---|
Controlled |
At the time of the Request or Investigation the content item was not open (e.g. behind a paywall) because access is restricted to authorized users. Access of content due to a trial subscription/license would be considered Controlled. Platforms providing content that has been made freely available but is not OA_Gold (e.g. free for marketing purposes or because the title offers free access after a year) MUST be tracked as Controlled. |
Aggregated_Full_Content |
TR, IR |
OA_Gold |
At the time of the user Request or Investigation the content item was available under a Gold Open Access license (content that is immediately and permanently available as Open Access because an article processing charge applies or the publication process was sponsored by a library, society, or other organization). Content items may be in hybrid publications or fully Open Access publications. Note that content items offered as Delayed Open Access (open after an embargo period) MUST currently be classified as Controlled, pending the implementation of OA_Delayed. |
Data_Repository |
TR, IR |
OA_Delayed |
*** RESERVED FOR FUTURE USE - DO NOT IMPLEMENT *** At the time of the user Request or Investigation the content item was available as Open Access after an embargo period had expired (Delayed Open Access). Note that author-archived works hosted in institutional repositories where access is restricted from public access for an embargo period will report usage as OA_Delayed for content accessed after the embargo period expires. NOTE: This value is not to be used until its inclusion has been approved by COUNTER and a timeframe for implementation published by COUNTER. |
||
Other_Free_To_Read |
At the time of the transaction the content item was available as free-to-read (no license required) and did not qualify under the OA_Gold Access_Type. NOTE: This value is for institutional repositories only. Institutional repositories may also use Access_Type Other_Free_To_Read in the Master Title Report if this report is offered. |
Data_Repository |
IR |
Access Methods
In order to track content usage that was accessed for the purpose of text and data mining (TDM) and to keep that usage separate from normal usage, R5 introduces the Access_Method attribute, with values of Regular and TDM. The table below lists the Access_Methods and the Host_Types and reports they apply to.
Table 3.v (below): List of Access_Method Values
Access_Method |
Description |
Host_Types |
Reports |
---|---|---|---|
Regular |
Refers to activities on a platform or content host that represent typical user behaviour. |
All Host_Types: |
All reports: |
TDM |
Content and metadata accessed for the purpose of text and data mining, e.g. through a specific API used for TDM. Note that usage representing TDM activity is to be included in Master Reports only. |
All Host_Types: |
PR, DR, TR, IR |
YOP
Analyzing collection usage by the age of the content is also desired. The YOP report attribute represents the year of publication, and it must be tracked for all Investigations, Requests and Access Denied metrics in the Title and Item Reports. The table below lists the Host_Types and reports the YOP attribute applies to.
Table 3.w (below): YOP Values
YOP |
Description |
Host_Types |
Reports |
---|---|---|---|
yyyy |
The year of publication for the item as a four-digit year. If a content item has a different year of publication for an online version than for the print version, use the year of publication for the Version of Record. If the year of publication is not known, use a value of 0001. For articles in press (not yet assigned to an issue), use the value 9999. |
Aggregated_Full_Content |
TR, IR |
Report Filters and Report Attributes
Customized views of the usage data are created by applying report filters and report attributes to the Master Reports. The Standard Views specified by R5 are examples of such views. Report attributes define the columns (elements) and report filters the rows (values) included in the reports. For Master Reports the user can choose from specific sets of filters and attributes depending on the report, while for Standard Views the filters and attributes are pre-set except for an optional Platform filter.
The filters and attributes used to create a report are included in the report header (unless the default value is used, in this case the filter/attribute MUST be omitted), for JSON reports as name/value pairs in the Report_Filters and Report_Attributes elements and for tabular reports encoded in the Metric_Types, Reporting_Period, Report_Filters and Report_Attributes elements (see Section 3.2.1 for the encoding). For the COUNTER_SUSHI API each filter/attribute corresponds to a method parameter with the same name in lower case (see the COUNTER_SUSHI API Specification for details).
The tables below show the attributes and filters and the reports where they (might) appear in the header (excluding Standard Views using the default values).
Table 3.x (below): Report Attributes
Report Attribute |
Description |
Reports |
---|---|---|
Attributes_To_Show |
List of additional columns/elements to include in the report (default: none). See Section 4.1.2, Section 4.2.2, Section 4.3.2 and Section 4.4.2 for permissible values. Note that the component and parent columns/elements cannot be selected individually and MUST NOT be included in the list (see the Include_Component_Details and Include_Parent_Details attributes below). |
PR, DR, TR, IR |
Exclude_Monthly_Details |
Specifies whether to exclude the columns with the monthly usage from the report. Permissible values are False (default) and True. This attribute is only applicable for tabular reports. The corresponding attribute for JSON reports is Granularity. |
PR, DR, TR, IR |
Granularity |
Specifies the granularity of the usage data to include in the report. Permissible values are Month (default) and Totals. This attribute is only applicable to JSON reports, the corresponding attribute for tabular reports is Exclude_Monthly_Details. For Totals each Item_Performance element represents the aggregated usage for the reporting period. Support for Month is REQUIRED for COUNTER compliance, support for Totals is optional. |
PR, DR, TR, IR |
Include_Component_Details |
Specifies whether to include the component columns/elements (see Table 3.k) in the report. Permissible values are False (default) and True. |
IR |
Include_Parent_Details |
Specifies whether to include the parent columns/elements (see Table 3.j) in the report. Permissible values are False (default) and True. |
IR |
Table 3.y (below): Report Filters
Report Filter |
Description |
Reports |
---|---|---|
Access_Method |
List of Access_Methods for which to include usage (default: all). See Section 4.1.3, Section 4.2.3, Section 4.3.3 and Section 4.4.3 for permissible/pre-set values. |
All reports: |
Access_Type |
List of Access_Types for which to include usage (default: all). See Section 4.3.3 and Section 4.4.3 for permissible/pre-set values. |
TR, IR |
Begin_Date |
Beginning and end of the reporting period. Note that the COUNTER_SUSHI API allows the format yyyy-mm for the method parameters, which must be expanded with the first/last day of the month for the report header. For the tabular reports these filters are included in the Reporting_Period header instead of the Reporting_Filters header for easier reading. |
All reports: |
Database |
Name of a specific database for which usage is being requested (default: all). Support for this filter is optional but recommended for the reporting website. |
DR |
Data_Type |
List of Data_Types for which to include usage (default: all). See Section 4.1.3, Section 4.2.3, Section 4.3.3 and Section 4.4.3 for permissible/pre-set values. |
PR, DR, TR, IR |
Item_Contributor |
Identifier of a specific contributor (author) for which usage is being requested (default: all). Support for this filter is optional but recommended for the reporting website. |
IR |
Item_ID |
Identifier of a specific item for which usage is being requested. Support for this filter is optional but recommended for the reporting website. |
TR, IR |
Metric_Type |
List of Metric_Types for which to include usage (default: all). See Section 4.1.3, Section 4.2.3, Section 4.3.3 and Section 4.4.3 for permissible/pre-set values. For the tabular reports this filter is included in the Metric_Types header instead of the Reporting_Filters header for easier reading. |
All reports: |
Platform |
The Platform filter is only intended in cases where there is a single endpoint for multiple platforms; that is, the same base URL for the COUNTER_SUSHI API is used for multiple platforms and the platform parameter is required for all API calls. In the web interface this would correspond to first selecting one platform and then creating reports only for that platform. |
All reports: |
Section_Type |
List of Section_Types for which to include usage (default: all). See Section 4.3.3 for permissible values. |
TR |
YOP |
Range of years of publication for which to include usage (default: all). For the COUNTER_SUSHI API more complex filter values (list of years and ranges) MUST be supported. |
TR, IR |
Zero Usage
Not all content providers or other COUNTER report providers link their COUNTER reporting tool to their subscription database, so R5 reports cannot include zero-usage reporting based on subscription records. Inclusion of zero-usage reporting for everything, including unsubscribed content, could make reports unmanageably large. The need for libraries to identify subscribed titles with zero usage will be addressed by the KBART Automation Working Group initiative.
For tabular reports
Omit any row where the Reporting_Period_Total would be zero.
If the Reporting_Period_Total is not zero, but usage for an included month is zero, set the cell value for that month to 0.
For JSON reports
Omit any Instance element with a Count of zero.
Omit Performance elements that don’t have at least one Instance element.
Omit Report_Items elements that don’t have at least one Performance element.
Missing and Unknown Values
The value for an element might be missing or unknown, for example a title might not have an ISBN or the ISBN might be unknown. In COUNTER reports this is expressed as follows:
For tabular reports the cell MUST be left blank.
For JSON reports
If the COUNTER_SUSHI API Specification (see Section 8) indicates the element is REQUIRED, the value of the element MUST be expressed as empty as appropriate for the data type.
If the element is not REQUIRED according to the COUNTER_SUSHI API Specification, the element MUST be omitted.
For clarity, values such as “unknown”, “n/a” or “-” MUST NOT be used.
If a non-empty value is required for an element and the value is empty or the element is omitted, the COUNTER Release 5 Validation Tool reports a (Critical) Error which would cause the report to fail an audit. If Title, Item or Publisher is empty or Data_Type Unspecified is used, the COUNTER Release 5 Validation Tool reports a Warning which might affect the result of an audit. See Section 9.2 for details on the error levels used by the COUNTER Release 5 Validation Tool.
COUNTER reports
Platform Reports
Platform Reports provide a summary of activity on a given platform to support the evaluation of platforms and to provide high-level statistical data to support surveys and reporting to funders.
Table 4 (below): Platform Master Report and Standard Views
Report_ID |
Report_Name |
Details |
Host_Types |
---|---|---|---|
PR |
Platform Master Report |
A customizable report summarizing activity across a content provider’s platforms that allows the user to apply filters and select other configuration options. |
All Host_Types: |
PR_P1 |
Platform Usage |
Platform-level usage summarized by Metric_Type. |
All Host_Types: |
*Data repositories may choose to conform to the Code of Practice Release 5 or, alternatively, may wish to work with the Code of Practice for Research Data.
Report Header
The table below shows the header details for the Platform Master Report and its Standard Views. For the tabular reports, elements MUST appear in the exact order shown, and spelling, casing, and punctuation of labels (Column A) and fixed data elements such as report names (Column B) MUST match exactly. The JSON version of the report MUST comply with the Report_Header definition in the COUNTER_SUSHI API Specification (see Section 8 below). Entries in the table appearing in italics describe the values to include.
Table 4.a (below): Header for Platform Master Report and Standard Views
Row in Tabular Report |
Label for Tabular Report (Column A) |
Value for Tabular Report (Column B) |
|
---|---|---|---|
PR |
PR_P1 |
||
1 |
Report_Name |
Platform Master Report |
Platform Usage |
2 |
Report_ID |
PR |
PR_P1 |
3 |
Release |
5 |
5 |
4 |
Institution_Name |
Name of the institution the usage is attributed to. |
|
5 |
Institution_ID |
Identifier(s) for the institution in the format of {namespace}:{value}. Leave blank if identifier is not known. Multiple identifiers may be included by separating with semicolon-space (“; ”). |
|
6 |
Metric_Types |
Semicolon-space delimited list of Metric_Types included in the report. |
Searches_Platform; Total_Item_Requests; Unique_Item_Requests; Unique_Title_Requests |
7 |
Report_Filters |
Semicolon-space delimited list of filters applied to the data to generate the report. |
Access_Method=Regular* |
8 |
Report_Attributes |
Semicolon-space delimited list of report attributes applied to the data to generate the report. |
(blank) |
9 |
Exceptions |
Any exceptions that occurred in generating the report, in the format “{Exception Code}: {Exception Message} ({Data})” with multiple exceptions separated by semicolon-space (“; ”). |
|
10 |
Reporting_Period |
Date range requested for the report in the form of “Begin_Date=yyyy-mm-dd; End_Date=yyyy-mm-dd”. The “dd” of the Begin_Date is 01. The “dd” of the End_Date is the last day of the month. |
|
11 |
Created |
Date and time the report was run in RFC3339 date-time format (yyyy-mm-ddThh:mm:ssZ). |
|
12 |
Created_By |
Name of organization or system that generated the report. |
|
13 |
(blank) |
(blank) |
(blank) |
*If a Platform filter is used (see Section 3.3.8 for details), it MUST be included in Report_Filters.
Column Headings/Elements
The following elements MUST appear in the tabular report in the order they appear in the table below. For guidance on how these elements appear in the JSON format, refer to the COUNTER_SUSHI API Specification (see Section 8 below). Mandatory (M) elements MUST be included in the report. The other elements MUST only be included in the Master Report if requested (R), and if included they MUST be listed in Attributes_To_Show in the Report_Attributes header.
Table 4.b (Below): Column Headings/Elements for Platform Master Report and Standard Views
Element Name (Tabular) |
PR |
PR_P1 |
---|---|---|
Platform |
M |
M |
Data_Type |
R |
|
Access_Method |
R |
|
Metric_Type |
M |
M |
Reporting_Period_Total |
M |
M |
Mmm-yyyy |
M* |
M |
*unless Exclude_Monthly_Details=True is used
Filters and Attributes
The following table presents the values that can be chosen for the Platform Master Report and that are pre-set for the Standard Views. If a filter is not included in the request, the default applies. For the Standard Views an empty cell indicates that the filter is not applied.
Table 4.c (below) Filters/Attributes for Platform Master Report and Standard Views
Filter/Attribute |
Filters available (options for Master Report and required for Standard Views) |
|
---|---|---|
PR |
PR_P1 |
|
Data_Type |
One or more or all (default) of the Data_Types applicable to the platform. |
|
Access_Method |
One or all (default) of: |
Regular |
Metric_Type |
One or more or all (default) of: |
Searches_Platform |
Exclude_Monthly_Details |
False (default) or True |
If a filter is applied to a column that doesn’t show on the report, usage for all selected attribute values is summed and the totals are presented in the report.
Database Reports
Database Reports provide a summary of activity related to a given database or fixed collection of content that is packaged like a database. These reports provide a means of evaluating the impact a database has for an institution’s users.
Table 4.d (below): Database Master Report and Standard Views
Report_ID |
Report_Name |
Details |
Host_Types |
---|---|---|---|
DR |
Database Master Report |
A customizable report detailing activity by database that allows the user to apply filters and select other configuration options. |
A&I_Database |
DR_D1 |
Database Search and Item Usage |
Reports on key Searches, Investigations and Requests metrics needed to evaluate a database. |
A&I_Database |
DR_D2 |
Database Access Denied |
Reports on Access Denied activity for databases where users were denied access because simultaneous-use licenses were exceeded or their institution did not have a license for the database. |
A&I_Database |
Report Header
The table below shows the header details for the Database Master Report and its Standard Views. For the tabular reports, elements MUST appear in the exact order shown, and spelling, casing, and punctuation of labels (Column A) and fixed data elements such as report names (Column B) MUST match exactly. The JSON version of the report MUST comply with the Report_Header definition in the COUNTER_SUSHI API Specification (see Section 8 below). Entries in the table appearing in italics describe the values to include.
Table 4.e (below): Header for Database Master Report and Standard Views
Row in Tabular Report |
Label for Tabular Report (Column A) |
Value for Tabular Report (Column B) |
||
---|---|---|---|---|
DR |
DR_D1 |
DR_D2 |
||
1 |
Report_Name |
Database Master Report |
Database Search and Item Usage |
Database Access Denied |
2 |
Report_ID |
DR |
DR_D1 |
DR_D2 |
3 |
Release |
5 |
5 |
5 |
4 |
Institution_Name |
Name of the institution the usage is attributed to. |
||
5 |
Institution_ID |
Identifier(s) for the institution in the format of {namespace}:{value}. Leave blank if identifier is not known. Multiple identifiers may be included by separating with semicolon-space (“; ”). |
||
6 |
Metric_Types |
Semicolon-space delimited list of Metric_Types included in the report. |
Searches_Automated; Searches_Federated; Searches_Regular; Total_Item_Investigations; Total_Item_Requests |
Limit_Exceeded; No_License |
7 |
Report_Filters |
Semicolon-space delimited list of filters applied to the data to generate the report. |
Access_Method=Regular* |
Access_Method=Regular* |
8 |
Report_Attributes |
Semicolon-space delimited list of report attributes applied to the data to generate the report. |
(blank) |
(blank) |
9 |
Exceptions |
Any exceptions that occurred in generating the report, in the format “{Exception Code}: {Exception Message} ({Data})” with multiple exceptions separated by semicolon-space (“; ”). |
||
10 |
Reporting_Period |
Date range requested for the report in the form of “Begin_Date=yyyy-mm-dd; End_Date=yyyy-mm-dd”. The “dd” of the Begin_Date is 01. The “dd” of the End_Date is the last day of the month. |
||
11 |
Created |
Date and time the report was run in RFC3339 date-time format (yyyy-mm-ddThh:mm:ssZ). |
||
12 |
Created_By |
Name of organization or system that generated the report. |
||
13 |
(blank) |
(blank) |
(blank) |
(blank) |
*If a Platform filter is used (see Section 3.3.8 for details), it MUST be included in Report_Filters.
Column Headings/Elements
The following elements MUST appear in the tabular report in the order they appear in the table below. For guidance on how these elements appear in the JSON format, refer to the COUNTER_SUSHI API Specification (see Section 8 below). Mandatory (M) elements MUST be included in the report. The other elements MUST only be included in the Master Report if requested (R), and if included they MUST be listed in Attributes_To_Show in the Report_Attributes header.
Table 4.f (below): Column Headings/Elements for Database Master Report and Standard Views
Element Name (Tabular) |
DR |
DR_D1 |
DR_D2 |
---|---|---|---|
Database |
M |
M |
M |
Publisher |
M |
M |
M |
Publisher_ID |
M |
M |
M |
Platform |
M |
M |
M |
Proprietary_ID |
M |
M |
M |
Data_Type |
R |
||
Access_Method |
R |
||
Metric_Type |
M |
M |
M |
Reporting_Period_Total |
M |
M |
M |
Mmm-yyyy |
M* |
M |
M |
*unless Exclude_Monthly_Details=True is used
Filters and Attributes
The following table presents the values that can be chosen for the Database Master Report and that are pre-set for the Standard Views. If a filter is not included in the request, the default applies. For the Standard Views an empty cell indicates that the filter is not applied.
Table 4.g (below): Filters/Attributes for Database Master Report and Standard Views
Filter/Attribute |
Filters available (options for Master Report and required for Standard Views) |
||
---|---|---|---|
DR |
DR_D1 |
DR_D2 |
|
Data_Type |
One or more or all (default) of the Data_Types applicable to the platform. |
||
Access_Method |
One or all (default) of: |
Regular |
Regular |
Metric_Type |
One or more or all (default) of: |
Searches_Automated |
Limit_Exceeded |
Exclude_Monthly_Details |
False (default) or True |
If a filter is applied to a column that doesn’t show on the report, usage for all selected attribute values is summed and the totals are presented in the report.
Title Reports
Title Reports provide a summary of activity related to content at the title level and provide a means of evaluating the impact a title has for an institution’s patrons.
Table 4.h (below): Title Master Report and Standard Views
Report_ID |
Report_Name |
Details |
Host_Types |
---|---|---|---|
TR |
Title Master Report |
A customizable report detailing activity at the title level (journal, book, etc.) that allows the user to apply filters and select other configuration options. |
Aggregated_Full_Content |
TR_B1 |
Book Requests (Excluding OA_Gold) |
Reports on full-text activity for books, excluding Gold Open Access content, as Total_Item_Requests and Unique_Title_Requests. The Unique_Title_Requests provides comparable usage across book platforms. The Total_Item_Requests shows overall activity; however, numbers between sites will vary significantly based on how the content is delivered (e.g. delivered as a complete book or by chapter). |
Aggregated_Full_Content |
TR_B2 |
Book Access Denied |
Reports on Access Denied activity for books where users were denied access because simultaneous-use licenses were exceeded or their institution did not have a license for the book. |
Aggregated_Full_Content |
TR_B3 |
Book Usage by Access Type |
Reports on book usage showing all applicable Metric_Types broken down by Access_Type. |
Aggregated_Full_Content |
TR_J1 |
Journal Requests (Excluding OA_Gold) |
Reports on usage of journal content, excluding Gold Open Access content, as Total_Item_Requests and Unique_Item_Requests. The Unique_Item_Requests provides comparable usage across journal platforms by reducing the inflationary effect that occurs when an HTML full text automatically displays and the user then accesses the PDF version. The Total_Item_Requests shows overall activity. |
Aggregated_Full_Content |
TR_J2 |
Journal Access Denied |
Reports on Access Denied activity for journal content where users were denied access because simultaneous-use licenses were exceeded or their institution did not have a license for the title. |
Aggregated_Full_Content |
TR_J3 |
Journal Usage by Access Type |
Reports on usage of journal content for all Metric_Types broken down by Access_Type. |
Aggregated_Full_Content |
TR_J4 |
Journal Requests by YOP (Excluding OA_Gold) |
Breaks down the usage of journal content, excluding Gold Open Access content, by year of publication (YOP), providing counts for the Metric_Types Total_Item_Requests and Unique_Item_Requests. Provides the details necessary to analyze usage of content in backfiles or covered by perpetual access agreement. Note that COUNTER reports do not provide access model or perpetual access rights details. |
Aggregated_Full_Content |
Report Header
The table below shows the header details for the Title Master Report and its Standard Views. For the tabular reports, elements MUST appear in the exact order shown, and spelling, casing, and punctuation of labels (Column A) and fixed data elements such as report names (Column B) MUST match exactly. The JSON version of the report MUST comply with the Report_Header definition in the COUNTER_SUSHI API Specification (see Section 8 below). Entries in the table appearing in italics describe the values to include.
Table 4.i (below) Header for Title Master Report and Standard Views - Part 1 (for Books)
Row in Tabular Report |
Label for Tabular Report (Column A) |
Value for Tabular Report (Column B) |
|||
---|---|---|---|---|---|
TR |
TR_B1 |
TR_B2 |
TR_B3 |
||
1 |
Report_Name |
Title Master Report |
Book Requests (Excluding OA_Gold) |
Book Access Denied |
Book Usage by Access Type |
2 |
Report_ID |
TR |
TR_B1 |
TR_B2 |
TR_B3 |
3 |
Release |
5 |
5 |
5 |
5 |
4 |
Institution_Name |
Name of the institution the usage is attributed to. |
|||
5 |
Institution_ID |
Identifier(s) for the institution in the format of {namespace}:{value}. Leave blank if identifier is not known. Multiple identifiers may be included by separating with semicolon-space (“; ”). |
|||
6 |
Metric_Types |
Semicolon-space delimited list of Metric_Types included in the report. |
Total_Item_Requests; |
Limit_Exceeded; |
Total_Item_Investigations; |
7 |
Report_Filters |
Semicolon-space delimited list of filters applied to the data to generate the report. |
Data_Type=Book; |
Data_Type=Book; |
Data_Type=Book; |
8 |
Report_Attributes |
Semicolon-space delimited list of report attributes applied to the data to generate the report. |
(blank) |
(blank) |
(blank) |
9 |
Exceptions |
Any exceptions that occurred in generating the report, in the format “{Exception Code}: {Exception Message} ({Data})” with multiple exceptions separated by semicolon-space (“; ”). |
|||
10 |
Reporting_Period |
Date range requested for the report in the form of “Begin_Date=yyyy-mm-dd; End_Date=yyyy-mm-dd”. The “dd” of the Begin_Date is 01. The “dd” of the End_Date is the last day of the month. |
|||
11 |
Created |
Date and time the report was run in RFC3339 date-time format (yyyy-mm-ddThh:mm:ssZ). |
|||
12 |
Created_By |
Name of organization or system that generated the report. |
|||
13 |
(blank) |
(blank) |
(blank) |
(blank) |
(blank) |
*If a Platform filter is used (see Section 3.3.8 for details), it MUST be included in Report_Filters.
Table 4.j (below): Header for Title Master Report and Standard Views - Part 2 (for Journals)
Row in Tabular Report |
Label for Tabular Report (Column A) |
Value for Tabular Report (Column B) |
|||
---|---|---|---|---|---|
TR_J1 |
TR_J2 |
TR_J3 |
TR_J4 |
||
1 |
Report_Name |
Journal Requests (Excluding OA_Gold) |
Journal Access Denied |
Journal Usage by Access Type |
Journal Requests by YOP (Excluding OA_Gold) |
2 |
Report_ID |
TR_J1 |
TR_J2 |
TR_J3 |
TR_J4 |
3 |
Release |
5 |
5 |
5 |
5 |
4 |
Institution_Name |
Name of the institution the usage is attributed to. |
|||
5 |
Institution_ID |
Identifier(s) for the institution in the format of {namespace}:{value}. Leave blank if identifier is not known. Multiple identifiers may be included by separating with semicolon-space (“; ”). |
|||
6 |
Metric_Types |
Total_Item_Requests; |
Limit_Exceeded; |
Total_Item_Investigations; |
Total_Item_Requests; |
7 |
Report_Filters |
Data_Type=Journal; |
Data_Type=Journal; |
Data_Type=Journal; |
Data_Type=Journal; |
8 |
Report_Attributes |
(blank) |
(blank) |
(blank) |
(blank) |
9 |
Exceptions |
Any exceptions that occurred in generating the report, in the format “{Exception Code}: {Exception Message} ({Data})” with multiple exceptions separated by semicolon-space (“; ”). |
|||
10 |
Reporting_Period |
Date range requested for the report in the form of “Begin_Date=yyyy-mm-dd; End_Date=yyyy-mm-dd”. The “dd” of the Begin_Date is 01. The “dd” of the End_Date is the last day of the month. |
|||
11 |
Created |
Date and time the report was run in RFC3339 date-time format (yyyy-mm-ddThh:mm:ssZ). |
|||
12 |
Created_By |
Name of organization or system that generated the report. |
|||
13 |
(blank) |
(blank) |
(blank) |
(blank) |
(blank) |
*If a Platform filter is used (see Section 3.3.8 for details), it MUST be included in Report_Filters.
Column Headings/Elements
The following elements MUST appear in the tabular report in the order they appear in the table below. For guidance on how these elements appear in the JSON format, refer to the COUNTER_SUSHI API Specification (see Section 8 below). Mandatory (M) elements MUST be included in the report. The other elements MUST only be included in the Master Report if requested (R), and if included they MUST be listed in Attributes_To_Show in the Report_Attributes header.
Table 4.k (below): Column Headings/Elements for Title Master Report and Standard Views
Element Name (Tabular) |
TR |
TR_B1 |
TR_B2 |
TR_B3 |
TR_J1 |
TR_J2 |
TR_J3 |
TR_J4 |
---|---|---|---|---|---|---|---|---|
Title |
M |
M |
M |
M |
M |
M |
M |
M |
Publisher |
M |
M |
M |
M |
M |
M |
M |
M |
Publisher_ID |
M |
M |
M |
M |
M |
M |
M |
M |
Platform |
M |
M |
M |
M |
M |
M |
M |
M |
DOI |
M |
M |
M |
M |
M |
M |
M |
M |
Proprietary_ID |
M |
M |
M |
M |
M |
M |
M |
M |
ISBN |
M |
M |
M |
M |
||||
Print_ISSN |
M |
M |
M |
M |
M |
M |
M |
M |
Online_ISSN |
M |
M |
M |
M |
M |
M |
M |
M |
URI |
M |
M |
M |
M |
M |
M |
M |
M |
Data_Type |
R |
|||||||
Section_Type |
R |
|||||||
YOP |
R |
M |
M |
M |
M |
|||
Access_Type |
R |
M |
M |
|||||
Access_Method |
R |
|||||||
Metric_Type |
M |
M |
M |
M |
M |
M |
M |
M |
Reporting_Period_Total |
M |
M |
M |
M |
M |
M |
M |
M |
Mmm-yyyy |
M* |
M |
M |
M |
M |
M |
M |
M |
*unless Exclude_Monthly_Details=True is used
Filters and Attributes
The following table presents the values that can be chosen for the Title Master Report and that are pre-set for the Standard Views. If a filter is not included in the request, the default applies. For the Standard Views an empty cell indicates that the filter is not applied.
Table 4.l (below): Filters/Attributes for Title Master Report and Standard Views - Part 1 (for Books)
Filter/Attribute |
Filters available (options for Master Report and required for Standard Views) |
|||
---|---|---|---|---|
TR |
TR_B1 |
TR_B2 |
TR_B3 |
|
Data_Type |
One or more or all (default) of the Data_Types applicable to the platform. |
Book |
Book |
Book |
Section_Type |
One or more or all (default) of the Section_Types applicable to the platform. |
|||
YOP |
All years (default), a specific year in the format yyyy, or a range of years in the format yyyy-yyyy. Use 0001 for unknown or 9999 for articles in press. Note that the COUNTER_SUSHI API allows the specification of multiple years and ranges separated by the vertical pipe (“|”) character. |
|||
Access_Type |
One or more or all (default) of: |
Controlled |
||
Access_Method |
One or all (default) of: |
Regular |
Regular |
Regular |
Metric_Type |
One or more or all (default) of: |
Total_Item_Requests |
Limit_Exceeded |
Total_Item_Investigations |
Exclude_Monthly_Details |
False (default) or True |
Table 4.m (below): Filters/Attributes for Title Master Report and Standard Views - Part 2 (for Journals)
Filter/Attribute |
Filters available (options for Master Report and required for Standard Views) |
|||
---|---|---|---|---|
TR_J1 |
TR_J2 |
TR_J3 |
TR_J4 |
|
Data_Type |
Journal |
Journal |
Journal |
Journal |
Section_Type |
||||
YOP |
||||
Access_Type |
Controlled |
Controlled |
||
Access_Method |
Regular |
Regular |
Regular |
Regular |
Metric_Type |
Total_Item_Requests |
Limit_Exceeded |
Total_Item_Investigations |
Total_Item_Requests |
Exclude_Monthly_Details |
If a filter is applied to a column that doesn’t show on the report, usage for all selected attribute values is summed and the totals are presented in the report.
Item Reports
Item Reports provide a summary of activity related to content at the item level and provide a means of evaluating the impact an item has for an institution’s patrons.
Table 4.n (below): Item Master Report and Standard Views
Report_ID |
Report_Name |
Details |
Host_Types |
---|---|---|---|
IR |
Item Master Report |
A granular, customizable report showing activity at the level of the item (article, chapter, media object, etc.) that allows the user to apply filters and select other configuration options. |
Data_Repository* |
IR_A1 |
Journal Article Requests |
Reports on journal article requests at the article level. This report is limited to content with a Data_Type of Article, Parent_Data_Type of Journal, and Metric_Types of Total_Item_Requests and Unique_Item_Requests. This Standard View must be provided only if (a) it is clear for all articles in IR whether they are journal articles or not and (b) the parent item is known for all journal articles. |
Repository |
IR_M1 |
Multimedia Item Requests |
Reports on multimedia requests at the item level. |
Multimedia |
*Data repositories may choose to conform to the Code of Practice Release 5 or, alternatively, may wish to work with the Code of Practice for Research Data.
Report Header
The table below shows the header details for the Item Master Report and its Standard Views. For the tabular reports, elements MUST appear in the exact order shown, and spelling, casing and punctuation of labels (Column A) and fixed data elements such as report names (Column B) MUST match exactly. The JSON version of the report MUST comply with the Report_Header definition in the COUNTER_SUSHI API Specification (see Section 8 below). Entries in the table appearing in italics describe the values to include.
Table 4.o (below): Header for Item Master Report and Standard Views
Row in Tabular Report |
Label for Tabular Report (Column A) |
Value for Tabular Report (Column B) |
||
---|---|---|---|---|
IR |
IR_A1 |
IR_M1 |
||
1 |
Report_Name |
Item Master Report |
Journal Article Requests |
Multimedia Item Requests |
2 |
Report_ID |
IR |
IR_A1 |
IR_M1 |
3 |
Release |
5 |
5 |
5 |
4 |
Institution_Name |
Name of the institution the usage is attributed to. |
||
5 |
Institution_ID |
Identifier(s) for the institution in the format of {namespace}:{value}. Leave blank if identifier is not known. Multiple identifiers may be included by separating with semicolon-space (“; ”). |
||
6 |
Metric_Types |
Semicolon-space delimited list of Metric_Types included in the report. |
Total_Item_Requests; Unique_Items_Requests |
Total_Item_Requests |
7 |
Report_Filters |
Semicolon-space delimited list of filters applied to the data to generate the report. |
Data_Type=Article; Parent_Data_Type=Journal; Access_Method=Regular* |
Data_Type=Multimedia; Access_Method=Regular* |
8 |
Report_Attributes |
Semicolon-space delimited list of report attributes applied to the data to generate the report. |
(blank) |
(blank) |
9 |
Exceptions |
Any exceptions that occurred in generating the report, in the format “{Exception Code}: {Exception Message} ({Data})” with multiple exceptions separated by semicolon-space (“; ”). |
||
10 |
Reporting_Period |
Date range requested for the report in the form of “Begin_Date=yyyy-mm-dd; End_Date=yyyy-mm-dd”. The “dd” of the Begin_Date is 01. The “dd” of the End_Date is the last day of the month. |
||
11 |
Created |
Date and time the report was run in RFC3339 date-time format (yyyy-mm-ddThh:mm:ssZ). |
||
12 |
Created_By |
Name of organization or system that generated the report. |
||
13 |
(blank) |
(blank) |
(blank) |
(blank) |
*If a Platform filter is used (see Section 3.3.8 for details), it MUST be included in Report_Filters.
Column Headings/Elements
The following elements MUST appear in the tabular report in the order they appear in the table below. For guidance on how these elements appear in the JSON format, refer to the COUNTER_SUSHI API Specification (see Section 8 below). Mandatory (M) elements MUST be included in the report. The Parent and Component elements MUST only be included in the Master Report if requested (R) via Include_Parent_Details and Include_Component_Details, respectively (they are not supposed to be selected individually). If they are included, then the corresponding Include_Parent_Details=True or Include_Component_Details=True MUST be included in the Report_Attributes header. The other elements also MUST only be included if requested (R), and if included they MUST be listed in Attributes_To_Show in the Report_Attributes header.
Table 4.p (below): Column Headings/Elements for Item Master Report and Standard Views
Element Name (Tabular) |
IR |
IR_A1 |
IR_M1 |
---|---|---|---|
Item |
M |
M |
M |
Publisher |
M |
M |
M |
Publisher_ID |
M |
M |
M |
Platform |
M |
M |
M |
Authors |
R |
M |
|
Publication_Date |
R |
M |
|
Article_Version |
R |
M |
|
DOI |
M |
M |
M |
Proprietary_ID |
M |
M |
M |
ISBN |
M |
||
Print_ISSN |
M |
M |
|
Online_ISSN |
M |
M |
|
URI |
M |
M |
M |
Parent_Title |
R |
M |
|
Parent_Authors |
R |
M |
|
Parent_Publication_Date |
R |
||
Parent_Article_Version |
R |
M |
|
Parent_Data_Type |
R |
||
Parent_DOI |
R |
M |
|
Parent_Proprietary_ID |
R |
M |
|
Parent_ISBN |
R |
||
Parent_Print_ISSN |
R |
M |
|
Parent_Online_ISSN |
R |
M |
|
Parent_URI |
R |
M |
|
Component_Title |
R |
||
Component_Authors |
R |
||
Component_Publication_Date |
R |
||
Component_Data_Type |
R |
||
Component_DOI |
R |
||
Component_Proprietary_ID |
R |
||
Component_ISBN |
R |
||
Component_Print_ISSN |
R |
||
Component_Online_ISSN |
R |
||
Component_URI |
R |
||
Data_Type |
R |
||
YOP |
R |
||
Access_Type |
R |
M |
|
Access_Method |
R |
||
Metric_Type |
M |
M |
M |
Reporting_Period_Total |
M |
M |
M |
Mmm-yyyy |
M* |
M |
M |
*unless Exclude_Monthly_Details=True is used
Filters and Attributes
The following table presents the values that can be chosen for the Item Master Report and that are pre-set for the Standard Views. If a filter is not included in the request, the default applies. For the Standard Views an empty cell indicates that the filter is not applied.
Table 4.q (below): Filters/Attributes for Item Master Report and Standard Views
Filter/Attribute |
Filters available (options for Master Report and required for Standard Views) |
||
---|---|---|---|
IR |
IR_A1 |
IR_M1 |
|
Data_Type |
One or more or all (default) of the Data_Types applicable to the platform. |
Article |
Multimedia |
YOP |
All years (default), a specific year in the format yyyy, or a range of years in the format yyyy-yyyy. Use 0001 for unknown or 9999 for articles in press. Note that the COUNTER_SUSHI API allows the specification of multiple years and ranges separated by the vertical pipe (“|”) character. |
||
Access_Type |
One or more or all (default) of: |
||
Access_Method |
One or all (default) of: |
Regular |
Regular |
Metric_Type |
One or more or all (default) of: |
Total_Item_Requests |
Total_Item_Requests |
Include_Parent_Details |
False (default) or True |
||
Include_Component_Details |
False (default) or True |
||
Exclude_Monthly_Details |
False (default) or True |
If a filter is applied to a column that doesn’t show on the report, usage for all selected attribute values is summed and the totals are presented in the report.
Delivery of COUNTER Reports
Content providers MUST make tabular versions of COUNTER reports available from an administrative/reporting site accessible by members of the institution requesting the report. All COUNTER reports provided by the content provider MUST also be available via the COUNTER_SUSHI API. Delivery requirements are:
Reports MUST be provided in the following formats:
In tabular form as either an Excel or a tab-separated-value (TSV) file, or both. Additional file formats that can be easily imported into spreadsheet programs without loss or corruption may be offered at the vendor’s discretion.
JSON formatted in accordance with the COUNTER_SUSHI API Specification (see Section 8 below).
The reports in JSON, TSV and other text formats MUST be encoded using UTF-8. Tabular reports in text formats SHOULD include a byte order mark, so that spreadsheet programs can automatically detect the encoding. JSON reports and other SUSHI server responses MUST NOT include a byte order mark (according to RFC 8259, Section 8.1).
Each report MUST be delivered as a separate file to facilitate automated processing of usage reports into ERM and usage consolidation systems. For clarity, multiple reports MUST NOT be included in the same Excel file as separate worksheets.
Tabular reports MUST be made available through a website.
The website may be password-controlled.
Email alerts may be sent when data is updated.
The report interface MUST provide filter and configuration options for the Master Reports that apply to the content provider.
The report interface MUST offer all Standard Views the content provider is required to provide, and Standard Views options MUST automatically apply the REQUIRED filter and configuration options and not allow the user to alter the filters or configuration options except for the usage begin and end dates.
The date range fields on the user interface MUST default to the latest month with complete usage. For example, if the current date is 15 May 2019 and April usage has been processed, the begin date would default to 01 April 2019 and the end date would default to 30 April 2019. If the April usage has not yet been processed, the start and end dates would default to 01 March 2019 and 31 March 2019.
Master Reports must include the option to Exclude_Monthly_Details. Item Master Reports must include the options to Include_Parent_Details and Include_Component_Details (see Section 3.3.8 for details).
Reports MUST be provided monthly.
Data MUST be updated within 4 weeks of the end of the reporting period.
Usage MUST be processed for the entire month before any usage for that month can be included in reports. If usage for a given month is not available yet, the content provider MUST NOT return usage for that month and MUST include exception 3031 in the report/response to indicate that usage is not ready for requested dates.
A minimum of the current year plus the prior 24 months of usage data MUST be available, unless the content provider is newly COUNTER compliant.
When content providers become compliant with a new release of the Code of Practice, they begin compiling usage compliant with the new release from the time they become compliant; and they MUST continue to provide the older usage that complies with the previous release(s) of the Code of Practice to fulfil the requirement.
The reports MUST allow the customer the flexibility to specify a date range, in terms of months, within the most recent 24-month period.
Reports MUST be available for harvesting via the COUNTER_SUSHI API within 4 weeks of the end of the reporting period.
Access to Usage for Consortia
Separate consortium reports are not provided under R5. Consortium managers must be able to access any R5 report for their members. To facilitate this:
The consortium administrator MUST be able to access the usage statistics for individual consortium member institutions, from a single login, using the same user id and password (i.e. without having to logout and back in for each individual institution).
COUNTER_SUSHI API implementations MUST support the /members path (see Section 10.3 below) to facilitate consortium managers retrieving usage for all members.
Logging Usage
Usage data can be generated in a number of ways, and COUNTER does not prescribe which approach should be taken. The two most common approaches are:
Log file analysis, which reads the log files containing the web server records of all its transactions
Page tagging, which uses JavaScript on each page to notify a third-party server when a page is rendered by a web browser.
Other options are to leverage Distributed Usage Logging (DUL) to capture content activity that happens on other websites. Each of these approaches has advantages and disadvantages, summarised below.
Log File Analysis
The main advantages of log file analysis over page tagging are:
Web servers normally produce log files, so the raw data are already available. No changes to the website are required.
The data is on the organization’s own servers and is in a standard, rather than a proprietary, format. This makes it easy for an organization to switch programs later, use several different programs, and analyse historical data with a new program.
Log files contain information on visits from search engine spiders. Although these MUST NOT be reported as part of user activity, it is useful information for search engine optimization.
Log files require no additional DNS lookups. Thus, there are no external server calls which can slow page load speeds or result in uncounted page views.
The web server reliably records every transaction it makes, including items such as serving PDF documents and content generated by scripts, and does not rely on the visitor’s browser.
Page Tagging
The main advantages of page tagging over log file analysis are:
Counting is activated by opening the page, not requesting it from the server. If a page is cached it will not be counted by the server. Cached pages can account for a significant proportion of page views.
Data is gathered via a component (tag) in the page, usually written in JavaScript although Java can also be used. JQuery and AJAX can also be used in conjunction with a server-side scripting language (such as PHP) to manipulate and store it in a database, allowing complete control over how the data is represented.
The script may have access to additional information on the web client or on the user, not sent in the query.
Page tagging can report on events that do not involve a request to the web server.
Page tagging is available to companies who do not have access to their own web servers.
The page-tagging service manages the process of assigning cookies to visitors; with log file analysis, the server must be configured to do this.
Recently page tagging has become a standard in web analytics.
Log file analysis is almost always performed in-house. Page tagging can be done in-house, but is more often provided as a third-party service. The cost differences between these two models can also be a consideration.
Distributed Usage Logging
Distributed Usage Logging (DUL) was an initiative sponsored by Crossref that provides a framework for publishers to capture usage of DOI-identified content items that occurs on other websites, such as aggregators, repositories, and scholarly information-sharing sites. The premise behind DUL was that publishers could register a DUL usage logging end-point with Crossref, which was then mapped to all of the publisher’s DOIs. A content site, such as a repository, could use a content item’s DOI to look up where the publisher wants a transaction to be logged, then use the standard DUL message structure to log the activity. Using DUL could allow a publisher to capture a more complete picture of content usage. The following points cover how DUL may be used with COUNTER statistical reporting:
DUL is not a replacement for log file analysis or page-tagging approaches. DUL can supplement a publisher’s normal usage logging mechanisms, but not replace them.
DUL-captured usage MUST NOT appear on Standard Views.
DUL-captured usage may appear on Master Reports.
DUL-captured usage that appears on Master Reports MUST be reported under the platform name where the transaction occurred.
The platform name MUST include the namespace DUL (i.e. MUST be in the format of DUL:{platform name}), so that DUL-captured usage can be identified and excluded when creating a Standard View from a Master Report.
An organization that supplies usage transactions using DUL MUST include their platform ID with each transaction, and their platform MUST be registered with COUNTER.
Reporting usage through DUL is OPTIONAL.
The publisher receiving transactions through DUL is responsible for performing COUNTER processing to eliminate double-clicks, eliminate robot/crawler or other rogue usage, and perform the actions to identify unique items and unique titles.
Publishers that plan to include usage reported through DUL in their COUNTER Master Reports are responsible for ensuring that DUL-reported usage is included in the audit.
The Crossref project has terminated but this process could be used by content providers to capture usage activity related to their content that happens on sites other than their own.
Processing Rules for Underlying COUNTER Reporting Data
Usage data collected by content providers for the usage reports to be sent to customers should meet the basic requirement that only intended usage is recorded and that all requests that are not intended by the user are removed.
Because the way usage records are generated can differ across platforms, it is impractical to describe all the possible filters and techniques used to clean up the data. This Code of Practice, therefore, specifies only the requirements to be met by the data to be used for building the usage reports.
HTTP Status Codes
Only successful and valid requests MUST be counted. For web server log files successful requests are those with specific HTTP status codes (200 and 304). The standards for HTTP status codes are defined and maintained by the IETF HTTP working group in a series of RFCs (most notably RFC 7231). If key events are used, their definition MUST match the HTTP standards. (For more information see The Friendly Guide to Release 5: Technical Notes for Content Providers.)
Double-Click Filtering
The intent of double-click filtering is to remove the potential of over-counting which could occur when a user clicks the same link multiple times, typically due to a slow internet connection. Double-click filtering applies to Total_Item_Investigations, Total_Item_Requests, No_License and Limit_Exceeded. See Section 7.3 and Section 7.4 below for information about counting unique items and titles. The double-click filtering rule is as follows:
Double-clicks, i.e. two clicks in succession, on a link by the same user within a 30-second period MUST be counted as one action. For the purposes of COUNTER, the time window for a double-click on any page is set at a maximum of 30 seconds between the first and second mouse clicks. For example, a click at 10:01:00 and a second click at 10:01:29 would be considered a double-click (one action); a click at 10:01:00 and a second click at 10:01:35 would count as two separate single clicks (two actions).
A double-click may be triggered by a mouse-click or by pressing a refresh or back button. When two actions are made for the same URL within 30 seconds the first request MUST be removed and the second retained.
Any additional requests for the same URL within 30 seconds (between clicks) MUST be treated identically: always remove the first and retain the second.
There are different ways to track whether two requests for the same URL are from the same user and session. These options are listed in order of increasing reliability, with Option 4 being the most reliable.
If the user is authenticated only through an IP address, that IP address combined with the browser’s user-agent (logged in the HTTP header) MUST be used to trace double-clicks. Where you have multiple users on a single IP address with the same browser user-agent, this can occasionally lead to separate clicks from different users being logged as a double-click from one user. This will only happen if the multiple users are clicking on exactly the same content within a few seconds of each other.
When a session cookie is implemented and logged, the session cookie MUST be used to trace double-clicks.
When a user cookie is available and logged, the user cookie MUST be used to trace double-clicks.
When an individual has logged in with their own profile, their username MUST be used to trace double-clicks.
Counting Unique Items
Some COUNTER Metric_Types count the number of unique items that had a certain activity, such as a Unique_Item_Requests or Unique_Item_Investigations.
For the purpose of COUNTER metrics, an item is the typical unit of content being accessed by users, such as articles, book chapters, book segments, whole books (if delivered as a single file), and multimedia content. The item MUST be identified using the unique ID which identifies the work (e.g. chapter or article) regardless of format (e.g. PDF, HTML, or EPUB). If no item-level identifier is available, then use the item name in combination with the identifier of the parent item (i.e. the article title + ISSN of the journal, or chapter name + ISBN of the book).
The rules for calculating the unique item counts are as follows:
If multiple transactions qualifying for the Metric_Type in question represent the same item and occur in the same user-sessions, only one unique activity MUST be counted for that item.
A user session is defined any of the following ways: by a logged session ID + transaction date, by a logged user ID (if users log in with personal accounts) + transaction date + hour of day (day is divided into 24 one-hour slices), by a logged user cookie + transaction date + hour of day, or by a combination of IP address + user agent + transaction date + hour of day.
To allow for simplicity in calculating session IDs, when a session ID is not explicitly tracked, the day will be divided into 24 one-hour slices and a surrogate session ID will be generated by combining the transaction date + hour time slice + one of the following: user ID, cookie ID, or IP address + user agent. For example, consider the following transaction:
Transaction date/time: 2017-06-15 13:35
IP address: 192.1.1.168
User agent: Mozilla/5.0
Generated session ID: 192.1.1.168|Mozilla/5.0|2017-06-15|13
The above replacement for a session ID does not provide an exact analogy to a session. However, statistical studies show that the result of using such a surrogate for a session ID results in unique counts are within 1-2 % of unique counts generated with actual sessions.
Counting Unique Titles
Some COUNTER Metric_Types count the number of unique titles that had a certain activity, such as a Unique_Title_Requests or Unique_Title_Investigations.
For the purpose of COUNTER metrics, a title represents the parent work that the item is part of. When the item is a chapter or section, the title is the book. The title MUST be identified using a unique identifier (e.g. an ISBN for a book) regardless of format (e.g. PDF or HTML).
The rules for calculating the unique title counts are as follows:
If multiple transactions qualifying for the Metric_Type in question represent the same title and occur in the same user-session only one unique activity MUST be counted for that title.
A user session is defined any of the following ways: by a logged session ID + transaction date, by a logged user ID (if users log in with personal accounts) + transaction date + hour of day (day is divided into 24 one-hour slices), by a logged user cookie + transaction date + hour of day, or by a combination of IP address + user agent + transaction date + hour of day.
To allow for simplicity in calculating session IDs, when a session ID is not explicitly tracked, the day will be divided into 24 one-hour slices and a surrogate session ID will be generated by combining the transaction date + hour time slice + one of the following: user ID, cookie ID, or IP address + user agent. For example, consider the following transaction:
Transaction date/time: 2017-06-15 13:35
IP address: 192.1.1.168
User agent: Mozilla/5.0
Generated session ID: 192.1.1.168|Mozilla/5.0|2017-06-15|13
The above replacement for a session ID does not provide an exact analogy to a session. However, statistical studies show that the result of using such a surrogate for a session ID results in unique counts are within 1-2 % of unique counts generated with actual sessions.
Attributing Usage when Item Appears in More Than One Database
Content providers that offer databases where a given content item (e.g. an article) is included in multiple databases MUST attribute the Investigations and Requests metrics to just one database. The following recommendations may be helpful when choosing when ambiguity arises:
Give priority to databases that the institution has rights to access.
If there is a priority order for databases for search or display within the platform, credit usage to the highest priority database.
Beyond that, use a consistent method of prioritizing database, such as by database ID or name.
If none of the above, pick randomly.
Federated Searches
Search activity generated by federated search engines MUST be categorized separately from searches conducted by users on the host platform.
Any searches generated from a federated search system MUST be included in the separate Searches_Federated counts within Database Reports and MUST NOT be included in the Searches_Regular or Searches_Automated counts.
The most common ways to recognize federated search activity are as follows:
A federated search engine may be using its own dedicated IP address, which can be identified and used to separate out the activity.
If the standard HTML interface is being used (e.g. for screen scraping), the user agent within the web log files can be used to identify the activity as coming from a federated search.
For Z39.50 activity, authentication is usually through a username/password combination. Create a unique username/password that just the federated search engine will use.
If an API or XML gateway is available, set up an instance of the gateway that is for the exclusive use of federated search tools. It is RECOMMENDED that you also require the federated search to include an identifying parameter when making requests to the gateway.
COUNTER provides lists of user agents that represent the most common federated search tools. See Appendix G.
Discovery Services and Other Multiple-Database Searches
Search activity generated by discovery services and other systems where multiple databases are searched simultaneously and the user does not have the option of selecting the databases being searched MUST be counted as Searches_Automated on Database Reports. Such searches MUST be included on the Platform Reports as Searches_Platform, but only as a single search regardless of the number of databases searched.
Example: A user searches a content site where the librarian has pre-selected 20 databases for business and economics searches and the user does not have the option to change the selection. For each search conducted by the user:
In the Database Report, each of the 20 databases gets credit for 1 Searches_Automated.
In the Platform Report, Searches_Platform gets credited by 1.
Internet Robots and Crawlers
Activity generated by internet robots and crawlers MUST be excluded from all COUNTER usage reports. COUNTER provides a list of user agent values that represent the crawlers and robots that MUST be excluded. Any transaction with a user agent matching one on the list MUST NOT be included in COUNTER reports.
COUNTER maintains the current list of internet robots and crawlers at https://github.com/atmire/COUNTER-Robots
Tools and Features that Enable Bulk Downloading
Only genuine, user-driven usage MUST be reported. COUNTER reports MUST NOT include usage that represents requests of full-text content when it is initiated by automatic or semi-automatic bulk download tools where the downloads occur without direct user action.
Products like Quosa or Pubget MUST only be recorded only when the user has clicked on the downloaded full-text article in order to open it.
Full text retrieved by automated processes such as reference manager software or robots (see Section 7.8 above) MUST be excluded.
Usage that occurs through emailing of a list of articles (Requests) or citations (Investigations) that was not as a result of a user explicitly selecting the items for sharing MUST be excluded. Note that the act of a user explicitly sharing an item would be considered an Investigation, and a user downloading and then emailing a PDF would also be considered a Request.
Text and Data Mining
Text and data mining (TDM) is a computational process whereby text or datasets are crawled by software that recognizes entities, relationships, and actions. (STM Statement on Text and Data Mining)
TDM does NOT include straightforward information retrieval, straightforward information extraction, abstracting and summarising activity, automated translation, or summarising query-response systems.
A key feature of TDM is the discovery of unknown associations based on categories that will be revealed as a result of computational and linguistic analytical tools.
Principles for reporting usage:
COUNTER does not record TDM itself, as most of this activity takes place after an article has been downloaded. All we can do is track the count of articles downloaded for the purposes of mining.
Usage associated with TDM activity (e.g. articles downloaded for the purpose of TDM) MUST be tracked by assigning an Access_Method of TDM.
Usage associated with TDM activity MUST be reported using the Title, Database, and Platform Master Reports by identifying such usage as Access_Method=TDM.
Usage associated with TDM activity MUST NOT be reported in Standard Views (TR_J1, TR_B1, etc.).
Detecting activity related to TDM:
TDM activity typically requires a prior agreement between the content provider and the individual or organization downloading the content for the purpose of text mining. The content provider can isolate TDM-related traffic using techniques like:
Providing a dedicated end-point that is specifically for TDM data harvesting.
Requiring the use of a special account or profile for TDM data harvesting.
Assigning an API key that would be used for the harvesting.
Registering the IP address of the machine harvesting content.
Harvesting of content for TDM without permission or without using the endpoint or protocol supplied by the content provider MUST be treated as robot or crawler traffic and MUST be excluded from all COUNTER reports.
SUSHI for Automated Report Harvesting
Content providers MUST support automatic harvesting of COUNTER reports via the COUNTER_SUSHI API. The specification for the RESTful COUNTER_SUSHI API is maintained by COUNTER on SwaggerHub:
https://app.swaggerhub.com/apis/COUNTER/counter-sushi_5_0_api/
The Swagger files are a comprehensive reference version that contains a detailed description of the entire COUNTER_SUSHI API. It is expected that reporting services will use only the parts relevant to them, or make local tailored copies relevant to their particular circumstances, for example by removing methods detailing reports they don’t support.
COUNTER_SUSHI API Paths to Support
The following paths (methods) MUST be supported:
HTTP Method |
Path |
Description |
---|---|---|
GET |
/status |
Returns the current status of the COUNTER_SUSHI API service. This path returns a message that includes the operating status of the API, the URL to the service’s entry in the Register of COUNTER Compliant Content Providers, and an array of service alerts (if any). |
GET |
/reports |
Returns a list of reports supported by the COUNTER_SUSHI API service. The response includes an array of reports, including the report identifier, the release number, the report name, a description, and (optional but recommended for custom reports) the path to use when requesting the report. |
GET |
/reports/{Report_ID in lower case} |
Each supported report has its own path, e.g. GET /reports/tr_b1 for “Book Requests (Excluding OA_Gold)”, GET /reports/tr_j1 for “Journal Requests (Excluding OA_Gold)”. |
GET |
/members |
Returns the list of consortium members or sites for multi-site customers. The response includes an array of customer account information, including for each the customer ID (to use when requesting COUNTER reports), the requestor ID (to use when requesting COUNTER reports), the customer account name, and additional identifiers for the organization (if any). Note that if the customer ID specified in the parameter for the /members path is not a multi-site organization, then the response will simply return the details for that customer. |
Authentication and Security for COUNTER_SUSHI API
The COUNTER_SUSHI API MUST be implemented using TLS (HTTPS).
The API MUST be secured using one or more of the following methods:
Combination of customer ID and requestor ID
IP address of the SUSHI client
API key assigned to the organization harvesting the usage
Non-standard techniques for authentication (techniques not specified in the COUNTER_SUSHI API specification) MUST NOT be used.
If IP address authentication is implemented, it MUST allow the same SUSHI client (a single IP address) to harvest usage for multiple customer accounts (e.g. hosted ERM services).
Report Filters and Report Attributes
The COUNTER_SUSHI API specification allows report responses to be customized to the caller’s needs using report filters and report attributes. For Standard Views, these filters and attributes are implicit. For the Master Reports, the filters and attributes will be explicitly included as parameters on the COUNTER_SUSHI request.
Refer to Section 3.3.8 and the COUNTER_SUSHI API Specification for the list of filters and attributes supported by the various COUNTER reports.
Errors and Exceptions
Implementations of the COUNTER_SUSHI API MUST comply with the warnings, exceptions and errors described in Appendix F.
Audit
An important feature of the COUNTER Code of Practice is that compliant content providers (including third-party services providing stats on behalf of content providers) MUST be independently audited on an annual basis in order to maintain their COUNTER-compliant status. To facilitate this, a set of auditing standards and procedures has been published in Appendix E of this Code of Practice. COUNTER has tried to meet the need of customers for credible usage statistics without placing an undue administrative or financial burden on content providers. For this reason, audits will be conducted online in accordance with the program included in the auditing standards and procedures (Appendix E).
The independent audit is REQUIRED within six months of a content provider’s first self-certifying their compliance with the COUNTER Code of Practice, and annually thereafter. COUNTER will recognize an audit carried out by any Certified Public Accountant (CPA) in the USA, by any Chartered Accountant (CA) in the UK, or by their equivalent in other countries. Alternatively, the audit may be done by a COUNTER-approved auditor which is not a CA or a CPA. (Contact COUNTER for a list of approved auditors.)
The Audit Process
COUNTER-compliant content providers are required to schedule an audit in time for the audit due date listed on their entry on the COUNTER website (https://www.projectcounter.org/about/register/).
At least one month before the audit due date, content providers MUST advise COUNTER of the name of the organization that will carry out the audit. Any queries about the audit process may be raised at this time.
Irrespective of the auditor selected, the audit MUST adhere to the requirements and use the program specified in Appendix E of this Code of Practice. The audit is carried out in three stages. Stage 1 covers the format and structure of the usage reports. In Stage 2 the auditor tests the integrity of the reported usage statistics by creating their own usage on a sample basis and subsequently reviewing the usage reports for this activity. In Stage 3 the auditor checks that the delivery of the usage reports adheres to the COUNTER requirements.
Upon completion of the audit, the auditor is REQUIRED to send a signed copy of the audit report to the Project Director (tasha.mellins-cohen@counterusage.org). On receipt of the successful audit report, the content provider will be sent a dated COUNTER logo, which they can display on their website. For example:
The dated logo MUST link to the content provider’s entry on the COUNTER website.
Failure to complete a successful audit by the due date may result in COUNTER removing that content provider from the list of compliant content providers on the COUNTER website.
COUNTER Release 5 Validation Tool
The COUNTER Release 5 Validation Tool allows content providers and auditors to quickly perform compliance checks related to format and consistency of both tabular and JSON reports. Content providers SHOULD use this free tool to check their reports and COUNTER_SUSHI API implementation and fix issues detected by the tool before they begin the audit. It is recommended to use the tool not just when preparing for the audit, but to integrate the testing in the regular QA processes or at least test regularly, once per month, to make sure the reports stay compliant.
The COUNTER Release 5 Validation Tool uses the following error levels to report issues with different severity:
Fatal error - The validation was aborted because an unrecoverable error was encountered, for example a missing or invalid Reporting_Period. The fatal error MUST be fixed before the report can be fully validated.
Critical error - The validation has detected a serious error that MUST be fixed, for example inconsistent data like a title with more Unique_Item_Requests than Total_Item_Requests or missing data. A critical error indicates that there could be a serious issue that would cause the audit to fail.
Error - The validation has detected an error that MUST be fixed to pass the audit.
Warning - The validation has detected an issue that needs to be checked by the auditor and might affect the result of the audit.
Notice - Additional information, for example about deprecations or amendments that must be addressed at some point in the future but currently won’t affect the result of an audit.
Categories of Audit Result
There are three categories of audit result, as follows:
Pass - No further action is required by the content provider as a result of the audit. In some cases, the auditor may add observations to the audit report, which are intended to help the content provider improve its COUNTER usage reports but are not required for compliance.
Qualified Pass - The content provider has passed the audit, but the auditor raises a minor issue requiring further action to maintain COUNTER-compliant status. A minor issue does not affect the reported figures but MUST be resolved within three months of the audit to maintain COUNTER-compliant status. An example of a minor issue is where a report format does not conform to the COUNTER specifications.
Fail - The auditor has identified an issue that MUST be resolved within three months for the content provider to maintain COUNTER-compliant status.
Timetable and Procedure
R5 of the COUNTER Code of Practice, published in July 2017, will become the only valid version of the Code of Practice from 1 January 2019.
Applications for COUNTER-compliant status
A register of content providers and their platforms for which COUNTER-compliant usage reports are available is maintained by COUNTER and posted on the COUNTER website - https://www.projectcounter.org/about/register/
Content providers may apply to the Project Director (tasha.mellins-cohen@counterusage.org) for their products to be included on the register. Content providers will have to provide proof of initial compliance by including the results of COUNTER Report Validation Tool tests showing compliance for each of its reports, including testing both the upload of the tabular reports and SUSHI harvesting of the same report. Upon receipt of the application and proof of compliance, content providers MUST allow at least one of the COUNTER library test sites to evaluate their usage reports.
When the usage reports are deemed to comply with the COUNTER Code of Practice, the content provider will be asked to sign a Declaration of COUNTER Compliance (Appendix C), after which the content provider and its platforms will be added to the register.
Within six months a report from an independent auditor confirming that the usage reports and data are indeed COUNTER-compliant will be required. See Appendix E for a description of the auditing program.
The signed declarations MUST be sent to the Project Director (tasha.mellins-cohen@counterusage.org) as email attachments.
Right to Use COUNTER-Compliance Logo and Designation
Content providers who have had their application accepted by COUNTER but have not yet completed a successful audit may use the designation “COUNTER Compliance Pending”. Only content providers that have passed the audit can use the designation “COUNTER Compliant” and the dated COUNTER logo.
Content providers who have not applied for compliance or whose compliance has lapsed MUST NOT claim or imply COUNTER compliance on their site, in licenses, or in their marketing and do not have the rights to use the COUNTER name or logo.
Other Compliance Topics
Content providers seeking COUNTER compliance are expected to comply with the following.
Including COUNTER in License Agreements
To encourage widespread implementation of the COUNTER Code of Practice, customers are urged to include the following clause in their license agreements with content providers:
‘The licensor confirms to the licensee that usage statistics covering the online usage of the products covered by this license will be provided. The licensor further confirms that such usage statistics will adhere to the specifications of the COUNTER Code of Practice, including data elements collected and their definitions; data processing guidelines; usage report content, format, frequency and delivery method’.
Confidentiality of Usage Data
Privacy and User Confidentiality
Statistical reports or data that reveal information about individual users will not be released or sold by content providers without the permission of that individual user, the consortium, and its member institutions (ICOLC Guidelines, October 2006).
It is the responsibility of the Content Providers to be aware of and ensure that they meet security and privacy requirements, including GDPR and other standards and requirements that may be applicable.
Institutional or Consortia Confidentiality
Content providers do not have the right to release or sell statistical usage information about specific institutions or the consortium without permission, except to the consortium administrators and other member libraries, and to the original content provider and copyright holder of the content. Use of institutional or consortium data as part of an aggregate grouping of similar institutions for purposes of comparison does not require prior permission as long as specific institutions or consortia are not identifiable. When required by contractual agreements, content providers, such as aggregators, may furnish institutional use data to the original content providers. (Based on ICOLC Guidelines, October 2006).
COUNTER Reporting for Consortia
Consortia license content for their members and consortium administrators need access to COUNTER statistics that show how each member has used the licensed resources.
Access to SUSHI Credentials for Member Sites
Content providers MUST support the /members COUNTER_SUSHI API path to provide the consortium with the list of their members on the platform and the SUSHI credentials for each member. This will enable tools to be created to efficiently retrieve member usage and create separate or consolidated reporting.
Privacy and Confidentiality
COUNTER acknowledges that some organizations treat their usage data as sensitive and private information. Content providers may include the option for consortium members to opt-out of consortium reporting. COUNTER recommends the default setting for an organization is to opt-in to consortium reporting.
Content to Report Usage On
When a COUNTER report is harvested by a consortium administrator, a content provider may choose to limit member usage to include only content acquired through the consortium. Note that when such a limitation is in place the resulting report may differ from the member-sites own version of the report. Since not all content providers can provide such limits, the consortium will be responsible for ensuring usage is filtered to the content they license for members.
When the content provider chooses to limit member usage to only content acquired through the consortium, they MUST include a message to this effect in the Notes element in their implementation of the /members path in the COUNTER_SUSHI API (see Section 8 above).
Detailed versus Summary Reports
A content provider MUST offer the option to provide consortium-level summary of usage for the consortium. For a consortium summary report (usage for all members of the consortium rolled up at the consortia level), COUNTER acknowledges that the totals on the summary report may differ from the sum of the totals on individual member reports for the same items if an authentication method used identifies to multiple member sites and usage it attributed to each such site (e.g. overlapping IP ranges).
Note that it is possible to create Master Reports for a consortium with usage broken down by member sites with extensions (see Section 11.5), but this isn’t required for compliance.
SUSHI Service Limits
The content provider MUST NOT place limits on the SUSHI service (such as requests per day or amount of data transferred) that would prevent a consortium from retrieving reports for all its members.
Extending the Code of Practice
COUNTER recognises that some content providers may want to provide customized versions of COUNTER reports to address reporting needs specific to their platform and content. This section describes a method of extending the Code of Practice that avoids creating conflicting custom implementations between content providers.
Platform as a Namespace
Content providers and other organizations providing COUNTER reports wishing to create custom reports or introduce custom elements or values can do so by using their platform identifier (platform ID) as a namespace. For example, if EBSCO wanted to create a customized version of the “Journal Requests (Excluding OA_Gold)” Standard View for their link resolver product that includes a new Metric_Type for counting link-outs, they could do this by naming the report EBSCOhost:LR1 and creating a new Metric_Type of EBSCOhost:Total_Linkouts. Note that the platform ID is also used as a namespace for local Institution_IDs and Publisher_IDs assigned by the content provider and for Proprietary_IDs (see Section 3.2).
The platform ID MUST only contain ASCII letters (a–z, A–Z), digits (0–9), underscores (_), dots (.) and forward slashes (/), and the length MUST NOT exceed 17 characters. Note that the platform ID is used in various columns and therefore should be as short as possible, but still recognizable. The platform ID usually should be based on the name, a well-known unique abbreviation or the domain of the publisher or platform. A short standard identifier like GRID or ROR (without the https://) also could be used.
COUNTER will assign the platform ID when adding the platform to their Registry of Compliance (content providers can suggest a value to be used for their platform ID). Other organizations providing COUNTER reports, such as consortia or ERM providers, may contact COUNTER to register a namespace if they desire to create extensions and customizations. COUNTER will maintain a list of approved namespaces.
Creating Custom COUNTER Reports
Custom COUNTER reports can be created as long as the general layout for COUNTER reports is followed. Custom reports MUST be given an identifier and a name in the format of {namespace}:{report ID} and {namespace}:{report name}. An example of a custom report could be:
Report_ID |
Report_Name |
---|---|
EBSCOhost:LR1 |
EBSCOhost:Link-out Report 1 |
It is recommended to make custom reports available both from the administrative/reporting site and via the COUNTER_SUSHI API and to include them in the response for the /reports path in the COUNTER_SUSHI API (see Section 8).
Creating Custom Elements/Columns Headings
Custom elements/column headings can be added to the Master Reports (PR, DR, TR, IR) and custom reports. The element name MUST take the form of {namespace}:{element name}. An example of a custom elements/column heading could be:
Element Name |
---|
EBSCOhost:Total_Linkouts |
Custom elements/column headings MUST only be included in Master Reports if requested, and if included they MUST be listed in Attributes_To_Show in the Report_Attributes header.
Creating Custom Values for Enumerated Elements
Several elements in COUNTER reports include a controlled list of possible values. On occasion, content providers may want to introduce additional custom values that better reflect their content and platform. For Master reports (PR, DR, TR, IR) and custom reports the element value lists can be extended by including additional custom values in the form of {namespace}:{element value}. An example would be a custom Metric_Type value EBSCOhost:Total_Linkouts. The following is the list of elements that can be extended in this manner:
Data_Type
Section_Type
Access_Type
Access_Method
Metric_Type
Custom values MUST only be included in Master Reports if requested, and if included they MUST be listed in the corresponding report filters in the Report_Filters or Metric_Types header.
Reserved Elements and Values Available for Extending Reports
COUNTER recognizes that there are some common extensions that content providers might want to include in Master Reports or when creating custom reports; therefore the following element names and values have been reserved for this common use:
Element Name |
Description |
Reports |
Examples |
---|---|---|---|
Customer_ID |
When a Master Report contains usage for multiple organizations, this element can be used to break the usage down by institution. The Customer_ID MUST match the Customer_ID in the report header in a JSON report for that institution and the Customer_ID for the institution returned by the /members COUNTER_SUSHI API path. If Customer_ID is included in a tabular report, it MUST be the second column if Institution_Name is also included, or the first column if Institution_Name is not included. |
PR, DR, TR, IR |
C12345 |
Institution_Name |
When a Master Report contains usage for multiple organizations, this element can be used to break the usage down by institution. The Institution_Name MUST match the Institution_Name in the report header in a report for that institution and the Name for the institution returned by the /members COUNTER_SUSHI API path. If Institution_Name is included in a tabular report, it MUST be the first column. |
PR, DR, TR, IR |
Mt. Laurel University |
Format |
By tracking the Format, content providers can generate R4 usage reports from R5 usage during the transition period. Reserved values for Format are:
The Format element MUST only be used in the Title Master Report (or custom reports) and for Metric_Type Total_Item_Requests. If Format is included in a tabular report, it MUST be the last column before Metric_Type, and for other Metric_Types than Total_Item_Requests the cells in the Format column MUST be empty. |
TR |
|
Country_Name |
Name of the country according to ISO 3166-1. Note that the standard allows country names in different languages. The name is included for easier reading, for processing the reports the Country_Code should be used. |
PR, DR, TR, IR |
Canada |
Country_Code |
ISO 3166-1 alpha-2 code of the country. |
PR, DR, TR, IR |
CA |
Subdivision_Name |
Name of the country subdivision according to ISO 3166-2. Note that the standard allows country subdivision names in different languages. The name is included for easier reading, for processing the reports the Subdivision_Code should be used. |
PR, DR, TR, IR |
Quebec |
Subdivision_Code |
ISO 3166-2 code of the country subdivision. |
PR, DR, TR, IR |
CA-QC |
Attributed |
Whether the content provider was able to attribute the usage to an institution or not. Valid values are Yes and No. With this extension usage of open content that could not be attributed to an institution can be reported. The extension usually would be used in a report for “The World” (see Section 3.2.1) which could be broken down by geolocation with the Country and Subdivision extensions. |
PR, DR, TR, IR |
No |
Note that by supporting the Institution_Name and Customer_ID extensions content providers can offer COUNTER Master Reports to consortia with usage broken down by their members. If a consortium requests a report with Institution_Name and/or Customer_ID, the usage would be broken down by institution if the extension is supported by the content provider, otherwise the usage would be summarised over all consortium members as usual.
Restrictions in Using Custom Elements and Values
Extensions MUST only be used with Master Reports and custom reports, they MUST NOT be used with Standard Views. Note, however, that a report with extensions that is similar to a Standard View can be created by applying the Standard View’s filters and attributes to the corresponding Master Report and adding the extensions.
Custom elements and values MUST only be included in Master Reports if requested. So for reports requested via the COUNTER_SUSHI API, custom elements MUST only be included if requested via attributes_to_show, and custom values MUST only be included if requested via the corresponding report filters (e.g. metric_type or data_type). On the administrative/reporting site the custom elements and values can be preselected for Master Reports, but the user MUST have the option to exclude the custom elements and values from the Master Reports.
Continuous Maintenance
With R5, the COUNTER Code of Practice will operate under a continuous maintenance procedure to allow incremental changes to be made to the Code of Practice without creating a completely new release. This section describes those procedures.
Instructions for Submittal of Proposed Change
Changes and updates to the COUNTER Code of Practice can be submitted by anyone. Submissions MUST be made via email and directed to the Project Director (tasha.mellins-cohen@counterusage.org). Each idea for submission MUST include:
Submitter contact information:
Name
Email
Phone
Affiliation
Description of the enhancement/adjustment (include the section and paragraph number of the current Code of Practice if applicable)
Reason for the change (use case and/or goals to be accomplished)
Any relevant attachments
Review of Change Requests
All submissions received will be acknowledged and forwarded to the COUNTER Executive Committee for consideration within 30 days of receipt.
Resolution of Proposed Changes
Responding to Submissions
The COUNTER Executive Committee (EC) will review submissions and provide a response within 90 days of receipt (to allow discussion at a regularly scheduled EC meeting). The EC will respond to every submission with one of the following, providing clarity when needed:
Proposed change accepted without modification
Proposed change accepted with modification
Proposed change accepted for further study
Proposed change rejected
If further study is needed, the EC may convene a separate working group to study the proposal and make recommendations related to the suggested comments.
Approval of Changes
Changes that are substantive in nature (i.e. would require changes in how reports are generated or consumed) will be presented to COUNTER membership for comments for a period of at least 45 calendar days. All member comments MUST be considered and responded to by the EC or the designated working group.
After the comment period, changes to the COUNTER Code of Practice MUST be voted upon by the COUNTER Executive Committee and approved by committee majority. EC Members can respond to a ballot by voting Yes, No or Abstain. For clarity, the number of affirmative votes MUST be greater than 50% of the total number of EC members minus abstentions (a non-vote is considered a “No” vote.)
Communication of Changes
COUNTER will inform the COUNTER membership about upcoming changes to the COUNTER Code of Practice through email and on the COUNTER website and through posting on listservs that discuss usage topics.
Version and Change Control
Each update to the COUNTER Code of practice will generate a new version number (i.e. the initial release of “R5” will be designated as version 5.0. A non-substantive change (fixing typographical errors) would increment the version by .0.1, creating version 5.0.1. A substantive change (requiring changes in implementation of the Code of Practice) would increment the version by .1, creating version 5.1.
All changes included in each release will be included in the Change History section of the Code of Practice. The prior release will be archived as a PDF document and access to that release provided via the COUNTER website.
Implementation Schedule
Changes to the COUNTER Code of Practice may be non-substantive or substantive. A non-substantive change may be a clarification or correction of typographical errors that does not affect how the Code of Practice is implemented. A substantive change is one that would affect the implementation of the COUNTER Code of Practice. Examples of substantive changes are adding a new metric type or report, changing the requirement for including a data element from “may” to “MUST”, or changing processing rules.
Non-substantive changes can become effective immediately upon publication of the new version of the Code of Practice.
Substantive changes become effective for a given content provider within 12 months of publication of the new release or with the next audit, whichever date is later.
Substantive changes will be clearly marked in the change log in Appendix B to ensure they can be easily identified.
All other requirements of the Code of Practice will remain in effect during the implementation period for changes brought about by a new release.
Transitioning from Previous Releases or to New Reporting Services
A requirement of the COUNTER Code of Practice is that content providers offer libraries access to the current year plus the prior 24 months of usage or from the date they first became compliant, whichever is later. This requirement must continue to be met even when a provider may be transitioning to a new release of the COUNTER Code of Practice or if they are moving to a new reporting service.
Transitioning to a New Reporting Service
When a content provider implements a new reporting service, underlying logging system, or approach, they:
MUST continue to meet the requirement to offer valid COUNTER reports for the current year plus the prior 24 months (or from the date they first became compliant, whichever is later) via a web interface and via a SUSHI server.
MUST support COUNTER reports that may include a range of months that span the transition period. If the new reporting service was deployed in August of 2017, a customer could request a report for January-December 2017 and receive a single report.
When it is not practical to support a single report with date ranges that span the transition period, the content provider MUST perform the transition on the first day of a month. If the new reporting service was deployed in August 2017, a customer wanting January-December 2017 usage would request January-July 2017 from the previous reporting service and August-December 2017 from the new reporting service. For clarity, a provider MUST NOT perform the transition mid-month such that the customer is required to run reports on both the old and new reporting services for the same month and merge and sum the results to obtain actual monthly usage.
Transitioning to a New Code of Practice
New releases of the COUNTER Code of Practice will typically be assigned an effective date after which a content provider must be compliant. In such cases, a content provider may choose to implement the new release before the effective date. New releases of the COUNTER Code of Practice may come with specific transition instructions, but, in general, content providers:
May implement the new release prior to the effective date of the new release.
Are not required to release reports for usage transacted prior to the implementation date; however, they may choose to do so at their discretion.
MUST continue to meet the requirement to offer valid COUNTER reports for the current year plus the prior 24 months (or from the date they first became compliant, whichever is later) via a web interface and via a SUSHI server.
MUST provide a means for customers to receive prior-release reports for usage transacted from the content provider’s transition date through to 3 full months after the effective date of the new release. For clarity, if a new release becomes effective 1 February 2019 and a content provider implements the new release 1 October 2018, a customer must be able to obtain the prior-release usage reports for usage prior to the transition period as well as for usage that occurred in October 2018 to April 2019. A content provider can meet this requirement in one of the following ways:
Maintain two reporting systems such that usage is logged to the old and new reporting services and customers can access current-release reports on the new reporting service and prior-release reports on the old reporting service.
Support the prior-release reports on the new reporting service. This may involve using the metrics from the new release to produce reports formatted to the prior release; or it may involve logging additional data to the new reporting service such that the prior release reports can continue to be supported.
If the new release offers metrics compatible with the prior release, offer only new release reports provided customers have access to freely available tools that will automatically generate the required prior release report from an equivalent new release report and meet the requirement that these reports are available in tabular form and via the COUNTER_SUSHI API.
May choose to support COUNTER reports that include a range of months that span the transition period. E.g. if the new reporting service compliant with a new COUNTER release was deployed in October of 2018, a customer could request a report for January-December 2018 and receive a single report in either the new release or the previous release (see previous point on the transition period).
When it is not practical to support a single report with date ranges that span the transition period, the content provider MUST perform the transition on the first day of a month. E.g. if the new reporting service was deployed in October 2018, a customer wanting January-December 2018 usage would request January-September 2018 from the previous reporting service and October-December 2018 from the new reporting service. For clarity, a provider MUST NOT perform the transition mid-month such that the customer is required to run reports on both the old and new reporting services for the same month and merge and sum the results to obtain actual monthly usage.
Transitioning from COUNTER R4 to R5
The transition from R4 to R5 meets the general requirements outlined in Section 13.2.
Content providers MUST be compliant by February 2019 for delivery of R5 reports starting with January 2019 usage.
Content providers may choose to release their R5 compliant reporting service before February 2019.
A content provider’s customers MUST be able to obtain R4-compliant reports for that content provider from the time the content provider’s R5 reporting service was released through to April 2019 (providing access to March 2019 usage). A content provider may provide access to R4 reports beyond April 2019 at their discretion.
Content providers may choose to meet the requirement to provide R4 reports based on R5 metrics. The following R4 reports must be supported (when applicable to the platform): BR1, BR2, BR3, DB1, DB2, JR1, JR2, JR5, and PR1. The following table presents the equivalent R4 metric types and R5 Metric_Types and filters by report.
R4 Report |
R4 metric |
R5 equivalent |
---|---|---|
BR1 |
Full-text requests (at the book level) |
Unique_Title_Requests AND Data_Type=Book AND Section_Type=Book |
BR2 |
Full-text requests (at the chapter/section level) |
Total_Item_Requests AND Data_Type=Book AND Section_Type=Chapter|Section |
BR3 |
Access denied - concurrent/simultaneous user limit exceeded |
Limit_Exceeded AND Data_Type=Book |
Access denied - content item not licensed |
No_License AND Data_Type=Book |
|
DB1 |
Regular searches |
Searches_Regular |
Searches - federated and automated |
SUM (Searches_Automated, Searches_Federated) |
|
Result clicks |
Total_Item_Investigations attributed to the database |
|
Record views |
Total_Item_Investigations attributed to the database. (Note that resulting result click and record view counts will be the same. Librarians should use one or the other and not add them up.) |
|
DB2 |
Access denied - concurrent/simultaneous user limit exceeded |
Limit_Exceeded AND Data_Type=Database |
Access denied - content item not license |
No_License AND Data_Type=Database |
|
JR1 |
Full-text requests |
Total_Item_Requests AND Data_Type=Journal |
HTML requests |
Leave blank unless format of HTML and PDF are also logged in which case: Total_Item_Requests AND Data_Type=Journal AND Format=HTML |
|
PDF requests |
Leave blank unless format of HTML and PDF are also logged in which case: Total_Item_Requests AND Data_Type=Journal AND Format=PDF |
|
JR2 |
Access denied - concurrent/simultaneous user limit exceeded |
Limit_Exceeded AND Data_Type=Journal |
Access denied - content item not licensed |
No_License AND Data_Type=Journal |
|
JR5 |
Full-text requests (by year of publications) |
Total_Item_Requests AND Data_Type=Journal, pivot on YOP |
PR1 |
Regular searches |
Searches_Platform |
Searches - federated and automated |
Leave blank (Searches performed on the platform via federated and automated searching are included in Searches_Platform). |
|
Result clicks |
SUM (Total_Item_Investigations attributed to the databases) |
|
Record views |
SUM (Total_Item_Investigations attributed to the databases). (Note that resulting result click and record view counts will be the same. Librarians should use one or the other and not add them up.) |
Change History
Release |
Description of Change |
Substantive? |
Date approved |
Date for compliance |
---|---|---|---|---|
New Code of Practice to replace Release 4. |
Yes |
2017-07-01 |
2019-02-28 (with support for January 2019 usage) |
|
Amendments, corrections and clarifications based on feedback and questions from the community. |
Yes |
2018-12-10 |
2019-02-28 (with support for January 2019 usage) |
|
Amendments, corrections and clarifications based on feedback and questions from the community. |
No |
2021-09-28 |
2022-02-28 (with support for January 2022 usage) |
|
Corrections and clarifications based on feedback and questions from the community. |
No |
2023-03-30 |
2023-04-28 (with support for March 2023 usage) |
A detailed description of the changes from Release 4 is provided in Appendix B.
Starting with Release 5.0.2 the COUNTER Code of Practice Release 5 is maintained in the GitHub repository Project-Counter/cop5, and all changes are tracked with GitHub issues and linked pull requests. The change log lists all changes by type and impact of the change.
Appendices
Appendix A: Glossary of Terms
Note: The main Code of Practice document takes precedence in the case of any conflicts between it and this appendix.
Term |
Definition |
Examples |
---|---|---|
Abstract |
A short summary of an article or content item. A detailed view of article metadata that includes the summary but not the full text. Accessing the abstract/detailed view falls into the usage category of Investigations. |
|
Accepted manuscript |
The version of a journal article that has been accepted for publication in a journal. This version includes any pre-publication revisions, but it does not include any formatting or copyediting changes or corrections. |
|
Access Denied |
The user is denied access to a content item because their institution lacks a proper license or because simultaneous user limits specified in the license have been exceeded. |
Limit_Exceeded, No_License |
Access_Method |
A COUNTER report attribute indicating whether the usage related to investigations and requests was generated by a human user browsing and searching a website (Regular) or by Text and Data Mining processes (TDM). |
Regular, TDM |
Access_Type |
A COUNTER report attribute used to report on the nature of access control restrictions, if any, placed on the content item at the time when the content item was accessed. |
Controlled, OA_Gold, OA_Delayed, Other_Free_to_Read |
Aggregated_Full_Content |
A COUNTER Host_Type for content providers that offers aggregated pre-set databases of full text and other content where content is accessed in the context of the licensed database. |
|
Aggregated full-text database |
A full-text database that includes content from multiple titles, usually from multiple publishers. |
Academic Search Complete |
Aggregator |
A type of content provider that hosts content from multiple publishers, delivers content directly to customers, and is paid for this service by customers. |
EBSCOhost, Gale, Lexis Nexis, ProQuest |
A&I database |
A database that primarily contains bibliographic metadata and descriptive abstracts to support search, discovery, and selection of the described items. The majority of A&I databases center on articles, books, and book chapters. A&I_Databases do not host full text of the described items. For databases that contain A&I and full text, see Full-text database, Aggregated full-text database, Aggregated_Full_Content and Full_Content_Database. A COUNTER Host_Type. |
PubMed, PsycInfo |
AJAX |
Asynchronous JavaScript And XML. AJAX allows web pages to be updated asynchronously by exchanging data with a web server behind the scenes. |
|
ALPSP |
The Association of Learned and Professional Society Publishers is an international trade association of non-profit publishers. |
|
APC |
See Article processing charge. |
|
API |
Application Programming Interface. |
|
Archive |
Non-current collections of journals, books, articles, or other publications that are preserved because of their continuing value and which are frequently made available by publishers as separate acquisitions. |
Oxford Journals Archive |
Article |
An item of original written work published in a journal, other serial publication, or in a book. An article is complete, but usually cites other relevant published works in its list of references, if it has one. A COUNTER Data_Type. A COUNTER Section_Type for Title Reports. |
|
Article processing charges |
An article processing charge (APC), also known as a publication fee, is a fee which is sometimes charged to authors to make a work available Open Access in either an Open Access journal or hybrid journal. …They are the most common funding method for professionally published Open Access articles. [Wikipedia] |
|
Article_Version |
Defined by ALPSP and NISO as a classification of the version of an Article as it goes through its publication life-cycle. An element in COUNTER Item Reports that identifies the version of the Article being accessed. Typically COUNTER usage reporting only reflects usage of the following article versions (of the 7 versions defined by the ALPSP/NISO JAV Technical Working Group):
|
AM, VoR, CVoR, EVoR |
Articles in press |
Full-text articles that have been accepted for publication in a journal and have been made available online to customers and that will be assigned a publication date of the current year or a future year. |
|
Attribute |
See Report Attributes. |
|
Author(s) |
The person/people who wrote/created the items whose usage is being reported. |
|
Automated search |
A search from a host site or discovery service where multiple databases are searched simultaneously with a single query from the user interface and the end user does not have the option of selecting the databases being searched. Usage of this nature is reported as Searches_Automated. A search run repeatedly (e.g. daily or weekly) by a script or automated process. Usage of this nature must not be included in COUNTER reports. |
|
Automated search agent |
A script or automated process that runs a search repeatedly, usually at pre-set intervals such as daily or weekly. |
|
Backfile |
See Archive. |
Oxford Journals Archive |
Begin_Date |
The first date in the range for the usage represented in a COUNTER report. |
|
Book |
A non-serial publication of any length available in print (in hard or soft covers or in loose-leaf format) or in electronic format. A COUNTER Data_Type. A COUNTER Section_Type for Title Reports. |
|
Book chapter |
A subdivision of a book or of some categories of reference work; usually numbered and titled. |
|
Book Requests |
Book content items retrieved. |
|
Book segment |
Part of a book. A COUNTER Data_Type. |
|
Bulk download |
A single event where multiple content items are downloaded to the user’s computer. |
|
Cache |
An automated system that collects items from remote servers to serve closer and more efficiently to a given population of users. Often populated by robots or modern browsers. Note: Publishers take steps to prevent local caching of their content, i.e. including appropriate response headers on their site to restrict caching. |
|
Central Index |
Also known as a Discovery Index. A collection of locally-hosted, consistently indexed metadata and content harvested from multiple external metadata and content sources, frequently including a library’s catalog and repository metadata, and usually representing a significant portion of the library’s collection. |
|
Certified Public Accountant (CPA) |
An accounting designation granted to accounting professionals in the United States. |
|
Chapter |
A subdivision of a book or of some categories of reference work, usually numbered and titled. A COUNTER Section_Type. |
|
Chartered Accountant (CA) |
An international accounting designation granted to accounting professionals in many countries around the world, aside from the United States. |
|
Citation |
A reference to a published or unpublished source. |
|
Collection |
A subset of the content of a service. A collection is a branded group of online information products from one or more vendors that can be subscribed to/licensed and searched as a complete group. For the COUNTER reporting this term is restricted to pre-set collections that are defined like databases. See Database. Note: A package or bundle provided by a publisher is not considered a database or a collection. |
|
Component |
A uniquely identifiable constituent part of a content item composed of more than one file (digital object). |
|
Consortium |
A group of institutions joining together to license content. |
Ohiolink |
Consortium member |
An institution that has obtained access to online information resources as part of a consortium. A consortium member is defined by a subset of the consortium’s range of IP addresses or by other specific authentication details. |
Ohio State University |
Content host |
A website that provides access to content typically accessed by patrons of libraries and other research institutions. |
|
Content item |
A generic term describing a unit of content accessed by a user of a content host. Typical content items include articles, books, chapters, multimedia, etc. |
|
Content provider |
An organisation, such as a publisher, aggregator or subscriptions agent, who provides access to resources on a subscription basis. [Knowledge Base+] |
Science Direct, Clarivate, JSTOR |
Controlled |
A COUNTER Access_Type. At the time of the transaction, the content item was not open (e.g. was behind a paywall) because access is restricted to authorized users. Access of content due to a trial subscription would be considered Controlled. |
|
Copyright holder |
A person or a company who owns any one of the Exclusive Rights of copyright in a work. |
|
Corrected Version of Record |
A version of the Version of Record of a journal article in which errors in the VoR have been corrected. The errors could be author errors, publisher errors, or other processing errors. |
|
COUNTER compliance pending |
Status of a vendor who is currently not compliant but whose audit is in progress or scheduled. |
|
COUNTER Report Validation Tool |
An online tool to validate COUNTER reports in JSON and tabular format. |
|
COUNTER_SUSHI API |
A RESTful implementation of SUSHI automation intended to return COUNTER Release 5 reports and snippets of COUNTER usage in JSON format. |
|
Crawler |
See Internet robot, crawler, spider. |
|
Created |
COUNTER element name. The date and time the usage was prepared, in RFC3339 date-time format (yyyy-mm-ddThh:mm:ssZ). |
|
Created_By |
COUNTER element name. The name of the organization or system that created the COUNTER report. |
|
Crossref |
A not-for-profit membership organization for publishers. |
|
Customer |
An individual or organization that can access a specified range of the content provider’s services and/or content that is subject to the agreed terms and conditions. |
|
Customer_ID |
The element in the COUNTER reports that indicates whose usage is being reported. May be a proprietary or standard value such as ISNI. |
ISNI:000000012150090X |
Data harvesting |
Automated processes used for extracting data from websites. |
|
Data_Repository |
An online database service; an archive that manages the long-term storage and preservation of digital resources and provides a catalogue for discovery and access. A COUNTER Host_Type. |
Figshare |
Data_Type |
The element identifying the type of content. |
Article, Book, Book_Segment, Database, Dataset, Journal, Multimedia, Newspaper_Or_Newsletter, Other, Platform, Report, Repository_Item, Thesis_Or_Dissertation |
Database |
A collection of electronically stored data or unit records (facts, bibliographic data, texts) with a common user interface and software for the retrieval and manipulation of data. (NISO) A COUNTER Data_Type. |
Social Science Abstracts, Reaxys |
Database Master Report |
A COUNTER report that contains additional filters and breakdowns beyond those included in the Database Standard Views and is aggregated to the database level. |
|
Database Reports |
A series of COUNTER reports that provide usage aggregated to the database level. |
|
Dataset |
A collection of data. A COUNTER Data_Type. |
|
Delayed Open Access |
See OA_Delayed. |
|
Digital Object Identifier |
See DOI. |
|
Discovery Layer |
A web-accessible interface for searching, browsing, filtering, and otherwise interacting with indexed metadata and content. The searches produce a single, relevancy-ranked results set, usually displayed as a list with links to full content, when available. Typically, discovery layers are customizable by subscribing libraries and may be personalized by individual users. |
|
Discovery service |
A pre-harvested central index coupled with a fully featured discovery layer. A COUNTER Host_Type. |
EDS, Primo, Summon |
Distributed Usage Logging (DUL) |
A peer-to-peer channel for the secure exchange and processing of COUNTER-compliant private usage records from hosting platforms to publishers. |
|
DNS lookups |
Domain Name System lookups. |
|
DOI (digital object identifier) |
A standard identifier (ANSI/NISO Z39.84). The digital object identifier is a means of identifying a piece of intellectual property (a creation) on a digital network, irrespective of its current location. DOIs may be assigned at the title, article/chapter, or component level. |
|
Double-click |
Two clicks in succession on the same link by the same user within a period of 30 seconds. COUNTER requires that double-clicks must be counted as a single click. |
|
Double-click filtering |
A process to remove the potential of over-counting which could occur when a user clicks the same link multiple times. Double-click filtering applies to Total_Item and Access Denied Metric_Types. |
|
DR |
Database Master Report. |
|
DR_D1 |
Database Search and Item Usage. A pre-set Standard View of DR showing Total_Item_Investigations and Requests, as well as Searches_Regular, Automated and Federated. |
|
DR_D2 |
Database Access Denied. A pre-set Standard View of DR showing where users were denied access because simultaneous-use (concurrency) licenses were exceeded, or their institution did not have a license for the database. |
|
DUL |
See Distributed Usage Logging (DUL). |
|
eBook |
Monographic content that is published online. A COUNTER Host_Type. |
|
eBook_Collection |
A branded group of eBooks that can be subscribed to/licensed and searched as a complete group. A COUNTER Host_Type. |
|
eBook host |
A content host that provides access to eBook and reference work content. |
EBL, EBSCOhost, ScienceDirect |
EC |
See Executive Committee. |
|
eJournal |
Serial content that is published online. A COUNTER Host_Type. |
|
eJournal host |
A content host that provides access to online serial publications (journals, conferences, newspapers, etc.). |
ScienceDirect |
Element |
A piece of information to be reported on, displayed as a column heading (and/or in the report header) in a COUNTER report. |
|
Embargo period |
The period of time before an article is moved out from behind the paywall, i.e. from Controlled to OA_Delayed. |
|
End_Date |
The last date in the range for the usage represented in a COUNTER report. |
|
Enhanced Version of Record |
A version of the Version of Record of a journal article that has been updated or enhanced by the provision of supplementary material. For example, multimedia objects such as audio clips and applets; additional XML-tagged sections, tables, or figures or raw data. |
|
e-Resources |
Electronic resources. |
|
Exception |
An optional element that may be included within a COUNTER report indicating some difference between the usage that was requested and the usage that is being presented in the report. An Exception includes the Exception Code and Exception Message and may include additional Data that further describes the error. |
3031: Usage Not Ready for Requested Dates (request was for 2016-01-01 to 2016-12-31, but usage is only available to 2016-08-31). |
Exception Code |
A unique numeric code included as part of an Exception that identifies the type of error. |
|
Exception Message |
A short description of the Exception encountered. The Message is normally a standard message for the Exception Code concerned. See Appendix F. |
|
Exclude_Monthly_Details |
A COUNTER report attribute for tabular reports that specifies whether the columns with the month-by-month breakdown of the usage are excluded from the report. |
|
Executive Committee |
The committee which deals with the day-to-day activities of COUNTER’s business. |
|
Federated search |
A search conducted by a federated search application that allows users to simultaneously search multiple content sources, typically hosted by different vendors, with a single query from a single user interface. The federated search application typically presents the user with a single set of results collected from the content sources searched. The end user is not responsible for selecting the content sources being searched. The content sources being searched will report such activity as Searches_Federated. See Appendix G. |
MetaLib, EBSCOhost Connection |
Filter |
See Report filters. |
|
Format |
A COUNTER element for extending reports, used to identify the format of the content. Reserved values include: HTML, PDF, Other. |
|
Full_Content_Database |
A COUNTER Host_Type for content providers that offer databases that are a collection of content items that are not otherwise part of a serial or monograph (i.e. non-aggregated). Note: In contrast to A&I_Databases and Aggregated_Full_Content the Investigations and Requests for Full_Content_Databases (like for example Cochrane Database of Systematic Reviews) are reported with Data_Type Database. |
|
Full-text article |
The complete text - including all references, figures, and tables - of an article, plus links to any supplementary material published with it. |
|
Full-text database |
A database that contains the complete text of books,dissertations, journals, magazines, newspapers or other kinds of textual documents. [Wikipedia] |
|
GDPR |
General Data Protection Regulation. |
|
Gold Open Access |
See OA_Gold. |
|
Host |
See Content host. |
Ingenta, Semantico, SpringerLink |
Host Site |
See Content host. |
|
Host_Type |
A categorization of content hosts used by COUNTER to facilitate implementation of the Code of Practice. The Code of Practice identifies the Host_Types that apply to the various artefacts in the Code of Practice, allowing a content host to quickly identify the areas of the Code of Practice to implement by identifying the Host_Types that apply to them. |
A&I_Database, Aggregated_Full_Content, Data_Repository, Discovery_Service, eBook, eBook_Collection, eJournal, Full_Content_Database, Multimedia, Multimedia_Collection, Repository, Scholarly_Collaboration_Network |
Host UI |
User interface that an end user would use to access content on the content host. |
|
HTTP |
Hypertext Transfer Protocol. |
|
Hybrid publication |
A publication that is available via a subscription license but also contains articles available as Gold Open Access. |
|
Institution |
The organization for which usage is being reported. |
|
Institution_ID |
A unique identifier for an institution. In COUNTER reports the Institution_ID is presented as a combination of the identifier namespace and its value. Proprietary identifiers that identify the content platform can be used. |
ISNI:000000012150090X, EBSCOhost:s12345 |
Institution_Name |
The element in the COUNTER reports that indicates the name of the institution. |
|
Institutional identifier |
See Institution_ID. |
|
Internet robot, crawler, spider |
Any automated program or script that visits websites and systematically retrieves information from them, often to provide indexes for search engines. See Appendix I. |
|
Investigation |
A category of COUNTER Metric_Types that represent a user accessing information related to a content item (e.g. an abstract or detailed descriptive metadata of an article) or a content item itself (e.g. full text of an article). |
|
IP |
Internet Protocol. |
|
IP address |
Internet protocol (IP) address of the computer on which the session is conducted. May be used by content providers as a means of authentication and authorization and for identifying the institution a user is affiliated with. The identifying network address (typically four 8-bit numbers separated by “.” for IPv4 or eight groups of up to four hexadezimal numbers separated by “:” for IPv6) of the user’s computer or proxy. |
|
IR |
Item Master Report. |
|
IR_A1 |
Journal Article Requests. A pre-set Standard View of IR showing Total and Unique_Item_Requests for journal articles. |
|
IR_M1 |
Multimedia Item Requests. A pre-set Standard View of IR showing Total_Item_Requests for multimedia items. |
|
ISBN (International Standard Book Number) |
A unique standard identifier (ISO 2108) used to identify monographic publications (books). |
|
ISIL |
International Standard Identifier for Libraries and Related Organizations (ISO 15511). In COUNTER reports ISILs can be used as identifiers for institutions. |
|
ISNI |
International Standard Name Identifier (ISO 27729). A unique number used to identify authors, contributors, and distributors of creative works, including researchers, inventors, writers, artists, visual creators, performers, producers, publishers, aggregators, etc. In COUNTER reports ISNIs can be used as identifiers for institutions, publishers and item contributors (authors). |
|
ISO |
International Organization for Standardization. |
|
ISSN (International Standard Serial Number) |
A unique standard identifier (ISO 3297) used to identify a print or electronic periodical publication. A periodical published in both print and electronic form may have two ISSNs, a print ISSN and an electronic ISSN. |
|
Issue |
A collection of journal articles that share a specific issue number and are presented as an identifiable unit online and/or as a physically bound and covered set of numbered pages in print. |
|
Item |
Collective term for content that is reported at a high level of granularity, e.g. a full-text article (original or a review of other published work), an abstract or digest of a full-text article, a sectional HTML page, supplementary material associated with a full-text article (e.g. a supplementary data set), or non-textual resources such as an image, a video, audio, a dataset, a piece of code, or a chemical structure or reaction. |
Full-text article, Abstract, Database record, Dataset, Thesis |
Item Master Report |
A COUNTER report that provides usage data at the item or item-component level. |
|
Item Reports |
A series of COUNTER reports that provide usage data at the item or item-component level. |
|
JavaScript Object Notation |
See JSON. |
|
Journal |
A serial that is a branded and continually growing collection of original articles within a particular discipline. A COUNTER Data_Type. |
Tetrahedron Letters |
Journal Requests |
Journal content items retrieved. |
|
JQuery |
A JavaScript library. |
|
JSON |
JavaScript Object Notation (JSON) is an open standard file format that uses human-readable text to transmit data objects consisting of attribute–value pairs and array data types. [Wikipedia] |
|
License |
A contract or agreement that provides an organization or individual (licensee) with the right to access certain content. |
|
Limit_Exceeded |
A COUNTER Metric_Type. A user is denied access to a content item because the simultaneous-user limit for their institution’s license would be exceeded. |
|
Linking_ISSN |
A COUNTER report item identifier for the International Standard Serial Number that links together the ISSNs assigned to all instances of a serial publication (ISSN-L) in the format nnnn-nnn[nX] (JSON reports only). |
|
Log file analysis |
A method of collecting usage data in which the web server records all of its transactions. |
|
Master Reports |
COUNTER reports that contain additional filters and breakdowns beyond those included in the Standard Views. |
|
Metadata |
A series of textual elements that describes a content item but does not include the item itself. For example, metadata for a journal article would typically include publisher, journal title, volume, issue, page numbers, copyright information, a list of names and affiliations of the authors, author organization addresses, the article title and an abstract of the article, and keywords or other subject classifications. |
|
Metadata provider |
An organization, such as a publisher, that provides descriptive article/item-level metadata to an online search service. |
|
Metric_Type |
A COUNTER report attribute that identifies the nature of the usage activity. |
Total_Item_Requests, Searches_Regular, Limit_Exceeded, Unique_Title_Requests |
Monograph Text |
See Book. |
|
Multimedia |
Non-textual media such as images, audio, and video. A COUNTER Host_Type. A COUNTER Data_Type. |
|
Multimedia collection |
A grouping of multimedia items that are hosted and searched as a single unit and behave like a database. A COUNTER Host_Type. See also Database. |
|
Multimedia item |
An item of non-textual media content such as an image or streaming or downloadable audio or video files. (Does not include thumbnails or descriptive text/metadata.) |
|
Namespace |
A term primarily used in programming languages where the same name may be used for different objects. It is created to group together those names that might be repeated elsewhere within the same or interlinked programs, objects and elements. For example, an XML namespace consists of element types and attribute names. Each of the names within that namespace is only related/linked to that namespace. The name is uniquely identified by the namespace identifier ahead of the name. For example, Namespace1:John and Namespace2:John are the same names but within different namespaces. |
|
Newspaper or Newsletter |
Textual content published serially in a newspaper or newsletter. A COUNTER Data_Type. |
|
NISO |
The National Information Standards Organization is a United States non-profit standards organization that develops, maintains and publishes technical standards related to publishing, bibliographic and library applications. [Wikipedia] |
|
No_License |
A COUNTER Metric_Type. A user is denied access to a content item because the user or the user’s institution does not have access rights under an agreement with the vendor. |
|
OA |
See Open Access. |
|
OA_Delayed |
A COUNTER Access_Type that is reserved for future use and must not be implemented. At the time of the transaction, the content item was available as Open Access because the publisher’s embargo period had expired (delayed Open Access). |
|
OA_Gold |
A COUNTER Access_Type. At the time of the transaction, the content item was available under a Gold Open Access license (content that is immediately and permanently available as Open Access because an article processing charge applies or the publication process was sponsored by a library, society, or other organization). Content items may be in hybrid publications or fully Open Access publications. Note that content items offered as delayed Open Access (open after an embargo period) currently must be classified as Controlled, pending the implementation of OA_Delayed. |
|
OCLC |
OCLC (Online Computer Library Center). An American non-profit cooperative organization “dedicated to the public purposes of furthering access to the world’s information and reducing information costs”. It was founded in 1967 as the Ohio College Library Center. [Wikipedia] |
|
Online_ISSN |
A COUNTER report item identifier for the ISSN assigned to the online manifestation of a serial work. See also ISSN. |
1533-4406 |
Open Access |
Open Access (OA) refers to online research outputs that are free of all restrictions on access (e.g. access tolls) and free of many restrictions on use (e.g. certain copyright and license restrictions). Open Access can be applied to all forms of published research output, including peer-reviewed and non-peer-reviewed academic journal articles, conference papers, theses, book chapters, and monographs. [Wikipedia] |
|
ORCID |
An international standard identifier for individuals (i.e. authors) to use with their name as they engage in research, scholarship, and innovation activities. See https://orcid.org/. A COUNTER identifier type for item contributors. |
|
Other |
A content item or section that cannot be classified by any of the other Data_Types or Section_Types. A COUNTER Data_Type. A COUNTER Section_Type for Title Reports. |
|
Other_Free_to_Read |
A COUNTER Access_Type for institutional repositories. At the time of the transaction, the content item was freely available for reading (no license required) and did not qualify under the OA_Gold Access_Type. |
|
Page tag |
Page-tagging is a method of collecting usage data that uses, for example, JavaScript on each page to notify a third-party server when a page is rendered by a web-browser. |
|
Parent |
In COUNTER Item Reports the parent is the publication an item is part of. For a journal article, the parent is the journal, and for a book chapter it is the book. |
|
Paywall |
A term used to describe the fact that a user attempting to access a content item must be authorized by license or must pay a fee before the content can be accessed. |
|
Portable Document Format, a standard file format for representing electronic documents (ISO 32000). Items such as full-text articles or journals published in PDF format tend to replicate the printed page in appearance. |
||
PHP |
PHP is a general-purpose programming language originally designed for web development. The PHP reference implementation is now produced by The PHP Group. [Wikipedia] |
|
Platform |
The content host of an aggregator, publisher, or other online service that delivers the content to the user and that counts and provides the COUNTER usage reports. Individual titles or groups of content might have their own branded user experience but reside on a common host. A COUNTER Data_Type. |
Wiley Online Library, HighWire |
Platform Master Report |
A COUNTER report that contains additional filters and breakdowns beyond those included in the Platform Standard Views, and which is aggregated to the platform level. |
|
Platform Reports |
A series of COUNTER reports that provide usage aggregated to the platform level. |
|
Platform search |
A search conducted at the platform level. |
|
Platform usage |
Activity across all metrics for entire platforms. |
|
PR |
Platform Master Report. |
|
PR_P1 |
Platform Usage. A pre-set Standard View of PR showing Total and Unique_Item_Requests and Unique_Title_Requests, as well as Searches_Platform. |
|
Print_ISSN |
A COUNTER report item identifier for the ISSN assigned to the print manifestation of a work. See also ISSN. |
0028-4793 |
Proprietary_ID |
A COUNTER report item identifier for a unique identifier given by publishers and other content providers to a product or collection of products. |
|
Proprietary Identifier |
See Proprietary_ID. |
|
Publication date |
The date of release by the publisher to customers of a content item. An element in COUNTER Item Reports. |
|
Publisher |
An organization whose function is to commission, create, collect, validate, host, distribute and trade information online and/or in printed form. |
Sage, Cambridge University Press |
Publisher_ID |
An element in COUNTER reports for a publisher’s unique identifier. In COUNTER reports the Publisher_ID is presented as a combination of identifier namespace and value. |
|
R4 |
Release 4. |
|
R5 |
Release 5. |
|
Reference work |
An authoritative source of information about a subject used to find quick answers to questions. The content may be stable or updated over time. |
Dictionary, encyclopedia, directory, manual, guide, atlas, index |
References |
A list of works referred to in an article or chapter with sufficient detail to enable the identification and location of each work. |
|
Registry of compliance |
The COUNTER register of content providers compliant with the COUNTER Code of Practice. |
|
Regular |
A COUNTER Access_Method. Indicates that usage was generated by a human user browsing/searching a website, rather than by text and data mining processes. |
|
Regular search |
A search conducted by a user on a host where the user has the option of selecting the databases being searched. |
|
Release |
Version of the COUNTER Code of Practice. |
|
Report |
A document that presents information in an organized format. A COUNTER Data_Type. |
|
Report attributes |
Report attributes are elements in COUNTER reports that describe the nature of usage for an item or affect how the usage is broken down. In COUNTER Master Reports the Report_Attributes report header includes a series of report attributes applied to the report. This affects how the usage is presented (i.e. which columns/elements are included in the report), but it does not change the totals. |
Attributes_To_Show=Access_Type|YOP |
Report filters |
Report filters can be used to limit the usage returned in a COUNTER report. For Standard Views the report filters are pre-set, for Master Reports they can be used to customize the report. The Report_Filters report header includes a series of report filters applied to the report. |
Data_Type=Journal |
Report_ID |
The alphanumeric identifier of a specific Master Report or Standard View. |
PR, DR_D1, TR_J3 |
Report name |
The name of a COUNTER Master Report or Standard View. |
Journal Requests (Excluding OA_Gold) |
Report validation tool |
See COUNTER Report Validation Tool. |
|
Reporting period |
The total time period covered in a usage report. |
Begin_Date=2018-01-01; End_Date=2018-06-30 |
Repository |
A host who provides access to an institution’s research output. Includes subject repositories, institution, department, etc. A COUNTER Host_Type. |
Cranfield CERES |
Repository item |
A content item hosted in a repository, including one that consists of one or more digital objects such as text files, audio, video or data, described by associated metadata. A COUNTER Data_Type. |
|
Request |
A category of COUNTER Metric_Types that represents a user accessing content (e.g. full text of an article). |
|
Requestor ID |
A system-generated hash identifier that uniquely identifies a requestor session. |
|
Required reports |
The COUNTER reports that Host_Types are required to provide. |
|
Research data |
Data that supports research findings and may include databases, spreadsheets, tables, raw transaction logs, etc. |
|
RESTful COUNTER_SUSHI API |
A RESTful implementation of SUSHI automation intended to return COUNTER Release 5 reports and snippets of COUNTER usage in JSON format. RESTful is based on representational state transfer (REST) technology, an architectural style and approach to communications often used in web services development. |
|
Robot |
See Internet robot, crawler, spider. |
|
ROR (Research Organization Registry) |
ROR is a community-led registry of open, sustainable, usable, and unique identifiers for every research organization in the world. See https://ror.org/. In COUNTER reports ROR IDs can be used as identifiers for institutions and publishers. |
|
Scholarly Collaboration Network |
A service used by researchers to share information about their work. A COUNTER Host_Type. |
Mendeley, Reddit/Science |
Screen scraping |
The action of using a computer program to copy data from a website. |
|
Search |
A user-driven intellectual query, typically equated to submitting the search form of the online service to the server. For COUNTER reports a search is counted any time a system executes a search to retrieve a new set of results. This means that systems that perform multiple searches (e.g. search for exact match, search for words in subject, general search) to return a single set of results must only count a single search, not multiple searches. Things that do count as separate searches:
Note that link resolution never counts as a search. |
|
Search engine |
A service that allows users to search for content via the World Wide Web. |
|
Searches_Automated |
A COUNTER Metric_Type used to report on searches conducted on a host site or discovery service where multiple databases are searched simultaneously with a single query and the end user does not have the option of selecting the databases being searched. See also Automated search. |
|
Searches_Federated |
A COUNTER Metric_Type used to report on searches conducted by a federated search application. See Appendix G. See also Federated search. |
|
Searches_Platform |
A COUNTER Metric_Type used to report on searches conducted at the platform level. Note: Searches conducted against multiple databases on the platform will only be counted once. |
|
Searches_Regular |
A COUNTER Metric_Type used to report on searches conducted by a user on a host site where the user has the option of selecting the databases being searched. Note: If a search is conducted across multiple databases, each database searched will count that search. See also Regular search. |
|
Section |
A group of chapters or articles. A COUNTER Section_Type. |
|
Section_Type |
A COUNTER report attribute that identifies the type of section that was accessed by the user. |
Article, Book, Chapter, Other, |
Serial |
A publication in any medium issued in successive parts bearing numerical or chronological designations and intended to be continued indefinitely. This definition includes periodicals, journals, magazines, electronic journals, ongoing directories, annual reports, newspapers, monographic series, and also those journals, magazines, and newsletters of limited duration that otherwise bear all the characteristics of serials (e.g. newsletter of an event). [NISO] |
|
Server-side scripting language |
Server-side scripting is a technique used in web development which involves employing scripts on a web server which produce a response customized for each user’s request to the website. The alternative is for the web server itself to deliver a static web page. [Wikipedia] |
|
Service |
See Content host. |
ScienceDirect, Academic Universe |
Session |
A successful use of an online service. A single user connects to the service or database and ends by terminating activity that is either explicit (by leaving the service through exit or logout) or implicit (timeout due to user inactivity). [NISO] |
|
Session cookie |
A data file that a web server can place on a browser to track activity by a user and attribute that usage to a session. |
|
Session ID |
A unique identifier for a single user session. If the content provider’s web-site does not assign and capture a unique identifier to each user session, then a surrogate session ID can be generated using the browser user-agent, the user’s IP address and a one hour time slice (see Section 7 for details). The Session ID is used for double-click filtering and computing Unique_Item and Unique_Title metrics. |
|
Sites |
See Hosts. |
|
Spider |
See Internet robot, crawler, spider. |
|
Standard View |
A predefined version of a Master report, designed to meet the most common needs. |
Book Requests (Excluding OA_Gold), Journal Article Requests |
Standardized Usage Statistics Harvesting Initiative |
See SUSHI. |
|
Status code |
HTTP response status code. Status codes are issued by a server in response to a client’s request made to the server. [Wikipedia] |
|
SUSHI |
An international standard (Z39-93) used by COUNTER R4 that describes a method for automating the harvesting of reports. Short form for the COUNTER_SUSHI API used in COUNTER R5 for harvesting COUNTER reports. COUNTER compliance requires content hosts to implement the COUNTER_SUSHI API. |
|
Tab Separated Value |
See TSV. |
|
TDM |
Text and data mining (TDM) is a computational process whereby text or datasets are crawled by software that recognizes entities, relationships, and actions. [STM Publishers] A COUNTER Access_Method used to separate regular usage from usage that represents access to content for the purposes of text and data mining. |
|
Text and data mining |
See TDM. |
|
Thesis or Dissertation |
Dissertation: a long essay on a particular subject, especially one written as a requirement for a degree. Thesis: a long essay or dissertation involving personal research, written by a candidate for a college degree. A COUNTER Data_Type. |
|
Title |
The name of a book, journal, or reference work. |
|
Title Master Report |
A COUNTER report that contains additional filters and breakdowns beyond those included in the Title Standard Views and is aggregated to publication title level rather than towards individual articles/chapters. |
|
Title Reports |
A series of COUNTER reports where usage is aggregated to the publication title level. |
|
TLS (HTTPS) |
Transport Layer Security (TLS) protocol, Hypertext Transfer Protocol Secure (HTTPS) protocol. |
|
Total_Item_Investigations |
A COUNTER Metric_Type that represents the number of times users accessed the content (e.g. a full text) of an item, or information describing that item (e.g. an abstract). |
|
Total_Item_Requests |
A COUNTER Metric_Type that represents the number of times users requested the full content (e.g. a full text) of an item. Requests may take the form of viewing, downloading, emailing, or printing content, provided such actions can be tracked by the content provider. |
|
TR |
Title Report. |
|
TR_B1 |
Book Requests (Excluding OA_Gold). A pre-set Standard View of TR showing full text activity for all book content which is not Gold Open Access. |
|
TR_B2 |
Book Access Denied. A pre-set Standard View of TR showing where users were denied access because simultaneous-use (concurrency) licenses were exceeded, or their institution did not have a license for the book. |
|
TR_B3 |
Book Usage by Access Type. A pre-set Standard View of TR showing all applicable Metric_Types broken down by Access_Type. |
|
TR_J1 |
Journal Requests (Excluding OA_Gold). A pre-set Standard View of TR showing full text activity for all journal content which is not Gold Open Access. |
|
TR_J2 |
Journal Accessed Denied. A pre-set Standard View of TR showing where users were denied access because simultaneous-use licenses were exceeded, or their institution did not have a license for the journal. |
|
TR_J3 |
Journal Usage by Access Type. A pre-set Standard View of TR showing all applicable Metric_Types broken down by Access_Type. |
|
TR_J4 |
Journal Requests by YOP (excluding OA_Gold). A pre-set Standard View of TR breaking down the full text usage of non-Gold Open Access content by year of publication (YOP). |
|
Transaction |
A usage event. |
|
TSV |
A tab-separated values (TSV) file is a simple text format for storing data in a tabular structure, e.g. database table or spreadsheet data. Each record in the table is one line of the text file. Each field value of a record is separated from the next by a tab character. [Wikipedia] |
|
Turnaway |
See Access denied. |
|
Unique item |
A content item assessed during a session. Each unique content item accessed in a session is counted once per user session, even if there are multiple requests for the same content item during a session. |
|
Unique_Item_Investigations |
A COUNTER Metric_Type that represents the number of unique content items investigated in a user session. Examples of content items are articles, books, book chapters, and multimedia files. |
|
Unique_Item_Requests |
A COUNTER Metric_Type that represents the number of unique content items requested in a user session. Examples of content items are articles, books, book chapters, and multimedia files. |
|
Unique title |
A book assessed during a session. Each unique book title accessed in a session is counted once per user session, even if there are multiple requests for the same title during a session. |
|
Unique_Title_Investigations |
A COUNTER Metric_Type that represents the number of unique titles investigated in a user session. This Metric_Type is only applicable for Data_Type Book. |
|
Unique_Title_Requests |
A COUNTER Metric_Type that represents the number of unique titles requested in a user session. This Metric_Type is only applicable for Data_Type Book. |
|
URI |
In information technology, a Uniform Resource Identifier (URI) is a string of characters that unambiguously identifies a particular resource. To guarantee uniformity, all URIs follow a predefined set of syntax rules, but also maintain extensibility through a separately defined hierarchical naming scheme (e.g.http://). [Wikipedia] An element in COUNTER reports used to identify the item for which usage is being reported. |
|
URL |
Uniform Resource Locator. The address of a World Wide Web page. |
|
URN |
Uniform Resource Name, which identifies a resource by name in a particular namespace. |
|
User |
A person who accesses the online resource. |
|
User agent |
An identifier that is part of the HTTP protocol that identifies the software (e.g. browser) being used to access the site. May be used by robots to identify themselves. |
|
User cookie |
A small piece of data sent from a website and stored on the user’s computer by the user’s web browser while the user is browsing. |
|
User session |
See Session. |
|
UTF-8 |
UTF-8 is a variable width character encoding capable of encoding all 1,112,064 valid code points in Unicode using one to four 8-bit bytes. The encoding is defined by the Unicode Standard, and was originally designed by Ken Thompson and Rob Pike. The name is derived from Unicode Transformation Format - 8-bit. [Wikipedia] |
|
Vendor |
A publisher or other online information provider who delivers licensed content to the customer and with whom the customer has a contractual relationship. |
Taylor & Francis, EBSCO |
Version of Record |
A fixed version of a journal article that has been made available by any organization that acts as a publisher that formally and exclusively declares the article “published”. |
|
W3C |
The World Wide Web Consortium is the main international standards organization for the World Wide Web. [Wikipedia] |
|
XML |
A mark-up language that defines a set of rules for encoding documents in a format that is both human-readable and machine-readable. [Wikipedia] |
|
Year of Publication |
See YOP. |
|
YOP |
Year of publication. Calendar year in which an article, item, issue, or volume is published. For the COUNTER report attribute YOP, use the year of publication for the Version of Record if the year of publication differs for print and online version. |
|
Z39.50 |
An international standard protocol created by NISO for search. A Z39.50 client can search any Z39.50-compatible online service. Often used by federated search applications to facilitate searching content at other sites. |
Appendix B: Changes from Previous Releases
Note: The main Code of Practice document takes precedence in the case of any conflicts between it and this appendix.
Starting with Release 5.0.2 the COUNTER Code of Practice Release 5 is maintained in the GitHub repository Project-Counter/cop5, and all changes are tracked with GitHub issues and linked pull requests. The change log lists all changes by type and impact of the change.
B.1 Changes from COUNTER Release 4 (R4)
Changes in the nature of online content and how it is accessed have resulted in the COUNTER Code of Practice evolving in an attempt to accommodate those changes. This evolution resulted in some ambiguities and, in some cases, conflicts and confusions within the Code of Practice. Release 5 (R5) of the COUNTER Code of Practice is focused on improving the consistency, credibility, and comparability of usage reporting.
B.1.1 List of Reports
R5 reduces the overall number of reports by replacing many of the special-purpose reports that are seldom used with four Master Reports and a number of Standard Views that are more flexible. All COUNTER R4 reports have either been renamed or eliminated in favour of R5 Master Report or Standard View options.
R4 report |
R5 Report/Status |
Comments |
---|---|---|
Book Report 1: Number of Successful Title Requests by Month and Title |
Book Requests (Excluding OA_Gold) |
The Unique_Title_Requests metric is equivalent to the full-text requests in Book Report 1. |
Book Report 2: Number of Successful Section Requests by Month and Title |
Book Requests (Excluding OA_Gold) |
The Total_Item_Requests metric is equivalent to full text requests in Book Report 2. |
Book Report 3: Access Denied to Content Items by Month, Title and Category |
Book Access Denied |
Limit_Exceeded and No_License metrics are equivalent to those found in Book Report 3. |
Book Report 4: Access Denied to Content items by Month, Platform and Category |
Eliminated (no equivalent) |
“Book Access Denied” can be used to provide summary statistics by platform. For book collections the denials would be reported in “Database Access Denied”. |
Book Report 5: Total Searches by Month and Title |
Eliminated (no equivalent) |
For most platforms, attempting to track searches by titles is not reasonable since all titles are included in most searches. |
Book Report 7: Number of Successful Unique Title Requests by Month and Title in a Session |
Book Requests (Excluding OA_Gold) |
The Unique_Title_Requests metric is equivalent to the full-text requests in Book Report 7. |
Consortium Report 1: Number of Successful Full-Text Journal Article or Book Chapter Requests by Month and Title |
Eliminated |
Consortium administrators will request “Journal Requests (Excluding OA_Gold)” for each member. This can be automated via the COUNTER_SUSHI API using the /members path. Tools will be provided to create consolidated reports that are functionally equivalent to Consortium Report 1. |
Consortium Report 2: Total Searches by Month and Database |
Eliminated |
Consortium administrators will request “Database Search and Item Usage” for each member. This can be automated via the COUNTER_SUSHI API using the /members path. Tools will be provided to create consolidated reports that are functionally equivalent to Consortium Report 2. |
Consortium Report 3: Number of Successful Multimedia Full Content Unit Requests by Month and Collection |
Eliminated |
For multimedia collections that are equivalent to databases, consortium administrators will request “Database Search and Item Usage” for each member. This can be automated via the COUNTER_SUSHI API using the /members path. Tools will be provided to create consolidated reports that are functionally equivalent to Consortium Report 3. |
Database Report 1: Total Searches, Result Clicks and Record Views by Month and Database |
Database Search and Item Usage |
Result Clicks and Record Views have been replaced by Total_Item_Investigations. Metrics for regular searches remains unchanged, and federated and automated searches are now reported separately. The report also includes Requests metrics. |
Database Report 2: Access Denied by Month, Database and Category |
Database Access Denied |
Report renamed and updated Metric_Types used. |
Journal Report 1: Number of Successful Full-Text Article Requests by Month and Journal |
Journal Requests (Excluding OA_Gold) |
Total_Item_Requests is the equivalent to full text total. HTML and PDF totals have been eliminated, but Unique_Item_Requests can be used to evaluate the effect of the user interface on statistics and offers a comparable statistics for cost-per-unique-use analysis. |
Journal Report 1 GOA: Number of Successful Gold Open Access Full-Text Article Requests by Month and Journal |
Title Master Report |
The Title Master Report can be filtered by “Access_Type=OA_Gold; Metric_Type=Total_Item_Requests” to obtain equivalent results. |
Journal Report 1a: Number of Successful Full-Text Article Requests from an Archive by Month and Journal |
Journal Requests by YOP (Excluding OA_Gold) |
The R5 report breaks out usage by year of publication (YOP) to enable evaluation of usage of content for which perpetual access rights are available. |
Journal Report 2: Access Denied to Full-Text Articles by Month, Journal and Category |
Journal Access Denied |
The Limit_Exceeded and No_License metrics are equivalent to corresponding metrics in R4 report. |
Journal Report 3: Number of Successful Item Requests by Month, Journal and Page-type |
Title Master Report |
The Title Master Report can be configured to show Section_Types, which provides details similar to JR3. Other details like the audio and video usage can be reported in the Item Master Report (using the Component elements where appropriate). |
Journal Report 3 Mobile: Number of Successful Item Requests by Month, Journal and Page-type for usage on a mobile device |
Eliminated (no equivalent) |
Capturing usage by mobile devices is less relevant with the responsive design of most sites. The variety of mobile devices also makes it difficult, as does the fact that today’s smartphones have screen resolutions that exceed those of some desktops. |
Journal Report 4: Total Searches Run By Month and Collection |
Eliminated (no equivalent) |
To the extent that a journal collection is organized for searching as a discrete collection (rare), usage would be reported in “Database Search and Item Usage”. |
Journal Report 5: Number of Successful Full-Text Article Requests by Year-of-Publication (YOP) and Journal |
Journal Requests by YOP (Excluding OA_Gold) |
This R5 report offers a breakdown of journal usage by year of publication (YOP) and the resulting report can be analysed using filters or pivot tables. |
Multimedia Report 1: Number of Successful Full Multimedia Content Unit Requests by Month and Collection |
Database Search and Item Usage |
Multimedia usage, where multimedia is packaged and accessed as separate collections, would be reported using “Database Search and Item Usage”. |
Multimedia Report 2: Number of Successful Full Multimedia Content Unit Requests by Month, Collection and Item Type |
Multimedia Item Requests |
The R5 report provides a more detailed breakdown by item and includes attributes such as Data_Type. This report can be used to provide summary statistics by type. |
Platform Report 1: Total Searches, Result Clicks and Record Views by Month and Platform |
Platform Usage |
The R5 report provides equivalent metrics as well as additional metrics related to item full-text requests. |
Title Report 1: Number of Successful Requests for Journal Full-Text Articles and Book Sections by Month and Title |
Title Master Report |
The Title Master Report offers a single report for books and journals and can show the usage broken down by Section_Type. |
Title Report 1 Mobile: Number of Successful Requests for Journal Full-Text Articles and Book Sections by Month and Title (formatted for normal browsers/delivered to mobile devices AND formatted for mobile devices/delivered to mobile devices |
Eliminated (no equivalent) |
Capturing usage by mobile devices is less relevant with the responsive design of most sites. The variety of mobile devices also makes it difficult, as does the fact that today’s smartphones have screen resolutions exceeding those of some desktops. |
Title Report 2: Access Denied to Full-Text Items by Month, Title and Category |
Title Master Report |
The Title Master Report offers a single report for books and journals and includes the options to show Access Denied metrics. |
Title Report 3: Number of Successful Item Requests by Month, Title and Page Type |
Title Master Report |
The Title Master Report offers a single report for books and journals and can show Requests metrics. |
Title Report 3 Mobile: Number of Successful Item Requests by Month, Title and Page Type (formatted for normal browsers/delivered to mobile devices AND formatted for mobile devices/delivered to mobile devices |
Eliminated (no equivalent) |
Capturing usage by mobile devices is less relevant with the responsive design of most sites. The variety of mobile devices also makes it difficult, as does the fact that today’s smartphones have screen resolutions exceeding those of some desktops. |
B.1.2 Report Format
With R5, all COUNTER reports are structured the same way to ensure consistency, not only between reports, but also between the JSON and tabular versions of the reports. Now all reports share the same format for the header, the report body is derived from the same set of element names, total rows have been eliminated, and data values are consistent between the JSON and tabular version. (See Section 3.2). R5 also addresses the problem of terminology and report layouts varying from report to report, as well as JSON and tabular versions of the same report producing different results while still being compliant.
B.1.3 Metric Types
Release 5 of the COUNTER Code of Practice strives for simplicity and clarity by reducing the number of metric types and standardizing them across all reports, as applicable. With R4, Book Reports had different metric types from those in Journal Reports or in additional attributes such as mobile usage, usage by format, etc. Most COUNTER R4 metric types have either been renamed or eliminated in favour of new R5 Metric_Types. The table below show the R4 metric types as documented for SUSHI and their R5 state.
R4 Metric Types |
R5 Equivalence or Status |
Comments |
---|---|---|
abstract |
Total_Item_Investigations |
Actions against an item are tracked using the more generic Total_Item_Investigations metric. Due to the variety of types of item attributes that can be investigated, COUNTER no longer attempts to track them with separate Metric_Types. |
audio |
Eliminated |
This metric was only used in JR3/TR3 reports which saw little implementation or use. The intent was to represent activity of objects embedded in articles. |
data_set |
Eliminated |
When a content item was a data_set, the Total_Item_Requests metrics would be used in combination with a Data_Type of Dataset. |
ft_epub |
Total_Item_Requests |
More generic Total_Item_Requests are now used in place of format-specific metrics. |
ft_html |
Total_Item_Requests |
More generic Total_Item_Requests are now used in place of format-specific metrics. |
ft_html_mobile |
Eliminated |
Tracking of activity by mobile devices is no longer required for COUNTER compliance. |
ft_pdf |
Total_Item_Requests |
More generic Total_Item_Requests are now used in place of format-specific metrics. |
ft_pdf_mobile |
Eliminated |
Tracking of activity by mobile devices is no longer required for COUNTER compliance. |
ft_ps |
Total_Item_Requests |
More generic Total_Item_Requests are now used in place of format-specific metrics. |
ft_ps_mobile |
Eliminated |
Tracking of activity by mobile devices is no longer required for COUNTER compliance. |
ft_total |
Total_Item_Requests |
Total_Item_Requests is a comparable metric. |
image |
Eliminated |
This metric was only used in JR3/TR3 reports which saw little implementation or use. The intent was to represent activity of objects embedded in articles. |
multimedia |
Total_Item_Requests |
More generic Total_Item_Requests are now used in place of format-specific metrics. |
no_license |
No_License |
No change. |
other |
Eliminated |
Other usage provides no value. |
podcast |
Eliminated |
This metric was only used in JR3/TR3 reports which saw little implementation or use. The intent was to represent activity of objects embedded in articles. |
record_view |
Total_Item_Investigations |
Actions against an item are tracked using the more generic Total_Item_Investigations metrics. Due to the variety of types of item attributes that can be investigated, COUNTER no longer attempts to track them with separate Metric_Types. |
reference |
Total_Item_Investigations |
Actions against an item are tracked using the more generic Total_Item_Investigations metrics. Due to the variety of types of item attributes that can be investigated, COUNTER no longer attempts to track them with separate Metric_Types. |
result_click |
Total_Item_Investigations |
Actions against an item are tracked using the more generic Total_Item_Investigations metrics. Due to the variety of types of item attributes that can be investigated, COUNTER no longer attempts to track them with separate Metric_Types. |
search_fed |
Searches_Federated |
The R4 automated and federated search metrics have been separated into two separate metrics since the nature of the activity is very different. |
search_reg |
Searches_Regular |
For database reports, use Searches_Regular. When reporting at the platform level use Searches_Platform. |
sectioned_html |
Total_Item_Requests |
More generic Total_Item_Requests are now used in place of format-specific metrics. |
toc |
Total_Item_Investigations |
Actions against an item are tracked using the more generic Total_Item_Investigations metrics. Due to the variety of types of item attributes that can be investigated, COUNTER no longer attempts to track them with separate Metric_Types. Note that for journals TOCs aren’t item-level objects, therefore TOC usage MUST NOT be reported for journals. |
turnaway |
Limit_Exceeded |
Renamed to provide more clarity into the nature of the access-denied event. |
video |
Eliminated |
This metric was only used in JR3/TR3 reports which saw little implementation or use. The intent was to represent activity of objects embedded in articles. |
B.1.4 New elements and attributes introduced
With R4 the nature of the usage sometimes had to be inferred based on the name of the report. In an effort to provide more consistent and comparable reporting, R5 introduces some additional attributes that content providers can track with the usage and use to create breakdowns and summaries of usage.
Attribute |
Description |
Values |
---|---|---|
Access_Type |
Used in conjunction with Investigations and Requests, this attribute indicates if, at the time of the investigation or request, access to the item was controlled (e.g. subscription or payment required) or was available as Open Access or other free-to-read option. |
Controlled |
Access_Method |
This attribute is used to distinguish between regular usage (users accessing scholarly information for research purposes) and usage for the purpose of Text and Data Mining (TDM). |
Regular |
Data_Type |
Used to generally classify the nature of the item the usage is being presented for. |
Article |
Publisher_ID |
A unique identifier for the publisher, preferably a standard identifier such as ISNI. For the JSON version of the report, the type (namespace) and value are separate. For tabular, the format is {namespace}:{value}. |
ISNI:1233344455678889 |
Section_Type |
Used in conjunction with Data_Type, this attribute tracks requests to the level of the section requested. Used mostly with books where content may be delivered by chapter or section, this element defines the nature of the section retrieved. |
Article |
YOP |
This attribute records the year of publication of the item. The YOP attribute replaces the year-of-publication ranges in R4’s JR5 report and is tracked for all metrics in Title and Item Reports. |
A 4-digit year, e.g. 2012 |
Appendix C: Vendor/Aggregator/Gateway Declaration of COUNTER Compliance
We <name of Content Provider> (‘The Company’) hereby confirm the following:
- That the following online usage reports that are supplied by The Company to its customers, and which The Company claims to be ‘COUNTER-compliant’, conforming to Release 5 of the COUNTER Code of Practice:<insert list COUNTER-compliant reports>
The Company agrees that it will implement the protocols specified in Section 7 of Release 5 of the Code of Practice to correct for the effects of federated searches and internet robots on usage statistics.
Where The Company supplies to customers online usage statistics not included in the usage reports covered in 1 above, but which use terms defined in the COUNTER Code of Practice, that the definitions used by The Company are consistent with those provided in the COUNTER Code of Practice.
The Company will pay to COUNTER the Vendor Registration Fee (£350/US$500), unless The Company is a Member of COUNTER in good standing, for whom this fee is waived.
That to maintain COUNTER-compliant status, the usage reports provided by The Company to its customers will be independently audited according to a schedule and standards specified by COUNTER.
Appendix D: Guidelines for Implementation
Note: The main Code of Practice document takes precedence in the case of any conflicts between it and this appendix.
Our Friendly Guide To Release 5 Technical Notes for Providers provides guidelines for implementation.

Appendix E: Audit Requirements and Tests
Note: The main Code of Practice document takes precedence in the case of any conflicts between it and this appendix.
E.1 Audit Requirements
The COUNTER audit seeks to mirror the activity of an institution (a customer) carrying out usage on the content provider’s platform. The COUNTER audit procedures and tests ensure the usage reports provided by content providers comply with the COUNTER R5 Code of Practice.
Third-party hosts and vendors must be taken into account for usage reporting. Both hosts and vendors have additional audit requirements. The details are below:
Third-party hosts: Some publishers have their online content hosted by a third party that provides standard usage statistics reporting as part of a broader hosting service. In these cases, it is the third-party host that is audited. For the audit the third-party host must provide the auditor with a list of all publishers hosted by them and the related COUNTER Reports and Standard Views. The auditor will then select a minimum of two publishers at random from the list and audit accordingly.
Third-party vendors: Some publishers use third-party companies that provide bespoke usage-statistics reporting services, where the solutions used may differ significantly for each client publisher. In this case the third-party vendor must provide the auditor with a list of all their client publishers and the COUNTER Reports and Standard Views offered by each. The auditor will then select 10% of the publishers (up to a maximum of 5, with a minimum of 2) from this list and carry out the audit tests specified below.
Prior to an audit any third-party host/vendor must discuss with COUNTER how they provide usage statistics reporting. COUNTER can advise which category applies.
The auditor will request written confirmation from COUNTER prior to proceeding.
E.1.1 Audit Tests and Process
COUNTER defines specific audit tests for each of the COUNTER required usage reports. As content providers may work with different auditors, the audit test scripts help to ensure each auditor follows a common auditing procedure. This is covered in greater depth in Section E.2 Audit Tests.
There are three audit compliance stages:
Stage 1: Format. The usage reports are reviewed by the auditor to confirm they adhere to the COUNTER R5 Code of Practice. This review includes format and presentation of all relevant COUNTER reports.
Stage 2: Data Integrity. The usage statistics reported by the content provider are verified by the auditor to accurately record the activity carried out during the audit. The auditor will check that the content provider supplies consistent usage statistics when reports are accessed using different browsers, including Google Chrome, Internet Explorer/Microsoft Edge, and Mozilla Firefox as a minimum. Note: COUNTER will review the selected browsers annually.
To ensure reports are counting correctly as per the COUNTER R5 Code of Practice, it is important browser cache settings of the test machines are disabled. It is also important the content provider confirms before the audit period whether or not they operate a cache server. If they do, tests may not report as the COUNTER R5 Code of Practice expects. It is likely there will be some under-reporting.
Stage 3: Report Delivery. The auditor tests that the content provider has implemented the COUNTER_SUSHI API correctly and reports can be accessed using SUSHI according to the instructions supplied by the content provider (which must comply with the COUNTER_SUSHI API specification). Note: Implementation of the COUNTER_SUSHI API is a requirement for compliance and is covered by the Declaration of COUNTER Compliance signed by all compliant content providers. Delivery of reports via Excel or tab separated value (TSV) file will still be required as specified in the COUNTER R5 Code of Practice.
The COUNTER auditor cannot express an opinion regarding usage reported in any other accounts/institutions, or regarding aspects of the COUNTER R5 Code of Practice, not specifically tested.
E.1.2 Frequency of the Audit
Once a content provider is listed in the Register of COUNTER Compliant Content Providers an independent audit is required within 6 months to maintain COUNTER-compliant status. Independent audits must be completed annually thereafter. Content providers that are members of COUNTER in the Smaller Publisher category, may be audited biennially with permission from COUNTER. Failure to meet these audit requirements will result in a content provider losing its COUNTER-compliant status.
If COUNTER does not receive a satisfactory auditor’s report within the specified timeframe, the following control procedures apply:
New content providers having signed the Declaration of Compliance:
6 months after signing |
A reminder from COUNTER that the first auditor’s report is required |
8 months after signing |
A final reminder from COUNTER that the first auditor’s report is required |
9 months after signing |
The content provider is removed from the registry and is notified by COUNTER that they are non-compliant and must not make reference to COUNTER or use the COUNTER logo. |
Content providers previously audited:
3 months following the due audit date |
A reminder from COUNTER that an auditor’s report is required |
4 months following the due audit date |
A further reminder from COUNTER that an auditor’s report is required |
5 months following the due audit date |
A final reminder from the Chair of the COUNTER Executive Committee that an auditor’s report is required |
6 months following the due audit date |
The content provider is removed from the registry and is notified by COUNTER that they are non-compliant and must not make reference to COUNTER or use the COUNTER logo. |
E.1.3 Host Types and COUNTER Reports
Independent audits are required for COUNTER reports according to Host_Types. See Table 1 (below).
Table 1: COUNTER Reports Requiring Audit
Category |
Report_ID |
R5 Report_Name |
Master Report / Standard View |
Host_Type |
---|---|---|---|---|
Platform |
PR |
Platform Master Report |
Master Report |
All |
Platform |
PR_P1 |
Platform Usage |
Standard View |
All |
Database |
DR |
Database Master Report |
Master Report |
- Aggregated_Full_Content |
Database |
DR_D1 |
Database Searches and Item Usage |
Standard View |
- Aggregated_Full_Content |
Database |
DR_D2 |
Database Access Denied |
Standard View |
- Aggregated_Full_Content |
Title |
TR |
Title Master Report |
Master Report |
- Aggregated_Full_Content |
Title |
TR_B1 |
Book Requests (excluding OA_Gold) |
Standard View |
- Aggregated_Full_Content |
Title |
TR_B2 |
Book Access Denied |
Standard View |
- Aggregated_Full_Content |
Title |
TR_B3 |
Book Usage by Access Type |
Standard View |
- Aggregated_Full_Content |
Title |
TR_J1 |
Journal Requests (excluding OA_Gold) |
Standard View |
- Aggregated_Full_Content |
Title |
TR_J2 |
Journal Access Denied |
Standard View |
- Aggregated_Full_Content |
Title |
TR_J3 |
Journal Usage by Access Type |
Standard View |
- Aggregated_Full_Content |
Title |
TR_J4 |
Journal Requests by YOP (excluding OA_Gold) |
Standard View |
- Aggregated_Full_Content |
Item |
IR |
Item Master Report |
Master Report |
- Data_Repository |
Item |
IR_A1 |
Journal Article Requests |
Standard View |
- Repository |
Item |
IR_M1 |
Multimedia Item Requests |
Standard View |
- Multimedia |
E.1.4 Audit Test Requirements
COUNTER defines the basic reporting period as a calendar month. A report run for any given month MUST reflect all activity of a customer for the entire selected audit month. The auditor must also conduct and conclude all audit tests within the audit month.
To prevent any collision of reported data, an auditor must be allowed to set up and maintain separate accounts for each of the audit tests. During the audit month, there should not be any activity on the audit accounts other than activity generated by the auditor. Any non-auditor activity on the test accounts will make the test reports unreliable, may result in further audit tests and may incur additional costs.
Prior to the audit, the content provider must supply to the auditor:
Account details for at least 4 separate accounts with access to all areas required to be tested (or specific restrictions for turn-away testing).
Links to download usage reports in all required formats. COUNTER reports must be provided as tabular versions, which can be easily imported into Microsoft Excel.
SUSHI credentials for the test accounts to enable verification of SUSHI harvesting and formatting of the harvested reports.
A declaration confirming federated and automated searches have been disaggregated from any searches reported. See the COUNTER R5 Code of Practice for further information on the protocols regarding federated and automated searches.
If server-side caching is implemented, information on cache settings used should be provided.
Note: Server-side caching can cause a discrepancy between the usage recorded in the audit tests and the usage reported by the content provider. Information on cache settings enables the auditor to take them into account when evaluating the results of the report tests. If the content provider does not provide this information, the auditor is likely to require further audit tests that may incur additional costs.
E.2 Audit Tests
There are many audit tests required by the COUNTER R5 Code of Practise, detailed below.
E.2.1 Platform Reports
E.2.1.1 Master Report: PR
The Platform Master Report will be COUNTER-compliant if the following Standard View passes the COUNTER R5 Audit. The figures reported in the Standard View must match the figures reported in the Platform Master Report.
E.2.1.2 Standard View: PR_P1
This Standard View presents platform-level usage summarized by Metric_Type.
An audit of this Standard View requires the following for all audit tests, unless otherwise specified:
The auditor must have access to all available content on the platform of the content provider.
All searches, including those returning 0 results, must be reported as a Searches_Platform in the PR_P1 Standard View.
The auditor must allow at least 31 seconds between each request.
Each time a search is conducted, the auditor will record the search term, the database searched (if applicable), and the number of results returned.
Audit tests P1-1, P1-2 and P1-3 must take place in separate accounts so that each audit test can be separately reported.
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in PR_P1 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metrics on the auditor’s report.
The audit test requirements may vary depending on the set up of the platform and any related database(s).
Audit Test P1-1: Searches_Platform
The auditor will perform audit tests based on the relevant option detailed below.
Option 1: The platform has multiple databases and the user can search: All databases, a selected subset of databases, and a single database.
The auditor must run 100 searches on the platform, including 50 searches against only 1 selected database, 25 against 2 selected databases, and 25 against all databases. Each of these searches must report 1 Searches_Platform in the PR_P1 Standard View.
Option 2: The platform has multiple databases and the user can search: All databases and a single database.
The auditor must run 100 searches on the platform, including 50 searches against only 1 selected database and 50 against all databases. Each of these searches must report 1 Searches_Platform in the PR_P1 Standard View.
Option 3: The platform has multiple databases and the user can only search all databases.
The auditor must run 100 searches. Each of these searches can only be run on all databases. Each of the searches must report 1 Searches_Platform in the PR_P1 Standard View.
Option 4: The platform has multiple databases and the user can search a selected single database.
The auditor must run 100 searches, each of these searches can only be run on a single database, and the testing should cover a reasonable amount of the available databases. Each of these searches must report 1 Searches_Platform in the PR_P1 Standard View.
Option 5: The platform has a single database.
The auditor must run 50 searches on the platform, with all 50 searches run against the 1 database. Each of these searches must report 1 Searches_Platform in the PR_P1 Standard View.
Audit Test P1-2: Total_Item_Requests, Unique_Item_Requests and Unique_Title_Requests
The auditor will perform audit tests based on the relevant option detailed below.
Multiple paths should be used to make the audit requests. When possible, 50% of items requested should be via browsing the platform and 50% via searching the platform.
If browsing to items or accessing items via searching is not possible, then 100% of requested items can be accessed via the only available option.
Option 1: Platform has multiple databases that include books.
The auditor must make a total of 100 requests on a subset of unique items, including 50 requests against items not within books (if available) and 50 requests against items within books (if available).
If the platform only has content that is within books, then all 100 requests must be made to items within books.
The auditor requests must result in:
100 Total_Item_Requests and 100 Unique_Item_Requests in the PR_P1 Standard View.
Each book must have 5 items within the book requested. This will report 5 Total_Item_requests, 5 Unique_Item_Requests and 1 Unique_Title_Request.
The Unique_Title_Requests being reported in the PR_P1 Standard View will be determined by the number of unique books noted by the auditor during the testing.
Option 2: The Platform has multiple databases that do not include books.
The auditor must make 100 requests on a subset of the unique items available to them.
This must result in 100 Total_Item_Requests and 100 Unique_Item_Requests reported in the PR_P1 Standard View. The number of Unique_Title_Requests will be 0 and therefore not reported.
Option 3: Platform has a single database that includes books.
The auditor must make 50 requests on items available to them, including 25 requests against items not within books (if available) and 25 requests against items within books (if available).
If the platform only has content that is within books, then all 50 requests must be made to items within books.
This must result in 50 Total_Item_Requests being reported in the PR_P1 Standard View.
Each book must have 5 items within the book requested. This will report 5 Total_Item_Requests, 5 Unique_Item_Requests, and 1 Unique_Title_Requests.
The Unique_Title_Requests being reported in the PR_P1 Standard View will be determined by the number of unique books noted by the auditor during the testing.
Option 4: Platform has a single database that does not include books.
The auditor must make 50 requests on items.
This must result in 50 Total_Item_Requests and 50 Unique_Item_Requests being reported in the PR_P1 Standard View. The number of Unique_Title_Requests will be 0 and therefore not reported.
Audit Test P1-3: Total_Item_Requests and Unique_Item_Requests - 30-second filters
The audit test consists of clicking links to an item twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Requests must be recorded. If the two clicks occur with more than 30 seconds between them, then 2 Total_Item_Requests must be counted. In both cases only 1 Unique_Item_Requests will be reported.
The auditor must carry out a total of 30 tests on the platform, each test will consist of 2 requests. There are 2 types of tests that must be carried out:
“Inside” tests (Two identical requests are made, and the second request is made within 30 seconds of the first).
“Outside” tests (Two identical requests are made, and the second request is made more than 30 seconds after the first).
The auditor must carry out 15 inside tests.
This must result in:
15 Total_Item_Requests and 15 Unique_Item_Requests in the PR_P1 Standard View.
The auditor must carry out 15 outside tests.
This must result in:
30 Total_Item_Requests and 15 Unique_Item_Requests in the PR_P1 Standard View.
E.2.2 Database Reports
E.2.2.1 Master Report: DR
The Database Master Report will be COUNTER-compliant if the following Standard Views pass the COUNTER R5 audit. The figures reported in the Standard Views must match the figures reported in the Database Master Report.
Any Standard View not applicable to the content provider does not require auditing. Any exclusions must be confirmed by COUNTER prior to testing and the auditor notified.
E.2.2.2 Standard View: DR_D1
This Standard View reports on key search and request metrics needed to evaluate a database: Databases Searches and Item Usage.
An audit of this Standard View requires the following for all audit tests unless otherwise specified:
The auditor must have access to all databases available on the platform of the content provider. Any exclusions must be agreed prior to the audit by COUNTER and communicated to the auditor.
The auditor must allow at least 31 seconds between each request.
Each time a search is conducted, the auditor will record the search term, the databases searched, and the number of results returned.
All searches, including those returning 0 results, must be reported as a Searches_Regular or Searches_Automated in the DR_D1 Standard View.
Audit tests D1-1, D1-2 and, D1-3, D1-4 and D1-5 must take place in separate accounts so that each audit test can be separately reported.
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in DR_D1 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metric(s) on the auditor’s report.
Audit Test D1-1: Searches_Regular and Searches_Automated
The auditor will perform audit tests based on the relevant option detailed below.
Option 1:The content provider has multiple databases and the user can search: All databases, a selected subset of databases, and a single database.
The auditor must run 100 searches, including 50 against only 1 selected database, 25 against 2 selected databases, and 25 against all databases.
Each of the searches on a single database must report 1 Searches_Regular in the DR_D1 Standard View.
Each of the searches over 2 auditor selected databases must report 1 Searches_Regular against each of the selected databases in the DR_D1 Standard View.
Each of the searches over all databases must report 1 Searches_Regular against each of the databases offered by the content provider in the DR_D1 Standard View.
Option 2: The content provider has multiple databases and the user can search: All databases and a single database.
The auditor must run 100 searches, including 50 against only 1 selected database and 50 against all databases.
Each of the searches on a single database must report 1 Searches_Regular in the DR_D1 Standard View.
Each of the searches over all databases must report 1 Searches_Regular against each of the databases offered by the content provider in the DR_D1 Standard View.
Option 3: The content provider has multiple databases and the user can only search all databases.
The auditor must run 100 searches. Each of these searches can only be run on all databases. Each of the searches must report 1 Searches_Automated against each of the databases offered by the content provider in the DR_D1 Standard View.
Option 4: The content provider has multiple databases and the user can search a single database only.
The auditor must run 100 searches, each of the searches can only be run on a single database and the testing should cover a reasonable amount of the available database.
Each of the searches must report 1 Searches_Regular in the DR_D1 Standard View.
Option 5: The content provider has a single database.
The auditor must run 50 searches against the single database.
Each of the searches must report 1 Searches_Regular in the DR_D1 Standard View.
Audit Test D1-2: Total_Item_Requests
The auditor must make 100 requests on a subset of available unique items.
This must result in 100 Total_Item_Requests reported in the DR_D1 Standard View.
Multiple paths should be used to make the requests. When possible, 50% of items requested should be via browsing the platform and 50% via searching the platform.
If browsing to items or accessing items via searching is not possible, then 100% of requested items can be accessed via the only available option.
Audit Test D1-3: Total_Item_Requests - 30-second filters
The audit test consists of making an Item_Request twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Request must be recorded. If the two clicks occur with more than 30 seconds between them, then 2 Total_Item_Requests must be counted.The auditor must carry out a total of 30 tests, and each test will consist of 2 requests. There are 2 types of tests that must be carried out:
“Inside” tests (The 2 requests are made to the same item, and the second request is made within 30 seconds of the first).
“Outside” tests (The 2 requests are made to the same item, and the second request is made more than 30 seconds after the first).
The auditor must carry out 15 inside tests.
This must result in:
15 Total_Item_Requests being reported in the DR_D1 Standard View.
The auditor must carry out 15 outside tests.
This must result in:
30 Total_Item_Requests being reported in the DR_D1 Standard View.
Audit Test D1-4: Total_Item_Investigations
IMPORTANT NOTE: This test is required when investigations can be reported independently of a request. It is not required if all investigations have a matching request, but this must be verified during the audit. Any exclusion of tests must be confirmed by COUNTER prior to testing and the auditor notified.
The auditor must make 100 Investigations on a subset of available unique items. This must result in 100 Total_Item_Investigations.
Multiple paths should be used to make the Investigations. When possible, 50% of items Investigations should be via browsing and 50% via searching.
If either browsing to item investigations or accessing item investigations via searching is not possible, then 100% of item investigations can be made via the only available option.
Audit Test D1-5: Total_Item_Investigations - 30-second filters
IMPORTANT NOTE: This test is required when investigations can be reported independently of a request. It is not required when all investigations have a matching request. Any exclusion of tests must be confirmed by COUNTER prior to testing and the auditor notified.
The audit test consists of making an Item_Investigation twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Investigations made must be recorded. If the two clicks occur with more than 30 seconds between, then 2 Total_Item_Investigations must be counted.
The auditor must carry out a total of 30 tests, and each test will consist of 2 item investigations. There are 2 types of tests that must be carried out:
“Inside” tests (Two item investigations are made to the same item the second item Investigation is made within 30 seconds of the first).
“Outside” tests (Two item investigations are made to the same item, and the second item investigation is made more than 30 seconds after the first).
The auditor must carry out 15 inside tests.
This must result in:
15 Total_Item_Investigations being reported in the DR_D1 Standard View.
The auditor must carry out 15 outside tests.
This must result in:
30 Total_Item_Investigations being reported in the DR_D1 Standard View.
E.2.2.3 Standard View: DR_D2
Databases Access Denied: This Standard View reports on access-denied activity for databases where a user is denied access because simultaneous user licenses are exceeded or the institution does not have a license for the database.
An audit of this Standard View and related tests requires the following:
The auditor must allow at least 31 seconds between each request unless otherwise specified.
Each time a user is denied access, the auditor will record the database on which the denial was produced.
Audit tests D2-1 and D2-2 must take place in separate accounts so that each audit test can be separately reported.
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in DR_D2 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metrics on the auditor’s report.
Audit Test D2-1: Limit_Exceeded
IMPORTANT NOTE: This test can only be carried out if the content provider has a concurrent/simultaneous user limit. Any exclusion of tests must be confirmed by COUNTER prior to testing and the auditor notified.
The auditor will perform audit tests based on the relevant option detailed below.
The account used for this testing must have concurrent/simultaneous-user limit set at one single user. A second user attempting to access the database would be denied.
Option 1: The content provider denies the user access when the concurrent/simultaneous-user limit is exceeded upon login.
The auditor must force 50 Limit_Exceeded access denials.
The auditor will log into the site causing the user limit to reach the maximum allowance. The auditor will then attempt to log into the site using a different computer.
The second login should be refused access. Each time access is refused, the auditor will record this as 1 Limit_Exceeded.
Each of these concurrent/simultaneous turnaways must report 1 Limit_Exceeded in the DR_D2 Standard View.
Option 2: The content provider denies the user access when the concurrent/simultaneous user limit is exceeded upon searching or accessing a database.
The auditor must force 50 Limit_Exceeded turnaways.
The auditor will log into the site. They will select and make a search on a database or browse to a database causing the user limit to reach the maximum allowance. The auditor will then log into the same site using a different computer. The auditor will then repeat the action made on the previous computer (select and make a search on a database or browse to a database).
The user should then be refused access as the concurrent/simultaneous-user limit has exceeded. Each time access is refused, the auditor will record this as 1 Limit_Exceeded.
Each of these concurrent/simultaneous access denials must report 1 Limit_Exceeded in the DR_D2 Standard View.
Option 3: The content provider denies the user access when the concurrent/simultaneous-user limit is exceeded upon accessing an item within a database.
The auditor must force 50 Limit_Exceeded turnaways.
The auditor will log into the site and will navigate to and request an item. This will cause the user limit to reach the maximum allowance.The auditor will log into the site again using a different computer. The auditor will then repeat the action made on the previous computer (navigate to and request an item).
After the item has been requested the user should then be denied access. Each time access is refused, the auditor will record this as 1 Limit_Exceeded.
Each of these concurrent/simultaneous turnaways must report 1 Limit_Exceeded in the DR_D2 Standard View.
Audit Test D2-2: No_License
IMPORTANT NOTE: This test can only be carried out if the content provider restricts site content or if restricted content is not displayed. Any exclusion of tests must be confirmed by COUNTER prior to testing and the auditor notified.
The account used for this testing must have restricted access to content. The content the user has no license to access must be declared by the content provider prior to the testing. The auditor will attempt to access content using the account set up with restricted access. Each time access is refused, the auditor will record 1 No_License.
Each of these No_License turnaways must report 1 No_License in the DR_D2 Standard View.
E.2.3 Title Reports
E.2.3.1 Master Report: TR
The Title Master Report will be COUNTER-compliant if the following Standard Views pass the COUNTER R5 audit. The figures reported must match the figures reported in the Title Master Report.
Any Standard View not applicable to the content provider does not require auditing. Any exclusions must be agreed prior to the audit by COUNTER.
E.2.3.2 Standard View: TR_B1
Book Requests (excluding OA_Gold): Reports on full-text activity for non-Gold Open Access books as Total_Item_Requests and Unique_Title_Requests.
The Unique_Title_Requests provide comparable usage across book platforms. The Total_Item_Requests show overall activity.
An audit of this Standard View requires the following:
The auditor must have access to all book content available by the content provider.
The Access_Type for all requests must be Controlled and not OA_Gold.
The auditor must allow at least 31 seconds between each request, unless otherwise specified.
Audit tests B1-1 and B1-2 must take place in separate accounts so that each audit test can be separately reported.
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in TR_B1 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metrics on the auditor’s report.
Audit Test B1-1: Total_Item_Requests and Unique_Title_Requests
The auditor must make a total of 100 requests on a subset of unique items within books.
Each book must have 5 items requested within it. This will report 5 Total_Item_Requests and 1 Unique_Title_Requests.
This must result in:
100 Total_Item_Requests being reported in the TR_B1 Standard View.
20 Unique_Title_Requests being reported in the TR_B1 Standard View.
Audit Test B1-2: Total_Item_Requests and Unique_Title_Requests - 30-second filters
The audit test consists of clicking links to an item within a book twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Requests must be recorded. If the two clicks occur with more than 30 seconds between, then 2 Total_Item_Requests must be counted. In both cases only 1 Unique_Title_Requests will be reported.
The auditor must carry out a total of 32 tests, and each test will consist of 2 requests. There are 2 types of tests that must be carried out:
“Inside” tests (Two requests are made to the same Item, and the second request is made within 30 seconds of the first).
“Outside” tests (Two requests are made to the same item and the second request is made more than 30 seconds after the first).
The auditor must carry out 16 inside tests.
Where possible, each book must have 2 item tests reporting 1 Total_Item_Requests and 1 Unique_Title_Requests.
This must result in 16 Total_Item_Requests and 8 Unique_Title_Requests in the TR_B1 Standard View.
The auditor must carry out 16 outside tests.
Where possible, each book must have 2 items requested reporting 2 Total_Item_Requests and 1 Unique_Title_Requests.
This must result in 32 Total_Item_Requests and 8 Unique_Title_Requests in the TR_B1 Standard View.
E.2.3.3 Standard View: TR_B2
Book Access Denied: This Standard View reports on access-denied activity for books where a user is denied access because simultaneous user licenses are exceeded or their institution does not have a license for the database.
An audit of this Standard View and related tests requires the following:
Each time a user is denied access, the auditor will record the book where the denial was produced.
The auditor must allow at least 31 seconds between each request unless otherwise specified.
Audit tests B2-1 and B2-2 must take place in separate accounts so that each audit test can be separately reported.
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in TR_B2 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metrics on the auditor’s report.
Audit Test B2-1: Limit_Exceeded
IMPORTANT NOTE: This test can only be carried out if the content provider has a concurrent/simultaneous user limit. Any exclusion of tests must be confirmed by COUNTER prior to testing and the auditor notified.
The auditor will log into the site and access a book item. The auditor will then log into the site using a different computer. The auditor will log into the site and access a book item.
After the item has been requested the auditor should be refused access. Each time access is refused, the auditor will record this as 1 Limit_Exceeded.
The auditor must force 50 Limit_Exceeded turnaways during testing.
Each of these concurrent/simultaneous turnaways must report 1 Limit_Exceeded in the TR_B2 Standard View.
Audit Test B2-2: No_License
IMPORTANT NOTE: This test only be carried out if the content provider restricts site content or if restricted content is not displayed. Any exclusion of tests must be confirmed by COUNTER prior to testing and the auditor notified. The account used for this testing must have restricted access to book content. The book content the user has no license to access must be declared by the content provider prior to the testing.
The auditor will attempt to access book content using the account specified with no access. Each time access is refused, the auditor will record 1 No_License.
The auditor must force 50 No_License during testing. Each of these book content not licensed turnaways must report 1 No_License in the TR_B2 Standard View.
E.2.3.4 Standard View: TR_B3
Book Usage by Access Type: Reports on book usage showing all applicable metric types broken down by Access_Type.
An audit of this Standard View and related tests requires the following:
The auditor must have access to all book content made available by the content provider. Any exclusions must be confirmed by COUNTER prior to testing and the auditor notified.
The auditor must allow at least 31 seconds between each request unless otherwise specified.
Audit tests B3-1, B3-2, B3-3, and B3-4 must take place in separate accounts so that each audit test can be separately reported.
The following metrics reported as a result of the B3-1 and B3-2 audit tests must match in the TR_B3 Standard View:
Unique_Item_Requests must match Unique_Item_Investigations
Unique_Title_Requests must match Unique_Title_Investigations
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in TR_B3 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metrics on the auditor’s report.
Audit Test B3-1: Total_Item_Requests, Unique_Item_Requests and Unique_Title_Requests
The auditor will perform audit tests based on the relevant options detailed below.
Option 1: Content provider offers OA_Gold items in addition to Controlled.
The auditor must make a total of 100 requests on a subset of unique items within books. There must be 50 requests to book items where the Access_Type is Controlled and 50 requests to book items where the Access_Type is OA_Gold.
Each book must have 5 items within it requested. This must report 5 Total_Item_Requests, 5 Unique_Item_Requests and 1 Unique_Title_Requests.
This must result in:
50 OA_Gold Total_Item_Requests and 50 Controlled Total_Item_Requests being reported in the TR_B3 Standard View.
50 OA_Gold Unique_Item_Requests and 50 Controlled Unique_Item_Requests being reported in the TR_B3 Standard View.
10 OA_Gold Unique_Title_Requests and 10 Controlled Unique_Title_Requests being reported in the TR_B3 Standard View.
Option 2: Content provider does not offer OA_Gold items.
The auditor must make a total of 100 requests on a subset of unique items within books.
Where the site allows, each book must have 5 items requested resulting in reporting 5 Total_Item_Requests, 5 Unique_Item_Requests, and 1 Unique_Title_Requests.
This must result in:
100 Controlled Total_Item_Requests being reported in the TR_B3 Standard View.
20 Controlled Unique_Title_Requests being reported in the TR_B3 Standard View.
Audit Test B3-2: Total_Item_Requests, Unique_Item_Requests and Unique_Title_Requests - 30-second filters
The auditor will perform audit tests based on the relevant options detailed below.
The audit test consists of clicking links to an item within a book twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Requests must be recorded. If the two clicks occur with more than 30 seconds between, then 2 Total_Item_Requests must be counted. In both cases, only 1 Unique_Item_Requests and 1 Unique_Title_Requests will be reported.
Option 1: Content provider offers OA_Gold items in addition to Controlled items.
The auditor must carry out a total of 32 tests, each test will consist of 2 requests. There are 2 types of tests that must be carried out:
“Inside” tests (Two requests are made to the same book item, and the second request is made within 30 seconds of the first).
“Outside” tests (Two requests are made to the same book item, and the second request is made over 30 seconds after the first).
The auditor must carry out 16 inside tests.
8 tests to book items where the Access_Type is Controlled and 8 tests to book items where the Access_Type is OA_Gold.
Where the site allows, each book must have 2 item tests. This will report 2 Total_Item_Requests, 2 Unique_Item_Requests, and 1 Unique_Title_Requests.
This must result in
8 Controlled Total_Item_Requests and 8 OA_Gold Total_Item_Requests in the TR_B3 Standard View.
8 Controlled Unique_Item_Requests and 8 OA_Gold Unique_Item_Requests in the TR_B3 Standard View.
4 Controlled Unique_Title_Requests and 4 OA_Gold Unique_Title_Requests in the TR_B3 Standard View.
The auditor must carry out 16 outside tests.
8 tests to book items where the Access_Type is Controlled and 8 tests to book items where the Access_Type is OA_Gold.
Where the site allows, each book must have 2 item tests. This will report 4 Total_Item_Requests, 2 Unique_Item_Requests, and 1 Unique_Title_Requests.
This must result in:
16 Controlled Total_Item_Requests and 16 OA_Gold Total_Item_Requests in the TR_B3 Standard View.
8 Controlled Unique_Item_Requests and 8 OA_Gold Unique_Item_Requests in the TR_B3 Standard View.
4 Controlled Unique_Title_Requests and 4 OA_Gold Unique_Title_Requests in the TR_B3 Standard View.
Option 2: Content provider does not offer OA_Gold items.
The auditor must carry out a total of 32 tests, and each test will consist of 2 requests. There are 2 types of tests that must be carried out:
“Inside” tests (Two requests are made to the same book item and the second request is made within 30 seconds of the first).
“Outside” tests (Two requests are made to the same book item, and the second request is made over 30 seconds after the first).
The auditor must carry out 16 inside tests.
Where the site allows, each book must have 2 item tests. This will report 2 Total_Item_Requests and 2 Unique_Item_Requests and 1 Unique_Title_Requests.
This must result in:
16 Controlled Total_Item_Requests in the TR_B3 Standard View.
16 Controlled Unique_Item_Requests in the TR_B3 Standard View.
8 Controlled Unique_Title_Requests in the TR_B3 Standard View.
The auditor must carry out 16 outside tests.
Where the site allows, each book must have 2 item tests. This will report 4 Total_Item_Requests, 2 Unique_Item_Requests, and 1 Unique_Title_Requests.
This must result in:
32 Controlled Total_Item_Requests in the TR_B3 Standard View.
16 Controlled Unique_Item_Requests in the TR_B3 Standard View.
8 Controlled Unique_Title_Requests in the TR_B3 Standard View.
Audit Test B3-3: Total_Item_Investigations, Unique_Item_Investigations, and Unique_Title_Investigations
IMPORTANT NOTE: This test is required when investigations can be reported independently of a request. It is not required when all investigations have a matching request. Any exclusion of tests must be confirmed by COUNTER prior to testing and the auditor notified.
The auditor will perform audit tests based on the relevant options detailed below.
Option 1: Content provider offers OA_Gold items in addition to Controlled.
The auditor must make a total of 50 item investigations within a subset of books. There must be 25 Investigations of items within a book where the Access_Type is Controlled and 25 investigations of items within a book where the Access_Type is OA_Gold.
Each book must have 5 investigations of unique items. This will report 5 Total_Item_Investigations, 5 Unique_Item_Investigations, and 1 Unique_Title_Investigations.
This must result in:
25 OA_Gold Total_Item_Investigations and 25 Controlled Total_Item_Investigations being reported in the TR_B3 Standard View.
25 OA_Gold Unique_Item_Investigations and 25 Controlled Unique_Item_Investigations being reported in the TR_B3 Standard View.
5 OA_Gold Unique_Title_Investigations and 5 Controlled Unique_Title_Investigations being reported in the TR_B3 Standard View.
Option 2: Content provider does not offer OA_Gold items.
The auditor must make a total of 50 Investigations within a subset of books.
Each book must have 5 investigations of unique items. This will report 5 Total_Item_Investigations, 5 Unique_Item_Investigations, and 1 Unique_Title_Investigations.
This must result in:
50 Controlled Total_Item_Investigations being reported in the TR_B3 Standard View.
50 Controlled Unique_Item_Investigations being reported in the TR_B3 Standard View.
10 Controlled Unique_Title_Investigations being reported in the TR_B3 Standard View.
Audit Test B3-4: Total_Item_Investigations, Unique_Item_Investigations, and Unique_Title_Investigations - 30-second filters
IMPORTANT NOTE: This test is required when investigations can be reported independently of a request. It is not required when all investigations have a matching request. Any exclusion of tests must be confirmed by COUNTER prior to testing and the auditor notified.
The auditor will perform audit tests based on the relevant options detailed below.
The audit test consists of clicking links to an investigation of an item within a book twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Investigations must be recorded. If the two clicks occur with more than 30 seconds between, then 2 Total_Item_Investigations must be counted. In both cases only 1 Unique_Item_Investigations and 1 Unique_Title_Investigations will be reported.
Option 1: Content provider offers OA_Gold items in addition to Controlled.
The auditor must carry out a total of 32 tests. Each test will consist of 2 item investigations. There are 2 types of tests that must be carried out:
“Inside” tests (Two investigations are made to the same book item, and the second investigation is made within 30 seconds of the first).
“Outside” tests (Two investigations are made to the same book item, and the second investigation is made more than 30 seconds after the first).
The auditor must carry out 16 inside tests. This requires 8 Investigations to book items where the Access_Type is Controlled and 8 investigations to book items where the Access_Type is OA_Gold.
Each book must have 2 item tests. This will report 2 Total_Item_Investigations, 2 Unique_Item_Investigations, and 1 Unique_Title_Investigations.
This must result in:
8 Controlled Total_Item_Investigations and 8 OA_Gold Total_Item_Investigations in the TR_B3 Standard View.
8 Controlled Unique_Item_Investigations and 8 OA_Gold Unique_Item_Investigations in the TR_B3 Standard View.
4 Controlled Unique_Title_Investigations and 4 OA_Gold Unique_Title_Investigations in the TR_B3 Standard View.
The auditor must carry out 16 outside tests. This requires 8 tests to book items where the Access_Type is Controlled and 8 tests to book items where the Access_Type is OA_Gold.
Each book must have 2 item tests. This will report 4 Total_Item_Investigations, 2 Unique_Item_Investigations, and 1 Unique_Title_Investigations.
This must result in:
16 Controlled Total_Item_Investigations and 16 OA_Gold Total_Item_Investigations in the TR_B3 Standard View.
8 Controlled Unique_Item_Investigations and 8 OA_Gold Unique_Item_Investigations in the TR_B3 Standard View.
4 Controlled Unique_Title_Investigations and 4 OA_Gold Unique_Title_Investigations in the TR_B3 Standard View.
Option 2: Content provider does not offer OA_Gold items.
The auditor must carry out a total of 32 tests. Each test will consist of 2 item investigations. There are 2 types of tests that must be carried out:
“Inside” tests (Two investigations are made to the same book item, and the second investigation is made within 30 seconds of the first).
“Outside” tests (Two investigations are made to the same book item, and the second investigation is made more than 30 seconds after the first).
The auditor must carry out 16 inside tests.
Each book must have 2 item tests. This will report 2 Total_Item_Investigations, 2 Unique_Item_Investigations, and 1 Unique_Title_Investigations.
This must result in:
16 Controlled Total_Item_Investigations in the TR_B3 Standard View.
16 Controlled Unique_Item_Investigations in the TR_B3 Standard View.
8 Controlled Unique_Title_Investigations in the TR_B3 Standard View.
The auditor must carry out 16 outside tests.
Each book must have 2 item tests. This will report 4 Total_Item_Investigations, 2 Unique_Item_Investigations, and 1 Unique_Title_Investigations.
This must result in:
32 Controlled Total_Item_Investigations in the TR_B3 Standard View.
16 Controlled Unique_Item_Investigations in the TR_B3 Standard View.
8 Controlled Unique_Title_Investigations in the TR_B3 Standard View.
E.2.3.5 Standard View: TR_J1
Journal Requests (excluding OA_Gold): Reports on usage of non-Gold Open Access journal content as Total_Item_Requests and Unique_Item_Requests.
An audit of this Standard View and related tests requires the following:
The auditor must have access to all journal content available by the content provider. Any exclusions must be confirmed by COUNTER prior to testing and the auditor notified.The Access_Type for all requests must be Controlled and not OA_Gold.
The auditor must allow at least 31 seconds between each request unless otherwise specified.
Audit tests J1-1 and J1-2 must take place in separate accounts so that each audit test can be separately reported.
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in TR_J1 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metrics on the auditor’s report.
Audit Test J1-1: Total_Item_Requests and Unique_Item_Requests
The auditor must make a total of 100 requests on a subset of unique journal items.
This must result in:
100 Total_Item_Requests being reported in the TR_J1 Standard View.
100 Unique_Item_Requests being reported in the TR_J1 Standard View.
Audit Test J1-2: Total_Item_Requests and Unique_Item_Requests - 30-second filters
The audit test consists of clicking links to a journal item twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Requests must be recorded. If the two clicks occur with more than 30 seconds between, then 2 Total_Item_Requests must be counted. In both cases only 1 Unique_Item_Requests will be reported.
The auditor must carry out a total of 30 tests. Each test will consist of 2 requests.
There are 2 types of tests that must be carried out:
“Inside” tests (Two requests are made to the same journal item, and the second request is made within 30 seconds of the first).
“Outside” tests (Two requests are made to the same journal item, and the second request is made over 30 seconds after the first).
The auditor must carry out 15 inside tests.
This must result in:
15 Total_Item_Requests and 15 Unique_Item_Requests in the TR_J1 Standard View.
The auditor must carry out 15 outside tests.
This must result in:
30 Total_Item_Requests and 15 Unique_Item_Requests in the TR_J1 Standard View.
E.2.3.6 Standard View: TR_J2
Journal Access Denied: This Standard View reports on access denied activity for journal content. A user is denied access because simultaneous-user licenses are exceeded, or their institution does not have a license for the journal.
An audit of this Standard View and related tests requires the following:
The auditor must allow at least 31 seconds between each request unless otherwise specified.
Each time a user is denied access, the auditor will record the journal on which the denial was produced.
Audit tests J2-1 and J2-2 must take place in separate accounts so that each audit test can be separately reported.
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in TR_J2 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metrics on the auditor’s report.
Audit Test J2-1: Limit_Exceeded
IMPORTANT NOTE: This test can only be carried out where the content provider offers a concurrent/simultaneous-user limit. Any exclusion of tests must be confirmed by COUNTER prior to testing and the auditor notified.
The account used for this testing must have a concurrent/simultaneous-user limit set for journal items. The account should allow a single active user to access journals. A second user accessing journals will be turned away. The number of registered users concurrently allowed must be declared by the content provider prior to the testing.
The content provider denies the user access when the concurrent/simultaneous-user limit is exceeded for journals.
The auditor will log into the site and access a journal item. The user limit is at maximum active users.
The auditor will then log into the site using a different computer and repeat the action of accessing a journal.
The user should be refused access.
Each time access is refused, the auditor will record this as 1 Limit_Exceeded.
The auditor must force 50 Limit_Exceeded turnaways during testing.
Each of these concurrent/simultaneous access denials must report 1 Limit_Exceeded in the TR_J2 Standard View.
Audit Test J2-2: No_License
IMPORTANT NOTE: This test can only be carried out if the content provider restricts site content or where restricted content is not displayed. Any exclusion of tests must be confirmed by COUNTER prior to testing and the auditor notified.
The account used for this testing must have restricted access to journal content. The content provider must declare the content the user does not have license to access.
The auditor will attempt to access the restricted journal content. Each time access is refused, the auditor will record 1 No_License.
The auditor must force 50 No_License turnaways during testing.
Each of these journal content not licensed denials must report 1 No_License in the TR_J2 Standard View.
E.2.3.7 Standard View: TR_J3
Journal Usage by Access Type: This Standard View reports on usage of journal content for all metric types broken down by access type.
An audit of this Standard View and related tests requires the following:
The auditor must have access to all journal content made available by the content provider. Any exclusions must be confirmed by COUNTER prior to testing and the auditor notified.
The auditor must allow at least 31 seconds between each request unless otherwise specified.
Audit tests J3-1, J3-2, J3-3, and J3-4 must take place in separate accounts so that each audit test can be separately reported.
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in TR_J3 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metrics on the auditor’s report.
Audit Test J3-1: Total_Item_Requests and Unique_Item_Requests
The auditor will perform audit tests based on the relevant options detailed below.
Option 1: Content provider offers OA_Gold items in addition to Controlled.
The auditor must make a total of 100 requests on a subset of unique journal items. 50 requests to journal items where the Access_Type is Controlled and 50 requests to journal items where the Access_Type is OA_Gold.
This must result in:
50 OA_Gold Total_Item_Requests and 50 Controlled Total_Item_Requests being reported in the TR_J3 Standard View.
50 OA_Gold Unique_Item_Requests and 50 Controlled Unique_Item_Requests being reported in the TR_J3 Standard View.
Option 2: Content provider does not offer OA_Gold items.
The auditor must make a total of 100 requests on a subset of unique journal items.
This must result in:
100 Controlled Total_Item_Requests being reported in the TR_J3 Standard View.
100 Controlled Unique_Item_Requests being reported in the TR_J3 Standard View.
Audit Test J3-2: Total_Item_Requests and Unique_Item_Requests - 30-second filters
The auditor will perform audit tests based on the relevant options detailed below.
The audit test consists of clicking links to a journal item twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Requests must be recorded. If the two clicks occur with more than 30 seconds between them, then 2 Total_Item_Requests must be counted. In both cases only 1 Unique_Item_Requests will be reported.
Option 1: Content provider offers OA_Gold items in addition to Controlled.
The auditor must carry out a total of 30 tests, and each test will consist of 2 requests. There are 2 types of tests that must be carried out:
“Inside” tests (Two requests are made to the same journal item, and the second request is made within 30 seconds of the first).
“Outside” tests (Two requests are made to the same journal item and the second request is made over 30 seconds after the first).
The auditor must carry out 15 inside tests.
8 tests to journal items where the Access_Type is Controlled and 7 tests to journal items where the Access_Type is OA_Gold.
This must result in:
8 Controlled Total_Item_Requests and 7 OA_Gold Total_Item_Requests in the TR_J3 Standard View.
8 Controlled Unique_Item_Requests and 7 OA_Gold Unique_Item_Requests in the TR_J3 Standard View.
The auditor must carry out 15 outside tests.
8 tests to journal items where the Access_Type is Controlled and 7 tests to journal items where the Access_Type is OA_Gold).
This must result in:
16 Controlled Total_Item_Requests and 14 OA_Gold Total_Item_Requests in the TR_J3 Standard View.
8 Controlled Unique_Item_Requests and 7 OA_Gold Unique_Item_Requests in the TR_J3 Standard View.
Option 2: Content provider does not offer OA_Gold items.
The auditor must carry out a total of 30 tests, and each test will consist of 2 requests. There are 2 types of tests that must be carried out:
“Inside” tests (Two requests are made to the same journal item, and the second request is made within 30 seconds of the first).
“Outside” tests (Two requests are made to the same journal item, and the second request is made more than 30 seconds after the first).
The auditor must carry out 15 inside tests.
This must result in:
15 Controlled Total_Item_Requests in the TR_J3 Standard View.
15 Controlled Unique_Item_Requests in the TR_J3 Standard View.
The auditor must carry out 15 outside tests.
This must result in:
30 Controlled Total_Item_Requests in the TR_J3 Standard View.
15 Controlled Unique_Item_Requests in the TR_J3 Standard View.
Audit Test J3-3: Total_Item_Investigations and Unique_Item_Investigations
The auditor will perform audit tests based on the relevant options detailed below.
Option 1: Content provider offers OA_Gold items in addition to Controlled.
The auditor must make a total of 50 investigations to a subset of unique journal items. Where the site allows, there must be 25 Investigations of journal items where the Access_Type is Controlled and 25 Investigations of journal items where the Access_Type is OA_Gold.
This must result in:
25 OA_Gold Total_Item_Investigations and 25 Controlled Total_Item_Investigations being reported in the TR_J3 Standard View.
25 OA_Gold Unique_Item_Investigations and 25 Controlled Unique_Item_Investigations being reported in the TR_J3 Standard View.
Option 2: Content provider does not offer OA_Gold items.
The auditor must make a total of 50 investigations to a subset of unique journal items.
This must result in:
50 Controlled Total_Item_Investigations being reported in the TR_J3 Standard View.
50 Controlled Unique_Item_Investigations being reported in the TR_J3 Standard View.
Audit Test J3-4: Total_Item_Investigations and Unique_Item_Investigations - 30-second filters
The audit test consists of clicking links to an Investigation of a journal item twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Requests must be recorded. If the two clicks occur with more than 30 seconds between them, then 2 Total_Item_Requests must be counted. In both cases only 1 Unique_Item_Requests will be reported.
Option 1: Content provider offers OA_Gold items in addition to Controlled.
The auditor must carry out a total of 30 tests. Each test will consist of 2 Investigations. There are 2 types of tests that must be carried out:
“Inside” tests (Two investigations are made to the same journal item, and the second investigation is made within 30 seconds of the first).
“Outside” tests (Two investigations are made to the same journal item, and the second investigation is made more than 30 seconds after the first).
The auditor must carry out 15 inside tests.
8 tests to journal items where the Access_Type is Controlled and 7 tests to journal items where the Access_Type is OA_Gold.
This must result in:
8 Controlled Total_Item_Investigations and 7 OA_Gold Total_Item_Investigations in the TR_J3 Standard View.
8 Controlled Unique_Item_Investigations and 7 OA_Gold Unique_Item_Investigations in the TR_J3 Standard View.
The auditor must carry out 15 outside tests.
8 tests to journal items where the Access_Type is Controlled and 7 tests to journal items where the Access_Type is OA_Gold.
This must result in:
16 Controlled Total_Item_Investigations and 14 OA_Gold Total_Item_Investigations in the TR_J3 Standard View.
8 Controlled Unique_Item_Investigations and 7 OA_Gold Unique_Item_Investigations in the TR_J3 Standard View.
Option 2: Content provider does not offer OA_Gold items.
The auditor must carry out a total of 30 tests. Each test will consist of 2 Investigations.
There are 2 types of tests that must be carried out:
“Inside” tests (Two investigations are made to the same journal item, and the second investigation is made within 30 seconds of the first).
“Outside” tests (Two investigations are made to the same journal item, and the second investigation is more than 30 seconds after the first).
The auditor must carry out 15 inside tests.
This must result in:
15 Controlled Total_Item_Investigations in the TR_J3 Standard View.
15 Controlled Unique_Item_Investigations in the TR_J3 Standard View.
The auditor must carry out 15 outside tests.
This must result in:
30 Controlled Total_Item_Investigations in the TR_J3 Standard View.
15 Controlled Unique_Item_Investigations in the TR_J3 Standard View.
E.2.3.8 Standard View: TR_J4
Journal Requests by YOP (excluding OA_Gold): Breaks down the usage of non-Gold Open Access journal content by year of publication (YOP) providing counts for the metric types Total_Item_Requests and Unique_Item_Requests. Note: COUNTER reports do not provide access model or perpetual access rights details.
An audit of this Standard View requires the following:
The auditor must have access to all journal content available by the content provider. Any exclusions must be confirmed by COUNTER prior to testing and the auditor notified.
The Access_Type for all requests must be Controlled and not OA_Gold.
The auditor must record the Year of Publication (YOP) of every item accessed during audit testing.
The auditor must confirm the Year of Publication (YOP) of articles covered in J4-1 with appropriate and proportionate spot checks, unless the article is “YOP unknown”, then check that YOP is 0001.
Audit tests J4-1 and J4-2 must take place in separate accounts so that each audit test can be separately reported.
The auditor must ensure that some full-text articles from different years of the same journal are requested during the J4-1 and J4-2 tests. The auditor should know the numbers expected to appear against each Year of Publication (YOP) in the TR_J4 report.
The auditor must allow at least 31 seconds between each request unless otherwise specified.
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in TR_J4 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metrics on the auditor’s report.
Audit Test J4-1: Total_Item_Requests and Unique_Item_Requests
The auditor must make a total of 100 requests on a subset of unique journal items.
This must result in:
100 Total_Item_Requests being reported in the TR_J4 Standard View.
100 Unique_Item_Requests being reported in the TR_J4 Standard View.
Audit Test J4-2: Total_Item_Requests and Unique_Item_Requests - 30-second filters
The audit test consists of clicking links to a journal item twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Requests must be recorded. If the two clicks occur with more than 30 seconds between, then 2 Total_Item_Requests must be counted. In both cases only 1 Unique_Item_Requests will be reported.
The auditor must carry out a total of 30 tests. Each test will consist of 2 requests. There are 2 types of tests that must be carried out:
“Inside” tests (Two item requests are made to the same journal item and the second request is made within 30 seconds of the first).
“Outside” tests (Two item requests are made to the same journal item, and the second request is made more than 30 seconds after the first).
The auditor must carry out 15 inside tests.
This must result in:
15 Total_Item_Requests and 15 Unique_Item_Requests in the TR_J4 Standard View.
The auditor must carry out 15 outside tests.
This must result in:
30 Total_Item_Requests and 15 Unique_Item_Requests in the TR_J4 Standard View.
E.2.4 Item Reports
E.2.4.1 Master Report: IR
The Item Master Report will be COUNTER compliant if the following Standard Views pass the COUNTER R5 audit. The figures reported in the Standard Views must match the figures reported in the Item Master Report.
Any Standard View not applicable to the content provider does not require auditing. Any exclusions must be agreed prior to the audit by COUNTER.
E.2.4.2 Standard View: IR_A1
This Standard View reports on journal article requests at the article level. This report is limited to content with a Parent_Data_Type of Journal, Data_Type of Article, and metric types of Total_Item_Requests and Unique_Item_Requests.
An audit of this Standard View requires the following:
The auditor must have access to all journal article content available by the content provider. Any exclusions must be confirmed by COUNTER prior to testing and the auditor notified.
The auditor must allow at least 31 seconds between each request, unless otherwise specified.
Audit tests A1-1 and A1-2 must take place in separate accounts so that each audit test can be separately reported.
An audit pass requires all metrics reported by the content provider in IR_A1 Standard View for the auditor’s test account to be within a -8% and +3% reliability window of the same metrics on the auditor’s report.
Audit Test A1-1: Total_Item_Requests and Unique_Item_Requests
The auditor must make a total of 100 requests on a subset of journal article items.
This must result in:
100 Total_Item_Requests being reported in the IR_A1 Standard View.
100 Unique_Item_Requests being reported in the IR_A1 Standard View.
Audit Test A1-2: Total_Item_Requests and Unique_Item_Requests - 30-second filters
The audit test consists of clicking links to a journal article item twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Requests must be recorded. If the two clicks occur with more than 30 seconds between, then 2 Total_Item_Requests must be counted. In both cases, only 1 Unique_Item_Request will be reported.
The auditor must carry out a total of 30 tests, and each test will consist of 2 requests. There are 2 types of tests that must be carried out:
“Inside” tests (Two requests are made to the same journal article item, and the second request is made within 30 seconds of the first).
“Outside” tests (Two requests are made to the same journal article item, and the second request is made more than 30 seconds after the first).
The auditor must carry out 15 inside tests.
This must result in:
15 Total_Item_Requests and 15 Unique_Item_Requests in the IR_A1 Standard View.
The auditor must carry out 15 outside tests.
This must result in:
30 Total_Item_Requests and 15 Unique_Item_Requests in the IR_A1 Standard View.
E.2.4.3 Standard View: IR_M1
Reports on multimedia requests at the item level.
An audit of this Standard View requires the following:
The auditor must have access to all multimedia content available by the content provider.
The auditor must allow at least 31 seconds between each request, unless otherwise specified.
Audit tests M1-1 and M1-2 must take place in separate accounts so that each audit test can be separately reported.
For each applicable audit test to achieve an audit pass, all metrics reported by the content provider in IR_M1 Standard View from the auditor’s test account must be within a -8% and +3% reliability window of the same audit metrics on the auditor’s report.
Audit Test M1-1: Total_Item_Requests
The auditor must make a total of 100 requests on a subset of multimedia items.
This must result in:
100 Total_Item_Requests being reported in the IR_M1 Standard View.
Audit Test M1-2: Total_Item_Requests - 30-second filters
The audit test consists of clicking links to a multimedia item twice in succession (double-clicks). If the two clicks occur within a 30-second time-span, only the second Total_Item_Requests must be recorded. If the two clicks occur with more than 30 seconds between, then 2 Total_Item_Requests must be counted.
The auditor must carry out a total of 30 tests, and each test will consist of 2 requests. There are 2 types of tests that must be carried out:
“Inside” tests (Two requests are made to the same multimedia item and the second request is made within 30 seconds of the first).
“Outside” tests (Two requests are made to the same multimedia item, and the second request is made more than 30 seconds after the first).
The auditor must carry out 15 inside tests.
This must result in:
15 Total_Item_Requests in the IR_M1 Standard View.
The auditor must carry out 15 outside tests.
This must result in:
30 Total_Item_Requests in the IR_M1 Standard View.
E.3 Audit Compliance
E.3.1 Data Integrity
During the audit testing, the auditor’s activities must be isolated from other activities on the content provider’s site. Depending on the site being tested, the auditor must conduct the audit test from a computer with a unique IP address and/or using a unique account number.
The auditor must accept user/machine and session cookies when prompted.
To ensure reporting is counting correctly as per the COUNTER R5 Code of Practice, it is important the browser cache settings of the machines used for testing are disabled. It is also important the auditee confirms before the audit period whether they operate a cache server.
E.3.2 Report Delivery
The auditor will check the following:
The delivery of reports in tabular format.
The reports are downloadable using the SUSHI protocol
This testing of report delivery and formats may be undertaken using the COUNTER Report Validation Tool (see Section 9.2 of the COUNTER R5 Code of Practice).
The JSON-formatted reports produced by SUSHI must match the total usage counted on the equivalent tabular report. A report should produce the same results irrespective of the delivery format.
E.3.3 Report Format
The auditor will confirm each of the audit reports complies with the COUNTER R5 Code of Practice.
The following items will be checked:
The layout of the report (headers, number of fields, field sequence, totals field, and format of reported numbers)
The conformity of identifiers to the required standard (e.g. ISSNs must be provided as nine digits, with a hyphen as the middle digit)
The presence of all required file formats (a Microsoft Excel or tab-separated value (TSV) file or both, additional file formats that can be easily imported into spreadsheet programs without loss or corruption may be offered at the vendor’s discretion)
Email alerts set to report usage reports are updated in a timely manner
Flexibility in the reporting period so customers can specify the start and end months of data reported in the COUNTER reports
That COUNTER reports are available in JSON format in accordance with the COUNTER_SUSHI API Specification (the specification is maintained by COUNTER on SwaggerHub, see Section 8 of the COUNTER R5 Code of Practice for details)
That all required COUNTER reports are available via the COUNTER_SUSHI API
That the JSON formatted reports produced via SUSHI matches the total of the relevant usage counted on the equivalent TSV/Excel report offered by the content provider, i.e. a report should produce the same results irrespective of the format in which it is delivered
E.4 Audit Conclusion
If the auditor identifies one or more issues, the content provider MUST resolve them and pass the audit within 3 months to maintain COUNTER-compliant status. Please see Section 9.3 in the COUNTER R5 Code of Practice.
The auditor will provide to the COUNTER Project Director a summary report including, as a minimum, the following information:
The name of the content provider
The audit period and date
The usage report(s) tested
For each usage report tested, the test results, indicated as a % of the reported figures over the expected
A summary of any material issues noted with the format/structure, data integrity, and/or delivery of the content provider’s reports. If there are no issues, a PASS should be noted.
A clear indication of the outcome of the audit: PASS, QUALIFIED PASS, or FAIL.
Any other comments that relate to the audit and are worthy of consideration by the COUNTER Executive Committee.
Sample Audit Report:
Content Provider |
<name> |
|
||||||
Audit Period |
<mmm/yyyy> |
Date |
<mmm/yyyy> |
|||||
Report |
Usage Activity Result |
Report Format |
Data Integrity |
Delivery |
Opinion |
Comments |
||
Tabular |
SUSHI/JSON |
Reports Interface |
SUSHI Server |
|||||
TR_J1 |
100% |
PASS |
PASS |
PASS |
PASS |
PASS |
PASS |
|
TR_B1 |
112% |
PASS |
REPORT TOTALS included |
PASS |
PASS |
PASS |
FAIL |
SUSHI versions of reports must not have totals. |
A content provider may need to submit multiple audit reports, some of which may PASS and some of which may FAIL. The results of each report’s tests should be submitted on a separate line. For a content provider to maintain COUNTER-compliant status, each audited report must PASS.
Appendix F: Handling Errors and Exceptions
Note: The main Code of Practice document takes precedence in the case of any conflicts between it and this appendix.
Exceptions are used both for reporting errors that occur while responding to a COUNTER_SUSHI API call and, when generating a report, for indicating that the report differs from what might be expected. While the COUNTER_SUSHI API Specification (see Section 8) defines the API methods and the JSON response formats, including the format for Exceptions (SUSHI_error_model), this appendix defines the permissible Exceptions, that is the Exception Codes, the corresponding Exception Messages and HTTP status codes, and how these Exceptions are expected to be used. Some of the Exceptions also can occur when generating tabular reports at an administrative/reporting site.
There are four types of errors that can occur while responding to COUNTER_SUSHI API calls:
The base URL, for example the release, or the method is wrong, resulting in an invalid path. In this case the SUSHI server MUST respond with HTTP status code 404. The Exceptions 3000 and 3010 used in Release 4 for indicating that the report or report version isn’t supported still exist, but they are deprecated and will be removed in the next major release.
While processing a COUNTER_SUSHI API call an error occurs that usually prohibits generating the requested report, report list, consortium member list or server status. The SUSHI server MUST respond with the appropriate non-200 HTTP status code and a single Exception in JSON format (see below).
The SUSHI server detects errors in a report request that can be ignored and processing can continue. The SUSHI server SHOULD continue processing the request and return HTTP status code 200 and the report in JSON format with the appropriate Exceptions in the report header.
The report differs from what might be expected, for example the report is empty because there was no usage. In this case the report in JSON format MUST be returned with the appropriate Exceptions in the report header.
When requesting a tabular report at an administrative/reporting site, only the last type of error should occur and be included in a report. The website is expected to gracefully handle other errors that might occur while generating the report.
While only a single Exception can be returned for a non-200 HTTP status code, the Exceptions element in the report header allows to return multiple Exceptions with HTTP status code 200, both in JSON and tabular reports. If the SUSHI server detects multiple errors, including some with a non-200 HTTP status code, it MUST only return a single Exception with a non-200 HTTP status code, preferably the one with the lowest Exception Code.
The COUNTER_SUSHI API Specification defines the JSON format for Exceptions as follows:
“SUSHI_error_model”: {
“type”: “object”,
“description”: “Generalized format for presenting errors and warnings.”,
“required”: [ “Code”, “Severity”, “Message” ],
“properties”: {
“Code”: {
“type”: “integer”,
“format”: “int32”,
“description”: “Exception Code. See Table F.1 in the Code of Practice, Appendix F.”,
“example”: 3031
},
“Severity”: {
“type”: “string”,
“description”: “Severity of the Exception (deprecated).”,
“example”: “Warning”,
“enum”: [ “Warning”, “Error”, “Fatal”, “Debug”, “Info” ]
},
“Message”: {
“type”: “string”,
“description”: “Exception Message. See Table F.1 in the Code of Practice, Appendix F.”,
“example”: “Usage Not Ready for Requested Dates”
},
“Help_URL”: {
“type”: “string”,
“description”: “URL to a help page that explains the Exception in more detail.”
},
“Data”: {
“type”: “string”,
“description”: “Additional data provided by the server to clarify the Exception.”,
“example”: “Request was for 2016-01-01 to 2016-12-31; however, usage is only available to 2016-08-31.”
}
}
}
For tabular reports the format for the Exceptions header is defined in Section 3.2.1 of the Code of Practice, Table 3.f, as “{Exception Code}: {Exception Message} ({Data})” with multiple Exceptions separated by semicolon-space (”; “).
As indicated in the code above, Exceptions in JSON format have the following elements:
Code: The Code is a number that identifies the Exception. See Table F.1 below for permissible values.
Severity: In Release 4 the Severity element was used to indicate the severity of the Exception (Fatal, Error, Warning, Info or Debug). The RESTful COUNTER_SUSHI API in Release 5 instead uses HTTP status codes to indicate if the response is a (fatal) error (non-200 HTTP status code) or not (HTTP status code 200). The Severity element therefore is deprecated and will be removed in the next major release. SUSHI clients should stop relying on Severity and use the HTTP status code and Exception Code instead.
Message: The Message element contains a textual description of the Exception. For standard Exceptions with Codes > 999 the Message MUST exactly match the Message in Table F.1 below.
Data: The Data element contains additional information that further describes the Exception. For some Exceptions this additional information MUST be provided (as indicated in Table F.1 below), for other Exceptions it is optional.
Help_URL: An optional element that contains an URL to a help page that explains the Exception in more detail.
Table F.1 lists all Exceptions permissible for the COUNTER_SUSHI API. Note that the standard Exceptions with Code > 999 MUST be used for the indicated invocation conditions, it is neither permitted to use custom Exceptions with Code <= 999 instead nor to define custom Exceptions with Code > 999.
Table F.1 (below): Exceptions
Exception Message |
Severity |
Exception Code |
HTTP Status Code |
Invocation Conditions |
---|---|---|---|---|
{Info or Debug Message} |
Info |
0 |
200 |
Any. These Messages will never be standardized and service providers can design them as they see fit. |
{Warning Message} |
Warning |
1-999 |
200 |
Any. This range is reserved for the use of service providers to supply their own custom warnings. |
Service Not Available |
Fatal |
1000 |
503 |
The service is executing a request, but due to internal errors cannot complete the request. If possible, the server should provide an explanation in the additional Data element. |
Service Busy |
Fatal |
1010 |
503 |
The service is too busy to execute the incoming request. The client should retry the request after some reasonable time. |
Report Queued for Processing |
Warning |
1011 |
202 |
Services queueing incoming report requests must return a response with this Exception and no payload to inform the client about the processing status. The client should retry the request after some reasonable time. Note: This Exception was included in the amendments published on 11 December 2018 but initially was missing from Release 5.0.1. |
Client has made too many requests |
Fatal |
1020 |
429 |
If the service sets a limit on the number of requests a client can make within a given timeframe, the server will return this Exception when the client exceeds that limit. The server would provide an explanation of the limit in the additional Data element (e.g. “Client has made too many requests. This server allows only 5 requests per day per requestor_id and customer_id.”). |
Insufficient Information to Process Request |
Fatal |
1030 |
400 |
There is insufficient data in the request to begin processing (e.g. missing requestor_id, no customer_id, etc.). |
Requestor Not Authorized to Access Service |
Error |
2000 |
401 |
If requestor_id is not recognized or not authorized by the service. |
Requestor is Not Authorized to Access Usage for Institution |
Error |
2010 |
403 |
If requestor_id has not been authorized to harvest usage for the institution identified by the customer_id, or if the customer_id is not recognized. |
APIKey Invalid |
Error |
2020 |
401 |
The service requires a valid APIKey to access usage data and the key provided was not valid or not authorized for the data being requested. |
IP Address Not Authorized to Access Service |
Error |
2030 |
401 |
The service requires IP authorization, and the IP address used by the client is not authorized. The server MUST include information on how this issue can be resolved in the Data element or include a Help_URL that points to the information. |
Report Not Supported |
Error |
3000 |
404 |
The requested report name, or other means of identifying a report that the service can process is not matched against the supported reports. In Release 5 the requested report is part of the URL path, and for RESTful APIs the HTTP status code 404 is used to signal that a path isn’t supported. Therefore this Exception is deprecated and will be removed in the next major release. SUSHI clients should stop relying on this Exception and use the HTTP status code instead. |
Report Version Not Supported |
Error |
3010 |
404 |
The requested version of the report is not supported by the service. In Release 5 the requested report version is part of the URL path, and for RESTful APIs the HTTP status code 404 is used to signal that a path isn’t supported. Therefore this Exception is deprecated and will be removed in the next major release. SUSHI clients should stop relying on this Exception and use the HTTP status code instead. |
Invalid Date Arguments |
Error |
3020 |
400 |
Any format or logic errors involving date computations (e.g., end_date cannot be less than begin_date). |
No Usage Available for Requested Dates |
Error |
3030 |
200 |
The service did not find any data for the specified date range and other filters (if any). Note: If the usage for a requested month either hasn’t been processed yet or is no longer available, only Exception 3031 or 3032 must be returned for that month. |
Usage Not Ready for Requested Dates |
Error, Warning |
3031 |
200 |
The service has not yet processed the usage for one or more of the requested months, if some months are available that data should be returned. The Exception should include the months not processed in the additional Data element. Note: If the requested begin_date is the current or a future month, the server should return Exception 3020. If the requested end_date is the current or a future month, the server may continue processing the request and include Exception 3031, the End_Date Report_Filter then should be set to the previous month (the last month that could have been processed). |
Usage No Longer Available for Requested Dates |
Warning |
3032 |
200 |
The service does not have the usage for one or more of the requested months because the requested begin_date is earlier than the available data. If some months are available that data should be returned. The Exception should include the months not processed in the additional Data element. Note: This Exception was included in the amendments published on 11 December 2018 but initially was missing from Release 5.0.1. |
Partial Data Returned |
Warning |
3040 |
200 |
The request could not be fulfilled in its entirety, since some of the requested data is missing. The server should return the available data and provide an explanation in the additional Data element. Note: This Exception is not intended for the conditions already covered by Exceptions 3030, 3031 and 3032. A use case for this Exception for example would be that usage data is missing because the logging has failed. Usually this Exception indicates a permanent error. |
Parameter Not Recognized in this Context |
Warning |
3050 |
200 |
The request contained one or more parameters that are not recognized by the server in the context of the report being serviced. The server should list the names of unsupported parameters in the additional Data element. Note: The server is expected to ignore unsupported parameters and continue to process the request, returning data that is available without the parameter being applied. Note: This Exception is only applicable for report requests. For report list, member list and server status requests parameters not recognized by the server should be ignored. |
Invalid ReportFilter Value |
Warning |
3060 |
200 |
The request contained one or more filter values that are not supported by the server. The server should list the names of unsupported filter values in the additional Data element. Note: The server is expected to ignore unsupported filters and continue to process the request, returning data that is available without the filter being applied. Note: If the begin_date or end_date value is invalid, the server must return Exception 3020. If the service requires a platform parameter, and the platform value is invalid, the server should return Exception 1030. |
Incongruous ReportFilter Value |
Warning |
3061 |
200 |
A filter element includes multiple values in a pipe-delimited list; however, the supplied values are not all of the same scope (e.g., item_id filter includes article level DOIs and journal level DOIs or ISSNs). Note: The server is expected to ignore the invalid filters and continue to process the request, returning data that is available without the filter being applied. |
Invalid ReportAttribute Value |
Warning |
3062 |
200 |
The request contained one or more report attribute values that are not supported by the server. The server should list the names of unsupported report attribute values in the additional Data element. Note: The server is expected to ignore unsupported report attributes and continue to process the request, returning data that is available without the report attribute being applied. |
Required ReportFilter Missing |
Warning |
3070 |
200 |
A required filter was not included in the request. Which filters are required will depend on the report and the service being called. The server should list the names of the missing filters in the additional Data element. Note: If begin_date or end_date is missing, the server must return Exception 1030. If the service requires a platform parameter, and platform is missing, the server also should return Exception 1030. Note: Currently there are no other required report filters, so this Exception should not occur. |
Appendix G: List of Federated Search Products
Note: The main Code of Practice document takes precedence in the case of any conflicts between it and this appendix.
The following are lists of known (to COUNTER) federated search products and user-agent values that may be used to identify federated search activity for reporting as Searches_Federated in Database Reports.
NOTE: These lists are for reference purposes only and may not represent all current Federated Search Products (please contact COUNTER with updates).
Table G.1: Federated Search Products
Federated Search Product |
Vendor |
---|---|
360 Search |
|
EBSCOhost Integrated Search |
|
Enterprise (Federated Search) |
|
EOS.Web |
|
MetaLib |
|
SEARCHit |
Table G.2: Federated Search Agent “User Agent” values
Federated Search User Agent |
---|
AGENTPORT-SCOCIT |
AGENTPORT-SDICIT |
AHMKEYS-SCOCIT |
AHMKEYS-SCOFUL |
ARCHIMINC-SCOCIT |
ARCHIMINC-SDICIT |
CITAVI-SCOCIT |
CITAVI-SDICIT |
COSMADRALI-SCOCIT |
COSMADRALI-SDICIT |
DEEPEX-SCOCIT |
DEEPEX-SDIABS |
DEEPEX-SDICIT |
EDINGET-SCOCIT |
EDINGET-SDICIT |
ENCOMP-SCOCIT |
ENCOMP-SDIABS |
ENCOMP-SDICIT |
GROGRO-SDICIT |
HENKINTRA-SCOCIT |
INERAEX-SCOCIT |
INTELLIFED-SCOCIT |
INTELLIFED-SDICIT |
MEKPAPERS-SCOCIT |
MEKPAPERS-SDICIT |
METALIB-SCOCIT |
METALIB-SDICIT |
MUSESEARCH-SCOCIT |
MUSESEARCH-SDICIT |
NJIT-SCOCIT |
NRLNAVY-SCOCIT |
OCLCPICAZ2-SCOCIT |
OCLCPICAZ2-SDICIT |
OOIPSDWID-SDICIT |
POTIRORDY-SCOCIT |
POTIRORDY-SDICIT |
QES-SCOCIT |
QES-SDICIT |
QINETIQ-SCOCIT |
RIGHTS-SDIABS |
RITENSE-SCOCIT |
SERSOL-SCOCIT |
SERSOL-SDICIT |
SYSONEMCKIN-SCOFUL |
SYSONEMCKIN-SDIABS |
TDNETDF-SCOCIT |
TDNETDF-SDICIT |
TDNSRCHR-SCOCIT |
TDNSRCHR-SDICIT |
UAG-SCOCIT |
UMIARERES-SCOCIT |
UWASOCR-SCOCIT |
UWASOCR-SCOFUL |
VSPACES-SCOCIT |
VSPACES-SDICIT |
WEBFEAT-SCOCIT |
WEBFEAT-SDICIT |
Appendix H: Sample COUNTER Master Reports and Standard Views
Note: The main Code of Practice document takes precedence in the case of any conflicts between it and this appendix.
The Master Reports and Standard Views in the following table are organized by reporting level with Platform first followed by Database, Title and ending with Item. Within the reporting level, the Master Report appears first followed by the Standard Views. Click the Excel, TSV and JSON links to download the corresponding sample reports.
Table H.1: Sample COUNTER Master Reports and Standard Views
Report_ID |
Report_Name |
Sample Report |
---|---|---|
PR |
Platform Master Report |
|
PR_P1 |
Platform Usage |
|
DR |
Database Master Report |
|
DR_D1 |
Database Search and Item Usage |
|
DR_D2 |
Database Access Denied |
|
TR |
Title Master Report |
|
TR_B1 |
Book Requests (Excluding OA_Gold) |
|
TR_B2 |
Book Access Denied |
|
TR_B3 |
Book Usage by Access Type |
|
TR_J1 |
Journal Requests (Excluding OA_Gold) |
|
TR_J2 |
Journal Access Denied |
|
TR_J3 |
Journal Usage by Access Type |
|
TR_J4 |
Journal Request by YOP (Excluding OA_Gold) |
|
IR |
Item Master Report |
|
IR_A1 |
Journal Article Requests |
|
IR_M1 |
Multimedia Item Requests |
Appendix I: List of internet robots, crawlers and spiders
Note: The main Code of Practice document takes precedence in the case of any conflicts between it and this appendix.
The growing use of internet robots, crawlers and spiders has the potential to artificially inflate usage statistics. Only genuine, user-driven usage should be reported in COUNTER usage reports. Usage of full text articles that is initiated by automatic or semi-automatic bulk download tools, such as Quosa or Pubget should only be recorded when the user has clicked on the downloaded full-text article in order to open it.
Activity generated by internet robots, crawlers and spiders must be excluded from all COUNTER usage reports.
This list of internet robots, crawlers and spiders was published in April 2016 and updated July 2016. Please note it is rationalised, removing some previously redundant entries (e.g. the text ‘bot’ - msnbot, awbot, bbot, turnitinbot, etc. - which is now collapsed down to a single entry ‘bot’).
The list is displayed below and also available here https://github.com/atmire/COUNTER-Robots
This page will always show the readme and give potential users and contributors of the list more information on how to integrate the list.
Please let us know of any user agents that should be included in this list or to suggest other amendments.