Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2011
…
5 pages
1 file
This report evaluates the ability of UK repositories to generate metrics related to full-text downloads, as requested by the SCONUL community. A survey conducted among repository managers revealed that while 53.2% could generate such metrics with manageable effort, a significant 46.8% faced challenges in doing so. The preferred metric among respondents for assessing repository performance was identified as full-text downloads, though there was consensus that combining this metric with others offers a more comprehensive evaluation of repository activity.
2007
Recent studies point to an increased focus on metrics / measuring in regard to open access repositories (OAR). Whereas historically the display of download figures for popular content was a way of promoting an OAR within its respective institution, more recently interest has shifted on how to better utilise OAR usage statistics for more formalised purposes. The authors examine the nature of the more commonly used web-based repository approaches, identify their major shortcomings, and describe recent initiatives to enhance functionality. They then discuss citation analysis and proposals to link this to web-based usage statistics, particularly in the context of high-level drivers focused on research quality and effectiveness. The paper concludes by outlining opportunities for institutions to engage with new metric models.
2006
Abstract A study was undertaken of download and usage statistics for the institutional repository at the University of Wollongong, Australia, over the six-month period January-June 2006. The degree to which research output was made available, via open access, on Internet search engines was quantified. Google was identified as the primary access and referral point, generating 95.8% of the measurable full text downloads of repository content.
A primary impact metric for institutional repositories (IR) is the number of file downloads, which are commonly measured through third-party Web analytics software. Google Analytics, a free service used by most academic libraries, relies on HTML page tagging to log visitor activity on Google's servers. However, Web aggregators such as Google Scholar link directly to high value content (usually PDF files), bypassing the HTML page and failing to register these direct access events. This article presents evidence of a study of four institutions demonstrating that the majority of IR activity is not counted by page tagging Web analytics software, and proposes a practical solution for significantly improving the reporting relevancy and accuracy of IR performance metrics using Google Analytics.
Increasingly there is a need for quantitative evidence in order to help demonstrate the value of online services. Such evidence can also help to detect emerging patterns of usage and identify associated operational best practice. This paper seeks to initiate a discussion on approaches to metrics for institutional repositories by providing a high-level overview of the benefits of metrics for a variety of stakeholders. The paper outlines the potential benefits which can be gained from providing richer statistics related to the use of institutional repositories and also reviews related work in this area. The authors describe a JISC-funded project which harvested a large number of repositories in order to identify patterns of use of metadata attributes and summarise the key findings. The paper provides a case study which reviews plans to provide a richer set of statistics within one institutional repository as well as requirements from the researcher community. An example of how third-party aggregation services may provide metrics on behalf of the repository community is given. The authors conclude with a call for repository managers, developers and policy makers to be pro-active in providing open access to metrics for open repositories.
2017
Since 2002, Project COUNTER has led the way in developing and maintaining systems of measurement for download counts. While these counts have often been used as a proxy measure in determining journal and article value for libraries and publishers, they miss an important post-download secondary usage factor – namely, that of sharing. Likewise, altmetrics, while accounting for the impact of social media, misses some aspects of sharing as distribution often occurs via email. This creates difficulty in quantifying an exact measure of use. One aim of the Beyond Downloads project was to develop a calculator for measuring total digital usage – including sharing. Through an examination of a range of sharing systems, we identified the most commonly used platforms for sharing scholarly articles, while an international survey provided data on access, download, saving, and sharing behavior. Survey results indicated that a range of sharing patterns can be estimated, but post-download usage o...
Performance Measurement and Metrics, 2012
Increased use and changes in the way e-resources are delivered led some libraries to question the detail of some of the definitions used and particularly to suggest that statistics required by SCONUL did not always match the requirements or practice of the libraries themselves.
Currently, libraries are passing through a challenging phase of progress as the entire scholarly publishing process is getting through the publisher's remote servers from the stage of production to users accessing the information. The server log created in this process measures their usage in several dimensions. As libraries strive to become more user-centric, usability factor becomes increasingly important for the development of collection as well as services. Consequently, the need is arising to explore methods to evaluate the usefulness of new type of resources also since they became integral part of the library collection. The subscribing libraries, organizations and consortia have been relying on the data for quantitative as well as qualitative assessment. The chapter discusses about the characteristics of electronic information, genesis of server logs and transformation into usage metrics and also the role of relevant standards in formatting the usage standards and advantageous features. The objectivity of data mining and research applications of usage metrics in supporting the library management as well as establishing the credibility of authors, institutions, journals or databases etc. are highlighted in this chapter.
From the beginning of time, measurement has been part of the essence of being human, whether consciously or unconsciously. The purposes for which we measure, as summarized by Behn (2003) and duly cited by Stuart in the introduction to his book, are enlightening in themselves: to evaluate, to check, to budget, to motivate, to promote, to celebrate, to learn, and to improve. Humans, whether they are involved in scientific or non-scientific activity, need to measure in order to know. Advances in instrumentation, coupled with the development of applied statistics, have heralded genuine revolutions in the collection, measurement and analysis of empirical data in different
2012
IRUS-UK is a new national standards-based statistics aggregation service for institutional repositories in the UK. The service processes raw usage data from repositories, consolidating those data into COUNTER-compliant statistics by following the rules of the COUNTER Code of Practice -the same code adhered to by the majority of scholarly publishers. This will, for the first time, enable UK repositories to provide consistent, comparable and trustworthy usage data, as well as supporting opportunities for benchmarking at a national level. This article provides some context to development, benefits and opportunities offered by the service, an institutional repository perspective and future plans.
Journal of Academic Librarianship, 1999
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
The E-Resources Management Handbook, 2006
Proceedings of the …, 2008
Journal of Information Science Theory and Practice, 2020
The E-Resources Management Handbook, 2006
Information Research, 2008
Scientometrics, 2010
International Higher Education
Serials: The Journal for the Serials Community, 2009
IALA Journal, 2024
Proceedings of the 2018 Library Assessment Conference: Building Effective, Sustainable, Practical Assessment: December 5–7, 2018, Houston, TX, 2019