B  U  L  L  E  T  I  N

of the American Society for Information Science and Technology       Vol. 30, No. 1     October/November  2003

Go to
Bulletin Index

bookstore2Go to the ASIST Bookstore


International Column

Citation Analysis and Research Assessment in the United Kingdom
by Julian Warner

Julian Warner serves as international liaison to the ASIS&T Board of Directors. He is affiliated with the School of Management and Economics at The Queen’s University of Belfast, Northern Ireland. He can be reached by e-mail at j.warner@qub.ac.uk

Evaluation of publicly funded university research in the United Kingdom has been conducted through a series of Research Assessment Exercises (RAEs) in 1992, 1996 and 2001. The results of the RAEs have been used as a factor in the calculation of public funds for distribution to universities for research. The primary mechanism used for the research evaluation has been direct peer review of submissions and publications from university departments. The conduct of the RAEs, the public debate generated and a recent central government report of future research evaluation has generated issues relevant to other jurisdictions and to the interest in research evaluation within information science. In particular, a crucial transformation in the value of citation analysis in research evaluation can be detected.

Two antithetical viewpoints on the appropriate relation of citation analysis to the RAEs have coexisted within information science, developing since the mid-1990s. The dominant approach established strong correlations between rankings of entities for research assessment derived from citation analyses and from RAE grades. Replacement of the process of peer review embodied in RAE procedures by an ordering (and conversion of the ordering into grades) determined by the results of citation analyses was then advocated on the grounds of broad equivalence of results and lower relative costs.

An alternative perspective (articulated by the current author) acknowledged the correlations established but argued that citation analysis may be used to inform, rather than to determine, judgment and does not even have to be used for this purpose. Advocacy of the use of citation analysis to determine grades has been strongest within information science as reflected in the number of authors and publications advocating it, the status of journals for publication and in numbers of citations received – although citations have been made predominantly from within information science rather than from other disciplinary literatures. Dialogue between the two viewpoints has been apparent but has not resulted in an accepted synthesis.

The dialogue could be regarded as occurring between incommensurable paradigms or at least as not being fully resolvable – from a Kuhnian perspective – by the participants from either paradigm. Resolution by an external observer, acknowledged to have cognitive authority for this purpose and concerned not only with the internal or logical coherence of each approach but with their real world effects and value, would then be welcome. An analogy could be made with court judgments between disputants and further with the process of referral to higher courts.

The recent central government report, Review of Research Assessment, was conducted by privileged observers credited with cognitive authority and concerned with real world implementation. The report does not directly engage with the literature of information science (and is free from a scholarly apparatus of citations and bibliography), but was informed by submissions, the understandings of participants in workshops and published literature on the RAEs.

Citation Analysis and Research Assessment

The congruence between the position developed by the critique of citation analysis and the recommendations of the report for future research assessment is striking. The critique had concluded

    citation analysis can … be employed as one element used to inform judgment of research quality, with judgment underdetermined by any single element.

Similarly, the first recommendation of the report insists:

    Recommendation 1. Any system of research assessment designed to identify the best research must be based upon the judgment of experts, who may, if they choose, employ performance indicators to inform their judgment.

Performance indicators should be understood to include the results of citation analyses. There are indications of movement over time in the understandings of some of the members of the steering group of the review, from a determining to an informing use of performance indicators.

Some of us believed, at the outset of the process, that there might be some scope for assessing research on the basis of performance indicators – we are now convinced that the only system that will enjoy both the confidence and the consent of the academic community is one based ultimately upon expert review.

The preference for peer review over the determining use of performance indicators is based principally on its ability to resist behavioral distortions and does acknowledge the contrasting levels of labor and costs involved.

The risk of the unproductive distortion of research behavior to conform with evaluative measures forms a persistent theme of the report and is emphasized in the Preface:

    More important, I urge the funding councils to remember that all evaluation mechanisms distort the processes they purport to evaluate.

Only peer review can detect and resist unwanted behavioral distortions:

    We are also convinced that only a system based ultimately upon expert judgment is sufficiently resistant to unintended behavioral consequences to prevent distorting the very nature of research activity.

Bibliometric measures are understood to be included in the performance indicators, which might promote undesirable results. The possibility of distortion of citation practices by the use of citation analysis in evaluation had been anticipated by the critique and by other commentaries.

The contrasting levels of direct human labor and the costs of that labor involved in peer review and in the determining use of performance indicators are acknowledged. The burden of peer review is accepted by the report. It is also recognized that the promise of reduced labor offered by citation analysis might be betrayed by the magnitude of the task of editing data into a form acceptable for comparisons between entities for assessment in a large country, such as the United Kingdom. The argument from relative costs for the use of citation analysis to determine judgment has then been considered and rejected.

The report’s own and proposed use of citation analysis and other performance indicators is consistent with its informative role. In relation to current considerations, the increase in citations received by United Kingdom papers since the introduction of the RAEs is used to support the review’s judgment of the improvement in United Kingdom research. For future use, peer reviewers are to be informed by performance indicators but not compelled to reflect them in grades awarded (for the Research Quality Analysis proposed to replace the RAEs). Performance indicators would only form the basis of assessment for those entities with quantities of research below the level commensurate with the costs of peer review (the Research Capacity Analysis).


A dispute within information science has been decisively resolved by privileged external observers, in favor of the view that citation analysis may be, and does not have to be, used to inform expert judgment. The congruence in conclusions between the report and the alternative perspective within information science is matched by similarities in reasoning, particularly the recognition of the risk of distortion of behavior. The report’s preference for peer review is partly a product of changes in the reviewers’ understandings, through their consideration of relevant evidence. The resolution of the dispute must be acknowledged by future work in citation analysis, if it is to retain its relevance to wider public discourse.

How to Order

American Society for Information Science and Technology
8555 16th Street, Suite 850, Silver Spring, Maryland 20910, USA
Tel. 301-495-0900, Fax: 301-495-0810 | E-mail:

Copyright © 2003, American Society for Information Science and Technology