Information Research, Vol. 6 No. 2, January 2001 | ||||
A citation study was carried out to predict the outcome of the Higher Education Funding Councils' 2001 Research Assessment Exercise (RAE). The correlation between scores achieved by academic departments in the UK in the 1996 Research Assessment Exercise, and the number of citations received by academics in those departments for articles published in the period 1994-2000, using the Institute for Scientific Informations citation databases, was assessed. A citation study was carried out on all three hundred and thirty eight academics who teach in the UK library and information science schools. These authors between them received two thousand three hundred and one citations for articles they had published between 1994 and the present. The results were ranked by Department, and compared to the ratings awarded to the departments in the 1996 RAE. On the assumption that RAE scores and citation counts are correlated, predictions were made for the likely scores in the 2001 RAE. Comments were also made on the impact of staff movements from one Higher Education Institution to another.
According to Poppers well-established philosophy of science, a discipline can only be considered a "science" if it makes falsifiable predictions, i.e., the theories underlying the discipline can be applied to new and untested situations and a prediction made of the likely outcome. Although the term "information science" is well established, under these criteria there are few, if any theories in information science outside of the field of modelling and system prediction in the information retrieval area outside of the field of modelling and system prediction in the information retrieval area that allow researchers to predict the outcome of a particular experiment (Summers, et al., 1999). For example, the well-known inverse relationship between Recall and Precision does not allow one to predict what the precision or recall of a particular retrieval experiment will be, and the Bradford-Zipf "law" does not allow one to predict the size of a core corpus of literature in a particular subject. A cursory examination of the information science literature demonstrates the rarity of the words "predict", "predicting", "predicted" or "prediction" in the titles of papers. If one follows the Popperian approach, the second word in the term "information science" appears to be unjustified. In one area of information science research other than information retrieval, namely citation analysis, however, predictions have been made (Oppenheim, 1979). The work described here follows in that tradition.
The Higher Education Funding Councils in the UK carry out a Research Assessment Exercise (RAE) every few years (the last one was in 1996) to grade the research output of University Departments in the UK. The next Exercise will take place in 2001. The RAE is conducted jointly by the Higher Education Funding Council for England (HEFCE), the Scottish Higher Education Funding Council (SHEFC), the Higher Education Funding Council for Wales (HEFCW) and the Department of Education for Northern Ireland (DENI).
The nominal purpose of the RAE is to enable the higher education funding bodies to distribute some of their public funds to Universities selectively on the basis of the quality of research carried out in each Department. The RAE assesses the quality of research in Higher Education Institutions (HEIs) in the UK. Around �5 billion of research funds per annum will be distributed in response to the results of the 2001 RAE (HEFCE, 1994). The RAE has become one of the most important features of UK academic life, and in practice, the results are widely used for promotional purposes.
The RAE provides quality ratings for research across all disciplines. Panels use a standard scale to award a rating for each submission. Ratings range (HEFCE, 1995) from 1 to 5*:
5* |
Quality that equates to attainable levels of international excellence in more than half of the research activity submitted and attainable levels of national excellence in the remainder. |
5 |
Quality that equates to attainable levels of international excellence in up to half of the research activity submitted and to attainable levels of national excellence in virtually all of the remainder. |
4 |
Quality that equates to attainable levels of national excellence in over two thirds of the research activity submitted, showing some evidence of international excellence. |
3a |
Quality that equates to attainable levels of national excellence in over two thirds of the research activity submitted, possibly showing evidence of international excellence. |
3b |
Quality that equates to attainable levels of national excellence in more than half of the research activity submitted. |
2 |
Quality that equates to attainable levels of national excellence in up to half of the research activity submitted |
1 |
Quality that equates to attainable levels of national excellence in none, or virtually none, of the research activity submitted. |
Panel members are appointed from the list of nominations of experienced and well-regarded members of the research community, and users of research. The inclusion of user representatives on certain panels is intended to enhance the assessment process by making available a user perspective. Panels use their professional judgement to form a view about the overall quality of research activity described in each submission in the round, taking into account all the evidence presented.
Institutions have been invited to make submissions containing information on staff in post on the census date, 31 March 2001. These submissions include a selection of publications emanating from the Department in question during the assessment period (1 January 1994 to 31December 2000, for Arts and Humanities subjects, and 1 January 1996 to 31 Decemb er 2000 for other subjects).
The outcomes of the assessment will be published in December 2001, and will provide public information on the quality of research in universities and higher education colleges throughout the UK. These results will show the number and proportion of staff submitted for assessment in each case, and the rating awarded to each submission. The funding bodies will also produce:
Those parts of submissions that contain factual data and textual information about the research environment will be published on the Internet. This will include the names of selected staff and their research publication output.
A large amount of public money is spent on the RAE. It involves considerable opportunity costs as well as actual financial expense by both the HEIs concerned, and the panels of experts in their particular subjects.
Many studies have demonstrated that there is a strong correlation between citation counts and ratings of academic excellence as measured by other means, for example receipt of honours, receipt of research funding, editorships of major journals and peer group ratings. The subject has been discussed in some depth by Baird and Oppenheim (1994), Garfield (1979) and Cole and Cole (1973).
Until now, only three studies have been carried out regarding the correlation between RAE counts and citations. In the first study, Oppenheim (1995) demonstrated the correlation between citation counts and the 1992 RAE Ratings for British library and information science University Departments. He concluded that the cost and effort of the RAE might not be justified when a simpler and cheaper alternative, namely a citation counting exercise, could be undertaken.
Seng and Willett (1995) reported a citation analysis of the 1989 and 1990 publications of seven UK library schools. The total number of citations, the mean number of citations per member of staff, and the mean number of citations per publication were all strongly correlated, with the ratings that these departments had achieved in the 1992 RAE. An analysis of the citedness of different types of research output from these departments was also carried out.
Oppenheim (1997) conducted the third study that showed the correlation between citation counts and the 1992 RAE ratings for British research in genetics, anatomy and in archaeology. He found that in all three cases, there was a statistically significant correlation between the total number of citations received, or the average number of citations per member of staff, and the RAE score.
Recent reviews on citation counting can be found in Baird and Oppenheim (1994) and Liu (1994). An overview of aspects of research evaluation is provided by Hemlin (1996), who pointed out that there is a strong positive correlation between citation counts and other means of measuring research excellence, despite justified concerns about the validity of citation counting.
The correlation of RAE ratings with other measures has not been studied much. Colman et al., (1995) carried out an evaluation of the research performance of British University politics departments. The research performance of forty-one British university politics departments was evaluated through an analysis of articles published between 1987 and 1992 in nine European politics journals with the highest citation impact factors. Annual performance scores were obtained by dividing each departments number of publications in these journals in each year (departmental productivity) by the corresponding departmental size. These scores were summed to obtain a research performance score for each department over the period of assessment. They correlate significantly with research performance scores from two previous studies using different methodologies: Crewes per capita simple publication count for the years 1978 1984; and the Universities Funding Councils research selectivity ratings covering the years 1989 - 1992.
The correlation between a 1985 University Grants Committee research ranking exercise and a so-called influence weight measure of journals in which the academics publish was considered by Carpenter et al (1988).
Citation studies have been used to compare different departments within one country in the same subject area (Winclawska, 1996), and of different departments around the world (Nederhof & Noynes, 1992).
Ajibade and East (1997) investigated the correlation between the RAE scores and the usage of BIDS ISI databases. They found that there was a strong correlation between the two factors. There is, of course, no implication that using the ISI databases on BIDS will lead to a high RAE score. Rather, research intensive departments that are likely to score well in the RAE are also likely to be intensive users of electronic information resources.
Zhu et al. (1991) carried out a review of research in a few chemistry and chemical engineering departments in British universities. They found a correlation between both citation counts and publications counts, and peer review rating.
Of itself, the correlation between citation counts and RAE scores is not particularly surprising. It is well established that high citation counts reflect the impact, and by implication the importance of a particular publication, or a group of publications. It is therefore hardly surprising to find that the RAE scores, which reflect peer group assessment of the quality of research, are correlated with citation counts, which reflect the same phenomenon. What is perhaps surprising is that the correlation applies in such a wide range of subject areas, including those where citations occur relatively rarely, and ranging from those with small publications outputs to those with high ones.
The results of the three studies discussed earlier (Oppenheim, 1995, 1997; Seng & Willett, 1995) demonstrated that there is a strong correlation between citation counts and the RAE ratings of UK academic departments. However, all three studies were analyses after the event. To our knowledge, no research has ever been undertaken to predict the outcome of a forthcoming RAE by means of citation counting.
The aim of this research was to predict the order of RAE ratings that will be achieved by Departments of Librarianship and Information Management in 2001, using citation analysis alone. When the RAE results are announced in 2001, comparison with the predictions made here can then be carried out.
A list was made of the members of staff in the Information and Library schools in the UK likely to be making a return under UoA 61.
The institutions examined were:
There were a number of other institutions without library schools, departments of information science or similar titles who put in submissions to the last RAE under the heading of Library and Information Management. These were: University of Bath, de Montfort University, University of Central Lancashire, University of Leicester, and the Royal Free Hospital School of Medicine. We did not know what individuals had been put forward under these submissions, and so these institutions were not considered for this study.
It is not clear yet who will put in submissions under Library and Information Management for the 2001 RAE, but the list of institutions we examined include the most probable ones. Clearly, if many other HEIs submit under UoA 61, or if some of the Departments we examined do not, the predictions made in this paper will be weakened.
The web sites of each department were visited to obtain a list of all current academic staff. The data was collected over two days in November 1999. Professors, readers, senior lecturers, tutors, lecturers, part-time lecturers, research fellows, and heads of departments were included. Honorary professors or lecturers, visiting professors or lecturers, computer supervisors, research staff, computer technicians, secretaries and administration officers were ignored. Of course, academic staff who might be returned under the UoA by the HEI, but were not noted in the departmental list were not included. However, from our personal knowledge, research-active individuals who were Deans and Pro-Vice Chancellors and whose "home" was one of the departments checked were not omitted, even if their name was not on the departmental home page in question. For example, J. Elkin, former Head of Department at University of Central England, Birmingham, is now Dean of Faculty there, and is not on the list of staff in the Department. However, it is likely that she will be included in the RAE, and so the analysis included her.
Next, the names of staff from this list were used for Science Citation Index (SCI) and Social Sciences Citation Index (SSCI) searches on MIMAS [1] or (in just two cases - see below) on BIDS [2]. A search was carried out to see how many times each individual had been cited. Only papers they had published since 1994 were examined for citation counting purposes. MIMAS automatically de-duplicated hits from SCI and SSCI. For the two BIDS searches, we de-duplicated by inspection. When doing the citation counts, reviews of books written by the target individuals were ignored. The citation searches were carried out over a short period in January 2000.
Citations to each individual as first-named author of a paper were used. This was both because full publication lists for many of the individuals were not available, and because of the complexities of trying to share a single citation between the authors of a multi-author paper. MIMAS permits the searching for citations to an author irrespective of where his or her name appears in the order of authors in a multi-authored paper.
However, the methodology methodology described in this paper relies purely on the ranking of Departments by the RAE rating and their ranking by citation count not the actual members involved.numbers involved. The previous studies (Seng & Willett, 1995) showed that this approach was robust.
We used BIDS only for J. Day (University of Northumbria at Newcastle) and A. Morris (Loughborough University), because of slow response time of MIMAS when handling these two very common names. As an aside, it is a matter of some concern that the new system, MIMAS, has much greater difficulty processing searches involving authors with common names than the older system, BIDS.
It should be noted that this research was carried out at the end of 1999 and in early 2000. Since then, no doubt many of the authors have received more citations, some academics have moved departments, some new staff have been recruited and others have retired.
One of the main problems with with this methodology was carrying out searches on common names. Other authors with exactly the same surnames and initials in quite different disciplines had been cited, and their citations were being duly recorded in the search. Such false drops were removed by inspection where the bibliographic details of the articles cited were clearly outside the field of librarianship or information science. However, in some many cases, there was genuine reason for doubt, as the journal or book title was ambiguous, or was in a cognate area to the one being considered. In such cases, the citations were included. There can be no doubt that some error is introduced by this, as demonstrated by one of our referees who tested our results and found that in his/her judgement, the numbers of hits were not identical to the numbers we found. However, the alternative, to obtain copies of the cited articles in doubt and to check that the the cited authors affiliation, or obtaining master lists of publications directly from each author in the library and information science departments, would have significantly added to the time and resources needed for the research. For this reason, the results presented below must be treated with caution.
On the other hand, it could be that citations for a different individual (say a US author) with the same name and initials who is active in librarianship or information management were credited to the UK academic under study. These two effects will, to some extent, cancel each other out. The errors introduced by the methodology are likely to be minor, bearing in mind that the research was not to find absolute figures, but rankings. However, as will be demonstrated below, some of the citation counts for Departments we obtained were very similar to each other. Our results, therefore, can only offer a fairly coarse differentiation between the Departments we studied.
The surname and initials as supplied by the departmental web sites were used. Hyphenated names were merged for the citation search; for example, Yates-Mercer became Yatesmercer, to follow the ISI (the producers of SCI and SSCI) conventions.
When the citation counts had been obtained, total number of citations for each department and the average number of citations per member of staff for each department were calculated. The Departments were then ranked in order of citations, and in order of the average number of citations per member of staff.
Three hundred and thirty eight staff were checked for citations. Two hundred and thirty five of these received no citations at all during the period in question. As an aside, one might wonder what the Higher Education Funding Councils would make of a Unit of Assessment where two thirds of the academics appear to have had no impact on their fellow academics at all. Twenty-six academics received one citation, twelve received two citations, seven received three citations, six received four citations and two received five citations. Fifty academics received more than five citations. In all, 2,301 citations were noted. Full details of our results, including individual scores, can be obtained from the senior author.
There have been recent moves by E. Davenport and H. Hall from Queen Margaret College, Edinburgh to Napier University, of D. Ellis from Sheffield to Aberystwyth and of M. Collier (formerly in the private sector, and who was cited seven times in the period) to Northumbria University. We analysed how much an institution is vulnerable to the departure of, or benefits from, the arrival of such individuals.
Table 1 shows the rankings for numbers of citations received before these changes and Table 2 shows the rankings for numbers of citations received after these changes.
RANK | INSTITUTION | NUMBER OF CITATIONS | PERCENTAGE OF TOTAL (%) |
---|---|---|---|
1 | University of Sheffield | 843 | 36.5 |
2 | City University, London | 605 | 26.2 |
3 | University of Loughborough | 338 | 14.6 |
4 | Queen Margaret College, Edinburgh | 104 | 4.5 |
5 | University of Wales, Aberystwyth | 78 | 3.4 |
6 | Queens University, Belfast | 62 | 2.7 |
7 | University of Strathclyde, Glasgow | 57 | 2.5 |
8 | Northumbria at Newcastle | 52 | 2.3 |
9 | Manchester Metropolitan University | 46 | 2.0 |
10 | Leeds Metropolitan University | 40 | 1.7 |
11 | Robert Gordon University, Aberdeen | 27 | 1.2 |
12 | University of Central England, Birmingham | 25 | 1.1 |
13 | University College, London | 11 | 0.5 |
14 | University of Brighton | 10 | 0.4 |
14 | University of North London | 10 | 0.4 |
16 | Liverpool John Moores University | 3 | 0.1 |
TOTAL | 2311 | 100 |
RANK | INSTITUTION | NUMBER OF CITATIONS | PERCENTAGE OF TOTAL (%) |
---|---|---|---|
1 | University of Sheffield | 726 | 32.1 |
2 | City University, London | 605 | 26.7 |
3 | University of Loughborough | 338 | 14.9 |
4 | University of Wales, Aberystwyth | 195 | 8.6 |
5 | Queens University, Belfast | 62 | 2.7 |
6 | Northumbria at Newcastle | 59 | 2.6 |
7 | University of Strathclyde, Glasgow | 57 | 2.5 |
8 | Queen Margaret College, Edinburgh | 56 | 2.5 |
9 | Manchester Metropolitan University | 46 | 2.0 |
10 | Leeds Metropolitan University | 40 | 1.8 |
11 | Robert Gordon University, Aberdeen | 27 | 1.2 |
12 | University of Central England, Birmingham | 25 | 1.2 |
13 | University College, London | 11 | 0.5 |
14 | University of Brighton | 10 | 0.4 |
15 | University of North London | 10 | 0.4 |
16 | Liverpool John Moores University | 3 | 0.0 |
TOTAL | 2268 | 100 |
The departure of E. Davenport and H. Hall from Queen Margaret College, Edinburgh, to Napier University had a dramatic effect on the College, which dropped from fourth to eighth place. D. Ellis departure from Sheffield to Aberystwyth did not affect Sheffields top position, but helped Aberystwyth. We do not know if Napier University will return Davenport and Hall under this UoA. Colliers arrival at Northumbria has an effect, moving the University from eighth to sixth place, but this is primarily because it is in a group of departments with very similar citation totals.
RANK | INSTITUTION | AVERAGE CITATIONS PER STAFF MEMBER |
---|---|---|
1 | City University, London | 50.42 |
2 | University of Sheffield | 49.6 |
3 | University of Loughborough | 16.9 |
4 | University of Strathclyde, Glasgow | 6.33 |
5 | Queen Margaret College, Edinburgh | 5.78 |
6 | University of Wales, Aberystwyth | 4.33 |
7 | Queens University, Belfast | 2.48 |
8 | Northumbria at Newcastle | 1.93 |
9 | Manchester Metropolitan University | 1.92 |
10 | University of Central England, Birmingham | 1.67 |
11 | Leeds Metropolitan University | 0.98 |
12 | Robert Gordon University, Aberdeen | 0.93 |
13 | University College, London | 0.61 |
14 | University of North London | 0.4 |
15 | Liverpool John Moores University | 0.33 |
16 | University of Brighton | 0.3 |
TOTAL | 146 |
A comparison of Table 1 with Table 3 shows some minor changes in order, in particular, a switch at the top between Sheffield University and City University, but there is little practical effect.
The average number of citations per member of staff taking into account the recent major staff changes is shown in Table 4.
RANK | INSTITUTION | AVERAGE CITATIONS PER STAFF MEMBER |
---|---|---|
1 | City University, London | 50.42 |
2 | University of Sheffield | 45.38 |
3 | University of Loughborough | 16.9 |
4 | University of Wales, Aberystwyth | 10.26 |
5 | University of Strathclyde, Glasgow | 6.33 |
6 | Queen Margaret College, Edinburgh | 3.5 |
7 | Queens University, Belfast | 2.48 |
8 | Northumbria at Newcastle | 2.11 |
9 | Manchester Metropolitan University | 1.92 |
10 | University of Central England, Birmingham | 1.67 |
11 | Leeds Metropolitan University | 0.98 |
12 | Robert Gordon University, Aberdeen | 0.93 |
13 | University College, London | 0.61 |
14 | University of North London | 0.4 |
15 | Liverpool John Moores University | 0.33 |
16 | University of Brighton | 0.3 |
A comparison of Table 3 with Table 4 shows that the only significant change in order as a result of recent staff changes was the improvement in Aberystwyths position.
Table 5 shows the thirty most heavily cited authors identified in this study.
RANK | NAME | INSTITUTION | NUMBER OF CITATIONS |
---|---|---|---|
1 | S. E. Robertson | City University, London | 439 |
2 | P. Willett | University of Sheffield | 410 |
3 | D. Ellis | University of Wales, Aberystwyth | 117 |
4 | M. Hancock-Beaulieau | University of Sheffield | 114 |
5 | C. Oppenheim | University of Loughborough | 102 |
6 | T. D. Wilson | University of Sheffield | 80 |
7 | D. Bawden | City University, London | 78 |
8 | E. Davenport | Queen Margaret College, Edinburgh (since moved) | 47 |
9 | M. Lynch | University of Sheffield | 46 |
10 | G. McMurdo | Queen Margaret College, Edinburgh | 41 |
11 | A. J. Meadows | University of Loughborough | 38 |
12 | G. Philip | Queens University, Belfast | 37 |
13 | J. Feather | University of Loughborough | 37 |
14 | E. M. Keen | University of Wales, Aberystwyth | 36 |
15 | D. Nicholas | City University, London | 34 |
16 | M. H. Heine | Northumbria at Newcastle | 33 |
17 | P. Brophy | Manchester Metropolitan University | 27 |
18 | S. Jones | City University, London | 26 |
19 | N. Ford | University of Sheffield | 25 |
20 | S. Walker | Leeds Metropolitan University | 24 |
20 | A. Morris | University of Loughborough | 24 |
22 | F. Gibb | University of Strathclyde | 23 |
23 | P. F. Burton | University of Strathclyde | 22 |
24 | C. McKnight | University of Loughborough | 23 |
25 | B. Usherwood | University of Sheffield | 22 |
26 | B. Loughbridge | University of Sheffield | 18 |
27 | J. Harrison | University of Loughborough | 17 |
28 | I. Rowlands | City University, London | 16 |
28 | A. Goulding | University of Loughborough | 16 |
28 | L. A. Tedd | University of Wales, Aberystwyth | 16 |
It should be noted that Professor Robertson only spends one day a week at City University but will, no doubt, be returned by City University for the 2001 RAE being considered to be on its staff list.
Based upon our results taking into account the recent changes of staff, and assuming that total numbers of citations, or average numbers of citations per member of staff can be used to predict the score order, we suggest that the order shown in Table 6 is likely to be seen in the 2001 RAE:
One can speculate regarding the actual RAE scores to be achieved by these HEIs. Twenty-three institutions were considered in this UoA in the 1996 RAE. The breakdown of RAE scores was as shown in Table 7 below:
1996 RAE RATING | NUMBER OF INSTITUTIONS | PERCENTAGE (%) |
---|---|---|
5* | 2 | 8.6 |
5 | 1 | 4.4 |
4 | 2 | 8.6 |
3a | 3 | 13.0 |
3b | 7 | 30.4 |
2 | 5 | 22.0 |
1 | 3 | 13.0 |
TOTAL | 23 | 100 |
We have analysed just 16 HEIs for this study. Assuming a similar percentage breakdown of 5*, 5, 4, 3a, 3b, 2 and 1 scores leads the following prediction (Table 8) for the distribution of RAE scores in 2001.
RATING | NO. OF INSTITUTIONS WITH THIS RAE SCORE> |
---|---|
5* | 1 |
5 | 1 |
4 | 1 |
3a | 2 |
3b | 5 |
2 | 4 |
1 | 2 |
TOTAL | 16 |
On this basis, a possible distribution of scores could be:
However, it must be stressed that there is no particular reason why the 2001 distribution of scores should be similar to that for the 1996 RAE. We emphasise that the prediction from our work is shown in Table 6.
We have already noted that moves by key members of staff can have an impact of likely RAE performance. Hancock-Beaulieau (cited 114 times) recently moved from City University to Sheffield University. This increased the total number of citations for Sheffield from 612 to 726 and decreased Citys from 719 to 605. The cases of Davenport and Ellis are particularly pertinent, as they are also highly cited but moved either from, or to, middle ranking HEIs. Indeed, even less heavily cited individuals can have an important impact on the middle ranking and weaker departments.
A key factor is the different citation patters for different parts of what is a very broad discipline. Robertson, Ellis and Beaulieu all work in the computer science end of LIS, which is likely to be more heavily cited than more traditional library oriented research, and this comment applies even more to work in chemoinformatics undertaken by Willett. This fact will tend to skew the figures substantially.
We suggest that some departments may be dependent upon one individual. We have calculated that S.E. Robertson of City University has received 439 citations. City University currently has a total of 605 citations and the loss of S.E. Robertson would mean that Citys total would decrease to 166, and thus move down that ranking to fourth. The average citations per member of staff would decrease from 50.42 to 15.1 and so City would move down that ranking from first to third. This shows that City University would be vulnerable to the loss of this individual. Similar analyses demonstrate the dependence of Sheffield on P. Willett, Aberystwyth on D. Ellis, Belfast on G. Philip, Leeds Metropolitan on S. Walker, University of North London on R.E. Burton, Manchester Metropolitan on P. Brophy and Northumbria on M.H. Heine.
Table 5 showed the current thirty most cited UK authors in library and information science. In 1992, D. Ellis was the most cited author with 71 citations. M. Hancock-Beaulieau was second, C. Oppenheim was third and P. Willett was fourth (Oppenheim, 1995). There has been a large increase since 1992 in the number of citations gained, but the order of the top few has hardly changed. The four most cited authors in 1992 are amongst the top five most cited authors currently. It is clear that there has been little change in this regard in the past decade.
Comparing the citation counts in 1992 (Oppenheim, 1997) with our results, it is clear that Loughborough has remained in third place, but the difference between the scores for Loughborough and the top two Departments (City and Sheffield) has grown.
One can use the methodology to identify those individuals with very low citation counts. These might be considered for exclusion from the RAE return by their HEIs, though how useful such an approach would be is open to debate. On the one hand, it could ensure a higher RAE score. On the other hand, funding from the Funding Councils is based on a calculation multiplying a figure linked to the RAE rating to the number of research-active staff returned, so this means that less money may be received. There is an argument that the psychological impact of a higher RAE score both internally and externally is more important than the risks involved in returning fewer research-active staff. This is just one of many factors in RAE gamesmanship that HEIs have to consider.
Other analyses could be carried out on the actual publications selected to be returned in the RAE. At the end of the 2001 RAE, the Funding Councils will put a list of all publications submitted in the RAE on its Web site. A citation count could be carried out on just those publications, and compared to other publications authored by the same member of staff, to see if perhaps other publications may be more appropriate for the return. Such a method should be used with great caution however, as recently published papers will receive fewer publications than older ones.
Another analysis could be by Impact Factor of the journals in which the articles published by the academics have appeared. This is defined as the average number of citations received per paper by a journal. Journal Citation Reports, which is published by ISI, lists journals by Impact Factor within each discipline. Whilst, again, such data must be used with caution (see Amin & Mabe, 2000 for a recent critical review of the topic), as review journals get extremely high Impact Factors, yet review articles are not appropriate for return for the RAE, there is an argument that the RAE return includes a high proportion of high Impact Factor journals.
Use of citation counting could lead in future to an assessment of how departments are currently performing, and will be able to predict their performance in the RAE relative to close competitor departments. This in turn will lead to possibilities for remedial action, if considered appropriate, for weak departments.
Further work is underway for a similar study in another subject area, sports science (Iannone & Oppenheim, [unpublished]).
This research is open to a wide variety of criticisms. These range from the fundamental to the detailed. For example, we became aware of the inadequacy of some of the web sites we inspected. Some were not complete and may not have been up to date. For example, E. Davenport and H. Hall were still listed at the time of our onresearch on the Queen Margaret College staff list even though they had both departed. Another problem was the effect of changing names. There were a number of authors who were cited both in maiden names and married names. For example, K. Henderson of Strathclyde, was previously K. de Placido. We knew a few such examples, and in such cases we used all the alternative names, but undoubtedly there were more that we did not recognise. Our methodology can take no account of name changes such as these.
At times, we had to make arbitrary judgements regarding whether a citation was to our chosen academic, or to someone in a different subjectarea that happened to have the same name. This remains an important weakness for any such study that does not have to hand a comprehensive master list of all publications by the academics under study.
Another criticism is that the research was undertaken some time ago, and would be better done much closer to the timing of the RAE.
A potential criticism is that this technique is usurping the peer group evaluation task. We believe that citation counts should be used to create a rough first go at the rank ordering for, but not the actual RAE scores to be assigned. This must remain a task for the panel of experts. However, we believe the use of citation counts would reduce the overall costs of the RAE, help speed up the panels work and would reduce the amount of paperwork that the HEIs submit.
Another potential criticism is that that the exercise is too heavily tied to traditional print published materials, and that any such assessment should include analyses of Web links to Departmental Web pages as well. Thomas and Willett (2000) have described the first attempt at such an analysis.
A final potential criticism must be addressed: by publishing these results in advance of the RAE panels decision-making, we may influence the UoA 61 RAE panel either to follow our suggested order, or to deliberately choose a different order in order to disprove our predictions. However, this criticism implies that this paper would influence the RAE panel. We believe that the panels methods of working, and its integrity in its approach to its work, means it would not take the slightest notice of this article. In other words, we are convinced that our results should not, and will not, affect the peer judgment of the RAE panel either positively or negatively.
We would like to thank our two anonymous referees for their helpful comments.
How to cite this paper:
Holmes, Alison and Oppenheim, Charles (2001) "Use of citation analysis to predict the outcome of the 2001 Research Assessment Exercise for Unit of Assessment (UoA) 61: Library and Information Management". Information Research, 6(2) Available at: http://InformationR.net/ir/6-2/paper103.html
© the author, 2001. Updated: 2nd January 2001
Contents |
|
Home |