University managers are constantly seeking simple ways to measure
and evaluate the research output of their university’s academics. While peer
review of scholarly research papers is arguably the best way to determine the
quality of any individual research output it is also acknowledged that peer
review is time consuming, expensive and subjective. Journal-level metrics, such as a journal quality
list, present managers with a convenient, objective and inexpensive tool for determining
the quality of scholarly articles. However,
managers relying on journal level metrics to evaluate articles may be suffering
from the ecological fallacy. The ecological fallacy occurs when conclusions
are made about individuals based only on analyses of group data. In this case, judging
the quality of an individual article based only on the journal in which it is
published.
Using journal level metrics to evaluate research quality is not a
new phenomenon with the earliest examples of journal quality lists being found
as far back as the late 60s and early 70s. Journal level metrics often take the form of lists
of scholarly journals which have been ranked against some particular criteria. While
there is no consensus on how a journal list should be compiled
many lists having been created using methodologies ranging from perceptual and
peer review based rankings through to objective citation based rankings.
The use and misuse of journal rankings is well documented in the literature.
Within Australia, and internationally, the academic community is shifting away
from the use of journal metrics to evaluate research. Australian academics were introduced to the ranked journal list as part of the national research evaluation exercise,
Excellence in Research for Australia (ERA). The rankings were considered highly
influential in determining a university’s ERA outcome so many institutions
began to provide incentives to staff to publish in ‘A*’ and ‘A’ ranked
journals. The ranked journal list quickly became the most contentious issue of the
ERA and by 2011, then Minster for Education, Kim Carr, announced that it would
be discontinued because its ‘existence
was focusing ill-informed undesirable behaviour in the management of
research’. In 2010, Australia’s
other major research funding agency the NHMRC, released a statement saying that
the Journal Impact Factor would no longer be accepted in applications for
funding or used in the peer review of individual applications. The statement went
on to say that the Journal Impact Factor is ‘not a sound basis upon which to
judge the impact of individual papers’.
Internationally, the San
Francisco Declaration on Research Assessment (DORA),
originating from the December meeting of the American Society for Cell Biology,
put forward a number of recommendations for funding agencies, universities and
researchers regarding the use of metrics for research evaluation. Amongst it
other recommendations DORA aims to halt the use of journal-based metrics for
the research evaluation of individual researchers. As of August 2015 the
declaration had over 12,500 individual and 588 institutional signatories.
While there are some compelling reasons to use journal quality lists
to evaluate the research performance of academics including convenience, objectivity and cost savings
there are also disadvantages. Some
of the disadvantages of using journal quality lists for research evaluation
include, reduced academic freedom, promotion of outlet targeting, driving research
in the direction of publisher preference and disadvantaging of specialist
journals and specialist fields of research.
Whether we like them or not journal quality lists have been part of research
evaluation for the past 50 years and their legacy persists today. As the
requirement for convenient and cost-effective research evaluation mechanisms increases
it is possible that journal quality lists will continue to play a part in
research evaluation into the future. For examples of journal lists from around
the world visit www.researchimpact.com.au/viewforum.php?f=20.
No comments:
Post a Comment