Showing posts with label Bibliometrics. Show all posts
Showing posts with label Bibliometrics. Show all posts

Saturday, December 14, 2013

Gender imbalance in scientific research

In a recent article in Nature researchers have used bibliometrics to highlight the gender disparity in the publication output within the sciences around the world. The study reveals that female scientists are publishing less volume than their male counterparts and that their publications have a lower citation impact as well. The authors touch on a number of well documented imbalances between the genders in the sciences including funding, earnings, hiring and patenting.

One of the issues the researchers had to get around in their study was how to determine gender of the authors of journal articles indexed by Thomson Reuters in the Web of Science. They used a combination of sources to match the name to a gender including social security databases, Wikipedia and even Facebook (the interesting methodology can be read in their supplementary material).

It would be interesting to see the same data normalized for funding. In many cases funding agencies are awarding more money to male scientists than to female scientists (e.g. from the ARC's website for Number of participants on all funded projects Male = 2280 and Female = 622). It would be interesting to know whether perhaps female scientists were achieving "more with less" when it came to publication output compared with funding and opportunity.

While there is no single answer to the problem these researchers are describing they do make a good point about improving the ability for female scientist to travel and collaborate internationally:

"For a country to be scientifically competitive, it needs to maximize its human intellectual capital. Our data suggest that, because collaboration is one of the main drivers of research output and scientific impact, programmes fostering international collaboration for female researchers might help to level the playing field."

Researcher mobility in all fields is a good strategy for any organisation that can afford it - and is certainly critical for any developed nation's research strategic plan.


http://www.nature.com/news/bibliometrics-global-gender-disparities-in-science-1.14321

Tuesday, November 17, 2009

UCD Bibliometrics Booklet

University College Dublin has released this interesting and useful guide to bibliometrics.

It gives concise explanations of citations, benchmarks, h-index, journal rankings and eigenfactors.


Sunday, October 18, 2009

ARC-supported research: the impact of journal publication output 2001-2005

This week the Australian Research Council (ARC) released a report conducted by Bev Biglia and Linda Butler of the Research Evaluation and Policy Project (REPP) on the impact of journal publication output from ARC funded research. The study focuses on the number of citations each paper received relative to the average number of citations a paper received in the World in that same field in the same time period. In other words - are these particular papers cited more often than the average? If the papers in a particular field are cited more often than the World average then you can say that these papers, and this field of research, have had more impact than the average. The number of citations that your paper receives can be determined by checking citation data suppliers such as Thomson-Reuters' Web of Science or Elsevier's Scopus - and these can be benchmarked against the Australian or World average citation rates for the same field through various products offered by the same companies.

It is encouraging to see that publications resulting from ARC funded research are generally above the World citation rates in almost all fields of research. Similar bibliometrics will be used as one of the suite of indicators of quality in the Excellence in Research for Australia (ERA) initiative that is currently underway in the higher education sector. Universities can measure their research impact through publications against World benchmarks using the exact same methodology as in this report. In fact, individual researchers can also use the same methodology to rate their own research output against World benchmarks provided their research output volume is high enough to make the metrics significant.

The media release and report can be found at the ARC's webpage here: http://www.arc.gov.au/media/releases/media_15Oct09.htm

Tuesday, January 20, 2009

Eigenfactor

Eigenfactor is a web based tool that ranks research journals. There are a number of journal rankings available, including; Thomson-Reuters' impact factor, the Australian Research Council (ARC) journal ranking list and various other discipline specific ranking lists. Each of these lists attempts to rank journals based on their quality within a field of research - and each list has its own ranking criteria.

Eigenfactor uses citation metrics to rank journals and gives weightings to citations from specific journals based on their quality. It works in much the same way that Google's pagerank algorithm works. Eigenfactor also includes a indicator of the value for money a journal delivers; this is a feature that is quite unique to the Eigenfactor.

To see a short article I wrote on the Eigenfactor for Collection Management click here.

To visit the Eigenfactor website and try it for yourself, click here.

Information on the ARC's journal ranking lists can be found at the ARC's ERA website.