Citation analyses, so-called bibliometric methods, allow a quantitative evaluation of scholarly journals (via the impact factor) as well as the evaluation of the publishing activities of persons and institutions. They play an even larger role in the evaluation of research performances in the light of narrow financial resources.
Database of the Institute of Scientific Information (ISI, today belonging to Clarivate Analytics), which includes the Science Citation Index published since the 1960s. Priorly represented are disciplines, which normally publish in English and in journals. Conference proceedings are not analyzed! No concrete statements about the evaluation of publications via Web of Science are possible for subject areas of the TUHH, which mainly publish in German.
Multidisciplinary abstract and citation database of the publisher Elsevier. As with the competitor database Web of Science citation analysis as well as further sophisticated research strategies are possible with Scopus.
Google service for scholarly research, shich also show citations, which can be tracked. Includes expecially scholarly journals with indexed full texts from commercial publishers. User without access authorization to the equivalent full text can onyl view the abstract.
Since 2011 scholars can create a personal profile with the service “Google Scholar Citations” (GSC). Using the freely available software Publish & Perish of Anne-Wil Harzing Google Scholar can be used for further citation analyses.
The often used indicator “Impact-Factor” refer to the evaluation of a journal. This “Journal Impact Factor” is calculated by dividing the number of current year citations to all the source items published in that journal during the previous two years. So an impact factor of 15 for a journal implies, that every article of the specifc journal from the previous two years got 15 citations in average in the following year. Using the database “Journal Citation Reports” you can search for and compare impact factors. An indicator for the personal reasearch performance is the so-called h-index (Hirsch index).
- General challenge, that citations sometimes were made without a real rational reason, and that papers, which results are wrong or corrected afterwards, may also be often cited.
- Common subject-specific practices of publishing should be kept in mind when interpreting citation results (average quantity of publication per scholar, practice to publish mainly via conference proceedings, language of publication)
Further links for information
Contact: Thomas Hapke
Concerning the quantitative evaluation of publications the following statement by Per O. Seglen at a conference on peer review hosted by the German Max-Planck-Society may be valid:
“So, the take-home-lesson of our exercise in bibliometric evaluation is that no matter how considerate and extensive an evaluation is, it will be implemented only to the extent that is in consonance with the prevailing power structure.”
(Seglen, Per O.: Bibliometric analysis – what can it tell us? S. 139-151. In: Science between evaluation and innovation : a conference on peer-review. Ringberg-Symposium 2002. München: Max-Planck-Gesellschaft, 2003.)