The Full Wiki

Impact factor: Map

Advertisements
  
  

Wikipedia article:

Map showing all locations mentioned on Wikipedia article:

The impact factor, often abbreviated IF, is a measure reflecting the average number of citations to articles published in science and social science journals. It is frequently used as a proxy for the relative importance of a journal within its field, with journals with higher impact factors deemed to be more important than those with lower ones. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information (ISI), now part of Thomson Reuters. Impact factors are calculated yearly for those journals that are indexed in Thomson Reuter's Journal Citation Reports.

Calculation

In a given year, the impact factor of a journal is the average number of citations to those papers that were published during the two preceding years. For example, the 2003 impact factor of a journal would be calculated as follows:

A = the number of times articles published in 2001 and 2002 were cited by indexed journals during 2003
B = the total number of "citable items" published in 2001 and 2002. ("Citable items" are usually articles, reviews, proceedings, or notes; not editorials or Letters-to-the-Editor.)


2003 impact factor = A/B


(Note that 2003 impact factors are actually published in 2004; it cannot be calculated until all of the 2003 publications had been received by the indexing agency.)

New journals, which are indexed from their first published issue, will receive an impact factor after two years of indexing; in this case, the citations to the year prior to Volume 1, and the number of articles published in the year prior to Volume 1 are known zero values. Journals that are indexed starting with a volume other than the first volume will not get an impact factor until they have been indexed for three years. Annuals and other irregular publications, will sometimes publish no items in a particular year, affecting the count. The impact factor relates to a specific time period; it is possible to calculate it for any desired period and the Journal Citation Reports (JCR) also includes a 5-year impact factor. The JCR shows rankings of journals by impact factor, if desired by discipline, such as organic chemistry or psychiatry.

Use

The IF is used to compare different journals within a certain field. The Web of Knowledge indexes 9000 science and social science journals from 60 countries and the results are widely (though not freely) available. In addition, the IF is an objective measure.

Criticisms

Numerous criticisms have been made of the use of an impact factor. Besides the more general debate on the usefulness of citation metrics, criticisms mainly concern the validity of the impact factor, possible manipulation, and its misuse.

Validity

  • The impact factor could not be reproduced in an independent audit (but see Thomson Scientific's reply).
  • The impact factor refers to the average number of citations per paper, but this is not a normal distribution. It is rather a Bradford distribution, as predicted by theory. Being an arithmetic mean, the impact factor therefore is not a valid representation of this distribution and unfit for citation evaluation.
  • In the short term - especially in the case of low-impact-factor journals - many of the citations to a certain article are made in papers written by the author(s) of the original article. This means that counting citations may be independent of the real “impact” of the work among investigators. Garfield, however, maintains that this phenomenon hardly influences a journal's impact factor. However, a study of author self-citations in diabetes literature found that the frequency of author self-citation was not associated with the quality of publications. Similarly, journal self-citation is common in journals dealing in specialized topics having high overlap in readership and authors, and is not necessarily a sign of low quality or manipulation.


Manipulation

A journal can adopt editorial policies that increase its impact factor. These editorial policies may not solely involve improving the quality of published scientific work.
  • Journals may publish a larger percentage of review articles which generally are cited more than research reports. Therefore review articles can raise the impact factor of the journal and review journals will therefore often have the highest impact factors in their respective fields.
  • Journals may change the fraction of "citable items" compared to front-matter in the denominator of the IF equation. Which types of articles are considered "citable" is largely a matter of negotiation between journals and Thomson Scientific. As a result of such negotiations, impact factor variations of more than 300% have been observed. For instance, editorials in a journal are not considered to be citable items and therefore do not enter into the denominator of the impact factor. However, citations to such items will still enter into the numerator, thereby inflating the impact factor. In addition, if such items cite other articles (often even from the same journal), those citations will be counted and will increase the citation count for the cited journal. This effect is hard to evaluate, for the distinction between editorial comment and short original articles is not always obvious. "Letters to the editor" might refer to either class.
  • Several methods, not necessarily with nefarious intent, exist for a journal to cite articles in the same journal which will increase the journal's impact factor.
  • In 2007 a specialist journal with an impact factor of 0.66 published an editorial that cited all its articles from 2005 to 2006 in a protest against the absurd use of the impact factor. The large number of citations meant that the impact factor for that journal increased to 1.44. As a result of the increase, the journal was not included in the 2008 Journal Citation Report.


Misuse

  • The impact factor is often misused to evaluate the importance of an individual publication or evaluate an individual researcher. This does not work well since a small number of publications are cited much more than the majority - for example, about 90% of Nature's 2004 impact factor was based on only a quarter of its publications, and thus the importance of any one publication will be different from, and in most cases less than, the overall number. The impact factor, however, averages over all articles and thus underestimates the citations of the most cited articles while exaggerating the number of citations of the majority of articles. Consequently, the Higher Education Funding Council for England was urged by the House of Commonsmarker Science and Technology Select Committee to remind Research Assessment Exercise panels that they are obliged to assess the quality of the content of individual articles, not the reputation of the journal in which they are published.


Other measures of impact

Related indices

Some related values, also calculated and published by the same organization, are:
  • the immediacy index: the number of citations the articles in a journal receive in a given year divided by the number of articles published.
  • the cited half-life: the median age of the articles that were cited in Journal Citation Reports each year. For example, if a journal's half-life in 2005 is 5, that means the citations from 2001-2005 are half of all the citations from that journal in 2005, and the other half of the citations precede 2001.
  • the aggregate impact factor for a subject category: it is calculated taking into account the number of citations to all journals in the subject category and the number of articles from all the journals in the subject category.


These measures apply only to journals, not individual articles or individual scientists (unlike the H-index). The relative number of citations an individual article receives is better viewed as citation impact.

It is, however, possible to measure the Impact factor of the journals in which a particular person has published articles. This use is widespread, but controversial. Garfield warns about the "misuse in evaluating individuals" because there is "a wide variation from article to article within a single journal". Impact factors have a large, but controversial, influence on the way published scientific research is perceived and evaluated.

PageRank algorithm

In 1976 a recursive impact factor that gives citations from journals with high impact greater weight than citations from low-impact journals was proposed.Such a recursive impact factor resembles the PageRank algorithm of the Google search engine, though the original Pinski and Narin paper uses a "trade balance" approach in which journals score highest when they are often cited but rarely cite other journals. A number of subsequent authors have proposed related approaches to ranking scholarly journals.In 2006, Johan Bollen, Marko A. Rodriguez, and Herbert Van de Sompel also proposed using the PageRank algorithm. From their paper:

ISI Impact Factor PageRank Combined
1 52.28 ANNU REV IMMUNOL 16.78 Nature 51.97 Nature
2 37.65 ANNU REV BIOCHEM 16.39 Journal of Biological Chemistry 48.78 Science
3 36.83 PHYSIOL REV 16.38 Science 19.84 New England Journal of Medicine
4 35.04 NAT REV MOL CELL BIO 14.49 PNAS 15.34 Cell
5 34.83 New England Journal of Medicine 8.41 PHYS REV LETT 14.88 PNAS
6 30.98 Nature 5.76 Cell 10.62 Journal of Biological Chemistry
7 30.55 Nature Medicine 5.70 New England Journal of Medicine 8.49 JAMA
8 29.78 Science 4.67 Journal of the American Chemical Society 7.78 The Lancet
9 28.18 NAT IMMUNOL 4.46 J IMMUNOL 7.56 NAT GENET
10 28.17 REV MOD PHYS 4.28 APPL PHYS LETT 6.53 Nature Medicine


The table shows the top 10 journals by ISI Impact Factor, PageRank, and a modified system that combines the two (based on 2003 data). Nature and Science are generally regarded as the most prestigious journals, and in the combined system they come out on top.

The Eigenfactor is another PageRank-type measure of journal influence, with rankings freely available online.

See also

  • H-index, for the impact factor of individual scientists, rather than journals.
  • PageRank, the algorithm used by Google, based on similar principles.
  • Eigenfactor, another journal citation ranking method.
  • SCImago Journal Rank, an open access journal metric which is based on Scopus data and uses an algorithm similar to PageRank.


References

  1. S.A. Marashi. On the identity of “citers”: are papers promptly recognized by other investigators? (2005) Med. Hypotheses 65, 822. .
  2. Impact Factor, Immediacy Index, Cited Half-life
  3. eigenfactor.org


External links




Embed code:
Advertisements






Got something to say? Make a comment.
Your name
Your email address
Message