Impact Factor Spam

I received the following unsolicited email (slightly curtailed) from the Royal Society of Chemistry:

Dear Dr Murray Rust

Quality is the focus at RSC Publishing: the recently published 2010 Journal Citation Reports ® prove that our quality is better than ever. And that is thanks to our authors and referees.

Our average impact factor (IF) now stands at 5.5. It's an impressive figure, especially when compared with the average for a chemistry journal* of 2.54.

But if you're thinking that there's nothing special about this, as most chemistry publishers are celebrating an overall rise in their impact factors, think again. RSC Publishing figures have risen by 63% since 2003 – almost double the average rise.

Of the top 20 journals in the multidisciplinary chemistry category, six are from RSC Publishing. No other publisher has more.

83% of our journals listed in this year's report have an IF above 3. No other publisher can boast such a large proportion of titles at this level, demonstrating just how well-cited our entire portfolio truly is.

(Data based on 2010 Journal Citation Reports ®, (Thomson Reuters, 2011).

I have two concerns – one with the impact factor (see below) and one with the RSC's use of bulk unsolicited email (SPAM). Dealing with the second first:

A European Directive ( makes it clear that the RSC's activity is illegal:

One of the key points of this legislation is that it is unlawful to send someone direct marketing who has not specifically granted permission (via an opt-in agreement). Organisations cannot merely add peoples details to their marketing database and offer an opt out after they have started sending direct marketing. For this reason the regulations offer more consumer protection from direct marketing.

I will be interested to hear from them why they have broken this directive and why I should not report them. I am not on any of their mailing lists and this type of mail wastes my time and fills up my mailbox. Even if it turns out that there is a legal loophole it is unethical to waste scientists time in this manner. But it was the RSC itself which opined that Open Access publishing was "ethically flawed" – have they ever retracted that opinion formally?

The main issue however is general – the growing and mindless use of Impact Factors and some measure of "quality". There are many reasons why IFs are frequently meaningless ( ). Bjorn Brembs at #okcon2011 gave us a presentation showing how IFs were fundamentally flawed and how publishers could negotiate to get them adjusted favourably (see - this is an old sldeshow and if Bjorn reads this maybe he can update anything). Objectively I see the following:

  • There is no objective definition of what a citation is. As far as I can see it's a mixture of what the closed commercial indexing organization thinks it is and the negotiating publisher. If we are going down the mindless citation route then at least we need Open Citations. But if we extract lists of bibliographic references (citations) from publications then we will be sued by the publishers. So citations are whatever the powerful forces in the publishing industry want them to be.
  • IFs are per journal. This about as meaningful a measure of worth as deciding that a person is well-dressed because they shop at a given store. You can be badly dressed with expensive cloths and vice versa. And the worth of academic publications is about as hard to measure as style. It's what we collectively think, not what we write in citation lists. The journal is an outdated concept in the current world – it exists only to brand publications (its use as a collection for disseminating a subject is disappearing). It's like saying "X is a good blogger because their blog is hosted by Y and lots of people read Y". No, people say "X writes good blog posts". There's enough technology in the world that we can have per-author metrics, but it won't suit the publishers because then we shall evaluate science by the worth of individuals rather than the strength of the marketing department of a money-making institution. And that's anathema to the publishers.

The sad thing is that young people have now been terrified by the Impact and H factors, and I can't give them much hope. When I published my first paper in 1967 (J. Chem. Soc. (now the RSC), Chemical Communications) I did it because I had a piece of science I was excited about and wanted to tell the world about. That ethos has gone. It's now "I have to publish X first author-papers in Y journals with impact factors great than Z".

I can't see how to change that other than by disruptive action in the publishing world. When I have fully worked out what that is I will start doing it and persuading other people to do it. Hopefully it will be legal. If not I shall be prepared to take the consequences.


This entry was posted in Uncategorized. Bookmark the permalink.

15 Responses to Impact Factor Spam

  1. "It’s now “I have to publish X first author-papers in Y journals with impact factors great than Z”."

    This is not just the young scientists. This is science. Departments get funding based on the JIF; universities rank candidates (at any level) using the JIF.

    I am foolish enough to publish in a journal that has no JIF. I have requested them to add the journal quickly, because it not being included in Web of Science, lowers my changes of getting a job. Really. This is Science.

    If we want to change this, we have to do this at the source. Instead of big funding agencies focusing on a glamorous new journal, they should focus on fixing these kind of problems in science.

    • The misused 'them' where I requested the journal to be included is of course Thomson Reuters, the owners of Web of Science.

      That reminds me... Open Bibliography must be used to set up an "Open Network of Science" to replace proprietary Web of Science, using CiTO, taking citations seriously. That is too what big funders can focus on, if they are serious about improving Science.

      • pm286 says:

        This is the whole idea of Open Scholarship to create firstly a spine for bibliography and then branches for citations, annotations, etc.

  2. Will try to update the slides within the next two weeks.

  3. Ben O'Steen says:

    Just to add some data to your citation index vs corpus point: In - “The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index” Peder Olesen Larsen and Markus von Ins, Scientometrics. 2010 September; 84(3): 575–603 there is a lot of interesting data.

    I used some of it in writing about Open Citation Data ( but the key parts for me are:

    - Published corpus (in STM/related) doubles every 18 years on average.
    - Citation indexes (root of many IFs) are growing at a markedly lower rate.

  4. Henry Rzepa says:

    Its not just citations that are getting out of control. We recently published in Angew Chemie. Just before the article came out, they sent us an email asking if we minded being highlighted on the inside front cover? In effect, the article is pre-selected to "appear before its been published" (whatever that means!). All pretty innocuous you might think?

    Well no, because such "special selection" has become highly regarded by heads of department etc. At reviews, they might even ask you "how many articles have you had highlighted recently?" (as opposed to "done any good science recently?").

    So now, superceding even the referees is the "special selection or highlight", often made by journal editors or sub-editors (perhaps on the basis of the science, but possibly because visually, it might make for a good cover, or a controversial item for a blogt. A particularly snappy title can also do it, such as "Illusory Molecular Expression of Penrose Stairs by an Aromatic Hydrocarbon", also as it happens published in a Wiley journal. I pick this because it does sound intriguing, does it not? In fact, it boils down to a dissymmetric molecule with D2 symmetry (which, IMHO, is not great new science). And its not easy to get published in Angew!

    Like the rest of the world, science now is very much in part how its spun, and not the impact it has (will have) down the line!

    • pm286 says:

      Thanks Henry,
      Yes - there is lots of spin going on. OK, it's probably always been there but not pushed by commercial interests. If scientists wanted to promote themselves at least they did it and not some journal apparatchik.

      Things are getting worse.

  5. Elizabeth Brown says:

    It's not just the RSC: Elsevier, AIP, AAAS and others have sent similar announcements regarding increases in Impact Factor rankings in the last couple of weeks. This is the first time I remember getting so many at the same time. What's frustrating to me as a librarian is that the impact factor is not used as a primary decision to purchase (or cancel) but rather the scope, price and other factors (editorial presence, etc.)are more important.

  6. Tim says:

    Competition between professional journals has become very obvious and unhealthy. Many journals publicize their impact factors to show how better they are now than before. The irony is that most of these journals that send out emails citing their impact factor are reputed ones. Journals should be focusing on quality rather than increasing their impact factor by few points. An interesting read is a two part article:

  7. Pedro S says:

    Even more egregious is the shameless self-promotion by authors and journals: an editor of J Biomol Struct Dyn has had one of his papers in that journal cited 34 times, of which only three are (at first sight) bona fide ( 21 citations in the same journal, + 11 citations by the author him/herself). This journal's IF has increased 5-fold in hte last year, mostly because of HUGE increase in self-citation. However, Science Funding organs still rely in metrics as flawed as IF :-(

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>