Blog Archives

The tyranny of the top science journals?

Are journals like Science, Nature and the like giving too much importance to themselves to the point of distorting the scientific process? A Nobel prize winner seems to think so (from The Guardian)

Journal Impact Factor: why you should not care… too much

I have been publishing scientific manuscripts for the past 22 years. My educated comments with regard to journal impact factor has always been the same (If you do not know what JIF is, please have a look at this Wikipedia entry). First order, you should publish in the most important journals for your field. If their JIF are low, who cares as long as your work is important to your field and well cited. For example, the scientific discovery of 2012 according to Science (very high JIF) is the publication of the experimental finding of the Higgs boson… in Phys Lett B (low JIF relative to Science)!

Do not forgot that from an historical perspective, we are awfully bad at predicting what will be the next important discovery down the road. A number of fundamental discoveries and early engineering feats were discarded at first. Similarly there are numerous example of scientists having had tremendous issues in getting those game-changing results published, even those ending up winning Nobel prizes. Karry Mullins’ PCR work is one of many examples of work was rejected by journals having top JIF but for which the application of this very technique was published in Science and Nature and the citations counts of these second generation papers also receiving higher numbers than the original, award-winning work!)

Now, you do not have to agree with this lone scientist opinion but certainly you should have a look at the The San Francisco Declaration on Research Assessment or DORA petition, which is supported by the “big boys” (no discrimination intended). The declaration statement is actually a very interesting read and it covers the historical origin of the JIF (which was not for evaluating researchers at all) and further call for dropping journal-based metrics in assessing scientific productivity for funding and promotion. Over 240 organizations and 6000 individuals have already signed the declaration.

In conclusion, do not loose a good night sleep over your favorite journals’ impact factors…

Mendeley is joining Elsevier…

Mendeley is a serious options for those looking at a PDF management (and in text citation) system for scientific literature. It is a rather good option to replace the old-timer Endnote.

Remember that EndNote is a product of Thomson Reuters; you know Web of Science, Impact Factor and so on.

Well, Mendeley is now part of the Elsevier family, another major player. It is going to be interesting where this will lead Mendeley in the longer term.

Read the link here: Team Mendeley is joining Elsevier. Good things are about to happen! | Mendeley Blog.

Exit impact factor and h-index, welcome real-time reputation metrics?

An interesting reads at TechCrunch on new forms of dissemination and measurements of scientific impact: Reputation Metrics Startups Aim To Disrupt The Scientific Journal Industry.

In a similar vein, you might want to read the excellent editorial by John R. Alder from Stanford entitled “A New Age of Peer Reviewed Scientific Journals” published in the open access journal Surgical Neurology International. The manuscript is available on Cureus blog.

 

Most of the crackpot papers which are submitted to The Physical Review are rejected, not because it is impossible to understand them, but because it is possible. Those which are impossible to understand are usually published. When the great innovation appears, it will almost certainly be in a muddled, incomplete and confusing form. To the discoverer himself it will be only half-understood; to everybody else it will be a mystery. For any speculation which does not at first glance look crazy, there is no hope.
%d bloggers like this: