Systematic characterization of semiconductor colloidal quantum dots (cQDs) response to ionizing radiation must be performed to use them in radiation detection. In this study, the robustness of multi-shell (MS) and core/shell (CS) cQDs was investigated under irradiation. Radioluminescence (RL) measurements with kV and MV photon beams revealed a better resistance of MS cQDs to ionizing radiation, with their spectra fluctuating by barely ∼ 1 nm. A systematic signal recovery between subsequent irradiations was noticed for MS cQDs only. A beam energy dependence of the RL stability was detected between kV and MV energies. At the same point of dose cumulated, the RL signal loss for the kV beams was observed to be ∼6-7% smaller than that of the MV beam, for both types of cQDs. These results demonstrate that MS cQDs are better candidates as ionizing radiation sensors than CS cQDs, especially in the kV energy range.
I have been publishing scientific manuscripts for the past 22 years. My educated comments with regard to journal impact factor has always been the same (If you do not know what JIF is, please have a look at this Wikipedia entry). First order, you should publish in the most important journals for your field. If their JIF are low, who cares as long as your work is important to your field and well cited. For example, the scientific discovery of 2012 according to Science (very high JIF) is the publication of the experimental finding of the Higgs boson… in Phys Lett B (low JIF relative to Science)!
Do not forgot that from an historical perspective, we are awfully bad at predicting what will be the next important discovery down the road. A number of fundamental discoveries and early engineering feats were discarded at first. Similarly there are numerous example of scientists having had tremendous issues in getting those game-changing results published, even those ending up winning Nobel prizes. Karry Mullins’ PCR work is one of many examples of work was rejected by journals having top JIF but for which the application of this very technique was published in Science and Nature and the citations counts of these second generation papers also receiving higher numbers than the original, award-winning work!)
Now, you do not have to agree with this lone scientist opinion but certainly you should have a look at the The San Francisco Declaration on Research Assessment or DORA petition, which is supported by the “big boys” (no discrimination intended). The declaration statement is actually a very interesting read and it covers the historical origin of the JIF (which was not for evaluating researchers at all) and further call for dropping journal-based metrics in assessing scientific productivity for funding and promotion. Over 240 organizations and 6000 individuals have already signed the declaration.
In conclusion, do not loose a good night sleep over your favorite journals’ impact factors…
An interesting reads at TechCrunch on new forms of dissemination and measurements of scientific impact: Reputation Metrics Startups Aim To Disrupt The Scientific Journal Industry.
In a similar vein, you might want to read the excellent editorial by John R. Alder from Stanford entitled “A New Age of Peer Reviewed Scientific Journals” published in the open access journal Surgical Neurology International. The manuscript is available on Cureus blog.
Most of the crackpot papers which are submitted to The Physical Review are rejected, not because it is impossible to understand them, but because it is possible. Those which are impossible to understand are usually published. When the great innovation appears, it will almost certainly be in a muddled, incomplete and confusing form. To the discoverer himself it will be only half-understood; to everybody else it will be a mystery. For any speculation which does not at first glance look crazy, there is no hope.