If a tree falls in a forest and no one is around to hear it, does it make a sound?*

Norvin Green State Forest in New Jersey. Photo from Wikipedia

Norvin Green State Forest in New Jersey. Photo from Wikipedia


[We want to know what you think! Please click on the link at the bottom of the post to complete a short survey and/or share your thoughts about the publishing process in the comments section below]
For better or worse, as scientists, our success depends not only on the merit of our work at face value, but also on where we publish and how our work is received after it is published- if a scientist discovers something new and exciting but no one cites the paper, does it matter?
Although journal impact factors** have been shown to be a poor predictor of the success of any given paper (Wang et al. 2013), in the hopes that his or her paper will reach a wide audience and be highly cited, scientists often strive to publish their work in the most prestigious journal possible. Furthermore, because universities and agencies consider journal impact factors when making hiring, promotion, and funding decisions (Adam 2002, Smith 2006), the quality of the journals in which you publish matters in addition to the quantity of papers you publish.
Another element scientists must consider when choosing where to send a paper is the length of time between manuscript submission and publication. Publications are our currency and as an early career scientist working to build my CV, I appreciate the challenge of deciding between submitting to a solid journal you’re confident will accept your manuscript quickly versus submitting to a more prestigious journal, risking a rejection and potentially increasing the length of time before your work gets published (and cited!). Needless to say, deciding where to submit a manuscript is an important decision.
In a recent PLOS ONE paper, Salinas and Munch (2015) present a framework that uses a Markov decision process to determine to which journal an author should submit a manuscript:

We derive the optimal submission sequence for a scientist attempting to maximize the expected number of citations obtained over some finite period [model 1]. Recognizing that revising and resubmitting a manuscript multiple times is both time-consuming and demoralizing, in our second approach we solve the dual-objective problem posed by balancing the trade-off between the expected number of citations and either the expected number of revisions or the time to publication [model 2].

Some of the parameters in the models include the expected number of citations a manuscript receives over a finite time interval ending at time T (where T might be, for example, the time to tenure), the acceptance rate of journal, the expected number of citations for a paper in that journal, the time in days from submission to publication in the journal, the revision time, and my favorite, the probability of getting scooped.
Considering 61 ecology journals, Ecology Letters was the optimal journal for maximizing citations over all time periods (T), although changing T (i.e. increasing or decreasing the time allowed for accruing citations) changed the optimal submission ranking (see Figure 1 in Salinas and Munch). When T was set to 10 years and held constant (see Figure 2), Molecular Ecology Resources was the top ranked journal for most values of s, the probability of getting scooped, followed by PLOS ONE and Ecology Letters (although I wonder how useful this information is given that Molecular Ecology Resources is a more specialized journal that doesn’t publish regular empirical research, focusing instead on new tools and methods).
Salinas and Munch also evaluated the trade-off between the expected number of citations and 1) the expected number of times the manuscript will be submitted before acceptance and 2) the length of time in review before acceptance. Simulations showed submissions to Ecology Letters obtained the highest number of citations (Figures 3 and 4 in Salinas and Munch and recreated here below). However, if authors are willing to sacrifice 4 to 14 citations by submitting to PLOS ONE instead, they can save 0.5-1.5 revisions (left panel) and between 30 and 150 days in review (right panel).

Figure 3. 3,200,000 different submission strategies (each grey dot) are evaluated in terms of expected number of citations (over 5 years) and number of submissions needed before acceptance (s = 0.002, tR = 30 days). Highlighted are the top journals for citation-maximizing strategies that minimize resubmissions (efficiency frontier). Ecology Letters dominates the high expected number of citations area, while PLOS ONE is the clear optimal choice at intermediate citations. Figure 4. Expected number of citations (over 5 years) and time spent in review for 3,200,000 different submission strategies (s = 0.002, tR = 30 days). Highlighted are the top journals for citation-maximizing strategies that minimize time spent in review.

Left: 3,200,000 different submission strategies (each grey dot) are evaluated in terms of expected number of citations (over 5 years) and number of submissions needed before acceptance (s = 0.002, tR = 30 days). Highlighted are the top journals for citation-maximizing strategies that minimize resubmissions (efficiency frontier). Ecology Letters dominates the high expected number of citations area, while PLOS ONE is the clear optimal choice at intermediate citations. Right: Expected number of citations (over 5 years) and time spent in review for 3,200,000 different submission strategies (s 0.002, tR = 30 days). Highlighted are the top journals for citation-maximizing strategies that minimize time spent in review. Figures and captions reproduced from Salinas and Munch 2015.


There are some limitations to keep in mind when interpreting the results of Salinas and Munch. For example, the authors received submission and publication data from 61 of the 131 ecology journals they contacted, so not all journals are represented in the analysis. The authors also assumed for simplicity’s sake that time to publication, acceptance rate, and the rate of being scooped were consistent across journals and that impact factor would correctly predict the expected number of citations of a paper. Independent peer review options like Axios and Peerage of Science were not considered in the model nor was the price per article. Cost is an important consideration for authors whether they are deciding to submit to a journal that charges for color figures or to an open access journal that charges a fee for publication. For a great in-depth perspective on the cost of open access publishing, check out Tim Vine’s TME post “Why is science publishing so damn expensive?
We want to know how you decide where to send a paper. Would you be inclined to use an objective, model-based method such as the one outlined by Salinas and Munch? Or is your decision about where to submit a paper based more on a gut feeling? Or based on the advice or advisors or colleagues? Follow this link to complete a short survey and/or tell us your thoughts about the publishing process in the comments section!
LINK TO SURVEY
*I’ve always thought of this as a philosophical question, but apparently you can view it through a scientific lens as well (which is great because I love science!). A not-so-recent issue of Scientific American addressed the question, concluding that “sound is vibration, transmitted to our senses through the mechanism of the ear, and recognized as sound only at our nerve centers. The falling of the tree or any other disturbance will produce vibration of the air. If there be no ears to hear, there will be no sound.” – Scientific American (1884) April 5, page 218
**the impact factor of a journal is a metric reflecting the average number of citations to recent articles published in that journal.
References:
Salinas, S., & Munch, S. B. (2015) Where should I send it? Optimizing the submission decision process. PLOS ONE, 10(1), e0115451. DOI: 10.1371/journal.pone.0115451
Wang, D., Song, C., & Barabási, A. L. (2013). Quantifying long-term scientific impact. Science, 342(6154), 127-132. DOI: 10.1126/science.1237825
Adam, D. (2002) Citation analysis: The counting house. Nature415(6873), 726-729. DOI: 10.1038/415726a
Smith, R. (2006) Commentary: The power of the unrelenting impact factor—Is it a force for good or harm? International Journal of Epidemiology35(5), 1129-1130. DOI: 10.1093/ije/dyl191
 

This entry was posted in career, funding, Impact Factors, peer review, science publishing. Bookmark the permalink.