Eeny, meeny, miny, mo: Deciding where to submit a manuscript for publication

look-of-successI am one of the instructors for the MyResearch graduate seminar series, which just ended for the autumn semester. It will be offered again in the winter. Issues in scholarly publishing are one of the topics we discuss in MyResearch, such as which factors should we consider for determining the best journal to submit a manuscript.

I use the following multi-step approach for selecting a journal:

1- With the database search results I exported to EndNote, I sort the relevant references in my EndNote Library by the “Journal” column and pick out a few journals (usually 3-5) that published the most articles on my topic.  You can also click on “analyze results” in Web of Science and Scopus to see the list of journals in your results set, listed from the ones with the most articles to the least.

2- I visit each journal’s website, where I read about the scope of the journal and look at the instructions to the authors.  This helps me determine whether my article would be considered for publication by the journal.  If yes, I move on to the next step.

3- I search for the Journal Impact Factor in Journal Citation Reports (you can also click on the journal name for a result in Web of Science to view the Journal Impact Factor), and then I use the “Compare sources” option in Scopus to search for each journal, which will enable me to look at the journal’s SJR (Scimago Journal Rank) score, SNIP (the citation potential or the average times an article in that journal can expect to be cited in a given year) score, % not cited, and % reviews (review articles usually generate more citations, so if a journal publishes a lot of reviews, this might inflate its SJR and Journal Impact Factor scores).  All these metrics are calculated differently but viewed together they tell me a story of which journal has the higher impact, based on more than one criterion.

4- If the journals I’m comparing rank about the same, e.g., let’s say they both fall between the top 25-35% of highly cited journals, I consider other factors, such as:

i) How long will it take before I receive a decision about my manuscript, and what is the journal’s manuscript acceptance rate? (Look at the journal’s website for this information. If you’re short on time to meet a deadline that requires the work to be published, this factor might be very important because you want to give yourself enough time to resubmit to a second journal if the first journal rejects your manuscript.)

ii) In which databases is this journal indexed? (If there are more databases that provide references to articles in the journal, this increases the odds of someone finding a reference to your article and citing it.)

iii) Will the journal allow me to make, at least, a post-print (final text of your manuscript, incorporating changes from peer-review process) freely available on the web within 12 months after publication? (This will allow individuals without a personal or institutional subscription to the journal to be able to read your article without paying for it, and satisfy the Tri-Agency Open Access Policy on Publications for research funded by SSHRC, NSERC, and CIHR. Search the journal in Sherpa Romeo to find this information or look at the journal’s website).

5- By this stage, I have usually narrowed down my list of multiple journals to two journals.  At this step, I consider personal preferences.  Which of the two journals do I read or use the most, or which one would make me the most proud if I received a letter of acceptance?

This is the seventh in a series of posts about topics relating to research skills and ethics. Happy Halloween!

Image from the Laugh-Out-Loud Cats cartoon strip by Adam “Ape Lad” Koford (creative commons license)

Potential, possible, or probable predatory publishers

There is a lot to keep in mind when deciding where to publish and it takes time to investigate individual journals and explore their websites. Not everyone considers the same things as important to them. For some it is about a journal’s prestige while for others it is about the audience that they can reach or about ensuring that their work is open access and available freely to all.

Unfortunately, there are publishers out there that are less than honest and provide false or misleading information about who they are and the impact that they have on a field. There are also publishers with hidden fees that send out invoices to authors after publishing their papers. These are commonly referred to as predatory publishers and they have fooled many a researcher.

One site that can help is Beall’s List of potential, possible, or probable predatory scholarly open access publishers. Jeffrey Beall uses a set of criteria to create this list that makes a lot of sense. I urge you to take a close look at these publishers before deciding to send them your work.

Publishing your research 101

ACS (American Chemical Society) Publications has created a series of videos to help authors and reviewers with the process of writing, editing, or reviewing articles.  Below is the first episode in the series.  It stars Chemistry Professor George M. Whitesides from Harvard University, who has published over 1100 articles and has worked on the advisory boards of multiple peer-reviewed journals.

Are video abstracts the latest trend in scientific publishing?

Hundreds of journals allow authors to submit a video abstract, i.e., a short video describing their research, along with their article.  Consequently, results of scientific experiments are now appearing on YouTube and attracting a larger audience.  Read this informative article by Jacob Berkowitz to find out more.

This article has been retracted

Here is an interesting site, Retraction Watch, that documents papers that are pulled from published scientific journals. Papers or authors can be withdrawn from the literature for various reasons (fraud, misconduct, errors, etc.). Their tagline is “Tracking retractions as a window into the scientific process.” For example, they recently covered article retractions due to faked peer-reviews using Elsevier’s editorial system. Check it out.

There is a new service called CrossMark from CrossRef, the organization that provides Digital Object Identifiers (DOIs). The goal is to have a CrossMark logo on digital documents (Web or PDF) that links to information about corrections, changes, and withdrawals. Be sure to click on the logo when it appears.

Validating scientific research

The New York Times reported earlier this year that the number of published scientific articles that were retracted by journals has increased over the years.  The articles were withdrawn due to false claims or errors in research data.

The Reproducibility Initiative was recently launched to improve the quality of preclinical biological research.  According to its website, “the Reproducibility Initiative is a new program to help scientists validate studies for publication or commercialization. Simply submit your study, and we’ll match you to one of our 1000+ expert providers for validation. Validations are conducted blind, on a fee-for-service basis.”

Image from Microsoft Office Clipart

Putting Scientific Information to Work in 1972

I love this cartoon about the exchange of information between scientists, from word of mouth and written letters, to the first scientific journals and, some time later, information overload. Forty years after this was created we are still plagued with a growing number of science and technology journals and are challenged with making full use of the literature.

One of the strategies in the 1960s from the mind of Eugene Garfield and the Institute for Scientific Information (ISI) was to cover a selective but important portion of the world’s journals.

Continue reading