Citation politics

We may want to believe that citation practices in STEM are unbiased, but the evidence out there says otherwise. It speaks to the politics and privilege that have pervaded peer review and the published literature.

Here is just a handful of recent examples:

What can we do?

There are a few things that we can do in our citation practices to make a difference. First, if we recognize that citations are power, then we can exercise that power in our own reference lists. When we write articles or otherwise disseminate our research outputs, we can choose to break away from citing the usual suspects and cite responsibly. One easy to remember rule of thumb is the Gray test. Our works pass the Gray test if they cite and discuss the scholarship of at least two women and two non-white people. We can also consider including citation diversity statements.

Second, we can be representative in the works that we recommend to others. For example, here is a Gender Balance Assessment Tool that is available to check that our reading lists are not mostly works written by male authors.

Third, when we find ourselves in a position to judge the work of others we can let go of our reliance on citation metrics. We can commit to evaluating works on their scientific merit and consider alternative metrics not based on citation counts.

Of course, we can also talk about citation politics with our colleagues. There is lots to think about and discuss!

Find more citation politics resources and readings, along with sources for alternative metrics on the Impact Measurements guide.

Voting with your feet: choosing which publishers deserve your time

Picture from above of black shoes and bottom of black pants standing on black asphalt, with two white arrows painted on the asphalt just above the feet, one arrow pointing diagonally up and the other pointing diagonally down
Photo by Jon Tyson on Unsplash

Written by guest contributor Jessica Lange

Academic publishing relies on the voluntary participation of scholars to contribute as peer reviewers and editors. Editors typically look after reviewing initial submissions to a journal, finding peer reviewers, and reviewing the final submission for acceptance. Peer reviewers serve an equally important role, assessing a scholarly work for its validity, impact, and relevance to the field.  

Researchers choose to serve as editors and peer reviewers to contribute back to the scholarly community and advance research. This is also considered part of the “job” of an academic for the purposes of tenure and promotion. However, as precarious academic positions are on the rise, this underlying principle is being reconsidered. Even for academics with a full-time position, the squeeze of increasing administrative responsibilities alongside a heavier valuing of research in tenure and promotion, has led some to make strategic choices about where to devote their energies.  

Despite the importance of these roles to the scientific community, many people outside academia are surprised to learn that neither peer reviewers nor editors receive compensation for their work. Given that large, commercial publishers post hefty profit margins and may have questionable privacy practices, researchers are starting to wonder if these corporations should benefit from their voluntary labour and scholarly expertise.  

If the above applies to you, I’d recommend the KU Leuven framework based on the Fair Open Access Alliance. When evaluating a publication for editorial or peer review duties, ask if : 

“The supplier of the infrastructure for scholarly communication has a transparent ownership structure, and is not profit-driven and accountable to shareholders, but mission-driven and accountable to the academic community (e.g. an editorial board or scholarly society).” (Fair Open Access) 

This framework privileges “scholar-led” operations, those run and led by academics themselves, supported in many cases by universities, societies, libraries, or associations. For example, the McGill Library hosted journal Seismica, a free-to-authors and readers open access journal, launched in specific response to the for-profit nature of scientific publishing in their discipline.  

How can I assess a journal?  

Journals will typically post this kind of information in their “About” page. Review their website to see if they are published by a commercial publisher (e.g., Wiley, Elsevier etc.), a non-profit (e.g., Cambridge University Press, University of Toronto Press etc.), or independently supported by a university, library, or association. Does the journal provide a mission statement? What is the publisher’s mission and goals? If the journal charges article processing charges, are they transparent about the fees (if applicable)? 

What else would you add for consideration?  

Additional resources

Jessica Lange is the Scholarly Communications Librarian at McGill University. In this role, she provides services to the campus community in the areas of open access, publishing, author rights, and open educational resources (OERs). She also manages McGill’s open access repository eScholarship and its scholarly publishing program. Her research interests include scholarly publishing and open access.  

Student recommendation for the NHL

Congratulations goes out to Mark Kumhyr, winner of the Winter/Summer 2018 Communication in Engineering (CCOM 206) Excellence in Written Communication Award!

Alternate Refrigeration Systems for Improving Ice Quality in NHL Arenas

The National Hockey League is a multibillion-dollar industry, and yet suffers from a recent uptick in complaints over sub-par ice quality, largely due to a warming climate and higher average ice rink temperatures. The objective of this paper is to demonstrate the superiority of an indirect ammonia/CO2 refrigeration system over a direct CO2 system, all in relation to the current indirect ammonia/brinewater system. The comparison will be made based on three criteria: efficiency, represented by the coefficient of performance value; cost, taking into account short- and long-term investments; and environmental effect, presented as a Global Warming Potential value. The results show that the indirect ammonia/CO2 system is 56% more efficient than the current system, and 20% more than the direct CO2 system, and is less costly in the long-term. The environmental effects of each refrigerant systems were shown to be negligible. It is recommended that the NHL implement an indirect ammonia/CO2 system in order to negate the warming arena temperatures, and ensure that the NHL remains a powerhouse in the sports industry.

Read the full paper in eScholarship, McGill’s open access repository.

Eeny, meeny, miny, mo: Deciding where to submit a manuscript for publication

look-of-successI am one of the instructors for the MyResearch graduate seminar series, which just ended for the autumn semester. It will be offered again in the winter. Issues in scholarly publishing are one of the topics we discuss in MyResearch, such as which factors should we consider for determining the best journal to submit a manuscript.

I use the following multi-step approach for selecting a journal:

1- With the database search results I exported to EndNote, I sort the relevant references in my EndNote Library by the “Journal” column and pick out a few journals (usually 3-5) that published the most articles on my topic.  You can also click on “analyze results” in Web of Science and Scopus to see the list of journals in your results set, listed from the ones with the most articles to the least.

2- I visit each journal’s website, where I read about the scope of the journal and look at the instructions to the authors.  This helps me determine whether my article would be considered for publication by the journal.  If yes, I move on to the next step.

3- I search for the Journal Impact Factor in Journal Citation Reports (you can also click on the journal name for a result in Web of Science to view the Journal Impact Factor), and then I use the “Compare sources” option in Scopus to search for each journal, which will enable me to look at the journal’s SJR (Scimago Journal Rank) score, SNIP (the citation potential or the average times an article in that journal can expect to be cited in a given year) score, % not cited, and % reviews (review articles usually generate more citations, so if a journal publishes a lot of reviews, this might inflate its SJR and Journal Impact Factor scores).  All these metrics are calculated differently but viewed together they tell me a story of which journal has the higher impact, based on more than one criterion.

4- If the journals I’m comparing rank about the same, e.g., let’s say they both fall between the top 25-35% of highly cited journals, I consider other factors, such as:

i) How long will it take before I receive a decision about my manuscript, and what is the journal’s manuscript acceptance rate? (Look at the journal’s website for this information. If you’re short on time to meet a deadline that requires the work to be published, this factor might be very important because you want to give yourself enough time to resubmit to a second journal if the first journal rejects your manuscript.)

ii) In which databases is this journal indexed? (If there are more databases that provide references to articles in the journal, this increases the odds of someone finding a reference to your article and citing it.)

iii) Will the journal allow me to make, at least, a post-print (final text of your manuscript, incorporating changes from peer-review process) freely available on the web within 12 months after publication? (This will allow individuals without a personal or institutional subscription to the journal to be able to read your article without paying for it, and satisfy the Tri-Agency Open Access Policy on Publications for research funded by SSHRC, NSERC, and CIHR. Search the journal in Sherpa Romeo to find this information or look at the journal’s website).

5- By this stage, I have usually narrowed down my list of multiple journals to two journals.  At this step, I consider personal preferences.  Which of the two journals do I read or use the most, or which one would make me the most proud if I received a letter of acceptance?

This is the seventh in a series of posts about topics relating to research skills and ethics. Happy Halloween!

Image from the Laugh-Out-Loud Cats cartoon strip by Adam “Ape Lad” Koford (creative commons license)

Potential, possible, or probable predatory publishers

There is a lot to keep in mind when deciding where to publish and it takes time to investigate individual journals and explore their websites. Not everyone considers the same things as important to them. For some it is about a journal’s prestige while for others it is about the audience that they can reach or about ensuring that their work is open access and available freely to all.

Unfortunately, there are publishers out there that are less than honest and provide false or misleading information about who they are and the impact that they have on a field. There are also publishers with hidden fees that send out invoices to authors after publishing their papers. These are commonly referred to as predatory publishers and they have fooled many a researcher.

One site that can help is Beall’s List of potential, possible, or probable predatory scholarly open access publishers. Jeffrey Beall uses a set of criteria to create this list that makes a lot of sense. I urge you to take a close look at these publishers before deciding to send them your work.

Publishing your research 101

ACS (American Chemical Society) Publications has created a series of videos to help authors and reviewers with the process of writing, editing, or reviewing articles.  Below is the first episode in the series.  It stars Chemistry Professor George M. Whitesides from Harvard University, who has published over 1100 articles and has worked on the advisory boards of multiple peer-reviewed journals.

Are video abstracts the latest trend in scientific publishing?

Hundreds of journals allow authors to submit a video abstract, i.e., a short video describing their research, along with their article.  Consequently, results of scientific experiments are now appearing on YouTube and attracting a larger audience.  Read this informative article by Jacob Berkowitz to find out more.

This article has been retracted

Here is an interesting site, Retraction Watch, that documents papers that are pulled from published scientific journals. Papers or authors can be withdrawn from the literature for various reasons (fraud, misconduct, errors, etc.). Their tagline is “Tracking retractions as a window into the scientific process.” For example, they recently covered article retractions due to faked peer-reviews using Elsevier’s editorial system. Check it out.

There is a new service called CrossMark from CrossRef, the organization that provides Digital Object Identifiers (DOIs). The goal is to have a CrossMark logo on digital documents (Web or PDF) that links to information about corrections, changes, and withdrawals. Be sure to click on the logo when it appears.

Validating scientific research

The New York Times reported earlier this year that the number of published scientific articles that were retracted by journals has increased over the years.  The articles were withdrawn due to false claims or errors in research data.

The Reproducibility Initiative was recently launched to improve the quality of preclinical biological research.  According to its website, “the Reproducibility Initiative is a new program to help scientists validate studies for publication or commercialization. Simply submit your study, and we’ll match you to one of our 1000+ expert providers for validation. Validations are conducted blind, on a fee-for-service basis.”

Image from Microsoft Office Clipart