Yoder, Stephen, and Brittany H. Bramlett. 2011. “What Happens at the Journal Office Stays at the Journal Office: Assessing Journal Transparency and Record-Keeping Practices.” PS: Political Science & Politics 44 (2): 363–373. doi:10.1017/S1049096511000217
This article explores the contentious world of journal acceptance rates in political science. The authors surveyed the top 30 journals in the field to investigate the publications’ practices regarding submissions. They give the publications a “transparency score” and also list the acceptance rates and average turn-around times (ranging from 21 days for Foreign Affairs to 120 for World Politics).
Dissemination of journal submission data is critical for identifying editorial bias, creating an informed scholarly marketplace, and critically mapping the contours of a discipline’s scholarship. However, our survey and case study investigations indicate that nearly a decade after the Perestroika movement began, political science journals remain reserved in collecting and releasing submission data. We offer several explanations for this lack of transparency and suggest ways that the profession might address this shortcoming.
Continuing with a recent theme of citation tracking, Google Scholar has continued to develop with additional features. One newer item is Scholar Metrics, which displays the various h-index measures for journals in all subject areas. You can search or browse by publication title and then click on the h-index scores to see the most highly-cited articles.
A previous post describes how individual researchers can set up their own profile in Google Scholar’s My Citations, which displays their work and citations and has a function for setting up alerts for new citations.
My last post was on ISI’s journal citation reports, but there are other ranking calculations as well. Along with straight citation counts, Eigenfactor brings in number of articles published, disciplinary citation patterns, and references from non-journal sources, . As the site explains it, the Eigenfactor score is a “measure of the journal’s total importance to the scientific community.” “A journal’s Article Influence score is a measure of the average influence of each of its articles over the first five years after publication.”
The scores can be searched by journal title and ISI subject category.
In Political Science, the journals with the top Eigenfactor are:
1. American Journal of Political Science
2. American Political Science Review
3. Journal of Politics
When ranked by Article Influence, Political Analysis goes to the top.
In International Relations, the Eigenfactor ranking is:
1. International Organization
2. Journal of Conflict Resolution
3. Foreign Affairs
The Eigenfactor Project also has a cost-effectiveness score that assesses journals’ Eigenfactor and Article Influence scores in relation to subscription costs.
A free site, the Eigenfactor Project also includes further background information and an interactive mapping tool that tracks citations across disciplines.
It’s time again for another release of the annual Journal Citation Reports from ISI. See this post for more background information. The reports are intended to provide an evaluative ranking of scholarly journals based on citation data.
The Social Sciences section includes separate rankings for Political Science and International Relations.
Based on ISI’s formulas, the Political Science journals with the highest impact factor in 2011 were:
1. American Political Science Review
2. American Journal of Political Science
3. Public Opinion Quarterly
4. Journal of Conflict Resolution
5. Political Analysis
*American Political Science Review was again cited most frequently overall.
In International Relations, the publications with the highest impact factor were:
1. World Politics
2. International Organization
3. Common Market Law Review
4. International Security
5. Journal of Conflict Resolution
*International Organization was again cited most frequently overall.
The reports also have information about broader citation trends, showing how the journals have been cited over time.
Open access journals are increasingly common on the academic publishing scene, but open access monographs are often less visible.
OAPEN (Open Access Publishing in European Networks) is one of many notable emerging resources that provides access to academic monographs that can be freely read online. It is a particularly good source of material in languages other than English.
At the time of writing, there are 186 works in the “politics and government” category, although of course a field as diverse as political science overlaps with other disciplines.
Publishers include (among many others!):
- Aarhus University Press
- Amsterdam University Press
- Firenze University Press
- Humboldt-Universität zu Berlin
- KITLV Press
- Manchester University Press
- Palgrave Macmillan
- Taylor & Francis
- Universitätsverlag Göttingen
- University of Wales Press
From the official description:
OAPEN (Open Access Publishing in European Networks) is a collaborative initiative to develop and implement a sustainable Open Access publication model for academic books in the Humanities and Social Sciences. The OAPEN Library aims to improve the visibility and usability of high quality academic research by aggregating peer reviewed Open Access publications from across Europe.
Journals’ websites are usually the best place to find out about editorial procedures and subject scope, but periodical directories can offer additional details.
Ulrich’s Periodical Directory, for example, includes information about the publisher, frequency, databases that index the publication, online availability, type (e.g., scholarly or popular), and circulation.
The MLA International Bibliography is a database primarily for journals in literature and linguistics, but it also contains a periodical directory (under the Advanced Search options). A handful of journals related to political science are included, and for several, the acceptance rate of the journal is listed.
The last few weeks have seen a lot of online discussion related to a movement to boycott journals published by publishing giant Elsevier. Mathematician Timothy Gowers set things in motion with a blog post. A website called The Cost of Knowledge appeared soon after, hosting a list of academics who have pledged not to contribute editorial work, serve as peer reviewers, or publish in Elsevier journals. At the time of writing, there are 7,353 signatories.
Barbara Fister offers a concise argument for the reasoning behind the boycott on Inside Higher Education. Although it’s true that other publishers are engaging in practices that are destructive to libraries and academic publishing, Elsevier is emblematic of problems that are pervasive in the industry, targeted because of its size and prominence. In addition, the company initially supported proposed legislation in the United States that is seen to be detrimental to the dissemination of research and the production of creative works, namely the Research Works Act. [February 27 update: Elsevier has withdrawn its support for the act.]
Elsevier produces publications in all fields, including a few journals in the
social and political sciences (e.g., Electoral Studies, Political Geography, European Journal of Political Economy, Social Science Research).
I’ve posted before about the complex and contentious topic of counting citations and ranking journals. The issue is further complicated by social media, open access publishing, self-archiving, and other new channels of scholarly communication.
A group of researchers has put together a website (and is conducting research) on the idea of altmetrics—new tools for tracking impact, ostensibly, with more accuracy, in today’s environment.
Altmetrics expand our view of what impact looks like, but also of what’s making the impact. This matters because expressions of scholarship are becoming more diverse. Articles are increasingly joined by:
- The sharing of “raw science” like datasets, code, and experimental designs
- Semantic publishing or “nanopublication,” where the citeable unit is an argument or passage rather than entire article.
- Widespread self-publishing via blogging, microblogging, and comments or annotations on existing work.
Because altmetrics are themselves diverse, they’re great for measuring impact in this diverse scholarly ecosystem. In fact, altmetrics will be essential to sift these new forms, since they’re outside the scope of traditional filters. This diversity can also help in measuring the aggregate impact of the research enterprise itself.
The website and the idea are still in development, but links are provided to some early tools:
The library has access to a very large collection of e-books published by Springer, which can be downloaded in full in PDF format (chapter by chapter).
In addition, our subscription includes access to special rates for McGill folks who want to purchase their own print copy of a book. When the print copy option is available, the price appears on the main listing for the item.
I’ve mentioned before ways of using Google Scholar to uncover which articles cite each other.
For several months now, Google has presented a new dimension to its citation searching, offering authors the ability to create a profile in Google Scholar that tracks their papers that are indexed in Google Scholar, how many times the articles have been cited, and other citation metrics.
As the help files explain:
Google Scholar Citations provides a simple way for authors to keep track of citations to their articles. You can check who is citing your publications, graph citations over time, and compute several citation metrics. You can also make your profile public, so that it may appear in Google Scholar results when people search for your name, e.g., richard feynman.
Best of all, it’s quick to set up and simple to maintain – even if you have written hundreds of articles, and even if your name is shared by several different scholars. You can add groups of related articles, not just one article at a time; and your citation metrics are computed and updated automatically as Google Scholar finds new citations to your work on the web. You can even choose to have your list of articles updated automatically – but, of course, you can also choose to review the updates yourself, or to manually update your articles at any time.
My page is one example of the tool in action, and another example is below.