Non-profit publishers and citational politics

5 minute read

Published:

This week we discussed this article on citational politics. There’s a lot to unpack, from algorithmic bias in search engines to the motivation for “firsting” in science. The article also links to additional blog posts that delve into some of these topics. The further you go, the more you see how this problem is entangled with the other inequities that surround us. In our conversation yesterday, one question that came up was a pragmatic one— when doing literature searches to write papers, how can we use search engines to do what CLEAR calls “citing differently”?

The question probably falls into the category of “citing differently in tight places”, discussed here. There are a number of tactics mentioned, like interdisciplinarity, or citing social knowledge infrastructures. Another idea for a tactic that our group came up with was to begin literature searches with non-profit publishers rather than search engines like Google Scholar or for-profit publishers. While not addressing some of the bigger landscape issues, such as knowledge holders who are excluded from the western peer review enterprise, the approach could add a tactical tool to the toolbox.

But would it work? Would it diversify citations?

Curiosity has been tugging at me, so I thought I’d explore this proposition. I decided to compare a search in Google Scholar to a search in a non-profit publisher. I chose PeerJ, which, at least as of a few years ago, was the hippest of the open-access non-profit (but see addendum) science publishers (with peer review — arXiv/bioRxiv is another conversation). And just to give some balance, I included the same search in the journal Science, published by the non-profit AAAS, but a major journal. I started with a very broad oceanographic search: just the keyword “phytoplankton”. I took the top 50 hits from each search on November 16 2022, limiting my selection to peer-reviewed journal articles (both search engines come up with various other items).

When considering who you’re citing, there’s only so much you can tell from the information that comes up in a search. Of course, you shouldn’t guess things like gender or race based on the names. In any case, science is an international endeavor, and those concepts are not constants around the world. I decided to look at a couple of other metrics to compare.

The first metric is just a question of where in the world the authors are located. The system of national borders delineates a significant dimension of the world’s inequities (eg here). I grouped into the Anglosphere (incl. USA, UK, Canada, New Zealand, Australia… the language of publications was always English), Europe (minus UK), Asia, and South America. Here’s the breakdown in table form:

 AnglosphereEuropeAsiaSouth America
PeerJ211484
Google Scholar301631
Science401000

The take-home message here is that PeerJ has much broader global coverage, and Science has very little. Google Scholar falls somewhere in the middle. This pattern holds if you dig in within each of these categories as well. (For example, of the 40 Science pubs in the Anglosphere, 36 of them are from the US.)

The other metric I thought was worth looking at is whether there are people or institutions that are overrepresented in these citations–i.e. is there an exclusive inner circle. In other words, are the same people and places getting the top hits again and again? The answer matches the previous pattern, with PeerJ having the fewest repeated authors and institutions, and Google Scholar and Science at the other end. Science in particular was drawing from a small number of institutions, with one institution occurring six times within the top 50 hits:

 first authors appearing more than onceinstitutions appearing more than once
PeerJ411
Google Scholar1115
Science920

There are of course caveats to these comparisons (it’s just a blog post!). For example, Science has been around longer than PeerJ and might have changed its practices in recent years. Still, I think this test is pretty representative of what someone might experience if they were looking for papers to cite in their writing. And while there are lots of dimensions of privilege and access not captured by looking at this one tactic, my compulsive literature searching has at least been in line with my initial hypothesis — searching for literature within non-profit publishers could be a helpful starting point in citing differently, but with the caveat of what Science looks like.

I just want to emphasize that this is only a tactic, and addressing citational politics requires a lot of reflection. For anyone interested in using non-profit publishers to guide their lit searches, here is a list of journals in the Ocean Science space, marked by whether they are non-profit or for-profit. Edits to the list are welcome.

https://docs.google.com/spreadsheets/d/1m9yyNmdtxS7AbE3AGyoLnLCFxdIeB5jKT5I_xZd8LsQ/edit#gid=0

Addendum: CLEAR’s bibliography on citational politics https://civiclaboratory.nl/2021/02/25/citational-politics-bibliography/

Addendum: PeerJ has been acquired by Taylor & Francis (for profit)