February/March 2021: What are DOIs?
A new abbreviation “DOI” has appeared in journals over the last few years, which is short for Digital Object Identifier. A DOI is a unique tag for a digital document and DOIs are important in academic citation because they are more permanent than URLs. A DOI is linked to a specific document rather than a website and it never changes.
The NZPPS is a member of a not-for-profit organisation called Crossref that provides DOIs. As part of our member obligations, the Journal submits a DOI for each paper published and displays it on the Journal’s website as well as on the pdf copy of each paper. The DOI for each cited reference must also be included in the publication and the reference lists must be deposited with Crossref once a paper is published. Having the DOI for each reference available makes it easier for readers to find supporting literature.
Each DOI contains a unique organisation code e.g. https://doi.org/10.30843/. The rest of the code is created by the publisher. New Zealand Plant Protection uses: journal abbreviation followed by year, volume number, and paper code, e.g. https://doi.org/10.30843/nzpp.2021.74.11726 but other journals have adopted different classification systems.
Many older documents have had DOIs assigned to them retrospectively so the DOI is not printed on the document in these cases. To find a DOI for a specific paper, just copy the standard metadata (authors, title, date, journal etc) into Crossref’s free Term Query portal and it will find the DOI for you. This is the tool I use to find the DOIs for reference lists of accepted papers.
December 2020/January 2021 (updated January 2024): Reference Lists – what’s acceptable and what's not.
A common issue with new submissions is authors including unpublished material in their reference lists despite clear instructions in the New Zealand Plant Protection manuscript templates not to do so. Conversely, many authors are unaware of the fact that client reports are treated as prior publications if they are publicly available online so have been caught out when then try to publish the same work in a scientific journal. Therefore, my blog this month explains what can and can’t be included in the reference list of a scientific paper.
Citing previous work is an important part of science publications. Citations may be used to acknowledge work already published and/or provide evidence to substantiate a statement that the authors have made. It each case, the citation must be considered as published (i.e. publicly available) so that the reader can independently verify the information the authors have cited.
The definition of a publication in the New Zealand Copyright Act (1994) is given in Clause 10:
This definition means that web pages and online documents are published if they are available online. There is no requirement that a citation be peer reviewed but peer-reviewed literature carries more weight because it has been checked for quality. Either way, the reader has the option to read the referenced material and make up their own mind if it is valid or not. Many authors are unaware of the fact that client reports are treated as prior publications if they are publicly available online so have been caught out when then try to publish the same work in a scientific journal. Authors frequently prepare detailed confidential reports for clients then find that the client has published the report on their website despite it being marked confidential. Subsequent journal manuscripts risk being rejected if too many details have been published previously in a publicly available client report. Editors particularly dislike authors not citing their previously published work on the same topic, as this is self-plagiarism. The risk of rejection of a subsequent paper if too many details have been published previously in a publicly available client report is an important issue often overlooked by authors, especially where clients demand detailed reports that are marked confidential then subsequently publish them online.
One exception has always been the citation of theses. These can be cited even though they are considered unpublished in order to allow the student to publish their work elsewhere after graduation. However, many theses are available online these days so can be cited directly with the corresponding url included.
What is not acceptable in a reference list:
- unpublished material such as internal organisational reports as these are not available for the reader to scrutinise. Often such reports contain useful information and indicate that prior work has been undertaken so New Zealand Plant Protection will allow details to be provided in a footnote in the main text.
- unpublished correspondence. Key points can be cited in the text followed by the name and affiliation of the correspondent and the term “pers. comm.”
- additional data that are not fully described in the text. Such information can be referred to as “ data not shown”.
What is acceptable in a reference list:
- peer-reviewed journal papers
- books or book chapters, which may or may not be peer reviewed.
- Web pages as long as the url is provided.
- Client reports (peer-reviewed or not) that have been published either by the author or the client as long as the url is provided.
October/November 2020: More Apostrophe Catastrophes
As I mentioned last month, there are plenty of examples of incorrect apostrophe use. Here's another one I found recently:
September 2020: Apostrophe Catastrophes
Sadly, there are plenty of examples of incorrect apostrophe use – a quick search of the internet will bring up dozens of funny, but incorrectly punctuated, signs. I came across this one last week in my local pharmacy:
There has even been a society dedicated to the correct use of apostrophes. The UK Apostrophe Protection Society was started in 2001 by John Richards with the specific aim of “preserving the correct use of this currently much abused punctuation mark” in all forms of text written in the English language”. However, he disbanded the Society in 2019 because “fewer organisations and individuals are now caring about the correct use of the apostrophe. We, and our many supporters worldwide, have done our best but the ignorance and laziness present in modern times have won!”
Unfortunately, while many of the examples posted are humorous, the poor understanding and use of punctuation and grammar compromise clear and effective communication.
August 2020: More on impact factors...
I’ll continue the topic of impact factors this month with a brief history.
The popularity of impact factors has stemmed from non-specialists wanting to assess and compare the quality of different scientists and their outputs. Early attempts involved simply counting the number of papers scientists published. Being smart people, scientists realised that they could increase this number by splitting their output into smaller papers telling just part of the story (known as ‘salami slicing’). Bean counters eventually noticed this behaviour and asked for a better ranking method – they wanted a measure of quality as well as quantity. One measure of quality is the relevance of a published paper and this can be inferred by the number of times that other scientists cite the work.
For many years, the Institute for Scientific Information (ISI) published the Science Citation Index, which listed the number of citations to individual papers in specific journals. The founder of the ISI, Eugene Garfield, devised various metrics including the Journal Impact Factor, JIF™. A JIF™ is calculated by dividing the number of current year citations to the source items published in that journal during the previous two years. Thomson Reuters bought ISI in 1992 and sold it in 2016 to a group now operating under the name of Clarivate.
Anyone with access to the appropriate information can calculate an impact factor for any journal but they can’t call it a Journal Impact Factor™. The publisher Elsevier produced an equivalent metric for many years with the accurate, but awkward, name of “cites per doc (2 yr)”. In 2016, Elseiver launched a slightly different metric – a three-year impact factor – with the snappy name of CiteScore™.
One of the problems with the Journal Impact Factor™ is that its owners restrict the number of eligible journals and only paying customers can access the data. The consequence of these restrictions over many years has been the creation of a set of elite journals. In contrast, CiteScore™ covers a much larger number of journals and the metric is freely available online. In 2019, there were over 13,000 titles with a CiteScore but no JIF™.
New Zealand Plant Protection has a 2019 CiteScore™ of 1.2 but no Journal Impact Factor™.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
July 2020: I’ll begin with impact factors...
In the June 2020 NZPPS newsletter I said I’d be writing a blog from “next month”. Well, it’s now nearly the end of “next month” so I’d better get started!
The advent of the digital age has had a disruptive and transformational effect on scientific publishing internationally so the aim of this short monthly blog is to alert, inform and remind readers about key publishing issues such as: impact factors; plagiarism; predatory publishers; fraud; and open-access publishing.
I’ll begin with impact factors. One of the funniest and most succinct articles I’ve found on the utility and validity (or not) of impact factors was written by Greg Petsko* back in 2008. The article starts with the following scenario:
The time: Some time in the not-too-distant future.
The place: The entrance to The Pearly Gates. There are fluffy clouds everywhere. In the center is a podium with an enormous open book. A tall figure in white robes with white hair and beard stands at the podium. Approaching is a thin, middle-aged man with glasses and a bewildered expression. He is the soul of a recently deceased genome biologist...
Read the rest of this delightful, well-written piece free on-line at: https://genomebiology.biomedcentral.com/articles/10.1186/gb-2008-9-7-107
*Petsko, G. A. (2008). Having an impact (factor). Genome Biology, 9: 107.