December 2020/January 2021: Reference Lists – what’s acceptable and what's not.
A common issue with new submissions is authors including unpublished material in their reference lists despite clear instructions in the New Zealand Plant Protection manuscript templates not to do so. Therefore, my blog this month explains what can and can’t be included in the reference list of a scientific paper.
Citing previous work is an important part of science publications. Citations may be used to acknowledge work already published and/or provide evidence to substantiate a statement that the authors have made. It each case, the citation must be published (i.e. publicly available) so that the reader can independently verify the information the authors have cited.
The definition of a publication in the New Zealand Copyright Act (1994) is given in Clause 10:
This definition means that web pages and online documents are published if they are available online. There is no requirement that a citation be peer reviewed but peer-reviewed literature carries more weight because it has been checked for quality. Either way, the reader has the option to read the referenced material and make up their own mind if it is valid or not.
One exception has always been the citation of theses, which are considered unpublished thus allowing the student to publish their work elsewhere after graduation. However, many theses are available online these days so can be cited with the corresponding url included.
What is not acceptable in a reference list:
- unpublished material such as internal organisational reports as these are not available for the reader to scrutinise. Often such reports contain useful information and indicate that prior work has been undertaken so New Zealand Plant Protection will allow details to be provided in a footnote in the main text.
- unpublished correspondence. Key points can be cited in the text followed by the name and affiliation of the correspondent and the term “pers. comm.”
- additional data that are not fully described in the text. Such information can be referred to as “ data not shown”.
October/November 2020: More Apostrophe Catastrophes
As I mentioned last month, there are plenty of examples of incorrect apostrophe use. Here's another one I found recently:
September 2020: Apostrophe Catastrophes
Sadly, there are plenty of examples of incorrect apostrophe use – a quick search of the internet will bring up dozens of funny, but incorrectly punctuated, signs. I came across this one last week in my local pharmacy:
There has even been a society dedicated to the correct use of apostrophes. The UK Apostrophe Protection Society was started in 2001 by John Richards with the specific aim of “preserving the correct use of this currently much abused punctuation mark” in all forms of text written in the English language”. However, he disbanded the Society in 2019 because “fewer organisations and individuals are now caring about the correct use of the apostrophe. We, and our many supporters worldwide, have done our best but the ignorance and laziness present in modern times have won!”
Unfortunately, while many of the examples posted are humorous, the poor understanding and use of punctuation and grammar compromise clear and effective communication.
August 2020: More on impact factors...
I’ll continue the topic of impact factors this month with a brief history.
The popularity of impact factors has stemmed from non-specialists wanting to assess and compare the quality of different scientists and their outputs. Early attempts involved simply counting the number of papers scientists published. Being smart people, scientists realised that they could increase this number by splitting their output into smaller papers telling just part of the story (known as ‘salami slicing’). Bean counters eventually noticed this behaviour and asked for a better ranking method – they wanted a measure of quality as well as quantity. One measure of quality is the relevance of a published paper and this can be inferred by the number of times that other scientists cite the work.
For many years, the Institute for Scientific Information (ISI) published the Science Citation Index, which listed the number of citations to individual papers in specific journals. The founder of the ISI, Eugene Garfield, devised various metrics including the Journal Impact Factor, JIF™. A JIF™ is calculated by dividing the number of current year citations to the source items published in that journal during the previous two years. Thomson Reuters bought ISI in 1992 and sold it in 2016 to a group now operating under the name of Clarivate.
Anyone with access to the appropriate information can calculate an impact factor for any journal but they can’t call it a Journal Impact Factor™. The publisher Elsevier produced an equivalent metric for many years with the accurate, but awkward, name of “cites per doc (2 yr)”. In 2016, Elseiver launched a slightly different metric – a three-year impact factor – with the snappy name of CiteScore™.
One of the problems with the Journal Impact Factor™ is that its owners restrict the number of eligible journals and only paying customers can access the data. The consequence of these restrictions over many years has been the creation of a set of elite journals. In contrast, CiteScore™ covers a much larger number of journals and the metric is freely available online. In 2019, there were over 13,000 titles with a CiteScore but no JIF™.
New Zealand Plant Protection has a 2019 CiteScore™ of 1.2 but no Journal Impact Factor™.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
July 2020: I’ll begin with impact factors...
In the June 2020 NZPPS newsletter I said I’d be writing a blog from “next month”. Well, it’s now nearly the end of “next month” so I’d better get started!
The advent of the digital age has had a disruptive and transformational effect on scientific publishing internationally so the aim of this short monthly blog is to alert, inform and remind readers about key publishing issues such as: impact factors; plagiarism; predatory publishers; fraud; and open-access publishing.
I’ll begin with impact factors. One of the funniest and most succinct articles I’ve found on the utility and validity (or not) of impact factors was written by Greg Petsko* back in 2008. The article starts with the following scenario:
The time: Some time in the not-too-distant future.
The place: The entrance to The Pearly Gates. There are fluffy clouds everywhere. In the center is a podium with an enormous open book. A tall figure in white robes with white hair and beard stands at the podium. Approaching is a thin, middle-aged man with glasses and a bewildered expression. He is the soul of a recently deceased genome biologist...
Read the rest of this delightful, well-written piece free on-line at: https://genomebiology.biomedcentral.com/articles/10.1186/gb-2008-9-7-107
*Petsko, G. A. (2008). Having an impact (factor). Genome Biology, 9: 107.