Skip to Main Content

Understanding Scholarly Metrics and Retractions for Law Students

URL: https://libguides.law.ucla.edu/retractions

Introduction

This guide is intended to be a short introduction to the basics of scholarly metrics and searching for retractions for law students doing research for their own projects or cite checking for law reviews.  Although it was prompted by hearing student concerns about finding reliable information and general concerns about the increase in retractions in recent years, it is not intended to be a full guide to information literacy or understanding how to evaluate literature in any particular field. Two self-paced courses and an article series are below for those that want to go more in-depth on these topics.  For a quick, basic overview of how to think about scholarly articles, see Calling Bullshit, How do you know a paper is legit?

Basics of Metrics

Scholarly metrics are meant to gauge the impact of scholars and journals.  Essentially, the idea is that journals and scholars who are cited to more often have more impact, and, since the vast majority of citations are positive, highly cited materials are "better."  (This is less true for "altmetric" measures, which capture the broader impact of articles such as mentions in news and on social media, which are critical much more often.) There are many reasonable criticisms of scholarly metrics.  However, it is impossible for anyone to be well versed in the work of every scholar or even every journal in every academic field, making it unrealistic to expect that researchers won't rely on these metrics, despite their flaws.

Metrics may be used at the article, author, or journal level.  The simplest metric for an article or author is citation count. A few databases, such as Web of Science (VPN required) and Semantic Scholar allow for sorting by number of citations, but most do not. 

For authors, the other metric that is most commonly used is the h-index.  The University of Texas MD Anderson Cancer Center Library offers a fairly straightforward explanation of this metric: "The h-index is calculated by counting the number of publications for which an author has been cited by other authors at least that same number of times.  For instance, an h-index of 17 means that the scientist has published at least 17 papers that have each been cited at least 17 times.  If the scientist's 18th most cited publication was cited only 10 times, the h-index would remain at 17.  If the scientist's 18th most cited publication was cited 18 or more times, the h-index would rise to 18." An author's h-index will vary from database to database as each database calculates the index based on its contents.  It is also important to note that a "good" h-index varies from field to field.

Journal metrics are more complicated, but still largely based on citation counts over time.  The major metrics for most scholarly journals are summarized in the Journal Citation Reports database, linked below.  It is important to note that there are many similarly named journals, some intended to mimic more reputable journals, and that the standards of journals may change over time if a publisher buys rights to a journal and weakens publication standards.  Although this is uncommon, when in doubt about the authenticity of a journal, it is worth checking the Retraction Watch Hijacked Journal Checker linked below.

Retractions and post-publication critcism

The Retraction Watch Database is the largest collection of information about retracted articles in the hard and social sciences and humanities.  One study found that it identified 95.4% of a set of known retracted articles, far more than other scholarly databases. (p. 3)   The Retraction Watch Database is integrated into both the UCLA Library Catalog and Zotero.  The document linked below walks through how to check for retractions in the UCLA Library Catalog.  This libguide from the University of Maine provides some additional examples to practice searching for retracted articles.

For biomedical articles, you may search in Pubmed for retractions as a publication type or try this search: ("Retraction of Publication" [Publication Type]) OR "Retracted Publication" [Publication Type] OR retraction[Title]).  The Texas Medical Center library suggests this search to look for errata and other corrections in Pubmed: ("Published Erratum"[Publication Type] OR errata OR erratum OR corrigenda OR corrigendum OR "expression of concern").

Web of Science also allows for searching for retractions by Document Type.  As with Pubmed, you will need to search for: (Retracted Publication (Document Type) or Retraction (Document Type) or Withdrawn Publication (Document Type)).  Errata and expressions of concern are also document types: (Correction (Document Type) or Correction, Addition (Document Type) or Expression of Concern (Document Type) and Publication with Expression of Concern (Document Type)).  Unfortunately, other databases do not have retractions as a separate publication type.  In some databases searching for retractions in the title field works moderately well, but it is not a reliable method on its own.

In addition to checking for retractions, you may want to check for criticism or questioning of articles on Pubpeer, an "online platform for post-publication peer review" either on the site or by downloading the Pubpeer browser extension.  One study notes, "over two thirds of the comments are posted to report some type of misconduct, mainly concerning image manipulation."   It is worth noting that users may leave comments anonymously on Pubpeer, a practice some have criticized, but allows for the reporting of concerns without fear of retaliation. In addition to the website, Pubpeer extensions are available for most major browsers.

Citing Retracted Literature

There may be times you want to cite to the fact of an article's retraction, this footnote provides a good model:

See, e.g., Paola Sebastiani et al., Genetic Signatures of Exceptional Longevity in Humans, Science (July 1, 2010), http://www.sciencemag.org/cgi/content/abstract/science.1190532v2 (finding, through improper methods, inaccurate associations between genetic variants and longevity, which prompted Science to issue a retraction); Tina Hesman Saey, Critics Point to Flaws in Longevity Study, 178 Sci. News 10 (2010), available at http://www.sciencenews.org/view/generic/id/61050/title/Deleted_Scenes_ Critics_point_to_flaws_in_longevity_study (commenting on the Sebastiani error).

Michelle D. Irick, Age of an Information Revolution: The Direct-to-Consumer Genetic Testing Industry and the Need for A Holistic Regulatory Approach, 49 San Diego L. Rev. 279, 340 n. 128 (2012).

Additionally, there may be times when so much of an academic's work has been retracted that it is wise to note it, even if the particular work cited isn't retracted itself (it would also be wise to think about how reliable you consider the remaining work to be).  These footnotes are good examples of this kind of note:

See generally Brian Wansink et al., Bottomless Bowls: Why Visual Cues of Portion Size May Influence Intake, 13 Obesity Res. 93 (2005). Note, however, that apparent methodological deficiencies have brought the paper under scrutiny as many other papers to emerge from Professor Wansink's Food and Brand Lab at Cornell University have been retracted. Pete Etchells & Chris Chambers, Mindless Eating: Is There Something Rotten Behind the Research?, Guardian (Feb. 16, 2018, 6:21 AM), https://www.theguardian.com/science/head-quarters/2018/feb/16/mindless-eating-brian-wansink-is-there-something-rotten-behind-the-research [https://perma.cc/RY9V-MTCZ].

Kyle Langvardt, Regulating Habit-Forming Technology, 88 Fordham L. Rev. 129, 185 n. 94 (2019).

See, e.g., Bucher et al., supra note 128, at 2553. The work of a leading proponent of the “libertarian paternalist” approach to influencing dietary choices, Brian Wansink, is being reevaluated in response to revelations that he and his co-authors regularly manipulated data to achieve desired results. See, e.g., Stephanie M. Lee, The Inside Story of How an Ivy League Food Scientist Turned Shoddy Data into Viral Studies, Buzzfeed (Feb. 25, 2018), https://www.buzzfeed.com/stephaniemlee/brian-wansink-cornell-p-hacking?utm_term=.jqmLO8dWm#.ekk8VaEwJ [https://perma.cc/8L5N-ZPWP].

Nathan A. Rosenberg & Nevin Cohen, Let Them Eat Kale: The Misplaced Narrative of Food Access, 45 Fordham Urb. L.J. 1091, 1120 n. 134 (2018).

Scientific Integrity/Data Quality Blogs