Harvey et al. attack article mum on real selection process for polar bear papers used in their analysis

The Harvey et al. Bioscience article that attacks this blog and others that link to it — a veritable tantrum paper that took 14 people to write — included a sciency-looking analysis of peer-reviewed articles said to have been retrieved by the database “Web of Science” using the search terms “polar bear” and “sea ice.”

Temper-Tantrum graphic

“Consensus science pounds the floor and chews the carpet in angry frustration.” [mpainter, 25 December 2017]

Other critics have pointed out that the Harvey paper used 92 such references:

“Of the 92 papers included in the study, 6 are labeled ‘controversial.’ Of the remaining 86, 60 are authored or co-authored by Stirling or Amstrup, or Derocher. That is, close to 70% (69.76%) of the so-called ‘majority-view’ papers are from just three people, 2 of whom wrote the attack paper themselves.” [Shub Niggurath, crossposted at Climate Scepticism, 14 December 2017]

The bias of co-author papers used to represent the “expert consensus” on polar bear biology is only one problem with this particular attempt at making the Harvey paper look like science: in fact, the short list of papers used for analysis is a far cry from the original number returned by Web of Science for the search terms the authors say they used in the supplementary information.

How that large original number (almost 500) was whittled down to less than 100 is not explained by the authors. As a consequence, I can only conclude that the “methodology” for paper selection was likely defined after the fact. While the method of paper selection sounds simple and reasonable, apparently not one of the Harvey et al. paper’s co-authors checked to see if it was plausible (or didn’t care if it was not).

Selected references were included in the principal component analysis depicted in Fig. 2 of Harvey et al. 2017 (H17):

Harvey et al. 2017 fig 2

Let’s have a look at the details behind the differences between what Web of Science offers for the search terms “polar bear” and “sea ice” and what Harvey and colleagues used for their analysis.

For those unfamiliar with the Web of Science database, here is what the beginning of the search results look like: each entry has a number (in this case, sorted by date, newest first) and the search key words are highlighted if they appear in the title. Clicking on the title gets you the abstract and list of key words provided by authors, where search key words are highlighted if they appear.

Web of Science search for polar bear plus sea ice 472

Searching for [polar bear] and [sea ice] (without quotes) generates 472 entries (where either of the four key words appears in the title, abstract, or list of author-generated key words), above. Searching for “polar bear” and “sea ice” (with quotes), filters out the stand-alone “polar” and “ice” papers etc., and generates a total of 201 entries, below.

Web of Science search for polar bear AND sea ice example 201

It appears that the cruder, ‘no quotes’ [polar bear] and [sea ice] strategy must have been used by H17 (if any such search was used at all) because the ‘with quotes’ option does not generate some of the papers cited, such as Tartu et al. 2017. However, a few of the 472 references returned with the cruder method are not relevant, since they contain just “ice” or “polar,” making some kind of post-search selection necessary. In addition, keep in mind that some papers known to exist on polar bears and sea ice don’t show up in either of these Web of Science searches for various reasons (many early polar bear papers were published as technical government reports, for example), so it’s important to understand that this database is far from a comprehensive record of the polar bear literature.

But first, before we look in detail…

Harvey et al. (H17) vs. the references Crockford has used

I’ve gone through the H17 list and marked in yellow in their pdf those references I have cited in my papers and/or blog posts, and underlined in green those references I have not yet cited but have read (i.e. they are in my personal library of pdfs): of the 92 papers included in the H17 analysis, 53 (59%) have been cited by Crockford in her papers and/or blog posts. Of the 39 papers not cited by Crockford, 15 (38%) reside in her personal library (i.e. read but not cited).

Oddities in the choice of papers used in H17 analysis

Why were some Web of Science references included while others were not? The authors of H17 don’t say. While a few are simply not relevant, it’s puzzling why so many were excluded that seem to fit the search parameters.

For example, chronologically, the most recent entry in the cruder Web of Science search result included in H17 is #12 Tartu et al. (“Sea ice-associated decline in body condition leads to increased concentrations of lipophilic pollutants in polar bears (Ursus maritimus) from Svalbard, Norway”) which was published 1 Oct 2017. It has both ‘sea ice’ and ‘polar bears’ in the title:

Tartu and Durner papers in 2017_Web of Science entries

Oddly, the very next Web of Science citation #13 is Durner et al. 2017 (“Increased Arctic sea ice drift alters adult female polar bear movements and energetics”) published September 2017 (see screen cap above). It also lists ‘sea ice’ and ‘polar bear’ in the title but only Tartu et al. was included in the H17 analysis. Ironically, this Durner et al. paper was one I critiqued recently (USGS treadmill paper repeats bogus claim that polar bears suffer from ice loss,  11 Jun 2017), citing 12 references from so-called ‘consensus polar bear experts.’

Similarly, for listings 41 through 44 (below), only entry #44 Regehr et al. 2016 (“Conservation status of polar bears (Ursus maritimus) in relation to projected sea-ice decline”s) was included in H17: entry #41 Pilfold et al. 2017 (“Migratory response of polar bears to sea ice loss: to swim or not to swim”) was not:

Pilfold 2017 vs Regehr 2016 Web of Science marked

Similarly, Iacozza and Ferguson 2014 (“Spatio-temporal variability of snow over sea ice in western Hudson Bay, with reference to ringed seal pup survival”), was cited by H17 even though polar bear does not show up in the title or abstract but only in the so-called list of “key words plus.” It’s mostly about seals.

Iacozza et al 2014 expanded entry in Web of Science

However, Ferguson et al. 2017 (“Demographic, ecological, and physiological responses of ringed seals to an abrupt decline in sea ice availability”) also lists sea ice in the title and polar bear in the author key words but was not included in H17.

So why Iacozza et al. 2014 but not Ferguson et al. 2017? Both are about polar bear prey.

Ferguson et al. 2017 WHB ringed seals Web of Science

Similarly, of the entries #414 to #420 below, only #419 Stirling, Lunn, and Iacozza 1998 (“Long-term trends in the population ecology of polar bears in western Hudson Bay in relation to climatic change”) were included in H17. Both #414 Ferguson et al. 2000 (“Relationships between denning of polar bears and conditions of sea ice”) and #417 Ferguson et al. 2000 (“Influence of sea ice dynamics on habitat selection by polar bears”) were left out:

Stirling Lunn Iacozza 1999 and surrounds_Web of Science marked

Oddly, the following papers came up in the Web of Science search and seem obvious good fits for the purpose but were not included in H17 for reasons we may never know:

34. Bechshoft et al. 2013 (“Polar bear stress hormone cortisol fluctuates with the North Atlantic Oscillation climate index”)

Bechshoft et al 2013 number 34 Web of Science

270. Meek 2011 (“Putting the US polar bear debate into context: The disconnect between old policy and new problems”)

Meek 2011 US pb debate in Web of Science

403 Mauritzen et al. 2001(“Space-use strategies of female polar bears in a dynamic sea ice habitat”)

Mauritzen et al. 2001 Web of Science

407 Ferguson et al. 2001 (“Activity and movement patterns of polar bears inhabiting consolidated versus active pack ice”)

Ferguson et al. 2001 Web of Science

440 Stirling 1997 (“The importance of polynyas, ice edges, and leads to marine mammals and birds”)

Stirling 1995 importance of polynyas_conference paper_Web of Science

471 Smith 1980 (“POLAR BEAR PREDATION OF RINGED AND BEARDED SEALS IN THE LAND-FAST SEA ICE HABITAT”)

472 Lentfer 1975 (“POLAR BEAR DENNING ON DRIFTING SEA ICE”)

Smith and Lentfer 1980 and 1975 Web of Science last two entries

Lastly however, and perhaps most concerning, is that all of the following papers (11 in total) were included in H17 but do not appear in the Web of Science search result, leaving us to wonder why they were included in the H17 analysis:

Fagre et al. 2015 (“A review of infectious agents in polar bears (Ursus maritimus) and their long-term ecological relevance”)

Hobson et al. 2009 (“Isotopic homogeneity of breath CO2 from fasting and berry-eating polar bears: implications for tracing reliance on terrestrial foods in a changing Arctic”)

Luque et al. 2014 (“Spatial behaviour of a keystone Arctic marine
predator and implications of climate warming in Hudson Bay”)

Matejova 2015 (“Is Global Environmental Activism Saving the Polar Bear?”)

McKinney et al. 2014 (“Validation of adipose lipid content as a body condition index for polar bears”)

Nuijten et al. 2016 (“Circumpolar contaminant concentrations in polar bears (Ursus maritimus) and potential population-level effects”)

Parsons and Cornick 2013 (“Politics, people and polar bears: A rebuttal of Clark et al. (2013)”)

Stone and Derocher 2007 (“An incident of polar bear infanticide and cannibalism on Phippsøya, Svalbard”)

Styrishave et al. 2017 (“Steroid hormones in multiple tissues of East Greenland polar bears (Ursus maritimus)”)

Tyrrell et al. 2014 (“What happened to climate change? CITES and the
reconfiguration of polar bear conservation discourse”)

Wiig et al. 2008 (“Effects of climate change on polar bears”)

Harvey et al. state:

“A broad keyword search on the internet and the ISI Web of Science database yielded 90 blogs (described above) and 92 peer reviewed papers reporting on both Polar bears and arctic ice.”

Althouth readers might assume that the Web of Science database was searched exclusively for the peer reviewed papers and the Internet searched for blogs, it is clear from my analysis above that the short list of peer-reviewed papers used in the H17 ‘analysis’ were specificially selected using unknown criteria from both the Internet and the Web of Science.

In other words, the Harvey et al. authours used a subjective and potentially very biased sample of the available polar bear/sea ice literature to “analyze” for consensus perspectives. I can only conclude that the stated “methodology” for paper selection was likely defined after the fact and does not reflect the methodology actually used.

Comments are closed.