How to Retard Scientific Progress

I found a great quote and analogy from an essay published in Current Biology by Peter Lawrence titled The Mismeasurement of Science. This essay takes a look at how science is measured and examines the use of impact factors and other metrics that measure scientific progress for individual scientists, academic departments and institutions.

The quote is actually from Leo Szilard, the famous Manhattan project physicist. It comes from his short science fiction story The Mark Gable Foundation from The Voice of the Dolphins: And Other Stories (read on Google Books):

“You could set up a foundation with an annual endowment of thirty million dollars. Research workers in need of funds could apply for grants, if they could make a convincing case. Have ten committees, each composed of twelve scientists, appointed to pass on these applications. Take the most active scientists out of the laboratory and make them members of these committees. …First of all, the best scientists would be removed from their laboratories and kept busy on committees passing on applications for funds. Secondly the scientific workers in need of funds would concentrate on problems which were considered promising and were pretty certain to lead to publishable results. …By going after the obvious, pretty soon science would dry out. Science would become something like a parlor game. …There would be fashions. Those who followed the fashions would get grants. Those who wouldn’t would not.”

The analogy is Lawrence’s own and relates to song writers being assessed in the same way as scientists, an analogy I can relate to having came to science from the music industry:

“It is fun to imagine song writers being assessed in the way that scientists are today. Bureaucrats employed by DAFTA (Ditty, Aria, Fugue and Toccata Assessment) would count the number of songs produced and rank them by which radio stations they were played on during the first two weeks after release. The song writers would soon find that producing junky Christmas tunes and cosying up to DJs from top radio stations advanced their careers more than composing proper music. It is not so funny that, in the real world of science, dodgy evaluation criteria such as impact factors and citations are dominating minds, distorting behaviour and determining careers.”

A Scientific “Audit Society”

Lawrence suggests that our scientific “audit society” has put meeting the demands of the holy impact factor above understanding nature and disease. He predicts that citation-fishing and citation-bartering will increase. Citation-fishing is putting your name on authors’ lists when you had no intellectual contributions, like provide a reagent. I am not entirely sure what citation-bartering means, but I suspect it has to do with journals “encouraging” submitters to cite more articles out that particular journal to facilitate acceptance of the paper.

One problem in using standardized metrics to evaluate scientific progress is the aggregation of high citation potential research in only a handful of the “top” journals. Instead of a more even spread of the literature in journals that are topically appropriate for certain papers, the trendy research of the day is published in high profile journal, while the more topical journals are left in the dirt. This prevents lower “impact” journals from escaping a certain impact factor range, making them less appealing to new researchers whose papers fit more readily in the scope of these journals and could have been potentially read by more people in their field. Getting your paper read by the right people is the real impact in my opinion. I recently submitted a species description to a journal in which several descriptions of species in that family of animals have been published, in hopes it would reach the broadest audience of interested people. My next paper I hope to be open access though.

“Impact” and Taxonomy

Standardized metrics, namely the impact factor, may have a tremendously negative impact on taxonomy and taxonomists. In part, this has to do with the behavior of most scientists with regards to the field. Taxonomic works are virtually never cited in the bibliography of scientific papers. In ecology this is the most pervasive because ecologists often use detailed keys and species descriptions in their work to confirm the identifications. The species is the fundamental unit of biology, especially in ecology. Yet you never see statements in the materials and methods section of such papers that “polychaetes were identified to genus using Fauchald 1977”. Meigen 1830, who described Drosophila melanogaster, or Maupas 1900, who described Caenorhabditis elegans, or even Linnaeus 1757, who described hundreds of species including Arabidopsis thaliana, should be the most cited publications that ever existed, based on the amount research published using these model organisms.

Ecologists often identify specimens on hearsay. By this, I mean someone told them what this species was so therefore that is what it will be called. This is how I started out. Senior grad students had done most of the identifying work for me, I just needed remember the species list or compare what I find with what was on the shelves. The problem with this was either the preliminary identifications were sometimes wrong or closer examination resulted in a species being more than one. The former has resulted in my shrimp paper and the latter resulted in my anemone paper. I can’t underscore the importance of checking the facts for yourself! Christopher Taylor at the Catalogue of Organisms underscores the importance of taxonomy based on his own research experience with Harvestman (Opiliones, an arachnid).

The profile of taxonomy will be greatly benefitted if people include citations to the works they used, whether individual species descriptions, large monographs, revisions or identification keys. Taxonomic works are consistently the only publications that get more used and more systematically ignored by the “big science” community, including medicine, molecular biology, biochemistry and evolutionary biology in addition to ecology and conservation science. This attitude towards taxonomy, and the managerial approach to modern science practices, has devaluated the stature of the taxonomist to providing a service, essentially for free, for the greater community without due recognition. This is exploitation.

While jobs become scarce, funding even scarcer, demand increases yet fewer positions open up. The top heavy age structure of taxonomists threatens to make this valuable profession even more rare as the old guard retires and their knowledge and skills are buried with them. This is especially true in the U.S. where most biology majors will never see a systematics class in their undergraduate handbook.

*Modified from an August, 2007 post on The Other 95%.

9 Replies to “How to Retard Scientific Progress”

  1. interesting. science is like any other institution; it gets riddled by cultural and institutional sclerosis.

  2. Except science doesn’t seems to want to go to the hospital for surgery when it needs it most…

  3. Christina Pikas also brought this to my attention: http://www3.interscience.wiley.com/journal/122648501/abstract?CRETRY=1&SRETRY=0

    She wrote up a note on it at her blog: http://scienceblogs.com/christinaslisrant/2010/01/very_quick_note_on_things_that.php

    Its very interesting and true, most data sources are no cited. Sometimes only in the text of the Material and Methods (i.e. “we used data obtained from OBIS, Fauchald 1977,…”), but it should be properly referenced. Citations are about transparency in your sources. It admits what influenced you research planning and direction, analyses, and writing. It also admits what sources you didn’t use! The bibliography is what gets imported into Web of Science. Listing your data sources, including taxonomic references, there is the only way the use of these sources can be tracked and the authors properly credited for their work.

  4. As someone who started in taxonomy, I admit I never thought about whether we were being neglected due to lack of citations. I guess it depends on whether listing the authority for a species counts as a citation. I am guessing it does not, since it probably doesn’t result in the bibliographic reference going into the Reference list for that paper.

    On the other hand, if you were cited every time a species you described were mentioned, you would end up with horrible skew based on the popularity of the SPECIES, not the PAPER. Thus, if you were the lucky SoB that described Arabidopsis, Drosophila or Caenorhabditis, you’d be cited a gazillion times, which is no reflection on the quality of the work. Then you’re right back to the Jingle-writer scenario.

    It comes down to that fact that any attempt to objectively measure the value of a scientist’s contribution is likely to be difficult if not impossible. Relative value yes, objective scoring – I think not.

  5. Nice writeup and aggregation of sources! It’s encouraging to see writing about the damage “impact-factor mentality” does, since there is apparently an academic camp that really believes we should

    “…make more use of citation and journal ‘impact factors,’ from Thomson ISI… If we add those scores to a researcher’s publication record, the publications on a CV might look considerably different than a mere list does.”

    Different != better. Grr.

Comments are closed.