[Todos] Fwd: Informe sobre la corrupción en Exactas - Parte 26

fabio vicentini fmvicent en gmail.com
Mie Nov 27 06:56:55 ART 2013


---------- Forwarded message ----------
From: fabio vicentini <fmvicent en gmail.com>
Date: Wed, Nov 27, 2013 at 6:55 AM
Subject: Informe sobre la corrupción en Exactas - Parte 26
To: "presidencia en conicet.gov.ar" <presidencia en conicet.gov.ar>,
FulbrightNEXUS en iie.org, info en fundacionsadosky.org.ar, info en mincyt.gob.ar,
prensa en mincyt.gov.ar, prensa en me.gov.ar


Esta serie dedicada a la corrupción académica hace hincapié en la
mediocridad de lo que se publica debido a la doctrina de *publish or perish*.
Nadie ha respondido a ninguno de mis artículos. Quizas se deba a que soy un
demenciado como afirma el decano. Pero quiero mostrar que no soy el único
loco en este mundo preocupado por el *bullshitting* académico. El siguiente
es un resumen del articulo de un físico.





PUBLISH OR PERISH – AN AILING ENTERPRISE?

Mohamed Gad-el-Hak



Three recent events, taking place in rapid succession, incited me to write
this Opinion.



The first was an annual report from a major school of engineering whose
dean proudly listed 52 papers that he wrote in the course of the previous
year. Such an output is, on average, one idea conceived, executed, written,
and published every week. That is an amazing feat for a busy administrator,
or anybody else for that matter.



The second was a physics professor who was introduced at a meeting as the
author of 80 books. This man was not the superhumanly prolific Isaac
Asimov, but a professor with a publication rate, over a 20-year career, of
one technical book every three months.



The third was a book on flow control I was asked to review for a journal.
The 200-page, camera-ready manuscript was clearly never seen by a
copyeditor and was mostly a shoddy cut-and-paste job from the author's
doctoral dissertation--and worse, from the publications of others. The book
offered little of value, yet it was priced at 50 cents per page.



The three events are a syndrome of what is ailing academic publishing
today.



*Academic institutions in the US have made it imperative for faculty
members to publish in order to survive and prosper*. There is nothing wrong
with that principle if it emphasizes quality rather than quantity. For the
most part, that emphasis on publishing has worked for many decades. The
number of publications was reasonable, and tenure and promotion decisions
in research universities were largely based on the impact of a candidate's
scholarly work, as measured by the number of citations and, less
quantitatively, by expert opinions. The number of journals and consequently
the number of requests for refereeing were both manageable. Overall,
technical books were published when a senior researcher with years of
experience had something significant to write about.



Unfortunately, today we witness a different environment from that of a
generation ago. The publish-or-perish emphasis for some, but not all,
*institutions
has deteriorated into* *bean counting*, and the race is on to publish en
masse. Demand spurs supply. Mostly-for-profit *publishers of books and
journals have mushroomed*, and mediocrity has crept into both places. *Journal
pages have to be filled*, and library shelves have to be stacked with
books. *The number of periodicals worldwide currently stands at 169
000*and the number of books published in the US alone in 2001 is 56
364. Of
course, not all of these are academic publications, but the sheer numbers
are frightening enough. Currently, more journals in a particular research
field are published than anyone can reasonably keep up with. *The
publishing craze has now extended to all-electronic journals*. Many
articles, both print and electronic, remain without a single citation five
or more years after publication. Although more difficult to measure, *I
presume even more papers remain unread by anyone other than their authors*. The
way some papers list their authors today, some articles may not even be
read by all their respective coauthors.



One measure of journal quality is the *impact factor*, which is defined,
for a specific year, as the total number of citations made in that year for
articles published in the two preceding years divided by the number of
citable articles published in those years (see the article by Henry H.
Barschall, "The Cost-Effectiveness of Physics Journals," Physics Today,
July 1988, page 56).



The peer review system, although criticized by some as somewhat biased
against unorthodox ideas, is essential to weed out the charlatans, the
misguided, and the fools. Peer review must be preserved if not
strengthened. However, more papers published means that, on average, each
researcher receives more requests for refereeing. *The good referees are
inundated with more papers to review than they can possibly handle*. *Other
types of referees typically do not do a thorough job, and mediocre papers
make it through the system*. Of course, shoddy work always existed and
competed with good work for journal space. *But with the deluge of new
journals, enough shoddy work is now being done to fill whole
journals*. *Hopping
from one journal to another until something is eventually accepted for
publication is fast becoming a pastime for some* researchers.



*When did the bug strike?* Although our malaise was slow in the beginning,
it accelerated in a classic chain-reaction fashion. About 15 years ago, the
problem became perceptible (see, for example, two great Reference Frame
columns by David Mermin, "What's Wrong With This Library?" Physics Today,
August 1988, page 9, and "Publishing in Computopia," Physics Today, May
1991, page 9), about the same time that grade inflation took hold (although
some trace the roots of this to the Vietnam era); instant gratification
became a birthright; and, in the film "Wall Street," Gordon Gecko declared
that greed is good. I make no claim of causality.



In an ideal world, *counting the publications of individuals should not be
used to evaluate them.* Instead, the impact of the individual's
publications should be what is important. But measuring impact is neither
easy nor straightforward, despite the availability of the Science Citation
Index and similar measuring tools. For example, particularly for young
reseachers, the number of citations per publication is a fairer index of
competence than the total number of citations.



Some time during the last 15 years, *bean counting became acceptable to
some universities.* As researchers found they were not getting sufficient
credit for producing high-impact publications, they decided to publish more
papers. *A tendency developed to* *add undeserving coauthors*. *The
cut-and-paste button on the computer facilitated the exponential growth of
papers*. *More and more journals entered the marketplace* to absorb the
additional demand for pages *and accelerated the need for editors and
referees*. *The competency of both suffered*. Of course, many journals kept
or even elevated their already high standards. *Journals quickly stratified
into elite and second- and third-tier publications. *



-----------------------

 Mohamed Gad-el-Hak is the Inez Caudill Eminent Professor of Biomedical
Engineering and chair of mechanical engineering at Virginia Commonwealth
University in Richmond
------------ próxima parte ------------
Se ha borrado un adjunto en formato HTML...
URL: http://mail.df.uba.ar/pipermail/todos/attachments/20131127/4cab7894/attachment.html 


Más información sobre la lista de distribución Todos