“Our faiths are inextricably linked on any number of things that we must confront and deal with in policy concepts today. Our faiths are inextricably linked on the environment. For many of us, respect for God’s creation also translates into a duty to protect and sustain his first creation, Earth, the planet,”
“Confronting climate change is, in the long run, one of the greatest challenges that we face, and you can see this duty or responsibility laid out in Scriptures clearly, beginning in Genesis. And Muslim-majority countries are among the most vulnerable. Our response to this challenge ought to be rooted in a sense of stewardship of Earth, and for me and for many of us here today, that responsibility comes from God.”
Pathetic handwaving double down from the UN
Eric Worrall writes: A number of MSM outlets are carrying news of a “leaked” UN document, which claims that global warming may be causing irreversible damage.
According to the Bloomberg version of the leak story;
“Global warming already is affecting “all continents and across the oceans,” and further pollution from heat-trapping gases will raise the likelihood of “severe, pervasive and irreversible impacts for people and ecosystems,”
The problem with this vapid handwaving nonsense is that it is so vague. I mean, in the good old days, alarmists made interesting predictions;
Snowfalls are now just a thing of the past
Al Gore’s ice free arctic (in 5 years!)
Rain will never fill Australian reservoirs again
The great thing about bold predictions is they are easily falsified – all you have to do is wait a few years, then point and laugh.
The survivors of that golden age of bold stupidity are far too timid – they issue vague predictions of calamity which won’t occur until long after we are all safely dead, and promises that if we wait a few decades we might see something worrying.
I mean, seriously folks, is this the best you can do? Can even the most rabid alarmists get enthused by such a pathetic effort?
“We are made miserable . . . not just by the strength of our beliefs, but by the weight of hard and all-too real situations, as they bear downward, robbing us of control . . . unhappiness treated by clinicians has much more to do with the sufferer’s situation than with anything about themselves, and for those with few privileges, this unhappiness is pretty well beyond the reach of therapeutic or any other conversation.” – Paul Moloney “The Therapy Industry”
Robin Williams’s body was scarcely cold when liberal commentators began using the tragedy of his death as publicity for suicide hotlines and professional mental health intervention in general. He had long-standing depression, we were told, and his “mental illness” was manifest in his decision to take his own life. Depression sufferers were urged to “be honest” and avail themselves of the services of professional therapists and counselors.
Days later Williams’s widow informed the world that her husband had been diagnosed with Parkinson’s disease, a degenerative disorder that even people with no prior history of depression can find impossible to face. Parkinson’s is chronic, and its symptoms worsen over time, leading to body tremors, muscle stiffness, and the loss of coordinated movement. No one knows why the disease develops, and it is incurable.
We do not know what went through Williams’s mind, of course, but it is not difficult to entertain the idea that the lifelong actor made an understandable decision to take an early exit from life’s stage rather than suffer the appalling loss of body control that the disease entails for its sufferers. Surely there is something more than “mental illness” involved in the desire to avoid such a fate.
Even if Williams’s well-known depression, which long-predated his Parkinson’s diagnosis, was involved in his decision to end his life, the liberal notion that we can and ought to rely on mental health professionals to guide us to health and sanity is more than a little suspect. There is no evidence that this group suffers lower rates of depression than the rest of the population, nor any that any kind of therapy has a cure for it. In fact, the evidence suggests that the mental health profession plays a crucial role in perpetuating a status quo within which depression is said to be growing by leaps and bounds.
Psychoanalyst Joel Kovel demonstrated in the early 1980s that psychotherapy and counseling had become indispensable parts of the capitalist economy, especially in the United States, where turning socially induced misery into false questions of self-improvement long ago reached the status of a quasi-religious movement. Subsequent to Kovel’s published insights came the “diseasing” and drugging of hyper-active American schoolchildren due to what eventually came to be known as “ADHD.” In more recent years, we have seen how “happiness psychology,” particularly the work of conservative academic and writer Martin Seligman, a former chairman of the American Psychological Association and adviser to the U.S. military, informed the Bush Administration’s torture program at Guantanamo Bay. All of this should make us quite skeptical about claims that therapy and counseling have the answer to our mental woes.
Having said that, the challenge of effectively treating mental disorders is surely formidable. According to surveys and clinical data, rates of depression in the U.S. have increased ten-fold since the 1950s, although it must be admitted that individuals of quite divergent symptoms are routinely classified under this broad umbrella, calling into question the validity of the category itself. However, even if some of the increase is due to an increased tendency to define common dissatisfaction as illness, it seems likely that at least some of the increase is genuine, given soaring inequality and an attendant increase in chronic illness, social isolation and reported loneliness, and suicide, especially during the periods of economic crisis that have become a nearly constant feature of U.S. capitalism in recent years.
Contrary to therapeutic claims that a “positive” attitude is the key to mental health, a growing body of evidence supports the claim that the principal influence on people’s mental health is their circumstances, both past and present. We can now say with some assurance that the larger and more obvious the gaps between rich and poor in developed societies – and the more exploitive the relations required to maintain and expand them – the greater the likelihood of violent conflict, mutual distrust, and degraded health, both mental and physical. Features of a particular location in the social hierarchy such as prestige, conditions of work, material circumstances, and wealth largely determine one’s likelihood of enjoying mental and physical health or illness. And to the extent that one belongs to a stigmatized, exploited group, and especially if one is poor, the more likely one is to experience life’s hardest blows – more often, more painfully, and with fewer joyful experiences to compensate for them.
Conventional counseling and therapy isn’t even focused on this problem, much less is it offering a solution to it. Because of its conviction that attitude is everything, conventional approaches put the onus of responsibility on the poor for their poverty. Thus they are given parenting training and other judgmental interventions when what they really need is decent housing, food, recreation, medical care, and above all, money. The assumption is that the poor deserve to be poor owing to their allegedly deficient character, made manifest in poor impulse control, hypersexuality, and a general lack of integrity. If it weren’t for these defects, the theory goes, the poor would be contented members of the middle class. This is one of the most damaging features of therapy, because it teaches exploited people that they are deficient or substandard instead of abused. Unfortunately, the crude stereotypes blaming the poor for their plight are promoted by a wide spectrum of members of the so-called “helping” profession: community leaders, social work educators, and quite a few academic researchers. If this is “help,” what might hindrance be?
Therapists and counselors with a genuine interest in finding a cure for mental illness would do well to investigate the income inequalities hypothesis of population health. Based on the common sense assumption that high levels of inequality are unhealthy (directly for the poor, indirectly for the rich), the thesis is that for modern industrialized countries, the average health, well-being, and longevity of the population depends not on the level of absolute poverty that exists, but on the spread of wealth, and especially on the gap between rich and poor.
As income differentials widen, the theory goes, people start to feel more competitive, and begin to look on others with increasing suspicion and distrust. Wariness, envy, shame, fear, and anger become more pronounced and take on a self-perpetuating thrust, undermining the basis for affectionate and caring relationships. A life of perpetual insecurity (which former Fed Chairman Alan Greenspan declared in Congressional testimony was the principal reason for the 1990s boom years) and perceived threat triggers the release of cortisol and other “stress” hormones into the bloodstream, lowering our capacity to fight infection and ward off heart disease and other degenerative conditions. It should be emphasized that the theory maintains that this harms even the rich, who, amidst increasingly unjust conditions, have less and less opportunity to enjoy their wealth in ease. The public health implications are substantial: an increase of 7% in the share of income going to the bottom half of the population allegedly yields two additional years of life expectancy. [Note: The U.S. has the most unequal distribution of wealth in the developed world. According to the most recent survey by the Federal Reserve, the top decile own 71% of the country's wealth, while the bottom half claims just one percent.]
One of the more intriguing mental health research findings undermines the “positive attitude” theorists. It shows that moderately depressed people have a more accurate perception of their abilities and their capacity to control events than do “healthy” people. A 2002 study found that mildly depressed women were more likely to live longer than non-depressed or severely depressed women. A longitudinal study of more than 1000 California schoolchildren concluded that optimism was more likely to lead to premature death – possibly because the optimists took more risks. Another study among pre-teenagers found that kids who were more realistic about their standing among their peers were less likely to get depressed than those who had illusions about their popularity. And a 2001 study co-authored by the guru of happiness psychology himself – Martin Seligman – found that among older people pessimists were less likely to fall into depression following a negative life event such as the death of a family member than were optimists.
These findings should provoke a complete reorientation of, not just the helping professions, but the entire society. After all, psychologists have long convinced us that we are all “CEOs” of self, rationally testing our ideas against reality, and that we become disturbed to the extent that we cannot accept the verdict that reality delivers. In short, to the extent that our ideas are unrealistic we are mentally ill, which should mean that President Obama, the Supreme Court, top executives on Wall Street, and virtually the entire Congress are certifiable lunatics.
But of course it doesn’t mean that. WE who cannot make our peace with a social order dedicated to plunder and destruction are mentally suspect, because responsible adulthood entails setting aside the childish notion that the world can be transformed into something within which a decent person would want to live, in order to concentrate on the supremely important matter of reproducing an increasingly imperiled social order dedicated to getting and spending. This is the reigning definition of sanity in our times. God help anyone who insists that social and political reality, not personal attitudes and reactions, is what needs to be adjusted.
Barbara Ehrenreich, “Bright-Sided – How The Relentless Promotion of Positive Thinking Has Undermined America,” (Metropolitan Books, 2009)
Paul Moloney, “The Therapy Industry – The Irresistible Rise of the Talking Cure, and Why It Doesn’t Work,” (Pluto Press, 2013)
Thomas Piketty, “Capital in the Twenty-First Century,” (Harvard University Press, 2014)
My, the Cook et al. 97% consensus paper is a gift to the climate blogosphere that keeps on giving. Some insightful new posts provide fodder for additional discussion on this.
Ben Pile has a new post Tom Curtis doesn’t understand the 97% paper. Here are a few excerpts that raise some interesting points:
We see now why many environmentalists are so hostile to debate. Permitting debate — even giving the possibility of debate a moment’s thought — shatters the binary opposing categories that have been established in lieu of an actual debate of substance on climate change and what to do about it. The division of the debate into scientists versus deniers is a strategy, but one which has worn thin, as Davey’s performance on The Sunday Politics show revealed, and which Hulme alludes to.
It has been somewhat gratifying that almost all of the criticism of my post I have seen so far is from angry trolls, mostly on twitter, but one or two popped up to comment on the post. From what I can tell their argument is circular: it is irresponsible to give air/blog time to sceptics because there’s a strong scientific consensus that says they’re wrong.
Tom Curtis (who is, as far as I can tell, a partner in the Skeptical Science blog enterprise) obliges, with archetypal green invective.
There is a large measure of idiocy in Ben Pile’s post, and in Mike Hulme’s endorsement of it.
The architects of the new consensus — Cook et al and their pals — really ought to understand the dynamics of a consensus. If you begin your defence of a consensus by calling those who might belong to it ‘idiots’, the only possible outcome is that the consensus will diminish.
I certainly do know for a fact that some people’s estimates of climate sensitivity are so low as to at least imply, contrary to the IPCC statement, natural variability might account for more than 50% of the warming in the second half of the C20th. My argument, however, was that the Consensus Project is too clumsy to capture such a position.
It’s all about endorsing with these guys, isn’t it… Endorsement and rejection. Hulme should have rejected Pile and endorsed Cook et al, because Pile rejects the consensus, whereas Cook et al endorse it, as do most climate scientists. They want agreements and disagreements to be black and white, yes and no, true and false, science and denier.
But as I explain in the post, in the case of Davey, the science is being ignored by a politician, it having been displaced from the debate by the 97% figure.
JC comment: This is an important point. In my No consensus paper, I argue “the consensus building process employed by the IPCC does not lend intellectual substance to their conclusions. “
Moreover, as we have seen in Davey, his predecessors, and his superiors, you can say anything you like about climate change, as long as it doesn’t contradict this view of sides. You could say, for instance, that there will be 10 metres of sea level rise by 2100 and that therefore climate policies are necessary. This claim would exist far away from ‘The Science’. But it would seem to be correct according to the tests applied to it by the Consensus Project. This is disappointing, because Curtis is nearly on to something…
Further, he appears to have picked up that strange censorial attitude noteworthy also in von Storch which presumes that because they do not believe that AGW will lead to catastrophe (which is a respectable position inside the consensus), that therefore scientists who do believe that it will (also a respectable position inside the consensus) must not state that belief in public.
Surely this is a frank admission that there is no consensus on catastrophic climate change? If so, then Curtis is now in a real bind, because this deprives the ‘warmist’ crowd of their moral imperatives. Moreover, most complaints from sceptics are that the catastrophism we are all too familiar with is undue — not that there is no such thing as climate change.
Dan Kahan has chimed in with a post The distracting counterproductive 97% consensus debate drags on. Some points that caught my eye:
But it is demonstrably the case (I’m talking real-world evidence here) that the regular issuance of these studies, and the steady drum beat of “climate skeptics are ignoring scientific consensus!” that accompany them, have had no—zero, zilch—net effect on professions of public “belief” in human-caused climate change in the U.S.
On the contrary, there’s good reason to believe that the self-righteous and contemptuous tone with which the “scientific consensus” point is typically advanced (“assault on reason,” “the debate is over” etc.) deepens polarization. That’s because “scientific consensus,” when used as a rhetorical bludgeon, predictably excites reciprocally contemptuous and recriminatory responses by those who are being beaten about the head and neck with it.
Such a mode of discourse doesn’t help the public to figure out what scientists believe. But it makes it as clear as day to them that climate change is an “us-vs.-them” cultural conflict, in which those who stray from the position that dominates in their group will be stigmatized as traitors within their communities.
Nevertheless, the authors of the most recent study announced (in a press release issued by the lead author’s university) that “when people understand that scientists agree on global warming, they’re more likely support politics that take action on it,” a conclusion from which the authors inferred that “making the results of our paper widely-known is an important step toward closing the consensus gap and increasing public support for meaningful climate change.”
Unsurprisingly, the study has in the months since its publication supplied a focal target for climate skeptics, who have challenged the methods the authors employ.
The debate over the latest “97%” paper multiplies the stock of cues that climate change is an issue that defines people as members of opposing cultural groups. It thus deepens the wellsprings of motivation that they have to engage evidence in a way that reinforces what they already believe. The recklessness that the authors displayed in fanning the flames of unreason that fuels this dynamic is what motivated me to express dismay over the new study.
Members of the public are not experts on scientific matters. Rather they are experts in figuring out who the experts are, and in discerning what the practical importance of expert opinion is for the decisions they have to make as individuals and citizens.
JC comment: In my post Climategate essay On the credibility of climate research Part II rebuilding trust, I wrote: Credibility is a combination of expertise and trust. While scientists persist in thinking that they should be trusted because of their expertise, climategate has made it clear that expertise itself is not a sufficient basis for public trust. Recent disclosures about the IPCC have brought up a host of concerns about the IPCC that had been festering in the background: involvement of IPCC scientists in explicit climate policy advocacy; tribalism that excluded skeptics; hubris of scientists with regards to a noble (Nobel) cause; alarmism; and inadequate attention to the statistics of uncertainty and the complexity of alternative interpretations. The experts do their science and ’cause’ a disservice by engaging in these behaviors.
Ordinary citizens are amazingly good at this. Their use of this ability, moreover, is not a substitute for rational thought; it is an exercise rational thought of the most impressive sort. But in a science communication environment polluted with toxic partisan meanings, the faculties they use to discern what most scientists believe are impaired.
JC comment: In my paper No consensus on consensus, I wrote: While the public may not understand the complexity of the science or be predisposed culturally to accept the consensus, they can certainly understand the vociferous debates over the science portrayed by the media. Further, they can judge the social facts surrounding the consensus building process, including those revealed by the so-called “Climategate” episode, and decide whether to trust the experts whose opinion comprises the consensus. Beck argues that “in a public debate, the social practices of knowledge-making matter as much as the substance of the knowledge itself.”
The problem with the suggestion of the authors’ of the latest “97%” study that the key is to “mak[e] the results of [their] paper widely-known” is that it diverts serious, well-intentioned people from efforts to clear the air of the toxic meanings that impede the processes that usually result in public convergence on the best available (and of course always revisable!) scientific conclusions about people can protect themselves from serious risks.
Indeed, as I indicated, the particular manner in which the “scientific consensus” trope is used by partisan advocates tends only to deepen the toxic fog of cultural conflict that makes it impossible for ordinary citizens to figure out what the best scientific evidence is.
JC comment: with the advent of the ‘pause’ in dominating the public debate on climate change, deepening of the fog may be the objective of the shriller perveyors of consensus.
But from what I see, it is becoming clearer and clearer that those who have dedicated themselves to promoting public engagement with the best available scientific evidence on climate change are not dealing with the admittedly sensitive and challenging task of explaining why it is normal, in this sort of process, to encounter discrepancies between forecasting models and subsequent observations and to adjust the models based on them. And why such adjustment in the context of climate change is cause for concluding neither that “the science was flawed” nor that “there is in fact nothing for anyone to be concerned about.”
JC comment: Too many defenders of the consensus have become either ‘pause’ deniers or ‘pause’ dismissers. A while back, I recommended that they ‘own’ the pause, and work on explaining it. Belatedly, we see a little bit of this happening, but of course it does not lead them to challenge the main IPCC conclusion on 20th century attribution.
JC summary: It is really good to see this discussion about the role of consensus in the public debate on climate change and the problems this has caused for the science, the policy, and increasingly for the proponents of consensus. It is however dismaying to see that continued influence that the existence of a ‘consensus’ has on the politics (especially President Obama’s citing of the Cook et al. study).
There is broad disagreement over the amounts and effects of radiation exposure due to the triple reactor meltdowns after the 2011 Great East-Japan Earthquake and tsunami. The International Physicians for the Prevention of Nuclear War (IPPNW) joined the controversy June 4, with a 27-page “Critical Analysis of the UNSCEAR Report ‘Levels and effects of radiation exposures due to the nuclear accident after the 2011 Great East-Japan Earthquake and tsunami.’”
IPPNW is the Nobel Peace Prize winning global federation of doctors working for “a healthier, safer and more peaceful world.” The group has adopted a highly critical view of nuclear power because as it says, “A world without nuclear weapons will only be possible if we also phase out nuclear energy.”
UNSCEAR, the United Nations Scientific Committee on the Effects of Atomic Radiation, published its deeply flawed report April 2. Its accompanying press release summed up its findings this way: “No discernible changes in future cancer rates and hereditary diseases are expected due to exposure to radiation as a result of the Fukushima nuclear accident.” The word “discernable” is a crucial disclaimer here.
Cancer, and the inexorable increase in cancer cases in Japan and around the world, is mostly caused by toxic pollution, including radiation exposure according to the National Cancer Institute. But distinguishing a particular cancer case as having been caused by Fukushima rather than by other toxins, or combination of them, may be impossible leading to UNSCEAR’s deceptive summation. As the IPPNW report says, “A cancer does not carry a label of origin…”
UNSCEAR’s use of the phrase “are expected” is also heavily nuanced. The increase in childhood leukemia cases near Germany’s operating nuclear reactors, compared to elsewhere, was not “expected,” but was proved in 1997. The findings, along with Chernobyl’s lingering consequences, led to the country’s federally mandated reactor phase-out. The plummeting of official childhood mortality rates around five US nuclear reactors after they were shut down was also “unexpected,” but shown by Joe Mangano and the Project on Radiation and Human Health.
The International Physicians’ analysis is severely critical of UNSCEAR’s current report which echoes its 2013 Fukushima review and press release that said, “It is unlikely to be able to attribute any health effects in the future among the general public and the vast majority of workers.”
“No justification for optimistic presumptions”
The IPPNW’s report says flatly, “Publications and current research give no justification for such apparently optimistic presumptions.” UNSCEAR, the physicians complain, “draws mainly on data from the nuclear industry’s publications rather than from independent sources and omits or misinterprets crucial aspects of radiation exposure”, and “does not reveal the true extent of the consequences” of the disaster. As a result, the doctors say the UN report is “over-optimistic and misleading.” The UN’s “systematic underestimations and questionable interpretations,” the physicians warn, “will be used by the nuclear industry to downplay the expected health effects of the catastrophe” and will likely but mistakenly be considered by public authorities as reliable and scientifically sound. Dozens of independent experts report that radiation attributable health effects are highly likely.
Points of agreement: Fukushima is worse than reported and worsening still
Before detailing the multiple inaccuracies in the UNSCEAR report, the doctors list four major points of agreement. First, UNSCEAR improved on the World Health Organization’s health assessment of the disaster’s on-going radioactive contamination. UNSCEAR also professionally “rejects the use of a threshold for radiation effects of 100 mSv [millisieverts], used by the International Atomic Energy Agency in the past.” Like most health physicists, both groups agree that there is no radiation dose so small that it can’t cause negative health effects. There are exposures allowed by governments, but none of them are safe.
Second, the UN and the physicians agree that areas of Japan that were not evacuated were seriously contaminated with iodine-132, iodine-131 and tellurium-132, the worst reported instance being Iwaki City which had 52 times the annual absorbed dose to infants’ thyroid than from natural background radiation. UNSCEAR also admitted that “people all over Japan” were affected by radioactive fallout (not just in Fukushima Prefecture) through contact with airborne or ingested radioactive materials. And while the UNSCEAR acknowledged that “contaminated rice, beef, seafood, milk, milk powder, green tea, vegetables, fruits and tap water were found all over mainland Japan”, it neglected “estimating doses for Tokyo … which also received a significant fallout both on March 15 and 21, 2011.”
Third, UNSCEAR agrees that the nuclear industry’s and the government’s estimates of the total radioactive contamination of the Pacific Ocean are “far too low.” Still, the IPPNW reports shows, UNSCEAR’s use of totally unreliable assumptions results in a grossly understated final estimate. For example, the UN report ignores all radioactive discharges to the ocean after April 30, 2011, even though roughly 300 tons of highly contaminated water has been pouring into the Pacific every day for 3-and-1/2 years, about 346,500 tons in the first 38 months.
Fourth, the Fukushima catastrophe is understood by both groups as an ongoing disaster, not the singular event portrayed by industry and commercial media. UNSCEAR even warns that ongoing radioactive pollution of the Pacific “may warrant further follow-up of exposures in the coming years,” and “further releases could not be excluded in the future,” from forests and fields during rainy and typhoon seasons when winds spread long-lived radioactive particles and from waste management plans that now include incineration.
As the global doctors say, in their unhappy agreement with UNSCAR, “In the long run, this may lead to an increase in internal exposure in the general population through radioactive isotopes from ground water supplies and the food chain.”
Physicians find ten grave failures in UN report
The majority of the IPPNW’s report details 10 major errors, flaws or discrepancies in the UNSCEAR paper and explains study’s omissions, underestimates, inept comparisons, misinterpretations and unwarranted conclusions.
1. The total amount of radioactivity released by the disaster was underestimated by UNSCEAR and its estimate was based on disreputable sources of information. UNSCEAR ignored 3.5 years of nonstop emissions of radioactive materials “that continue unabated,” and only dealt with releases during the first weeks of the disaster. UNSCEAR relied on a study by the Japanese Atomic Energy Agency (JAEA) which, the IPPNW points out, “was severely criticized by the Fukushima Nuclear Accident Independent Investigation Commission … for its collusion with the nuclear industry.” The independent Norwegian Institute for Air Research’s estimate of cesium-137 released (available to UNSCEAR) was four times higher than the JAEA/UNSCEAR figure (37 PBq instead of 9 PBq). Even Tokyo Electric Power Co. itself estimated that iodine-131 releases were over four times higher than what JAEA/UNSCEAR) reported (500 PBq vs. 120 BPq). The UNSCEAR inexplicably chose to ignore large releases of strontium isotopes and 24 other radionuclides when estimating radiation doses to the public. (A PBq or petabecquerel is a quadrillion or 1015 Becquerels. Put another way, a PBq equals 27,000 curies, and one curie makes 37 billion atomic disintegrations per second.)
2. Internal radiation taken up with food and drink “significantly influences the total radiation dose an individual is exposed to,” the doctors note, and their critique warns pointedly, “UNSCEAR uses as its one and only source, the still unpublished database of the International Atomic Energy Association and the Food and Agriculture Organization. The IAEA was founded … to ‘accelerate and enlarge the contribution of atomic energy to peace, health and prosperity throughout the world.’ It therefore has a profound conflict of interest.” Food sample data from the IAEA should not be relied on, “as it discredits the assessment of internal radiation doses and makes the findings vulnerable to claims of manipulation.” As with its radiation release estimates, IAEA/UNSCEAR ignored the presence of strontium in food and water. Internal radiation dose estimates made by the Japanese Ministry for Science and Technology were 20, 40 and even 60 times higher than the highest numbers used in the IAEA/UNSCEAR reports.
3. To gauge radiation doses endured by over 24,000 workers on site at Fukushima, UNSCEAR relied solely on figures from Tokyo Electric Power Co., the severely compromised owners of the destroyed reactors. The IPPNW report dismisses all the conclusions drawn from Tepco, saying, “There is no meaningful control or oversight of the nuclear industry in Japan and data from Tepco has in the past frequently been found to be tampered with and falsified.”
4. The UNSCEAR report disregards current scientific fieldwork on actual radiation effects on plant and animal populations. Peer reviewed ecological and genetic studies from Chernobyl and Fukushima find evidence that low dose radiation exposures cause, the doctors point out, “genetic damage such as increased mutation rates, as well as developmental abnormalities, cataracts, tumors, smaller brain sizes in birds and mammals and further injuries to populations, biological communities and ecosystems.” Ignoring these studies, IPPNW says “gives [UNSCEAR] the appearance of bias or lack of rigor.”
5. The special vulnerability of the embryo and fetus to radiation was completely discounted by the UNSCEAR, the physicians note. UNSCEAR shockingly said that doses to the fetus or breast-fed infants “would have been similar to those of other age groups,” a claim that, the IPPNW says, “goes against basic principles of neonatal physiology and radiobiology.” By dismissing the differences between an unborn and an infant, the UNSCEAR “underestimates the health risks of this particularly vulnerable population.” The doctors quote a 2010 report from American Family Physician that, “in utero exposure can be teratogenic, carcinogenic or mutagenic.”
6. Non-cancerous diseases associated with radiation doses — such as cardiovascular diseases, endocrinological and gastrointestinal disorders, infertility, genetic mutations in offspring and miscarriages — have been documented in medical journals, but are totally dismissed by the UNSCEAR. The physicians remind us that large epidemiological studies have shown undeniable associations of low dose ionizing radiation to non-cancer health effects and “have not been scientifically challenged.”
7. The UNSCEAR report downplays the health impact of low-doses of radiation by misleadingly comparing radioactive fallout to “annual background exposure.” The IPPNW scolds the UNSCEAR saying it is, “not scientific to argue that natural background radiation is safe or that excess radiation from nuclear fallout that stays within the dose range of natural background radiation is harmless.” In particular, ingested or inhaled radioactive materials, “deliver their radioactive dose directly and continuously to the surrounding tissue” — in the thyroid, bone or muscles, etc. — “and therefore pose a much larger danger to internal organs than external background radiation.”
8. Although UNSCEAR’s April 2 Press Release and Executive Summary give the direct and mistaken impression that there will be no radiation health effects from Fukushima, the report itself states that the Committee “does not rule out the possibility of future excess cases or disregard the suffering associated…” Indeed, UNSCEAR admits to “incomplete knowledge about the release rates of radionuclides over time and the weather conditions during the releases.” UNSCEAR concedes that “there were insufficient measurements of gamma dose rate…” and that, “relatively few measurements of foodstuff were made in the first months.” IPPNW warns that these glaring uncertainties completely negate the level of certainty implied in UNSCEAR’s Exec. Summary.
9. UNSCEAR often praises the protective measures taken by Japanese authorities, but the IPPNW finds it “odd that a scientific body like UNSCEAR would turn a blind eye to the many grave mistakes of the Japanese disaster management…” The central government was slow to inform local governments and “failed to convey the severity of the accident,” according to the Fukushima Nuclear Accident Independent Investigation Commission. “Crisis management ‘did not function correctly,’ the Commission said, and its failure to distribute stable iodine, “caused thousands of children to become irradiated with iodine-131,” IPPNW reports.
10. The UNSCEAR report lists “collective” radiation doses “but does not explain the expected cancer cases that would result from these doses.” This long chapter of IPPNW’s report can’t be summarized easily. The doctors offer conservative estimates, “keeping in mind that these most probably represent underestimations for the reasons listed above.” The IPPNW estimates that 4,300 to 16,800 excess cases of cancer due to the Fukushima catastrophe in Japan in the coming decades. Cancer deaths will range between 2,400 and 9,100. UNSCEAR may call these numbers insignificant, the doctors archly point out, but individual cancers are debilitating and terrifying and they “represent preventable and man-made diseases” and fatalities.
IPPNW concludes that Fukushima’s radiation disaster is “far from over”: the destroyed reactors are still unstable; radioactive liquids and gases continuously leak from the complex wreckage; melted fuel and used fuel in quake-damaged cooling pools hold enormous quantities of radioactivity “and are highly vulnerable to further earthquakes, tsunamis, typhoons and human error.” Catastrophic releases of radioactivity “could occur at any time and eliminating this risk will take many decades.”
IPPNW finally recommends urgent actions that governments should take, because the UNSCEAR report, “does not adhere to scientific standards of neutrality,” “represents a systematic underestimation,” “conjures up an illusion of scientific certainty that obscures the true impact of the nuclear catastrophe on health and the environment,” and its conclusion is phrased “in such a way that would most likely be misunderstood by most people…”
John LaForge works for Nukewatch, a nuclear watchdog and anti-war group in Wisconsin, and edits its Quarterly.
 Nancy Wilson, National Cancer Institute, “The Majority of Cancers Are Linked to the Environment, NCI Benchmarks, Vol. 4, Issue 3, June 17, 2004
From Rasmussen Reports:
Voters strongly believe the debate about global warming is not over yet and reject the decision by some news organizations to ban comments from those who deny that global warming is a problem.
Only 20% of Likely U.S. Voters believe the scientific debate about global warming is over, according to the latest Rasmussen Reports national telephone survey. Sixty-three percent (63%) disagree and say the debate about global warming is not over. Seventeen percent (17%) are not sure. (To see survey question wording, click here.)
Forty-eight percent (48%) of voters think there is still significant disagreement within the scientific community over global warming, while 35% believe scientists generally agree on the subject.
The BBC has announced a new policy banning comments from those who deny global warming, a policy already practiced by the Los Angeles Times and several other media organizations. But 60% of voters oppose the decision by some news organizations to ban global warming skeptics. Only 19% favor such a ban, while slightly more (21%) are undecided.
But then 42% believe the media already makes global warming appear to be worse than it really is. Twenty percent (20%) say the media makes global warming appear better than it really is, while 22% say they present an accurate picture. Sixteen percent (16%) are not sure.
Still, this is an improvement from February 2009 when 54% thought the media makes global warming appear worse than it is. Unchanged, however, are the 21% who say the media presents an accurate picture.
I’ve written a few times on the question of one of my favorite hangouts on the planet, underwater tropical coral reefs. Don’t know if you’ve ever been down to one, but they are a fairyland of delights, full of hosts of strange and mysterious creatures. I’ve seen them far from the usual haunts of humanoids, where they are generally full of vigor and bursting with life.
I’ve also seen them in various stages of ill-health, including the bleaching caused by occasional high temperatures (which a healthy reef recovers from in a few years). In all of my writings on this subject, I’ve said that the health of the reef depends in large part on parrotfish. I’ve proposed that atoll nations declare the parrotfish as their national bird, just to bring attention to the fish that are responsible for the very existence of the atolls themselves.
This is for two reasons. First, parrotfish are herbivores. They graze on the algae that is constantly trying to take over the reef. This keeps the reef clear of algae so that the coral polyps can get the sunlight that they need to survive.
Second, the parrotfish graze by biting off chunks of coral. They crunch these up between specialized bony plates in their throats, digest all of the greenery, and they subsequently excrete nothing but the finest, whitest, softest coral sand … the very sand that makes the romantic tropical beaches. It’s quite funny to see what happens if you disturb a whole school of them—they drop their entire load and disappear in a flash, leaving nothing but a white cloud of sand slowly dropping to the ocean floor, eventually to be swept by the waves up onto the beach.
Unfortunately, although parrotfish are wary during the day, they sleep at night out in the open. As a result, the advent of the waterproof flashlight has led to their local extinction on many reefs.
To bring this story up to the present, over at his excellent NoTricksZone website, Pierre Gosselin points out a press release from the International Union of Concerned Scientists (IUCN) entitled From despair to repair: Dramatic decline of Caribbean corals can be reversed. It discusses a recent report called “Status and Trends of Caribbean Coral Reefs, 1970-2012″, linked to below.
In the press release, they point out that although climate change has been blamed for the decline in Caribbean coral reefs, the major reason for the decline is … drum roll … the loss of the parrotfish and other reef grazers. The press release says:
Climate change has long been thought to be the main culprit in coral degradation. While it does pose a serious threat by making oceans more acidic and causing coral bleaching, the report shows that the loss of parrotfish and sea urchin – the area’s two main grazers – has, in fact, been the key driver of coral decline in the region.
Despite the obligatory nod to climate change, they have finally come to their senses.
Now, the IUCN has been heavily invested in the “climate change” meme, so I find this to be a most welcome sign that perhaps some sanity is returning to the field. Back a decade ago I wrote about role of parrotfish in reef loss, but at that time everyone from the Sierra Club to the IUCN were blaming climate change.
And this is one of the huge problems with blaming everything and its cousin on climate change—when you blame wrongly climate change, you ignore the real problem. For example, the claimed (but illusory) “sinking” of coral atolls was long blamed on sea level rise from climate change.
But all that did is obscure the real danger to coral atolls, which is the decline of the reefs on which they depend for their continued wellbeing. Regarding the Caribbean reefs, the report itself says:
Outbreaks of Acropora and Diadema diseases in the 1970s and early 1980s, overpopulation in the form of too many tourists, and overfishing are the three best predictors of the decline in Caribbean coral cover over the past 30 or more years based on the data available. Coastal pollution is undoubtedly increasingly significant but there are still too little data to tell. Increasingly warming seas pose an ominous threat but so far extreme heating events have had only localized effects and could not have been responsible for the greatest losses of Caribbean corals that had occurred throughout most of the wider Caribbean region by the early to mid 1990s.
So … will the reefs abide? Fortunately, we now know that waving our hands at CO2 is not the solution to the problems of the reefs—as with far too much of such CO2 hysteria, the underlying problems indeed have human causes, but they have nothing to do with CO2.
And that’s great news, because although we have no hope of changing atmospheric CO2, we can indeed do something about overfishing of parrotfish, and about coastal pollution. Fix those, and we’ll fix the reefs, and they will abide.
Best regards to everyone, and thanks for all the parrotfish, I’m off for Las Vegas.
My previous posts on the subject:
The Irony, It Burns
The Reef Abides
Anthony Watts has posted a story about a laughable analysis of the cost of propping up renewables through subsidies. And long-time WUWT contributor KD helpfully pointed me to the document itself. Now that I have the actual document, here’s what they say about subsidies (all emphasis mine).
First, they point out that the cost of shifting to renewables will be on the order of $800 billion dollars per year. Overall, they say the cost will be $45,000,000,000,000 ($45 trillion dollars) by 2050, and could be as high as $70 trillion.
In other words, a substantial “clean-energy investment gap” of some $800 billion/yr exists – notably on the same order of magnitude as present-day subsidies for fossil energy and electricity worldwide ($523 billion). Unless the gap is filled rather quickly, the 2°C target could potentially become out of reach.
Now, a trillion is an unimaginable amount of money. Here’s a way to grasp it. If I started a business in the year zero AD, and my business was so bad that I lost a million dollars a day, not a million a year but a million dollars a day, how many trillion dollars would I have lost by now?
Well, I wouldn’t have lost even one trillion by now, only about $735 billion dollars … in other words, less than the estimated PER-YEAR cost of switching to renewables.
Then they go on to claim that hey, $800 billion per year is no big deal, because fossil fuel subsidies are nearly that large.
While the clean-energy investment gaps (globally and by region) may indeed appear quite sizeable at first glance, a comparison to present-day energy subsidy levels helps to put them into context. According to estimates by the International Monetary Fund and International Energy Agency, global “pre-tax” (or direct) subsidies for fossil energy and fossil electricity totaled $480–523 billion/yr in 2011 (IEA 2012b; IMF 2013). This corresponds to an increase of almost 30% from 2010 and was six times more than the total amount of subsidies for renewables at that time. Oil-exporting countries were responsible for approximately two-thirds of total fossil subsidies, while greater than 95% of all direct subsidies occurred in developing countries.
Now, this is a most interesting and revealing paragraph.
First, despite what people have said on the previous thread, they have NOT included taxes in their calculation of subsidies.
Next, to my great surprise an amazing 95% of all subsidies are being paid by developing nations. This underscores the crucial importance of energy for the poor.
In addition, they say that most of the money used to pay the fossil fuel subsidies comes from … wait for it … the sale of fossil fuels.
Next, it means that nothing that the developed world does will free up much money. Only 5% of the subsidies are in developed nations, they could go to zero and it wouldn’t change the big picture.
It also means that since these subsidies are not going to drivers in Iowa and Oslo, but are propping up the poorest of the global poor, we cannot stop paying them without a huge cost in the form of impoverishment, hardship, and deaths.
Finally, unless we shift the fuel subsidy from fossil fuels to renewables, which obviously we cannot do, the comparison is meaningless—we will still need nearly a trillion dollars per year in additional subsidies to get renewables off of the ground, over and above the assistance currently given to the poor … where do the authors think that money would come from?
I fear that like the pathetically bad Stern Report, this analysis is just another batch of bogus claims trying to prop up the war on carbon, which is and always has been a war on development and human progress, and whose “collateral damages” fall almost entirely on the poor.
And at the end of the day, despite their vain efforts to minimize the cost, even these proponents of renewables say it will cost up to $70 trillion dollars to make the switch, with no guarantee that it will work.
I see that in the study they make much of the disparity between fossil fuel subsidies ($523 billion annually) and renewables subsidies, which they proudly state are only about a sixth of that ($88 billion annually).
However, things look very different when we compare the subsidies on the basis of the energy consumed from those sources. To do that, I use the data in the BP 2014 Statistical Review of World Energy spreadsheet in the common unit, which is “TOE”, or “Tonnes of Oil Equivalent”. This expresses everything as the tonnes of oil that are equivalent to that energy. I’ve then converted the results to “Gallons of Oil Equivalent” and “Litres of Oil Equivalent” to put them in prices we can understand. That breakdown looks like this:
Fuel, Subsidy/Gallon, Subsidy/Litre
Fossil fuels – $0.17 per gallon, $0.04 per litre
Renewables – $1.19 per gallon, $0.31 per litre.
So despite the fact that renewable subsidies are only a sixth of the fossil subsidies, per unit of energy they are seven times as large as the fossil subsidies.
This, of course, is extremely bad news for the promoters of the subsidies. It means that to get the amount of energy we currently use, without using fossil fuels and solely from renewables, it would require seven times the current fossil fuel subsidy, or $3.5 TRILLION DOLLARS PER YEAR.
And of course, since there’d be no fossil fuel sales at that point, there’d be little money to pay for the subsidy.
Sometimes, the idiocy of the savants is almost beyond belief.
A video being hyped around the internet – “Witness a polar bear’s heartbreaking swim for ice in the Arctic” said one headline – is simply shameless propaganda, facilitated by the US Geological Survey and its polar bear biologists. USGS scientists involved in this work should be ashamed of themselves.
The caption for the Youtube video (published Jun 21, 2014) says this:
Take a swim with a polar bear family as they traverse the Arctic Ocean in search of sea ice.
This is a load of nonsense and a total misrepresentation of the facts.
In addition, the text added to the video is pure propaganda: it is being used to promote the US government position that sea ice loss due to climate change is a massive threat to polar bears. Unfortunately, recent studies contradict the contention that polar bears have already been harmed by declines in summer sea ice.
Here are some background to the video you should be aware of:
1) The bears were swimming away from the USGS researchers and film crew who had shot them full of sedatives and attached a camera to one of their necks — they were not swimming toward sea ice 100 miles away.
2) The video was shot in the Bering Sea, in April 2014, when sea ice was about its maximum extent of the year — there was lots of ice around when this video was filmed.
3) The company doing the filming is using this video as a fundraiser.
Details below, including a sea ice map for April 2014.
Andy Revkin at the New York Times DotEarth blog promotes it as something spectacular (as he did a couple of weeks ago with an earlier offering from the same team, June 10), admitted in the comments section in response to someone who said it looked like these bears were being harassed by a boat:
“The cameras are on bears that were sedated (which counts as a kind of harassment, yes, but is part of a broader research project; see the earlier post linked from this one).”
Sea ice map for April (average extent for the month, label added), courtesy NSIDC Click it to enlarge.
The youtube video includes a link to the Arctic Exploration Fund of filmmaker Adam Ravetch’s Arctic Bear Productions company.
“To learn more about the Arctic Exploration Fund visit:
Here is the stated mandate of Ravetch’s NGO:
“Arctic Exploration Fund, AEF, is a non profit 501c 3 whose mission is to arm wild animals around the world, with groundbreaking new cameras, who go out and gather multiple hours of footage of their own lives, which we track using on-board satellite GPS, and together create a brand new natural history archive filmed entirely by the animals themselves.”
Nothing there about deliberately misrepresenting the facts shown in the film footage and using the video for propaganda purposes.
Finally, shame on the USGS: its work in the Bering and Beaufort Seas is being promoted as scientific polar bear research (like collecting blood samples), yet the real products being generated are propaganda videos.
Swimming bear video used for propaganda was not shot with bear-mounted cameras
I just went back to Andy Revkin’s blog post on the swimming polar bear video story that I wrote about on Wednesday to see what kind of feedback it was getting. I found that it pays to check up.
A reader from Oregon questioned the filming techniques used for this video.
Revkin followed up.
And it turned out, the reader from Oregon was correct — the film used in this video was shot with “an assortment of traditional methods,” not with the strapped on cameras that the USGS were using on the bears.
Revkin assumed from the background provided to him that this was leading-edge technology, bear-generated video. And even though he’d interviewed the filmmaker, the truth hadn’t come out.
Update June 29, 2014 – another damning comment made, added below.
Read the exchange below from Revkin’s blog:
David Cothran Ashland OR Yesterday
I have spent quite a bit of time observing polar bears during the course of 12 summer seasons working as a guide and photographer in Svalbard. In my view, the bears in this footage often appear quite stressed, swimming rapidly away from the camera, turning to look behind and even diving briefly. I think that it is quite unlikely that all, or even most of this video was shot from cameras mounted on bears. As another commenter said, it very much appears that the bears are fleeing from a boat. I would like to hear from Ravetch’s team and other wildlife film makers about this question. It certainly appears to me that the majority of the footage was shot with pole-mounted cameras.
Stressing bears by chasing them, particularly when they are swimming, is illegal in many parts of their range and certainly unethical anywhere. I have great respect for your blog and the careful attention you give to complex issues; I hope that you will look further into this. Film makers working with politically charged species like polar bears must be very careful that their methods are unassailable.
undefined 20 hours ago
I sent your note to Ravetch and he clarified that this footage was not shot with the new strapped-on cameras, as you suspected.
He included this note: “That footage was not taken with strapped on cameras and was documented with an assortment of traditional techniques. Aerial, pole cam, and more traditional filming techniques…. I operate with the greatest respect for animals when I am around them for brief periods of time.” [SJC bold]
Thanks for offering your valuable insight. (I also fixed the photo caption to avoid confusion.)
Kudos to Revkin for not dismissing the comment and for following through.
[See the video here]
This video was strictly propaganda from the get-go. The new-fangled camera technology Ravetch’s company and USGS polar bear biologists had been experimenting with was not used at all in the filming of this video.
Update June 29 – comment left today by Kelsey Eliasson
Kelsey Eliasson, Churchill 13 hours ago
I work as a polar bear guide in Churchill. These images looks like they came from Hudson Bay, likely near Southampton Island, when Ravetch was filming there in the summer of 2012, I believe. I have seen raw footage of this event and the bear is so stressed that at one point, it actually tries to climb into the boat after being followed for a considerable amount of time. Beautiful shot though.
So, this seems like it is not USGS, not in the Beaufort Sea and filmed in the summer. If I am wrong, I apologize, however, the similarities in the footage are striking.
The Arctic Exploration Fund actually seems like it is just raising money for documentary projects not actual research – wish I had thought of that.
It is very unfortunate that management decisions are being based on fictional portrayals of polar bear habitat and behaviour. The bears are the ones that will suffer in the long run.
* Dr. Susan J. Crockford is a zoologist with more than 35 years experience, including work on the Holocene history of Arctic animals. Like polar bear biologist Ian Stirling, Susan Crockford earned her undergraduate degree in zoology at the University of British Columbia. She is currently an adjunct professor at the University of Victoria, B.C. Polar bear evolution is one of Dr. Crockford’s professional interests, which she discusses in her book, Rhythms of Life: Thyroid Hormone and the Origin of Species.
Skeptics doing what skeptics do best . . . attack skeptics. – Suyts
Last week, the mainstream media was abuzz with claims by skeptical blogger Steve Goddard that NOAA and NASA have dramatically altered the US temperature record. For examples of MSM coverage, see:
Further, this story was carried as the lead story on Drudge for a day.
First off the block to challenge Goddard came Ronald Bailey at reason.com in an article Did NASA/NOAA Dramatically Alter U.S. Temperatures After 2000? that cites communication with Anthony Watts, who is critical of Goddard’s analysis, as well as being critical of NASA/NOAA.
Politifact chimed in with an article that assessed Goddard’s claims, based on Watt’s statements and also an analysis by Zeke Hausfather. Politifact summarized with this statement: We rate the claim Pants on Fire.
I didn’t pay much attention to this, until Politifact asked me for my opinion. I said that I hadn’t looked at it myself, but referred them to Zeke and Watts. I did tweet their Pants on Fire conclusion.
Skepticism in the technical climate blogosphere
Over at the Blackboard, Zeke Hausfather has a three-part series about Goddard’s analysis – How not to calculate temperatures (Part I, Part II, Part III). Without getting into the technical details here, the critiques relate to the topics of data dropout, data infilling/gridding, time of day adjustments, and the use of physical temperatures versus anomalies. The comments thread on Part II is very good, well worth reading.
Anthony Watts has a two-part series On denying hockey sticks, USHCN data and all that (Part 1, Part 2). The posts document Watts’ communications with Goddard, and make mostly the same technical points as Zeke. There are some good technical comments in Part 2, and Watts makes a proposal regarding the use of US reference stations.
Nick Stokes has two technical posts that relate to Goddard’s analysis: USHCN adjustments, averages, getting it right and TOBS nailed.
While I haven’t dug into all this myself, the above analyses seem robust, and it seems that Goddard has made some analysis errors.
OK, acknowledging that Goddard made some analysis errors, I am still left with some uneasiness about the actual data, and why it keeps changing. For example, Jennifer Marohasy has been writing about Corrupting Australian’s temperature record.
In the midst of preparing this blog post, I received an email from Anthony Watts, suggesting that I hold off on my post since there is some breaking news. Watts pointed me to a post by Paul Homewood entitled Massive Temperature Adjustments At Luling, Texas. Excerpt:
So, I thought it might be worth looking in more detail at a few stations, to see what is going on. In Steve’s post, mentioned above, he links to the USHCN Final dataset for monthly temperatures, making the point that approx 40% of these monthly readings are “estimated”, as there is no raw data.
From this dataset, I picked the one at the top of the list, (which appears to be totally random), Station number 415429, which is Luling, Texas.
Taking last year as an example, we can see that ten of the twelve months are tagged as “E”, i.e estimated. It is understandable that a station might be a month, or even two, late in reporting, but it is not conceivable that readings from last year are late. (The other two months, Jan/Feb are marked “a”, indicating missing days).
But, the mystery thickens. Each state produces a monthly and annual State Climatological Report, which among other things includes a list of monthly mean temperatures by station. If we look at the 2013 annual report for Texas, we can see these monthly temperatures for Luling.
Where an “M” appears after the temperature, this indicates some days are missing, i.e Jan, Feb, Oct and Nov. (Detailed daily data shows just one missing day’s minimum temperature for each of these months).
Yet, according to the USHCN dataset, all ten months from March to December are “Estimated”. Why, when there is full data available?
But it gets worse. The table below compares the actual station data with what USHCN describe as “the bias-adjusted temperature”. The results are shocking.
In other words, the adjustments have added an astonishing 1.35C to the annual temperature for 2013. Note also that I have included the same figures for 1934, which show that the adjustment has reduced temperatures that year by 0.91C. So, the net effect of the adjustments between 1934 and 2013 has been to add 2.26C of warming.
Note as well, that the largest adjustments are for the estimated months of March – December. This is something that Steve Goddard has been emphasising.
It is plain that these adjustments made are not justifiable in any way. It is also clear that the number of “Estimated” measurements made are not justified either, as the real data is there, present and correct.
Watts appears in the comments, stating that he has contacted John Nielsen-Gammon (Texas State Climatologist) about this issue. Nick Stokes also appears in the comments, and one commenter finds a similar problem for another Texas station.
Homewood’s post sheds light on Goddard’s original claim regarding the data drop out (not just stations that are no longer reporting, but reporting stations that are ‘estimated’). I infer from this that there seems to be a real problem with the USHCN data set, or at least with some of the stations. Maybe it is a tempest in a teacup, but it looks like something that requires NOAA’s attention. As far as I can tell, NOAA has not responded to Goddard’s allegations. Now, with Homewood’s explanation/clarification, NOAA really needs to respond.
Sociology of the technical skeptical blogosphere
Apart from the astonishing scientific and political implications of what could be a major bug in the USHCN dataset, there are some interesting insights and lessons from this regarding the technical skeptical blogosphere.
Who do I include in the technical skeptical blogosphere? Tamino, Moyhu, Blackboard, Watts, Goddard, ClimateAudit, Jeff Id, Roman M. There are others, but the main discriminating factor is that they do data analysis, and audit the data analysis of others. Are all of these ‘skeptics’ in the political sense? No – Tamino and Moyhu definitely run warm, with Blackboard and a few others running lukewarm. Of these, Goddard is the most skeptical of AGW. There is most definitely no tribalism among this group.
In responding to Goddard’s post, Zeke, Nick Stokes (Moyhu) and Watts may have missed the real story. They focused on their previous criticism of Goddard and missed his main point. Further, I think there was an element of ‘boy who cried wolf’ – Goddard has been wrong before, and the comments at Goddard’s blog can be pretty crackpotty. However, the main point is that this group is rapidly self-correcting – the self-correcting function in the skeptical technical blogosphere seems to be more effective (and certainly faster) than for establishment climate science.
There’s another issue here and that is one of communication. Why was Goddard’s original post unconvincing to this group, whereas Homewood’s post seems to be convincing? Apart from ‘crying wolf’ issue, Goddard focused on the message that the real warming was much less than portrayed by the NOAA data set (caught the attention of the mainstream media), whereas Homewood more carefully documented the actual problem with the data set.
I’ve been in email communications with Watts through much of Friday, and he’s been pursuing the issue along with Zeke and help from Neilsen-Gammon to NCDC directly, who is reportedly taking it seriously. Not only does Watts plan to issue a statement on how he missed Goddard’s original issue, he says that additional problems have been discovered and that NOAA/NCDC will be issuing some sort of statement, possibly also a correction, next week. (Watts has approved me making this statement).
This incident is another one that challenges traditional notions of expertise. From a recent speech by President Obama:
“I mean, I’m not a scientist either, but I’ve got this guy, John Holdren, he’s a scientist,” Obama added to laughter. “I’ve got a bunch of scientists at NASA and I’ve got a bunch of scientists at EPA.”
Who all rely on the data prepared by his bunch of scientists at NOAA.
How to analyze the imperfect and heterogeneous surface temperature data is not straightforward – there are numerous ways to skin this cat, and the cat still seems to have some skin left. I like the Berkeley Earth methods, but I am not convinced that their confidence interval/uncertainty estimates are adequate.
Stay tuned, I think this one bears watching.
We have studied the long-term toxicity of a Roundup-tolerant GM maize (NK603) and a whole Roundup pesticide formulation at environmentally relevant levels from 0.1 ppb. Our study was first published in Food and Chemical Toxicology (FCT) on 19 September, 2012. The first wave of criticisms arrived within a week, mostly from plant biologists without experience in toxicology. We answered all these criticisms. The debate then encompassed scientific arguments and a wave of ad hominem and potentially libellous comments appeared in different journals by authors having serious yet undisclosed conflicts of interests. At the same time, FCT acquired as its new assistant editor for biotechnology a former employee of Monsanto after he sent a letter to FCT to complain about our study. This is in particular why FCT asked for a post-hoc analysis of our raw data. On 19 November, 2013, the editor-in-chief requested the retraction of our study while recognizing that the data were not incorrect and that there was no misconduct and no fraud or intentional misinterpretation in our complete raw data – an unusual or even unprecedented action in scientific publishing. The editor argued that no conclusions could be drawn because we studied 10 rats per group over 2 years, because they were Sprague Dawley rats, and because the data were inconclusive on cancer. Yet this was known at the time of submission of our study. Our study was however never attended to be a carcinogenicity study. We never used the word ‘cancer’ in our paper. The present opinion is a summary of the debate resulting in this retraction, as it is a historic example of conflicts of interest in the scientific assessments of products commercialized worldwide. We also show that the decision to retract cannot be rationalized on any discernible scientific or ethical grounds. Censorship of research into health risks undermines the value and the credibility of science; thus, we republish our paper.
There is an ongoing debate on the potential health risks of the consumption of genetically modified (GM) plants containing high levels of pesticide residues . Currently, no regulatory authority requests mandatory chronic animal feeding studies to be performed for edible GMOs and formulated pesticides. This fact is at the origin of most of the controversies. Only studies consisting of 90-day rat feeding trials have been conducted by manufacturers for GMOs. Statistical differences in the biochemistry of treated rats versus controls may represent the initial signs of long-term pathologies , possibly explained at least in part by pesticide residues in the GM feed. This is why we studied the long-term toxicity of a Roundup-tolerant GM maize (NK603) and a whole Roundup pesticide formulation at environmentally relevant levels from 0.1 ppb.
We first published these results in Food and Chemical Toxicology (FCT) on 19 September, 2012  after a careful and thorough peer review. However, 1 year and 2 months later, in an unusual step, the editor-in-chief requested the retraction of our study, while conceding that the data were not incorrect and that there was no misconduct and no fraud or intentional misinterpretation. According to him, some data were inconclusive, but for reasons already known at the time of submission of the paper. The present paper is a summary of the debate resulting in this retraction, which in our view is a historic example of conflicts of interests in the scientific assessments of products commercialized worldwide.
The long-term toxicity study of the NK603 maize and Roundup
An initial study on NK603 maize was submitted by Monsanto Company in support of commercial authorization of the maize. NK603 maize was fed to 4 groups of 20 Sprague Dawley rats (2 doses of 11% and 33% in the diet of both sexes) for 90 days . The blood analyses were performed on 10 rats per group. The re-analysis of the raw data resulted in a debate on the biological relevance of admitted statistical differences versus controls as the first signs of hepatorenal toxicities . To solve the problem, a 2-year-long study was carried out using two hundred Sprague Dawley rats to which the following treatments were administered: NK603 maize treated or not with Roundup at three different levels in their feed (11%, 22%, and 33% of the total diet) and Roundup alone, administered via drinking water at three different concentrations, from the admitted residual level in regular tap water (0.1 ppb), to the maximum level authorized in GMOs (400 ppm), up to half of the agricultural dose (0.5%). They were divided into ten groups, each containing ten males and ten females. No other long-term study has examined the effects of regular consumption of Roundup-tolerant GM maize and of a pesticide formulation, in any dilution, on blood parameters, sexual hormones, and multiple organs.
We found that these products provoked statistically discriminant disturbances in biochemical markers of livers and kidneys in females at the 15th month, when most of the rats were still alive. At the same time, testosterone and estradiol levels were also disturbed. At the end of the experiments, these disrupted biochemical markers corresponded to pathologies evidenced in a blinded manner: notably hepatorenal deficiencies, more severe in males, and female mammary tumors, which led to premature deaths. For instance, after around 700 days, there were up to 3.25 more mammary tumors (the highest rate was observed in females consuming 0.1 ppb of Roundup in water). This could be associated with a 2.4-time increase in pituitary dysfunctions noticed by the end of the experiment (2 years).
These findings were immediately dismissed by persons involved in the products’ authorizations, or in collaboration with biotech industries. A number of them wrote to FCT to nourish a controversy, including Richard Goodman, a former Monsanto employee in charge of the immunotoxicity files of GMOs, and Paul Christou, a patent holder of the methods used to create transgenic plants. This was rapidly followed by a coordination of national regulatory agencies organized by the European Food Safety Authority (EFSA), released on 4 October, 2012 . The EFSA had previously assessed NK603, and glyphosate, the declared active principle of Roundup, as safe on the basis of regulatory data, which they never fully published. The EFSA has since published Monsanto’s safety data on NK603 maize , but not on glyphosate. The NK603 data are in a pdf format preventing an easy statistical re-analysis. However, there was no long-term toxicological assessment for NK603, or for Roundup. Moreover, we demonstrated in several studies [8-10] that Roundup is far more toxic than glyphosate because of non-inert adjuvants. On 10 October, 2012, the Monsanto Company also sent its criticisms to FCT  but did not release its safety data, claiming commercial confidentiality.
Overall, the first wave of criticisms arrived within a week, mostly from plant biologists. We answered all criticisms  in FCT on 9 November, 2012. The debate then encompassed scientific arguments. A second wave of ad hominem and potentially libelous comments appeared in different journals [13-16]. Regrettably, there were no invitations to respond to these exacerbated attacks, which we discovered only by our literature survey. Some of the authors of these articles had serious yet undisclosed conflicts of interest. The scientific remarks concentrated on the supposedly inadequate choice of the Sprague Dawley rat strain, which is, however, a classic model for toxicology . The Sprague Dawley strain was also used by Monsanto in its 90-day test on the same maize . In addition, Monsanto measured biochemically the same number of rats per group as in our experiment. Thus, with regard to blood and urine biochemistry, Monsanto gathered data from the same number of rats that we did.
Unsubstantiated allegations of fraud or errors
Paul Christou, the lead author of Arjo et al. , demanded that our paper be retracted and insulted us personally. He claimed first in a letter addressed to the editor-in-chief that the publication of our study ‘does not meet minimal acceptable standards of scientific rigor’ and ‘will damage an entire scientific discipline due to flawed conclusion’ (personal communication). Then, he attacked us in an article published in the journal Transgenic Research on 20 December 2012 . The quantity of insults and defamations in this paper, authorized and co-authored by the editor-in-chief in a supposedly serious journal, is excessive. They include: ‘abject failure to treat the experimental animals in a humane manner’, ‘inability to formulate a valid hypothesis’, ‘media fanfare’, ‘fraudulent or knowingly inaccurate statements’, ‘unethical behavior’, ‘transparent attempt to discredit regulatory agencies’, ‘ammunition for extremists’, ‘flawed science’, ‘disingenuous or inept’, and ‘unjustified waste of animals’ (while at the same time asking for more animals in the groups). Christou and co-authors suggest that by practising ‘flawed science’, we are working against ‘progress towards a better quality of life’ and in fact are ‘actively working to make life worse’. We were not invited to reply. This behaviour can be explained, though not justified, by the undisclosed conflicts of interests.
Christou is not only the editor-in-chief of Transgenic Research, the journal in which he published his article, but is also linked to Monsanto . He is named as the inventor on several patents on GM crop technology, for most of which Monsanto owns the property rights. These include patents on the plant transformation process  used to make glyphosate-tolerant transgenic corn plants . He worked as a researcher at Agracetus Inc. (later acquired by Monsanto) for 12 years. Then, from 1994 to 2001, Christou worked at the John Innes Centre in the UK , which is heavily invested in GM crop technology . He thus has no mammalian toxicology background. However, in his published article, Christou only gave as his affiliation his publicly funded position at a research institute. Christou’s failure to declare his current interests – his inventor status on patents concerning the company that developed the products we tested – could be considered grounds for retraction of a paper in a scientific journal, according to ethical guidelines for scientific publishing .
The Arjo et al. article was co-authored by Wayne Parrott, an active member of the Biotechnology Committee at the International Life Sciences Institute (ILSI) . ILSI is funded by multinational food, agribusiness, and biotechnology companies, including Monsanto and Syngenta . ILSI has proved highly controversial in North America and Europe due to its influence on risk assessment methodologies for chemicals, pesticides, and GM foods [25-27]. Wayne Parrott also has an inventor status in patents on materials and methods for selecting transgenic organisms  and transformation vector systems .
In addition, Christou and his co-authors made numerous mistakes, false and unsubstantiated assertions, and misrepresentations of our data. The title of Arjo et al.’s paper includes defamation and a misrepresentation of our research, implying that it is ‘pseudoscience’ and alleging that it claimed Roundup Ready maize and Roundup herbicide caused ‘cancer’ in rats – a claim we never made. We did not even use the word ‘cancer’ in our paper although this argument was reiterated in the final letter of the editor-in-chief of FCT when explaining his decision to retract our paper . Tumors do not always lead to cancer, even if they can be more deleterious in a shorter time because of their size or body position, by hurting internal functions.
Arjo et al.’s paper begins with a false assertion that is not evidenced in the paper or in the cited source: ‘It started with a press conference in which journalists agreed not to engage in fact-checking’. The authors made other false assertions about our study, for example, alleging that ‘the water consumption was not measured’. In fact, we measured both the water and food consumption, and the stability of the Roundup solution over time. This was indicated in the paper, in which we explained that all the data cannot be shown in one paper and that we concentrated on the most important data; these parameters were only part of a routine survey. They also falsified the reporting of the data, compiling the mortality data only at the end of the experiment and ignoring the originality and the major findings of the differential chronological effects between treated rats and controls, which we established by measuring tumor size twice a week over 2 years. Moreover, we respected legal requirements and ethical norms relating to animal experiments, and Arjo et al. present no evidence of the contrary, so their allegation of inhumane treatment of the rats is without substance.
Importantly, we had already answered many of the criticisms of our paper made by Arjo et al. in a paper that was published before that of Arjo et al. . Their publication was received on 20 December 2012, when our paper was published on 9 November 2012. Our published answers were simply ignored.
Christou was not alone in failing to declare conflicts of interest in his criticism of our paper. Since we underlined that 75% of the comments addressed to FCT within a week after our study was published came from plant biologists, it was discovered that several had developed patents on GMOs. Some authors were employees of Monsanto Company, which owns NK603 GM maize and sells Roundup herbicide [4,11]. Other more recent papers, published by plant biologists and/or affiliates of the industry-funded group ILSI [15,16], repeated the arguments. The author of a separate article criticizing our study expressed concern that our results could damage public opinion about GM crops  – a sentiment that gives precedence to economic interests over public health. An article in Forbes magazine even alleged, without presenting any evidence, that we had committed fraud . Surprisingly, even Monsanto authors  declared that they had ‘no conflicts of interest’ in their first draft published online on FCT website. Investigative reports [32,33] evidenced that many authors of these opinions had failed to disclose their conflicts of interest, including Henry Miller, Mark Tester, Chris Leaver, Bruce Chassy, Martina Newell-McGloughlin, Andrew Cockburn, L. Val Giddings, Sivramiah Shantharam, Lucia de Souza, Erio Barale-Thomas, and Marc Fellous. The undisclosed conflicts of interest included links with biotechnology companies that develop GMOs and with industry-backed lobbying organizations.
All of this has huge implications for public health. We observed an intense lobbying in parliaments, as well as proofs of conflicts of interests for persons involved in the regulatory decisions for the commercialization of these products . A series of high-profile conflict-of-interest revelations (not restricted to GMOs and pesticides) led to the resignations of leading administrators involved in decisions affecting the assessment of these products, including the European Commissioner John Dalli  and the former chair of the European Food Safety Authority’s (EFSA) management board Diana Banati . In February of 2013, a strange occurrence following the publication of our paper raised questions about the connections of industry to scientific publishing, described below.
Conflicts of interests in the editorial board
In February 2013, FCT acquired a new assistant editor for biotechnology, Richard E. Goodman. The editor-in-chief has admitted that Goodman was introduced into the editorial board after he sent a letter to FCT to complain about our study. In his letter, Goodman appears worried about economic consequences but not so much about potential public health consequences (personal communication). He wrote: ‘The implications and the impacts of this uncontrolled study is having HUGE impacts, in international trade, in consumer confidence in all aspects of food safety, and certainly in US state referendums on labelling’. Further in his letter, Goodman asked for ‘an evaluation by an independent set of toxicologists’. This is particularly why the Publishing Assistant for FCT asked for our raw data on 15 March 2013.
In fact, we can question the independence of this re-evaluation. After his appointment at FCT, Goodman was a member of the subcommittee that requested our raw data, until we complained to Elsevier publishing group. Goodman is far from being independent. He previously worked for Monsanto for 7 years . He also has a long-standing affiliation with ILSI . Goodman will now deal with all biotechnology papers submitted to FCT. Another scientific paper on GMO risks was withdrawn from FCT, without explanation shortly after it had been accepted and published by the journal . The paper was immediately published by another journal  according to the authors’ initiative.
We received a letter from the editor-in-chief of FCT, A. Wallace Hayes, asking us to retract our paper on 19 November 2013, more than 1 year after its publication . In his retraction notice, the editor-in-chief certifies that ‘no evidence of fraud or intentional misrepresentation of the data’ was found in the investigation, that the results are ‘not incorrect’, ‘there was no misconduct’, and that the sole reason for retraction is the ‘inconclusiveness’ of the paper. He argued that no conclusions could be drawn because we studied 10 rats per group over 2 years, because they were Sprague Dawley rats, and because we could not conclude on cancer. In fact, the Sprague Dawley is a standard choice for 2-year studies performed by industry and independent scientists alike [17,41]. We also measured 10 animals per sex per group according to OECD 452 guideline on chronic toxicity studies  because our study is a chronic toxicity study that was never intended to be a carcinogenicity study. We wish to point out that Dr Hayes’ decision is in violation of the retraction guidelines of the Committee on Publication Ethics (COPE), of which FCT is a member. ‘Inconclusiveness’ is not a valid reason for a journal to retract a paper. Lack of conclusiveness (which can be discussed) and error are not synonymous. COPE criteria for retraction included scientific misconduct/honest error, prior publication, plagiarism, or unethical research. None of these criteria applied to our study. On the contrary, numerous published scientific papers contain inconclusive findings. It is for further studies to build on the reported findings and arrive at a more conclusive position. In contrast with our study measuring toxicity, the Monsanto study reporting safety with the same number and the same strain of rats, but limited to 90 days,  is not subject to the same controversy. The data in the Monsanto study show statistically significant differences in multiple-organ functions between the GM and non-GM feeding groups, which the authors dismissed as not ‘biologically meaningful’, using a set of questionable criteria . The significant effects observed do not have to be linear to the dose to be taken into consideration; otherwise, endocrine effects will be dismissed. In addition, biochemical disturbances do not have to correlate simultaneously with organ lesions, in contrast to the claims of Doull et al.  in defence of Monsanto. These outdated concepts coming from the toxicology of poisons, and are not valid for endocrine disruption [43,45]. If 10 rats/sex/group are too few to demonstrate a toxic effect, then this number of rats is certainly too small to demonstrate safety. Overall, in the current system of assessment, any toxic effect is first suspected to be a false positive, arising by chance, rather than questioning whether no evidence of effect is a false negative result. The Monsanto data as presented are thus inconclusive and should also be retracted.
Following the retraction of our paper, many letters were sent to the editor-in-chief of FCT. On 10 December 2013, he published a defence of the retraction, which raised many doubts as to his understanding of our data . He claimed that we concluded on cancer, although ours was a long-term toxicity study with a detailed statistical analysis of blood and urine parameters. He also defended the study done by Monsanto  claiming that they used 20 rats/sex/group while we only used 10 rats/sex/group. In fact, despite the fact that the Monsanto study used twice our sample size, the Monsanto authors only analyzed blood and urine from half of the animals (10), the same number of sampled animals as in our study.
According to an editorial in Environmental Health Perspectives , ‘the decision to retract a published scientific work by an editor, against the desires of the authors, because it is ‘inconclusive’ based on a post hoc analysis represents a dangerous erosion of the underpinnings of the peer-review process, and Elsevier should carefully reconsider this decision’.
Confidentiality and censorship erode the value of science
Recent reviews of the GM food safety literature have found that research concluding that GM products were safe tended to come from industry and that research conducted by those with either financial or professional conflicts of interest was associated with outcomes favorable to the GM sector . In fact, it appears in our case that consequences of conflicts of interests in science go beyond divergence in scientific interpretations and also rely on unscientific practices: confidentiality and censorship.
Transparency of, and access to, all the raw data obtained by companies and accepted by regulatory agencies (overall blood analyses of rats) as proof of safety for products, is an unavoidable first step to move forward in this debate. It is the only way in which the scientific community can enter the scientific discussion. This is why we republish our paper in an open access way, together with its raw data allowing debate about our results. This is not possible for the data used as a proof of safety for commercial authorizations. The Monsanto toxicological data on NK603 maize recently made public by EFSA is not in a statistically usable format and an agreement with Monsanto is requested before use. Moreover, the data examined for Roundup authorizations are clearly inadequate . For instance, ANSES (French Agency for Food, Environmental and Occupational Health & Safety), confirmed to us in writing (January 2013) that there were no 2-year studies of Roundup in its whole formulation on animals, adding that there are a few studies of acute toxicity (a few days up to 3 weeks) without any blood tests. Instead, glyphosate, which is much less toxic than Roundup [10,49], is tested alone by Monsanto, in its reports to regulatory authorities . We strongly emphasize that data with implications for public health are not related to manufacturing patents and should not be kept confidential. Removal of confidentiality claims on biosafety data is necessary to adhere to standard scientific procedures of quality assurance, to increase transparency, to minimize impacts of conflicts of interests, and ultimately to improve public confidence in GMOs . Moreover, in the regulatory assessment of GMOs, chemicals, and medicines, confidential tests are conducted by the applicant companies themselves, often in their own laboratories or in those of subcontractors.
The second step must be the building of new experiments for new or the most important products, by laboratories independent of the companies. They will be recruited by public tender, with compulsory transparency of the results. This public research will be funded by companies, at a level corresponding to their previous budget for regulatory testing, but managed independently of the companies. The protocols and results will be submitted to open and contradictory assessments. Thus, there will be no additional financial cost or time delay to the current system. Such reforms will not only radically transform the understanding and knowledge of toxicology and science in general, but will radically reduce public health costs and promote trust in companies and science. This will move the world towards a sustainable development of products with low, if any, impacts on health and environment.
The reason given to retract our paper – ‘inconclusiveness’ – is unprecedented and violates the norms of scientific publishing. The decision to retract cannot be rationalized on any discernible scientific grounds. Censorship on research into the risks of a technology so critically entwined with global food safety undermines the value and the credibility of science.
The authors declare that they have no competing interests.
GES designed and coordinated the commentary. RM participated in the drafting of the manuscript and final version. ND and JsDV helped in the writing, compiling the literature, revising details, and proofreading the manuscript. All authors read and approved the final manuscript.
We acknowledge the Charles Leopold Mayer (FPH) and Denis Guichard Foundations, together with CRIIGEN, for fellowships and structural supports. We are equally thankful to Malongo, Lea Nature, and the JMG Foundation for their help.
- Seralini G-E, Mesnage R, Clair E, Gress S, de Vendomois J, Cellier D (2011) Genetically modified crops safety assessments: present limits and possible improvements. Environ Sci Eur 23:10 BioMed Central Full Text
- Spiroux de Vendômois J, Cellier D, Velot C, Clair E, Mesnage R, Seralini GE (2010) Debate on GMOs health risks after statistical findings in regulatory tests. Int J Biol Sci 6:590-598 Publisher Full Text
- Seralini GE, Clair E, Mesnage R, Gress S, Defarge N, Malatesta M, Hennequin D, de Vendomois JS (2012) RETRACTED: Long term toxicity of a Roundup herbicide and a Roundup-tolerant genetically modified maize. Food Chem Toxicol 50:4221-4231Retracted in Food and Chemical Toxicology. 2014, 4263: 4244Publisher Full Text
- Hammond B, Dudek R, Lemen J, Nemeth M (2004) Results of a 13 week safety assurance study with rats fed grain from glyphosate tolerant corn. Food Chem Toxicol 42:1003-1014 Publisher Full Text
- Spiroux de Vendômois J, Roullier F, Cellier D, Seralini GE (2009) A comparison of the effects of three GM corn varieties on mammalian health. Int J Biol Sci 5:706-726 Publisher Full Text
- (2012) Review of the Séralini et al. (2012) publication. EFSA J 10(10):2910
- [http://www.efsa.europa.eu/fr/press/news/130114.htm] webciteEFSA: EFSA Promotes Public Access to Data in Transparency Initiative (2013). Press Release .
- Richard S, Moslemi S, Sipahutar H, Benachour N, Seralini GE (2005) Differential effects of glyphosate and roundup on human placental cells and aromatase. Environ Health Perspect 113:716-720 Publisher Full Text
- Benachour N, Seralini GE (2009) Glyphosate formulations induce apoptosis and necrosis in human umbilical, embryonic, and placental cells. Chem Res Toxicol 22:97-105 Publisher Full Text
- Mesnage R, Bernay B, Seralini GE (2013) Ethoxylated adjuvants of glyphosate-based herbicides are active principles of human cell toxicity. Toxicology 313:122-128 Publisher Full Text
- Hammond B, Goldstein DA, Saltmiras D (2013) Letter to the editor. Food Chem Toxicol 53:459-464 Publisher Full Text
- Seralini GE, Mesnage R, Defarge N, Gress S, Hennequin D, Clair E, Malatesta M, de Vendomois JS (2013) Answers to critics: why there is a long term toxicity due to NK603 Roundup-tolerant genetically modified maize and to a Roundup herbicide. Food Chem Toxicol 53:461-468
- Arjo G, Portero M, Pinol C, Vinas J, Matias-Guiu X, Capell T, Bartholomaeus A, Parrott W, Christou P (2013) Plurality of opinion, scientific discourse and pseudoscience: an in depth analysis of the Seralini et al. study claiming that Roundup Ready corn or the herbicide Roundup cause cancer in rats. Transgenic Res 22:255-267 Publisher Full Text
- Houllier F (2012) Biotechnology: bring more rigour to GM research. Nature 491:327 Publisher Full Text
- Martinelli L, Karbarz M, Siipi H (2013) Science, safety, and trust: the case of transgenic food. Croat Med J 54:91-96 Publisher Full Text
- Romeis J, McLean MA, Shelton AM (2013) When bad science makes good headlines: Bt maize and regulatory bans. Nat Biotechnol 31:386-387 Publisher Full Text
- King-Herbert A, Sills R, Bucher J (2010) Commentary: update on animal models for NTP studies. Toxicol Pathol 38:180-181 Publisher Full Text
- Christou P (2013) Full CV.
- [http://www.google.com/patents/US5015580] webciteUS Patent 5015580: Particle-mediated transformation of soybean plants and lines. 1991, .
- [http://www.google.com/patents/US5554798] webciteUS Patent 5554798: Fertile glyphosate-resistant transgenic corn plants. 1996, .
- [http://www.jic.ac.uk/corporate/media-and-public/news-archive/010716.htm] webciteJohn Innes Centre: Laying the foundation for more science at the John Innes Centre, Norwich, UK. 2001, .
- [http://publicationethics.org/files/retraction%20guidelines.pdf] webciteCOPE: Retraction guidelines. 2009, .
- (2013) Biotechnology Update Symposium.
- (2011) ILSI Annual Report.
- [http://www.nrdc.org/media/pressreleases/060131.asp] webciteNRDC: Industry association barred from influencing international health standards. 2006, .
- Robinson C, Holland N, Leloup D, Muilerman H (2013) Conflicts of interest at the European Food Safety Authority erode public confidence. J Epidemiol Community Health 67(9):712-720
- Lougheed T (2006) WHO/ILSI affiliation sustained. Environ Health Perspect 114(9):A521 Publisher Full Text
- [http://www.google.co.in/patents/US7005561] webciteUS Patent 7005561: Arabitol or ribitol as positive selectable markers. 2006, .
- [http://www.google.com/patents/US6096523] webciteUS Patent 6096523: Transformation vector system. 2000, .
- Hayes AW (2014) Editor in Chief of Food and Chemical Toxicology answers questions on retraction. Food Chem Toxicol 65:394-395 Publisher Full Text
- [http:/ / www.forbes.com/ sites/ timworstall/ 2012/ 09/ 21/ proof-perfect-that-the-seralini-pap er-on-gm-corn-and-cancer-in-rats-is -rubbish/ ] webciteWorstall T: Proof perfect that the Seralini paper on GM corn and cancer in rats is rubbish..
- [http:/ / blogs.rue89.nouvelobs.com/ de-interet-conflit/ 2012/ 11/ 12/ ogm-la-guerre-secrete-pour-decredib iliser-letude-seralini-228894] webciteSourice B: OGM: la guerre secrète pour décrédibiliser l’étude Séralini. 2012, .
- [http:/ / www.spinwatch.org/ index.php/ issues/ science/ item/ 164-smelling-a-corporate-rat] webciteMatthews J: Smelling a corporate rat. 2012, .
- Commission E: 16/10/2012 Press Statement. Brussels: ᅟ; 2012. MEMO/12/788.
- (2012) EFSA Management Board Chair Resigns.
- [http://www.hesiglobal.org/i4a/pages/index.cfm?pageid=3595] webciteILSI: Symposium on sensitizing properties of protein. 2012, .
- (2005) ILSI Protein Allergenicity Technical Committee.
- Mezzomo BP, Miranda-Vilela AL, de Souza Freire I, Barbosa LC, Portilho FA, Lacava ZG, Grisolia CK: WITHDRAWN: Effects of oral administration of Bacillus thuringiensis as spore-crystal strains Cry1Aa, Cry1Ab, Cry1Ac or Cry2Aa on hematologic and genotoxic endpoints of Swiss albino mice.Food Chem Toxicol 2012, doi:10.1016/j.fct.2012.10.032.
- Mezzomo B, Miranda-Vilela A, Freire I, Barbosa L, Portilho F (2013) Hematotoxicity of Bacillus thuringiensis as spore-crystal strains Cry1Aa, Cry1Ab, Cry1Ac or Cry2Aa in Swiss albino mice. J Hematol Thromb Dis 1:104doi:104172/jhtd1000104Publisher Full Text
- Hayes AW (2014) Retraction notice to “Long term toxicity of a Roundup herbicide and a Roundup-tolerant genetically modified maize” [Food Chem. Toxicol. 50 (2012) 4221–4231]. Food Chem Toxicol 63:244 Publisher Full Text
- Meyer H, Hilbeck A (2013) Rat feeding studies with genetically modified maize – a comparative evaluation of applied methods and risk assessment standards. Environ Sci Eur 25:33 BioMed Central Full Text
- (2012) OECD Guidelines for the Testing of Chemicals, Section 4: Health Effects Test No. 452: Chronic Toxicity Studies. OECD Publishing, Paris.
- Séralini GE, de Vendomois JS, Cellier D, Sultan C, Buiatti M, Gallagher L, Antoniou M, Dronamraju KR (2009) How subchronic and chronic health effects can be neglected for GMOs, pesticides or chemicals. Int J Biol Sci 5:438-443 Publisher Full Text
- Doull J, Gaylor D, Greim HA, Lovell DP, Lynch B, Munro IC (2007) Report of an Expert Panel on the reanalysis by of a 90-day study conducted by Monsanto in support of the safety of a genetically modified corn variety (MON 863). Food Chem Toxicol 45:2073-2085 Publisher Full Text
- Vandenberg LN, Colborn T, Hayes TB, Heindel JJ, Jacobs DR Jr, Lee DH, Shioda T, Soto AM, Vom Saal FS, Welshons WV, Zoeller RT, Myers JP (2012) Hormones and endocrine-disrupting chemicals: low-dose effects and nonmonotonic dose responses. Endocr Rev 33:378-455 Publisher Full Text
- Portier C, Goldman L, Goldstein B (2014) Inconclusive findings: now you see them, now you don’t! Environ Health Perspect 122(2):A36doi:10.1289/ehp.1408106Publisher Full Text
- Diels J, Cunha M, Manaia C, Sabugosa-Madeira B, Silva M (2011) Association of financial or professional conflict of interest to research outcomes on health risks or nutritional assessment studies of genetically modified products. Food Policy 2011(36):197-203 Publisher Full Text
- [http://www.criigen.org/SiteFr//images//anses_letter.pdf] webciteMortureux M: ANSES letter regarding the raw data of glyphosate. 2013,.
- Mesnage R, Defarge N, SpirouxDeVendômois J, Séralini GE (2014) Major pesticides are more toxic to human cells than their declared active principles. BioMed Res Int 2014:Article ID 179691 Publisher Full Text
- [http:/ / www.bfr.bund.de/ en/ the_bfr_has_finalised_its_draft_rep ort_for_the_re_evaluation_of_glypho sate-188632.html] webciteGerman Federal Agency BfR: The BfR has finalised its draft report for the re-evaluation of glyphosate..
- Nielsen KM (2013) Biosafety data as confidential business information. PLoS Biol 11(3):e1001499 Publisher Full Text
© 2014 Séralini et al.; licensee Springer
There may be a fatal tumour in your brain. The only way we’ll know is if I cut it open – but there’s a chance that might kill you. Shall I go ahead?
We’ve just been confronted with a question a bit like this by scientists at the University of Wisconsin-Madison. They insist the only way to guard against the outbreak of a deadly flu epidemic like the Spanish flu of 1918 is to create viruses very similar to those responsible. Not to study them in the wild, mind, but to actively engineer from bird flu genes a strain that can pass in airborne droplets from one animal – or perhaps species – to another. Sure, it is dangerous. But what about the risk of doing nothing?
Not according to Sir Robert May, one of the world’s most respected epidemiologists. Publicly he has called the work “absolutely crazy”, and given May’s reputation for directness his private opinion is likely to be less polite. He’s not alone. Other researchers have challenged the claims of the Wisconsin team that their work is the only way to find out how to combat a lethal flu outbreak effectively, and that the experiments were deemed necessary and safe by experts. May even suggests that the team effectively hoodwinked the US National Institutes of Health into granting approval and funding.
Research on pathogens, particularly viruses, has become increasingly disputatious over the past decade. In 2002 a team at the State University of New York ordered pieces of synthetic DNA through the mail, from which they pasted together the genome of the polio virus. They then “booted it up” to infect mice, explaining that the work had been done to highlight the risk of how easy it was. Others accused the team of an irresponsible publicity stunt. The Wisconsin team, led by the virologist Yoshihiro Kawaoka, courted controversy in 2012 when it created a mutant strain of H5N1 bird flu that could spread among mammals. Its results, and similar ones from a team in the Netherlands, were deemed too dangerous to publish by a US biosecurity panel that feared what bioterrorists might do with them.
In one sense we have been here before. Research often carries risks, whether of intentional misuse or accidents. The discovery of nuclear energy in the early 20th century, and of how to release it through nuclear fission in 1938, were arguably examples of “pure” research with perilous applications that still loom apocalyptically today. The common response of scientists is that such is the inevitable price of new knowledge.
But the dangers of biotechnology, genetics and synthetic biology are something new. For centuries we struggled to keep nasty microorganisms at bay. Even the discovery of antibiotics gave us no protection from viruses, and the emergence of HIV was a bitter reminder of that. But with the arrival of genetic manipulation in the 1970s, nature was no longer an inscrutable menace warded off with trial-and-error potions: we could fight back at the genetic level.
This new means of intervention brought a new way to foul up. Synthetic biology promises to take the battle to the next level: to move beyond tinkering with this or that resistance gene, say, and to enable full-scale engineering and design of life. We can take our nemeses apart and rebuild them from scratch.
Yet we arrive at this point relatively unprepared to deal with the moral dilemmas. The heated nature of the current debate signifies as much: scientists have never been averse to shouting at each other about the interpretation of their results, but it is rare to see them so passionately opposed on the question of whether a piece of research should be done in the first place. If even top experts can’t agree, what’s to be done?
Physical scientists are often faced with questions that can’t be answered experimentally; not, on the whole, because the experiments are too dangerous – but because they are too hard. Their usual response is to figure out what should happen in theory, and then see if the predictions can be tested in more accessible, simpler ways. But in biology it is much, much harder to make reliable theoretical predictions (or any predictions at all), because living things are so damned complicated.
We’re getting there, however, as witnessed by the development of computer models of human physiology and biochemistry for drug testing. It’s not too much to hope that one day drugs might be designed and safely trialled almost wholly on the computer, without the need for controversial animal tests or expensive human trials.
Other models might be adequate for understanding viruses, which are after all the simplest organisms known. One reason why some researchers argue that remaining smallpox stocks be destroyed is that the live virus is no longer needed for research – its genome sequence is enough. Looked at this way, making hair-raisingly lethal viruses to understand their behaviour reflects our lamentable ignorance of the theoretical principles involved.
There could be ways to make experiments safer too. Faced with fears about the quasi-artificial life forms they are starting to create, synthetic biologists say that it should be possible to build in safety measures – for example, so that the organisms can only survive on a nutrient unavailable in the wild, or will self-destruct after a few rounds of replication.
These are not fantasies, although they raise questions both about whether such fail-safe strategies give natural selection even more urgency to evade them – and whether there’s a false security in the whole engineering paradigm when applied to biology.
All the same, the questions raised by flu research can’t be defused with techno-fixes alone. Forget the new Longitude prize – here is a place where science really does need to be democratic.
One thing you can say for sure about the question posed at the outset is that the patient should have a say. If scientists are going to take these risks for our sake, as they claim, then we had better be asked for our approval.
It’s in our interests to ensure that our decision is informed and not kneejerk, and the appropriate democratic machinery requires careful construction. But the consent must be ours to give.