The Guardian have dredged up a US meteorologist, Paul Douglas, to come up with a list of “extreme” weather events, which he then uses to claim that climate change is making worse.
Whatever happened to normal weather? Earth has always experienced epic storms, debilitating drought, and biblical floods. But lately it seems the treadmill of disruptive weather has been set to fast-forward. God’s grandiose Symphony of the Seasons, the natural ebb and flow of the atmosphere, is playing out of tune, sounding more like a talent-free second grade orchestra, with shrill horns, violins screeching off-key, cymbal crashes coming in at the wrong time. Something has changed.
Let’s start by looking at some of his claims:
A warmer atmosphere is increasing water vapor levels overhead, juicing storms, fueling an increase in flash floods in the summer, and heavier winter snows along the East Coast of the USA. “All storms are 5 to 10 percent stronger in terms of heavy rainfall” explained Dr. Kevin Trenberth, at the National Center for Atmospheric Research in Boulder, Colorado. “It means what was a very rare event is now not quite so rare.”
Yet even the IPCC tell us they can find no evidence that floods are getting bigger or more frequent on a worldwide basis:
And as far as the US is concerned, the USGS say:
Only one of four large regions of the United States showed a significant relationship between carbon dioxide (CO2) in the atmosphere and the size of floods over the last 100 years. This was in the southwestern region, where floods have become smaller as CO2 has increased.
Storms? Surely any US meteorologist worth his salt must know that tornadoes have been getting much less frequent, and, more particularly, less violent since the 1970s.
He goes on to rehash the thoroughly discredited theory of Jennifer Francis that Arctic warming is making the jet stream more sluggish and wavy, bringing weather blocks.
If he had bothered reading HH Lamb, he might have found out that the same sort of weather was occurring when the world was cooling after the Second World War. This was what Lamb had to say in his volume, “Climate, History and The Modern World”:
ANOTHER TURNING POINT
Over the years since the 1940’s, it has become apparent that many of the tendencies in world climate which marked the previous 50 to 80 years or more have either ceased or changed…. It was only after the Second World War that the benign trend of the climate towards general warming over those previous decades really came in for much scientific discussion and began to attract public notice.
Such worldwide surveys as have been attempted seem to confirm the increase of variability of temperature and rainfall [since 1950].’’
In Europe, there is a curious change in the pattern of variability: from some time between 1940 and 1960 onwards, the occurrence of extreme seasons – both as regards temperature and rainfall has notably increased.
A worldwide list of the extreme seasons reported since 1960 makes impressive reading. Among the items included:
1960-9 – Driest decade in central Chile since 1770’s and 1790’s.
1962-3 Coldest winter in England since 1740.
1962-5 Driest four-year period in the eastern United States since records began in 1738.
1963-4 Driest winter in England & Wales since 1743; coldest winter over an area from the lower Volga basin and Caspian Sea to the Persian Gulf since 1745.
1965-6 Baltic Sea completely ice covered.
1968 Arctic sea ice half surrounded Iceland for the first time since 1888.
1968-73 Severest phase thus far of the prolonged drought in the Sahel, surpassing all 20thC experience.
1971-2 Coldest winter in more than 200 yrs in parts of European Russia and Turkey: River Tigris frozen over.
1972 Greatest heatwave in the long records for north Finland and northern Russia.
1973-4 Floods beyond all previous recorded experience stretching across the central Australian desert.
1974-5 Mildest winter in England since 1834.
1975-6 Great European drought produced the most severe soil moisture deficit that can be established in the London (Kew) records since 1698.
1975-6 Greatest heatwaves in the records for Denmark, Netherlands and England.
1976-7 Severest winter in the temperature records (which began in 1738) for the eastern United States.
1978-9 Severest winter and lowest temperature recorded in 200 yrs in parts of northern Europe, and perhaps in the Moscow region. Snowfalls also extreme in parts of northern Europe.
This shortened list omits most of the notable events reported in the southern hemisphere and other parts of the world where instrument records do not extend so far back. Cases affecting the intermediate seasons, the springs and autumns, have also been omitted.
These variations, perhaps more than any underlying trend to a warmer or colder climate, create difficulties for the planning age in which we live. They may be associated with the increased meridionality of the general wind circulation, the greater frequency of blocking, of stationary high and low pressure systems, giving prolonged northerly winds in one longitude and southerly winds in another longitude sector in middle latitudes.
Over both hemispheres there has been more blocking in these years… The most remarkable feature seems to be the an intensification of the cyclonic activity in high latitudes near 70-90N, all around the northern polar region. And this presumably has to do with the almost equally remarkable cooling of the Arctic since the 1950’s, which has meant an increase in the thermal gradient between high and middle latitudes.
He then goes full Guardian !
Pick up a newspaper or turn on the TV to see signs of climate volatility sparking more weather disruption. From the mega-blaze that swept across Fort McMurray, Alberta to repeated flooding of Houston, scorching heat in India, perpetual drought from California to Australia, and a record year for global hurricanes, typhoons and cyclones in 2015, the symptoms of a warming ecosystem are becoming harder to dismiss or deny.
We already know that the so called mega blaze in Alberta is small from a historical perspective, has nothing to do with climate change and would have made little news if man had not built a city in the middle of the wilderness where such things happen all the time.
And what nonsense is this about drought?
There may have been a drought in California recently, one that is certainly not in any way unprecedented, but for the US as a whole, NOAA’s own figures show that droughts are much less common, or severe, in recent decades than they used to be in the past.
Not only are rainfall totals consistently higher than the past, but the percentage of land area in decile1, the driest category, is also sharply down. This indicates that the extra rainfall has been widespread, rather than simply extreme in just a few areas.
And Accumulated Cyclone Energy stats do not support the contention that global warming is making hurricanes worse.
Of course, weather and climate continually change. I have little doubt that in some places and at certain times extreme weather has increased, and no doubt too that in others the reverse is true.
What is sad about these pathetic little attempts to blame everything on global warming is that they stop us having a balanced and objective debate on the subject.
The real reason, however, for this story is revealed when Douglas tells us:
In my upcoming book I interview 11 veteran television meteorologists in the United States. All of them are witnessing symptoms of climate change in their hometowns.
Aitken et. al. in Nature newly comports to confirm 2015 fears about instability of the Totten Glacier in Eastern Antarctica. This could ‘suddenly’ raise sea level as much as 4 meters! (Or, based on the abstract, maybe only 0.9 meters in ‘modern scale configuration’, but over 2 meters [2.9-4] in unspecified other configurations).
There are two parts to the story of Aitken et. al. 2016: the author’s comments as reported by MSM, and what the paper actually found.
An example from the Weather Channel:
“An Antarctic glacier three-fourths the size of Texas continues to melt into the sea, and if it disappears completely, sea levels will rise dramatically around the world, a new study says. The Totten Glacier is melting quickly in eastern Antarctica and threatens to become yet another point of concern as global temperatures rise, according to the study published in the journal Nature. It’s getting close to a “tipping point,” the study found, and if the glacier collapses, global sea levels could rise nearly 10 feet…”I predict that before the end of the century the great global cities of our planet near the sea will have two- or three-meter (6.5 to 10 feet) high sea defenses all around them,” study author Martin Siegert told the French Press Agency.” [Bolds mine]
From Science Daily, drawn from the Imperial College London press release:
Current rates of climate change could trigger instability in a major Antarctic glacier, ultimately leading to more than 2m of sea-level rise. By studying the history of Totten’s advances and retreats, researchers have discovered that if climate change continues unabated, the glacier could cross a critical threshold within the next century, entering an irreversible period of very rapid retreat. This would cause it to withdraw up to 300 kilometres inland in the following centuries and release vast quantities of water, contributing up to 2.9 metres to global sea-level rise. [Bolds mine]
Finally, the lurid title of Chris Mooney’s article in the WaPo on May 18: ‘Fundamentally unstable’: Scientists confirm their fears about East Antarctica’s biggest glacier
Most of the paper is a complex analysis of detailed gravimetric and magnetic data captured from low pass aircraft mapping an important ridge component of Totten’s subglacial geology.
It is helpful to understand the context for seeking evidence of alarming seal level rise (SLR) (see my previous CE post Sea Level Rise Tipping Points). SLR is not accelerating, so warmunists have searched for future ice sheet ‘tipping points’ that might cause the abrupt SLR supporting urgent CO2 mitigation. Greenland was the initial focus; it is not cooperating because of its bowl shaped geology. See my previous post for details and references.
The West Antarctic Ice Sheet (WAIS) was the next focal point. The Ronne Ice Shelf proved pinned and stable per the above linked Tipping Points guest post. ANDRILL showed that the Ross Ice Shelf is also stable; its grounding line hasn’t shifted for about 4 millennia, ditto the Tipping Points sites linked to above. Attention then shifted to the Amundsen Embayment, where much was made in 2014 of the flowing Pine Island Glacier (PIG)–until it was pointed out PIG sits on an active volcano that has nothing to do with global warming. (There are volcanic ash layers embedded in PIG.) WAIS is not cooperating, either. So attention has now shifted to the East Antarctic Ice Sheet (EAIS) where Totten is the biggest glacier/catchment basin, almost half of the above figure’s NASA defined geological sector (which also contains the Moscow University Ice Shelf and the Frost glacier) just ‘east’ of the Wilkes Land sector in the figure below.
Where Totten enters the Southern Ocean, it is mostly grounded in shallows <500 meters deep. This does not affect its stability (like the Ross Ice Sheet), since the first ~500 meters of Antarctic coastal seawater is basically at the freezing point. But warmer seawater below about 500 meters is melting Totten’s base at a deep trough about 5 km wide and about 800 meters deep, discovered in 2015 [link]. This melting causes a slow retreat of the grounding line behind the trough. The annual basal melting/grounding line retreat rate is presently about 100 meters/year, (but as fast as 175 meters per year in some places according to Aitken per WaPo). It is useful to note that Aitken was an author, but not lead author, on the 2015 trough discovery paper.
This deep ocean melting process could move inland for about 150 km through the Sabrina subglacial basin (deep blue in the following figure from the 2015 paper) over about 1500 years before hitting a sub-ice rock ridge perpendicular to the glacier only about 200 meters below sea level, which would stop the melting (since melting water is below ~500 meters). Aitken et. al. 2016 estimate that this would raise sea level about 0.9 meters, or ~6 cm/century. No cause for alarm.
What Aitken et. al. 2016 reports is another fjord like deep ‘fault trench’ through this blocking ridge, which would (if water temperature stratification remained undisturbed) enable basal melting to proceed through the interior Aurora subglacial basin behind the ridge. This process would continue for about another 350 km, or about 40% back into the Totten catchment basin. Aitken et. al. also used ice-penetrating radar to probe both the Sabrina and Aurora basin floors to confirm that Totten did in fact melt back through both basins about 3 million years ago in the Pliocene (before the onset of the current ice ages), with CO2 at about 400 ppm. That was spun into the PR alarm—it happened before at 400 ppm!!! At the current melting rates this would take about 3 millennia and could raise sea level about 2.9 meters, an unalarming 10cm/century. This is probably still far too fast, since all the Aurora warming water would have to enter undisturbed through the newly reported narrow trench through the ridge.
This is NOT fundamentally unstable collapse, implying 2-3 meters SLR by the end of this century, as the authors clearly intimated in their press releases.
How to get 3 feet of SLR by melting the Sabrina basin back to the ridge? Simply assume that all the ice in the catchment basin to the ridge disappears, even that above sea level not subject to seawater melting. To the ridge and ‘trench’, the catchment basin is about 200-250 km wide, the glacier about 100 km wide, its mouth and protruding ice shelf 145 km wide. The assumption is dubious, but not implausible. It would imply ice flow similar to that of coastal northeast Greenland glaciers today (another overhyped SLR alarm favorite), except where there are no such flowing glaciers today, and where Antarctica never gets above freezing in summer (while most of Greenland does, briefly).
How to get 2.9 meters SLR from the red oval? Easy. Just use the same entire catchment assumption to that deeper recessional melting point.
How to get ~4 meters (WaPo)? Just assume that if the Aurora basin behind the ridge melts via trough/trench intrusion of warmer seawater, the entire catchment will then lose all its ice because it lost its Totten ‘plug’ (up catchment ice is about 2.5 km thick).
This is the same assumption Rignot made in raising PIG alarm about losing all the ice in the Amundsen Embayment catchment, even though his own paper showed that is impossible (as per my previous post at CE).
This is the same assumption that the Greenbaum et. al 2015 trench paper cited above made (on which Aitken was a co-author), upon which Aitken et. al. 2016 builds. From its SI,
8. Sea Level Potential for Totten Glacier and the Aurora Subglacial Basin
We estimate the global sea level potential of ice flowing through Totten Glacier using a modified approach applied for Thwaites and Pine Island Glaciers. We find the ice volume within the Totten Glacier Catchment20, correct for the higher density of seawater, subtract the volume of seawater required to replace the submarine ice, and divide the result by the area of the world oceans (3.6E14 m2). The result, ~3.5 meters, is conservative because it implies vertical catchment boundaries whereas, in reality, ice from neighboring catchments would contribute to the total sea level contribution if the entire catchment was drained of ice.
We follow a similar procedure to compute the total potential global sea level contribution of the Aurora Subglacial Basin (ASB) using catchment 13 defined on NASA Goddard Space Flight Center’s drainage basin website21. Using that catchment we find that at least 5.1 m of global sea level potential is grounded below sea level and is therefore more susceptible to retreat. This figure assumes that all remaining ice grounded above sea level remains as it is today with unrealistic vertical cliffs. If all of the ice in the ASB were to melt, the total sea level contribution would be closer to 6.7 meters. The sea level figures here have not been corrected for isostatic rebound associated with the removal of ice loading of the crust.
[Note: the 6.7 meters assumes all the ice in this entire sector of the first figure disappears. It is easy to build scary PR from bad assumptions. Rignot blazed a false trail now relied on [SI fn 17, 18] by others.
The alarming estimates from this new Nature paper, particularly as represented by the media, are grievously wrong both with respect to the amount of and the rate of sea level rise that might be associated with melting of the EIAS Totten glacier.
There is unjustified author spin in the press releases and author’s interviews. There are underlying bad assumptions never mentioned except by reference to a previously refuted [here] bad paper by Rignot. A tangled web of deceit, to paraphrase a famous poem.
The EPA recently posted online reports on two disputed herbicide chemicals, only to pull them offline shortly afterwards. The reports said glyphosate was not a human carcinogen and atrazine caused reproductive harm to mammals.
On April 29, the EPA’s cancer assessment review committee (CARC) posted an 86-page report on the agency’s regulations.gov website that stated glyphosate, the main ingredient in Monsanto’s Roundup weed killer that was deemed a “probable” human carcinogen by the World Health Organization last year, “was not likely to be carcinogenic to humans,” Reuters reported.
On May 2, the EPA pulled the report offline, saying the action was taken “because our assessment is not final,” and that the “preliminary” documents were “inadvertently” published.
“EPA has not completed our cancer review,” the EPA told Reuters. “We will look at the work of other governments as well as work by (the U.S. Department of Health and Human Services’) Agricultural Health Study as we move to make a decision on glyphosate.”
However, the cover page of the documents was titled “final Cancer Assessment Document,” Reuters reported, and the word “FINAL” was printed on each page of the report, dated October 1, 2015. The EPA said the assessment — part of the first comprehensive safety review of the chemical since 1993, which will determine glyphosate use in the US over the next 15 years — will be complete by the end of 2016.
Critics of glyphosate ridiculed the EPA for its short-lived assessment, while the chemical’s supporters, including agribusiness giant Monsanto, hailed the report for endorsing glyphosate’s safety. Monsanto even posted a copy of it on its website.
“Pulling the report indicates lack of confidence in the outcome,” tweeted Nathan Donley, a scientist for the Center for Biological Diversity. “Can’t blame them, the analysis is terrible.”
The glyphosate documents indicated that the EPA was “relying heavily on unpublished, industry funded studies” in its assessment that glyphosate is not a human carcinogen, the Center for Biological Diversity said. In contrast, the World Health Organization’s view that glyphosate is a “likely” human carcinogen included studies that were publicly available and that took into account consumer products.
“All they’re doing is reviewing studies that are funded by the industry,” Jennifer Sass, a senior scientist at Natural Resources Defense Council, told Reuters.
In 1974, Monsanto began selling the chemical in Roundup, which has become a top bioicide for farming, especially involving genetically-engineered crops, and home and garden uses.
“No pesticide regulator in the world considers glyphosate to be a carcinogen, and this conclusion by the U.S. EPA once again reinforces this important fact,” said Hugh Grant, Monsanto’s CEO.
The use of glyphosate in herbicides has increased by more than 250 times in the United States over the last 40 years, according to the New England Journal of Medicine. Long-term exposure to glyphosate has been linked to kidney and liver damage, as well as cellular and genetic diseases. Monsanto and defenders of glyphosate use called the World Health Organization’s carcinogen classification too “dramatic” and have pointed to assurances that the chemical is safe.
In April, the European Parliament approved the seven-year reauthorization of glyphosate, though it recommended the chemical should be used only by professionals and not in public places.
Around the same time it pulled the glyphosate assessment off its website, the EPA similarly published and retracted a less-flattering report on the herbicide atrazine, which was banned in Europe in 2004. Atrazine is legal in the US, where it is second only to glyphosate among most-used agricultural herbicides.
Atrazine is manufactured by agrochemical corporation Syngenta. At least 60 million pounds of the chemical is used in the US each year, mainly on corn fields, according to the Natural Resources Defense Council. US agencies and other researchers have found high levels of atrazine in groundwater and drinking water near agricultural and rural areas. Atrazine is known to be an endocrine disruptor and has been linked to hormonal defects and some types of cancer in humans.
On April 29, an EPA assessment on atrazine was posted on the agency’s website but subsequently taken down. The documents are available here. The assessment said atrazine was found to cause reproductive harm to birds and mammals, exceeding by 200 times the EPA’s “levels of concern.” Amphibians were found to be especially at-risk from atrazine exposure, echoing research by scientists at the University of California, Berkeley, who found that about three-quarters of male frogs are castrated by the chemical.
“When the amount of atrazine allowed in our drinking water is high enough to turn a male tadpole into a female frog, then our regulatory system has failed us,” said Donley, the Center for Biological Diversity scientist. “We’ve reached a point with atrazine where more scientific analysis is just unnecessary — atrazine needs to be banned now.”
Like glyphosate, atrazine is undergoing a 15-year safety review by the EPA. The previous of such assessments on atrazine occurred in 2003.
Syngenta, atrazine’s maker, touts the chemical’s safety on its website, claiming it is not “physically possible to dissolve enough atrazine in water to have any impact on hormones or human health.”
“No one has, ever will, or ever could be exposed to enough atrazine in the natural environment to affect their reproductive health,” the chemical giant says.
Marc Morano has a new movie, Climate Hustle.
CLIMATE HUSTLE, hosted by award-winning investigative journalist Marc Morano, reveals the history of climate scares including global cooling; debunks outrageous claims about temperatures, extreme weather, and the so-called “consensus;” exposes the increasingly shrill calls to “act immediately before it’s too late,” and in perhaps the film’s most important section, profiles key scientists who used to believe in climate alarm but have since converted to skepticism.
The movie had a red carpet premiere last December in Paris, and was shown last week in a Congressional briefing.
The film will be aired in 500 theaters in the U.S. (and one in Canada) on May 2 in a one night theater event. Locations and showtimes can be found [here].
An interesting interview with Marc Morano about the film is found [here].
Let me start by discussing my take on Marc Morano, and why I agreed to be interviewed for his movie. I first heard of Marc Morano circa 2006, from Joe Romm. Romm’s take on Morano was basically that of the climate ‘anti-Christ.’ I then put ClimateDepot on my list of blogs to monitor, to check up on what the ‘evil’ side in the climate debate was up to. I slowly built up an understanding of what Morano was doing, and I didn’t regard all of it as negative.
At some point (probably around the time of Climategate) I found myself on the same email list as Marc Morano, and we exchanged a few emails on issues of common interest. Circa 2010 (if my memory serves) I referred to Marc Morano as a ‘demagogue’ (I can’t find this anywhere on the internet). Marc was offended, we discussed this on email, and I raised my concern about his attacks on individual climate scientists that included publishing their email addresses, etc. We declared sort of a truce on this, and we agreed to point out to each other if we spotted inappropriate behaviors.
Subsequently, I’ve met Marc several times, and I have to say I like the guy. He’s smart and he’s funny (he pokes fun at both sides), and as far as I can tell he is honest. When he asked to interview me for the movie, I agreed to do it. The interview itself was really fun. I have no complaints about how I was portrayed in the movie.
I saw an earlier version of the film in November, prior to the Paris premiere. I wasn’t quite sure what to expect, but my initial reaction was relief that there were no goofy or incredible statements about the science. I found the movie to be pretty entertaining and even interesting, especially the narratives developed around silly alarmist statements made by scientists and politicians.
I thought the selection of featured scientists was quite good. It included some new faces that were quite effective – Caleb Rossiter, Robert Giegengack, Richard Tol, Daniel Botkin were especially good.
The budget for this was shoestring, I think it was less than $500K (somewhere I recall seeing a $20M budget for Merchants of Doubt movie, this may not be correct). Financials for Merchants of Doubt movie: $192K at the box office, with an additional $114K from home video sales (JC note: Merchants of Doubt movie was discussed in this previous post). It will be interesting to see how Climate Hustle does at the box office (and in subsequent home video sales).
I’m sure people will criticize me for participating in this, but then these are the people that have pretty much already sent me to Coventry, so . . . so what.
The key issues surrounding the movie are reflected in these quotes from Randy Olson and Bill Nye:
“I also think [Morano]’s a danger to the efforts of the climate movement”
“I think it will expose your point of view as very much in the minority and very much not in our national interest and the world’s interest.”
Chip Knappenberger tweetrd re Nye’s ‘national interest’ statement: “Sounds like Nye should work for the State Department.”
Well, I will make no attempt to arbitrate what is in the national interest, but a reminder of minority rights in a constitutional democracy seems in order:
Thomas Jefferson, third President of the United States, expressed this concept of democracy in 1801 in his First Inaugural Address. He said,
All . . . will bear in mind this sacred principle, that though the will of the majority is in all cases to prevail, that will to be rightful must be reasonable; that the minority possess their equal rights, which equal law must protect and to violate would be oppression.
In every genuine democracy today, majority rule is both endorsed and limited by the supreme law of the constitution, which protects the rights of individuals. Tyranny by minority over the majority is barred, but so is tyranny of the majority against minorities.
The perspective in Climate Hustle is arguably a minority perspective, at least in terms of world governments and a select group of scientists. Randy Olson comments on this:
There is a need for opposition voices and questioning. If anyone feels threatened by this movie it would have to mean you’re conceding that the communication skills of the environmental side are really bad — which actually they are, so maybe there should be some cause for concern.
So, I hope some of you will be able to see the movie on May 2, I look forward to your reactions.
Marc Morano posing with his ‘Climate Criminal’ wanted poster in the streets of Paris
The House Energy and Commerce Subcommittee held a hearing on a bill intended to streamline nuclear power regulatory rules, in order to allow safer and more efficient next-generation reactors to replace those being decommissioned.
The Advanced Nuclear Technology Development Act of 2016 (HR 4979), introduced by Representative Bob Latta (R-Ohio), was discussed during a Friday hearing of the House Energy and Commerce Subcommittee to reduce regulatory hurdles for building advanced reactors. “Advanced” being defined as having significant improvements over contemporary nuclear reactor, such as better “inherent safety features, lower waste yields, greater fuel utilization, superior reliability, resistance to proliferation, and increased thermal efficiency.”
Currently, the Nuclear Regulatory Commission (NRC) demands a complete and final design from potential nuclear developers. This, combined with expensive reviews that developers pay out of pocket, can deter potential startups with a multimillion dollar price tag with no assurance of ever being allowed to operate. The bipartisan panel’s tenor was that this needs to change.
“The future of the nuclear industry needs to start now, and the Nuclear Regulatory Commission needs to be able to provide the certainty the provide sector needs to invest in innovative technologies.” Goodlatte said at the hearing. “As the United States looks to the future, more energy will be needed, and Nuclear power provides reliable, clean baseload power option.
“Investment in new technology is already happening, with approximately 50 companies in this country working to develop the next generation of nuclear power. It’s time to insure that the NRC provides a framework so that innovators and investors can prepare to apply to licensing technologies.”
In order to create a conducive environment for investment in next-generation plants, HR 4979 would require the NRC to implement a new framework to streamline nuclear plant licensing, making it more efficient and cost-effective to investors by 2019. The commission would have to submit to an implementation plan for such a framework within 180 days of the enactment of the law.
The US’s 99 operational nuclear energy plants provide nearly 20 percent of the country’s power, but approximately 126,000 megawatts of nuclear power generation is set to be retired over the next 15 years. At the same time, the US Energy Information Administration forecasts a need for 287,000 megawatts of new electric capacity by 2040 – on top of replacing the electric capacity that is needed to replace the retired power plants.
This reality, combined with the fact that nuclear power produces no greenhouse gasses, has led to environmentally-conscious lawmakers on the committee making common cause with their innovation-minded colleagues worried about falling behind international competitors.
“Our nation will, by necessity, diminish its dependence on fossil fuels in order to fight climate change. And as we do so, we will need to turn more and more to nuclear power,” said Representative Jerry McNerney (D-Illinois), who co-signed the bill.
The hearing comes at a time of renewed anxiety about aging nuclear power infrastructure. Earlier this month, a Manhattan Project-era nuclear storage facility in Washington state had up to 3,500 gallons of waste leaking out. However, the Washington Department of Ecology said that there was no risk to the environment or nearby residents.
The world has had 30 years to assess the consequences for life on Earth of the disaster at Chernobyl.
This is about the same period during which I have studied the effects of radioactive pollution on the planet. It was the radioactive rain in the mountains of North Wales, where I lived in 1986, that brought me into this strange Alice in Wonderland area of science, where people and children die, and the global authorities, advised by physicists, deny what would be obvious to a child at school.
Chernobyl was mentioned as the star that fell to earth in the Book of Revelations. You may laugh, and it may be a coincidence, but the impact of the event has certainly been of biblical proportions. It is a story about the imposition by reductionist science on humanity of a version of the truth constructed from mathematics, not the only one, but perhaps the most important, since it involves the systematic destruction of the genetic basis of life. It is a story of lies, secrecy, power, assassination and money: the vast amounts of money that would be lost if the truth came out.
Shortly after the murder in 1992 of the German Green Party leader and anti-nuclear activist Petra Kelly, the late Prof Ernest Sternglass (the first of the radiation scientist/ activists) told me that Kelly had just struck a deal with a German TV company to run a series demonstrating the true awfulness of the immediate effects of radiation. He said: if the truth came out, all the Uranium and the billions of dollars in Uranium shares would turn into sand. So something like a cover-up had to happen, and it did, continuing the process of chicanery and control of information that began with the nuclear weapons tests of the 50s and 60s. In 1959, as the genetic effects of the atmospheric tests became apparent, the control of the understanding of radiation and health was wrested from the World Health Organization (WHO) and passed to the International Atomic Energy Agency (IAEA).
The arguments about the health effects of Chernobyl have mostly centered on cancer. I won’t write much about cancer here. The study of radiation and cancer has many complications, including that the data is often suspect, the time lag between the cancer diagnosis and the original radiation exposure can be 20 years, in which time a lot can happen, introducing ammunition (and opportunity) for those denying causation. The predictions of the global cancer yield of the Chernobyl contamination has ranged from around a million (as predicted independently by the European Committee on Radiation Risk (ECRR), Rosalie Bertell, John Gofman and me, to about 600,000 (Alexey Yablokov), to less than a few thousand (the International Commission on Radiological Protection (ICRP), whose risk model is the current basis for all legal constraints on radioactive releases in Europe.
Cancer is caused by genetic damage but takes a while to show. More easily studied is the immediate and direct genetic damage, demonstrated in birth rates of congenital diseases, birth defects, fetal abnormalities, data which is easier to locate. The effects of a sudden increase in radioactive contamination are most easily seen in sudden increases in these indicators. You don’t have to wait 20 years. Out they come after nine months or in aborted fetuses with their heart and central nervous system defects, their lack of hands and feet, their huge hydrocephalic heads, their inside-out organs, their cleft palates, cyclops eyes and the whole range of dreadful and usually fatal conditions. There is no argument, and the affair is in the hands of doctors, not physicists. The physicists of the ICRP base their risk of genetic effects on experiments with mice.
I was in Kiev in 2000 at the WHO conference on Chernobyl. On the podium, conducting the theatricals, were the top men in the IAEA (Abel Gonzalez) and the United National Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), represented by Canadian Norman Gentner. No effects can be seen—Abel Gonzalez. Internal radiation is the same as external—Norman Gentner. Happily you can watch this farce as it was videotaped by a Swiss team.
So: cut to the chase, to the fatal assault on the edifice of the current ICRP radiation risk model. In January 2016 Prof Inge Schmitz Feuerhake, Dr Sebastian Pflugbeil and I published a major review paper on the genetic effects of radiation in the prestigious Korean peer-reviewed Journal of Environmental Health and Toxicology.
What the research shows is that in every corner of the ex-Soviet Union and Europe and even further afield where epidemiologists and pediatricians looked, there were large and statistically significant increases in congenital diseases at birth and in babies that were aborted.
The new article recalculates the genetic risk from radiation based upon reports from Germany, Turkey, Greece, Croatia, Egypt, Belarus, Ukraine, Russia, Hungary, Italy, the UK, Scotland, Wales, indeed everywhere where anyone looked. There was a sudden jump in birth defects immediately following the contamination from Chernobyl and in proportion; but only up to the point where the exposure was so great the babies died in the womb or miscarried early in pregnancy. Thus, the relation between exposure level and effect was not a simple one where the birth defects increased with exposure: after a critical level of exposure they leveled off, or indeed fell. Also since contamination is still there, women are still giving birth to genetically damaged children some 30 years later. These results, published by many doctors, epidemiologists and researchers in many different journals, show that the effects occurred at levels of contamination that provided ‘doses’, that yardstick of radiation exposure invented by the ICRP, that were very low, often below the natural background dose.
It is worse: from research on the nuclear test site veterans’ grandchildren (also reviewed in the study) it is clear that these effects continue down the generations and will only disappear when an offspring dies without issue, and leaves the genome of the human race. And many will or already have done: since what causes genetic malformation in the infant, at a larger dose causes fetal death and infertility. No one can have failed to have noticed the increase in human infertility that has occurred since the radioactive contamination of the planet began in the 1950s. As ex- US Atomic Energy Commission scientists John Gofman wrote in 1981 “the nuclear industry is waging a war on humanity.”
How can it be possible that the legislative system has got it so wrong? The answer is also given in the paper. It is that the concept of ‘dose’ which may be convenient for the physicists as it is simple to compute, really does not address the situation where the substances that provide the dose are inside the body, often bound chemically to the DNA, which is the acknowledged target for all these genetic effects. It shows that the human genome (and of course that of all life) is exquisitely sensitive to radiation damage from such internal exposures, to Strontium-90, Plutonium-239, Uranium and particularly to the nano-particles containing these radioactive elements which were produced when the reactor No 4 blew apart.
The paper shows the studies of the Hiroshima bomb survivors, upon which the current unsafe radiation laws are based were faulty because the true comparison group, those not in the city at the time of the bombing, was abandoned when it began to look like there was a real effect. Was this stupidity? Was it a trick? Does someone have to go to jail?
Last month, Prof. Alexey Yablokov, Dr. Alex Rosen and I wrote to the editor of The Lancet, in a recorded delivery letter posted by the Independent WHO in Geneva, requesting space in that influential journal to draw attention to these truths and overturn the false and dangerous structures created by the physicists. Let us all hope that some good will finally come of the disaster—that the real legacy of Chernobyl will be the understanding of the true danger to health of radioactive pollution.
Note: The ECRR has focused on Chernobyl as a major data source for establishing the risk posed by radiation. It has concluded that the current ICRP model is in error by upwards of about 300-fold, for some types of internal exposures, by upwards of 1000-fold. This means that over the period of the radiation contamination, more than 60 million people have died from cancer as a result of the releases. This risk model is available on the website http://www.euradcom.org.
Christopher Busby is an expert on the health effects of ionizing radiation. He qualified in Chemical Physics at the Universities of London and Kent, and worked on the molecular physical chemistry of living cells for the Wellcome Foundation. Professor Busby is the Scientific Secretary of the European Committee on Radiation Risk based in Brussels and has edited many of its publications since its founding in 1998. He has held a number of honorary University positions, including Visiting Professor in the Faculty of Health of the University of Ulster. Busby currently lives in Riga, Latvia. See also: http://www.chrisbusbyexposed.org, http://www.greenaudit.org and http://www.llrc.org.
New York Attorney General Eric T. Schneiderman has accused ExxonMobil of lying to the public and investors about the risks of climate change according to the NY Times and has launched an investigation and issued a subpoena demanding extensive financial records, emails and other documents.
Massachusetts, the US Virgin Islands, and California are also investigating ExxonMobil. It is interesting that all but one of the attorneys general are Democrats. The remaining attorney general is Claude Walker of the US Virgin Islands who is a Green leaning Independent. So, this is a very partisan investigation, carefully coordinated with anti-fossil fuel activists. How much is there to it?
I’ve reviewed the 22 internal documents from 1977 to 1989 made available by ExxonMobil here. I’ve also reviewed what I could find on 104 publications (most are peer-reviewed) with ExxonMobil personnel as authors or co-authors. For some of the peer-reviewed articles I only had an abstract and for some I could find the reference but no abstract or text without paying a fee. Below this short essay is an annotated bibliography of all 22 internal documents and 89 of the published papers. The documents are interesting reading, they fill in the history of modern climate science very well. Much of the current debate on climate change was being debated in the same way, and often with the same uncertainties, in 1977.
Between 1977 and the fifth IPCC report in 2013 ExxonMobil Corporate Research in New Jersey investigated the effect of increasing CO2 on climate. If they withheld or suppressed climate research from the public or shareholders, it is not apparent in these documents. Further, if they found any definitive evidence of an impending man-made climate catastrophe, I didn’t see it. The climate researchers at ExxonMobil participated in the second, third, fourth and fifth IPCC assessment reports making major contributions in mapping the carbon cycle and in climate modeling. They calculated the potential impact of man-made CO2 in several publications. They investigated methods of sequestering CO2 and adapting to climate change. They also investigated several potential biofuels.
The internal documents are generally summaries of published work by outside researchers. Some of the documents are notes from climate conferences or meetings with the DOE (Department of Energy). For many of the internal documents one has to read carefully to separate what is being said by the writer and what he is reporting from outside research. Exxon (and later ExxonMobil) did some original research, particularly making ocean and atmospheric measurements of CO2 from their tankers. But, most of what they produced was by funding research at Columbia University or the Lamont-Doherty Earth Observatory. All of their internal research and the work at Columbia was published as far as I can tell, so it is difficult to accuse them of hiding anything from the public or shareholders.
At the heart of Schneiderman’s accusation, according to the NY Times, is a list of statements made by ExxonMobil executives that he believes contradict the internal memos summarized below. The statements are reported here. In fact, the internal memos and documents listed below, do not contradict the ExxonMobil executives in any way. The internal documents and publications all clearly describe the considerable uncertainties in climate science and align with the executives’ statements. Go to the link to see all of them, two of the most notable are quoted below:
Mr. Ken Cohen, ExxonMobil Vice President for Public and Government Affairs, 2015 (Blog Post):
“What we have understood from the outset – and something which over-the-top activists fail to acknowledge — is that climate change is an enormously complicated subject.
“The climate and mankind’s connection to it are among the most complex topics scientists have ever studied, with a seemingly endless number of variables to consider over an incredibly long timespan.”
Duane Levine, Exxon’s manager of Science and Strategy Development, 1989 (Internal Document #21 below)
“In spite of the rush by some participants in the greenhouse debate to declare that the science has demonstrated the existence of [man-made global warming] today, I do not believe such is the case. Enhanced greenhouse is still deeply imbedded in scientific uncertainty, and we will require substantial additional investigation to determine the degree to which its effects might be experienced in the future.”
Even if there were a contradiction between the executives and the ExxonMobil climate researchers, who is to say which of them is wrong? Free speech is a fundamental individual right in the USA and executives are allowed to disagree with their employees. As University of Tennessee Law Professor Glenn Harlan Reynolds has said in USA Today :
Federal law makes it a felony “for two or more persons to agree together to injure, threaten, or intimidate a person in any state, territory or district in the free exercise or enjoyment of any right or privilege secured to him/her by the Constitution or the laws of the Unites States, (or because of his/her having exercised the same).”
“I wonder if U.S. Virgin Islands Attorney General Claude Walker, or California Attorney General Kamala Harris, or New York Attorney General Eric Schneiderman have read this federal statute. Because what they’re doing looks like a concerted scheme to restrict the First Amendment free speech rights of people they don’t agree with. They should look up 18 U.S.C. Sec. 241.”
ExxonMobil has filed court papers in Texas seeking to block a subpoena issued by the attorney general of the US Virgin Islands Claude Walker. They argue that the subpoena is an unwarranted fishing expedition into ExxonMobil’s internal records.
Environmentalist groups, like the Rockefeller Family Fund and 350.org are trying to organize a legal attack against ExxonMobil patterned on the attack many organizations led against the tobacco companies. They feel that their presumed imminent man-made climate disaster is being ignored and they want to make ExxonMobil a scapegoat. As Lee Wasserman (Rockefeller Family Fund) said recently “It’s not really about Exxon.”
Mr. Scheiderman may have made the “error of assuming facts that are not in evidence.” He assumes that man-made greenhouse gases are a significant factor in climate change and that the resulting enhanced climate change is dangerous. Neither assertion has been proven. He also assumes that Exxon’s early research proved these assertions to be true, with little or no doubt. Therefore, Mr. Scheiderman believes the Exxon executives’ claims that there is significant uncertainty around the idea of dangerous man-made climate change is a lie. I do not see any proof of dangerous climate change, man-made or otherwise in any of the documents below. In peer reviewed document #55 below, Flannery, et al. in 1985 suggest that the effect of CO2 on climate, based on geological data from the Cretaceous Period, is 50% or less. Internal document #3 indicates concern that there is a “potential problem amid all the scientific uncertainties.”
Along this line of thought, the ExxonMobil court filing against Mr. Walker and the US Virgin Islands says in part:
“… [ExxonMobil] has “widely and publicly confirmed” that it recognizes “that the risk of climate change and its potential impacts on society and ecosystems may prove to be significant.”
Brian Flannery states in published document #66 below in 2001:
“Although we know the human emissions fairly well, we don’t know the natural emissions well at all. Added to this uncertainty is the fact that natural emissions can change as a result of long-term climate changes.”
The key problem is that ExxonMobil management and most, if not all, of their researchers do not think the idea of dangerous man-made climate change has been proven. Further, one of them said in internal document #3 below: “we have time to evaluate the uncertainties even in a worse-case scenario.” This is still true, especially considering the very slow pace of warming over the last twenty years.
In internal document #3 below, they discuss the potential effect of doubling CO2 in the atmosphere and the discussion is instructive. The CO2 level prior to the industrial revolution (roughly 1840-1850) is unknown. They give two possibilities (260-270 ppm or 290-300 ppm). The temperature increase from 1850 to the end of 2015 is roughly 0.85°C from the HADCRUT 4 dataset and the 5th IPCC Assessment reports 0.85°C from 1880 to 2012. The Exxon researchers did not think a clear anthropogenic signal was detectable in 1979, because at that time the total temperature increase from 1850 had not exceeded 0.5°C, their assumed natural variability. So, they thought man-made warming might be clearly detected by the year 2000.
We are now well past the year 2000 and according to the data shown in their Table 6 (Internal Document #3), we are on track with their most benign scenario of a temperature increase of 1.3° to 1.7°C per doubling of CO2 (ECS). This assumes an initial concentration of CO2 of 265 to 295 ppm and a natural variability of +-0.5°C. The initial CO2 concentration assumption is reasonable, the assumption of 0.5°C for natural variability may be too low. However, if the assumptions are true, they probably eliminate the possibility of higher climate sensitivity to CO2 (ECS>2°). This is also supported by recent empirical estimates of ECS. There are considerable uncertainties in this approach, but they are important to recognize. We don’t know the CO2 level when we started emitting a lot of fossil fuel CO2, we don’t know the net effect on our climate, and can’t be certain we have seen any impact of man-made CO2 on our climate to date.
Even Brian Flannery, one of the Exxon researchers who has been deeply involved in the IPCC process stated in internal document 22, below: “While uncertainty exists, science supports the basic idea that man’s actions pose a serious potential threat to climate.” This is the most alarmist statement I could find anywhere, but it still says “potential” and notes that uncertainty exists.
In peer-reviewed paper #25 below, Dr. Kheshgi and Dr. White state in 2001:
“Many previous claims that anthropogenically caused climate change has been detected have utilized models in which uncertainties in the values of some parameters have been neglected (Santer et al. 1996b). In section 5 we have incorporated known parameter uncertainties for an illustrative example by using the proposed methodology for distributed parameter hypothesis testing. The results clearly show that incorporation of parameter uncertainty can greatly affect the conclusions of a statistical study. In particular, inclusion of uncertainty in aerosols forcing would likely lead to rejection of the hypothesis of anthropogenically caused climate change for our illustrative model …”
They are concerned here and in other papers, that the GCM (global circulation climate models) have used fixed parameters for their calculations for variables that actually have a great deal of uncertainty. By fixing these variables across many models, the modelers produce a narrower range of outcomes giving a misleading appearance of consistency and accuracy that does not actually exist.
As Professor Judith Curry has often said there is an uncertainty monster at the science-policy interface. The ExxonMobil scientists are very good, they write well and their superiors in ExxonMobil understand what they are saying. Man-made climate change is a potential problem, but it is shrouded in uncertainty because it is an extremely complex research topic with countless variables. The internal and published documents below show that Exxon has worked hard to define the uncertainty and they have even succeeded in reducing the uncertainty in some areas, especially in the carbon cycle. But still, the remaining uncertainty is huge and it covers the range from zero anthropogenic effect to perhaps 4° or 5°C (see publication #7, Kheshgi and White 1993) to this day. Not much different than in 1977 when they got started.
I’ll conclude this post with a quote from internal document #11, the 1982 Exxon Consensus statement. I think it speaks well for ExxonMobil and puts Schneiderman (and many in the media) to shame:
“As we discussed in the August 24 meeting, there is the potential for our research to attract the attention of the popular news media because of the connection between Exxon’s major business and the role of fossil fuel combustion in contributing to the increase of atmospheric CO2. Despite the fact that our results are in accord with most major researchers in the field and are subject to the same uncertainties, it was recognized that it is possible for these results to be distorted or blown out of proportion.
Nevertheless the consensus position was that Exxon should continue to conduct scientific research in this area because of its potential importance in affecting future energy scenarios and to provide Exxon with the credentials required to speak with authority in this area. Furthermore our ethical responsibility is to permit the publication of our research in the scientific literature; indeed to do otherwise would be a breach of Exxon’s public position and ethical credo on honesty and integrity.”
This is the only thing I found in the internal memos that was not published. In 1982 they thought the media might distort their research results or blow them out of proportion (the Uncertainty Monster). Well, that certainly happened. For science to work properly, research outcomes cannot be dictated. All interested parties must be allowed to investigate the problem and publish their results. They must have access to data, computer programs and models that are publicly funded. But, above all, they should not be punished, jailed, intimidated or sued because they are skeptical of a popular scientific thesis. They should be judged only on the quality of their scientific work and not who they work for or who funds them.
This post is excerpted from a longer post The Exxon Climate Papers, that includes links and annotations to 89 documents, including internal documents and published papers.
Bio notes: Andy May worked for Exxon from 1980 to 1985. During part of that time he worked on the Natuna D-Alpha project discussed in some of these documents. He did not work at either the Florham Park, New Jersey Research laboratory or the Linden, New Jersey laboratory where the climate research was done. The views expressed in this essay and bibliography are his own. This was written in his spare time and he received no compensation from anyone for writing and posting it.
WASHINGTON – The US government has sent Special Envoy Amos Hochstein to Kuwait, Qatar, Egypt and Israel to discuss falling oil prices after the failure of the Doha energy talks, the US Department of State announced in a media note on Monday.
“Special Envoy and Coordinator for International Energy Affairs Amos J. Hochstein will be travelling to the region to meet with key interlocutors in Jerusalem, Cairo, Kuwait City and Doha,” the note stated.
As global oil prices remain near record lows, and the United States emerges as a global exporter of liquefied natural gas, Hochstein will be seeking to strengthen US relationships with partners in the region as well as discuss strategies for addressing the market realities of the energy sector, the note explained.
Hochstein will discuss energy security issues in Israel, power generation issues in Egypt and plans to investment in developing new oil fields and build additional oil refineries in Kuwait, the State Department pointed out.
In Qatar, Hochstein will give a speech emphasizing US support for liquefied natural gas development and its role in reducing global carbon emissions, the note said.
The last few years have seen an alarming increase in claims that tribal peoples have been shown to be more violent than we are. This is supposed to prove that our ancestors were also brutal savages. Such a message has profound implications for how we view human nature – whether or not we see war as innate to the human condition and so, by extension, broadly unavoidable. It also underpins how industrialized society treats those it sees as “backward.” In reality though it’s nothing more than an old colonialist belief, masquerading once again as “science.” There’s no evidence to support it.
The American anthropologist, Napoleon Chagnon, is invariably cited in support of this brutal savage myth. He studied the Yanomami Indians of Amazonia from the 1960s onwards (he spells the tribe “Yanomamö”) and you’d be hard pressed to find a book or article on tribal violence which doesn’t refer to his work. Popular writers such as Steven Pinker and Jared Diamond frequently make much of Chagnon’s thesis, so it’s worth giving a thumbnail sketch of why in reality it proves little about the Yanomami, and nothing about human evolution.
First, it’s important to dispatch a red herring from the murky cauldron being cooked up by the brutal savage promoters: They often point to Darkness in El Dorado, a book by Patrick Tierney, which attacked Chagnon’s work, but went too far. Tierney raised the possibility that one of Chagnon’s colleagues may have deliberately introduced a deadly measles epidemic to the Indians. That simply wasn’t true: In fact, the epidemic was inadvertently started by American missionaries. That Tierney was wrong on this single point is now used to claim that all his and other writers’ criticisms of Chagnon have been discredited. They haven’t. In any case, were a single error deemed to negate a whole thesis, then pretty much all science, as well as journalism, the law and a lot else, falls apart.
Anyway, let’s set Tierney aside. For decades, Napoleon Chagnon’s findings have been rejected by almost all of the many other anthropologists who have worked with the Yanomami, and in most countries his work simply isn’t taught. He had rather faded from anthropology in the United States too, until his recent resurgence as the darling of establishment attitudes.
According to Chagnon, brutality is a key driver of human evolution. How did he come upon such a disturbing “discovery”? Basically, he counted how many Yanomami men boasted that they were unokai and he told us this means they’ve killed people. He then crunched the numbers to show that unokai are similarly successful in love as they are in war, and that by fathering more children than non-killers, they ensure the next generation is as murderous as they are.
As with any sweeping conclusion in human sciences, there are numerous known unknowns. For example, did Yanomami raiding in the 1960s increase through growing pressure from settler or missionary incursions? (After all, Chagnon used the extremist New Tribes Mission to get into the Yanomami.) Did the influx of outside trade goods, including guns, play a role? Such impacts are difficult to analyze, though some believe they were clearly significant.
But the most significant fact, the extraordinary single error that, in this case, does destroy Chagnon’s thesis in one swoop, is something Chagnon doesn’t tell us – unokai does not just mean “killer.” It’s also the status claimed by everyone who’s ever shot an arrow into a dead body during an inter-village raid (most raids stop after one killing). It describes many other individuals as well, including men who’ve killed an animal thought to be a kind of shamanic embodiment of a human, as well as stay-at-homes who try and cast lethal spells. It even includes those who’ve participated in a ritual during their future wife’s puberty (she also becomes unokai). In other words, many unokai haven’t killed anyone. With this simple fact, every one of Chagnon’s conclusions about “killers” falls apart.
But supposing he was right after all, what would his figures show? What percentage of the population are we talking about? Here the brew gets fishier: Chagnon plays fast and loose with his own data. His autobiography, “Noble Savages,” says that “killers” number “approximately 45 percent of all the living adult males.” Yet even according to his own (shaky) data, that is simply not true: Chagnon’s own figures do not show that 45 percent of men are unokai. He has grossly inflated his percentage by ignoring everyone younger than 25, an age group with far fewer claiming unokai status. Were they included, his percentage would plummet.
Chagnon has been asked about this manipulation for years. When he bothers to reply, he claims he’ll publish new supporting data. We’re still waiting.
So there you have it: That’s the poster boy of the “scientific proof” behind the myth of the brutal savage. The fact that Chagnon’s thesis has been repeatedly demolished in scholarly publications for decades is simply ignored by those who want him to be right. For them to dismiss the many Chagnon critics, to pretend that science is on their side, and to chorus sneeringly “noble savages” whenever Chagnon is criticized, is just facile propaganda.
By the way, if you want to know how many unokai (supposed “killers”) Chagnon managed to winkle out during a quarter century of fieldwork with one of Amazonia’s largest tribes – numbering several thousand – the answer is just 137 men. They could all comfortably fit into a single car on the New York subway. How many of those were actually killers? We’ll never know.
That’s the size of the sample group supposedly proving that tribal peoples live in a state of chronic warfare and, by throwing in more red herrings, that our ancestors did so too. The latter assertion is widely promulgated. It goes like this: The Yanomami are a small-scale tribal (non-state) hunting society, our ancestors were the same, so the Yanomami can teach us about our ancestors because they live in a similar way. And yet the theory fails on several points: For example, no one knows the degree to which our distant ancestors scavenged for meat, rather than actively hunted it. That’s quite a different approach to life, and the Yanomami wouldn’t dream of doing it. In any case, a moment’s informed reflection tells you that no one who inhabited the ice age plains of Eurasia, for example, lived remotely like the tropical rainforest Yanomami of Chagnon’s 1960s.
The real story is more obvious, prosaic and simpler than the Chagnon-created “fierce people” and their supposed “chronic” warfare. The truth is that there are some tribal peoples who have a belligerent reputation, others known for avoiding violence as much as possible, and lots in between. That’s nothing to do with any grasping at mythic noble savages, it’s what anthropologists have actually found.
Despite the growing mythology, the archeological record reveals very little evidence of past violence either (until the growth of big settlements, starting around 10,000 years ago). Researchers Jonathan Haas and Matthew Piscitelli studied descriptions of 2,930 earlier skeletons from 900 different sites worldwide. Apart from a single massacre site of two dozen people in the Sudan, they found “but a tiny number of cases of violence in skeletal remains,” and noted how just four sites in Europe “are mentioned over and over by multiple authors” striving to demonstrate the opposite of what the evidence actually reveals. The archeological record before 10,000 years ago, they conclude, in fact “shows that warfare was the rare exception.”
Much of the other “proof” for the brutal savage advanced by Steven Pinker, Jared Diamond, and other champions of Chagnon, is rife with the selection and manipulation of facts to fit a desired conclusion.
To call this “science” is both laughable and dangerous. These men are desperate to persuade us that they’ve got “proof” for their opinions, which isn’t surprising as they’re nothing more – opinions based on a narrow and essentially self-serving political point of view. They have proved nothing, except to those who want to believe them.
Does it matter? Yes, very much. How we think of tribal peoples dictates how we treat them. Proponents of Chagnon seek to reestablish the myth of the brutal savage which once underpinned colonialism and its land theft. It’s an essentially racist fiction which belongs in the 19th century and, like a flat earth, should have been discarded generations ago. It’s the myth at the heart of the destruction of tribal peoples and it must be challenged.
It’s not just deadly for tribal peoples: It’s dangerous for all of us. False claims that killing is a proven key factor in our evolution are used to justify, even ennoble, the savagery inherent in today’s world. The brutal savage may be a largely invented creature among tribal peoples, but he is certainly dangerously and visibly real much closer to home.
In her oily, cringe-inducing and totally predictable speech to AIPAC on March 21, Hillary Clinton argued that, since (according to her) “anti-Semitism is on the rise across the world… we must repudiate all efforts to malign, isolate and undermine Israel and the Jewish people.” In other words, we must do what we can to shut down any legitimate criticism of Israeli policy. A reliable means of doing so is to conflate said criticism with anti-Semitism and thus vilify the critic in question. This particular strategy has been perfected and institutionalized for decades, and was perhaps best deconstructed by Norman Finkelstein in “The Holocaust Industry.”
By dismissing BDS advocates as irrational, Jew-hating troublemakers, Hillary Clinton, the great bastion of liberalism and progress, makes common cause with the jingoist far right (where she actually belongs). But she also makes common cause with a good chunk of US academia, where criticism of Israel and its atrocities is often met with censorship and intimidation. In a comprehensive report on the subject, Palestine Legal details the extent of the suppression: “From January 2014 through June 2015, Palestine Legal interviewed hundreds of students, academics and community activists who reported being censored, punished, subjected to disciplinary proceedings, questioned, threatened, or falsely accused of anti-Semitism or supporting terrorism for their speech in support of Palestinian rights or criticism of Israeli policies.”
Needless to say, this is a gross violation of First Amendment rights, and it needs to be challenged at every opportunity. The university system is based on the principles of free inquiry and unfettered discourse; absent the open exchange of conflicting ideas and opinions, academia is essentially worthless. When certain viewpoints are institutionally favored, colleges cease to be places of learning and instead become places of indoctrination. Who could desire such a circumstance? Well, apart from authoritarians, fascists, religious fanatics (including Zionists) and Hillary Clinton, it’s becoming more and more apparent that “liberal” student activists do.
On college campuses across the country, students are mobilizing and protesting against institutionalized discrimination. Few on the left would argue that this is a negative development. After all, if nothing else these students are contesting authority—a noble and worthy exercise in itself. However, what do we say when fundamental democratic values like free speech are subordinated to an ideology? This is the precarious situation in which many student activists currently find themselves. It’s bizarre: presumably, the students protesting at places like Yale and the University of Missouri (to take two high-profile examples from last year) would stand with the BDS activists who are targeted and censored by pro-Israel forces. And yet these same students—exhibiting a degree of schizophrenia—would have their own ideological opponents treated in the same fashion.
Take a recent incident. At Emory College in Atlanta, some students used chalk to write “Trump 2016”—and other similarly anodyne messages—throughout the campus. Curiously (or perhaps not at this point), controversy erupted when a number of students declared that they felt physically threatened by the chalk drawings, which were considered by some to be acts of violence. “I thought we were having a KKK rally on campus,” one student reportedly told the Daily Beast. She “legitimately feared for [her] life.” Another student said that “some of us were expecting shootings” and thus “feared walking alone.” They demanded that the Emory administration identify the perpetrators, presumably so some sort of disciplinary action could take place—perhaps a public flogging. When the administration responded with a tepid defense of the anonymous chalkers’ right to free speech, the offended shifted their ire onto the college itself, for failing to provide an adequate safe space. All of which is par for the course by now.
So here we have a conflation of Donald Trump supporters with homicidal white supremacists; of political campaigning with physical violence. This is not dissimilar to the conflation of BDS with anti-Semitism, which plagues Palestinian rights activists everywhere. In fact, it’s closer to the profoundly stupid idea that all Muslims endorse terrorism—a notion that the offended students at Emory surely find abhorrent. There is one obvious distinction that must be made: the censorship of BDS on college campuses comes from the top, while the attempted censorship of Donald Trump supporters comes from the comparatively impotent student body. The former case is a much graver threat to free speech, but that is not an excuse to ignore the latter. Soon enough the student body will hold positions of authority.
ESP seems to be a trait common to advocates of censorship. For example, in a recent pro-Israel memo from the Regents of the University of California, it is contended that “opposition to Zionism often is expressed in ways that are not simply statements of disagreement over politics and policy, but also assertions of prejudice and intolerance toward Jewish people and culture.” Translation: the mind readers at the Regents of the University of California can tell when critics of Israel are actually rabid Jew-haters, and they will adjudicate such cases accordingly. Similarly, the would-be student censors use their clairvoyance to judge when an opinion they don’t like is motivated by race hatred or some other form of bigotry. Support for Donald Trump, as we have already seen, implies a desire to kill minorities. It is therefore no different from real physical violence.
What would happen if an entire college was founded on this line of thinking? A recent petition drawn up by some student activists at Western Washington University spells it out for us. The group calls themselves the Student Assembly for Power and Liberation, which is more than a little ominous-sounding. In their own words: “We are a growing group of students from a multitude of communities and disciplines around campus combatting the systemic oppression embedded within our society that is inevitably upheld through this institution, as it was created to uphold white supremacy at its core.”
Note the aggressively bureaucratic language (the grammar of which unravels throughout the petition). Prolixity of this sort is often employed by postmodernist academics—in whose tradition these students are working—for reasons that aren’t entirely clear. Noam Chomsky once argued that, in general, postmodernism “allows people to take a radical stance—more radical than thou—but to be completely dissociated from anything that’s happening, for many reasons. One reason is nobody can understand a word they’re saying. So they’re already dissociated. It’s kind of like a private lingo.”
Obviously, Michel Foucault these kids are not, but the postmodernist influence is plain to see. It’s like that smug kid in your Creative Writing workshop whose stories are all cheap Bukowski imitations. They don’t really have any idea what they’re doing, but they’re busting with self-satisfaction nevertheless.
What these students want, and what their petition is meant to facilitate, is the creation of a brand new college: the College of Power and Liberation. The function of this hypothetical college would be the “development of academic programs that are committed to social justice.” The first step in realizing this goal is “a cluster hire of ten tenure-track faculty to teach at the college.” Fair enough. However, there is something of a catch: “the Student Assembly for Power and Liberation will have direct input and decision-making power over the hiring of faculty for the college.”
That’s right—the professors at the College of Power and Liberation are to be hired by the students attending that college. The “power,” then, is to reside entirely in the hands of the student body. Naturally, they also reserve the right to take “disciplinary action” against “everyone in a teaching position within the university.” And it gets weirder. Demanded in part three of the petition is “the creation and implementation of a 15 persxn [sic] paid student committee, The Office for Social Transformation.”
The misspelling of “person” here is deliberate, as is the discontinuous misspelling of “history” (hxstory) later on. The implication is that these nouns are gendered (person, history) and thus microaggressive residue of an outmoded patriarchal system of thought. Therefore they have been changed. This, I suppose, is an example of the “de-colonial work” for which the College of Power needs “an annually dedicated revenue of $45,000.”
The Office for Social Transformation doesn’t just sound Orwellian—it quite literally is. Here is its express purpose: “to monitor, document, and archive all racist, anti-black, transphobic, cissexist, misogynistic, ablest, homophobic, islamophobic, xenophobic, anti-semitism [sic], and otherwise oppressive behavior on campus.” This oppressive behavior, the petition continues, is regularly found “in faculty curriculum.” By that I assume they mean curriculum including books with controversial subject matter, for instance the novels of James Baldwin and Mark Twain. So much for the English professors who wish to teach the “Adventures of Huckleberry Finn”—a terribly oppressive book.
The petition does not explicitly propose thought crime legislation, but it doesn’t rule it out either. One inevitably wonders about the criteria by which a person’s behavior is judged oppressive (i.e., punishable). For example, what becomes of the student or faculty member who is caught reading Kipling? Surely owning a copy of The Cantos is grounds for disciplinary action—Ezra Pound was a bona fide fascist. Hemingway was anti-Semitic and homophobic: it follows that The Sun Also Rises is beyond the pale. Tolstoy abused his wife, and so reading War and Peace implies an endorsement of misogyny.
Simone de Beauvoir once appealed to the censors of her time: “Must we burn [the Marquis de] Sade?” Indeed we must—and most others, for that matter.
Never fear, though: the College of Power and Liberation has a “three-strike disciplinary system that corresponds to citations that are processed.” Thank heavens for the three-strike disciplinary system, without which people might be fired and expelled unreasonably.
You get the picture. The mini despots comprising the so-called Student Assembly for Power and Liberation are concerned very much with Power and very little with Liberation. Their ultimate goal is to establish a totalitarian microcosm of a state, very far removed from reality, in which power and wealth is concentrated in the hands of a few self-righteous 20-somethings with delusions of grandeur. Because the First Amendment is overrated anyway.
The Holocaust Industry would be proud. And that’s what makes all of this so distressing. If so-called liberal student activists believe in censorship (and many of them evidently do), who can we rely on to challenge the unconstitutional suppression of BDS activism on college campuses? It necessarily devolves into a battle of hypocrites: the right rationalizes their brand of censorship while condemning the left’s, and vice versa. The reality is that both need to be condemned, because both represent explicit attacks on basic democratic principles. The crucial difference, I suppose, is that the Zionists (who know exactly what they’re doing) must be fought, while the overzealous students (who don’t) need merely to be educated. We can and should do both at once.
Michael Howard is a freelance writer from Buffalo, NY. He can be reached at email@example.com .
“The problem with this response is that man-made climate change is real and happening now. The detonation of a nuclear bomb is a hypothetical.”
From the WaPo :
Those who fear the effects of radiation always focus on cancer. But the most frightening and serious consequences of radiation are genetic.
Cancer is just one small bleak reflection, a flash of cold light from a facet of the iceberg of genetic damage to life on Earth constructed from human folly, power-lust and stupidity.
Cancer is a genetic disease expressed at the cellular level. But genetic effects are transmitted across the generations.
It was Herman Joseph Muller, an American scientist, who discovered the most serious effects of ionizing radiation – hereditary defects in the descendants of exposed parents – in the 1920s. He exposed fruit flies – drosophila – to X-rays and found malformations and other disorders in the following generations.
He concluded from his investigations that low dose exposure, and therefore even natural background radiation, is mutagenic and there is no harmless dose range for heritable effects or for cancer induction. His work was honoured by the Nobel Prize for medicine in 1946.
In the 1950s Muller warned about the effects on the human genetic pool caused by the production of low level radioactive contamination from atmospheric tests. I have his original 1950 report, which is a rare item now.
Muller, as a famous expert in radiation, was designated as a speaker at the Conference, ‘Atoms for Peace’ in Geneva in 1955 where the large scale use of nuclear energy (too cheap to meter) was announced by President Eisenhower. But when the organisers became aware that Muller had warned about the deterioration of the human gene pool by the contamination of the planet from the weapon test fallout, his invitation was cancelled.
The Wonderful Wizard of Oz
The protective legislation of western governments does, of course, concede that radiation has such genetic effects. The laws regulating exposure are based on the risk model of the International Commission on Radiological Protection, the ICRP.
The rules say that no one is allowed to receive more than 1mSv of dose in a year from man-made activities. The ICRP’s scientific model for heritable effects is based on mice; this is because ICRP states that there is no evidence that radiation causes any heritable effects in humans.
The dose required to double the risk of heritable damage according to the ICRP is more than 1000mSv. This reliance on mice has followed from the studies of the offspring of those who were present in Hiroshima and Nagasaki by the Japanese/ US Atomic Bomb Casualty Commission (ABCC).
These studies were begun in 1952 and assembled groups of people in the bombed cities to compare cancer rates and also birth outcomes in those exposed at different levels according to their distance from the position of the bomb detonation, the hypocentre. The entire citadel of radiation risk is built upon this ABCC rock.
But the rock was constructed with smoke and mirrors and everything about the epidemiology is false. There have been a number of criticisms of the A-Bomb Lifespan Studies of cancer: it was a survivor population, doses were external, residual contamination was ignored, it began seven years after the event, the original zero dose control group was abandoned as being “too healthy”, and many others.
But we are concerned here with the heritable effects, the birth defects, the congenital malformations, the miscarriages and stillbirths. The problem here is that for heritable damage effects to show up, there have to be births. As you increase the exposures to radiation, you quickly obtain sterility and there are no pregnancies. We found this in the nuclear test veterans.
Then at lower doses, damaged sperm results in damaged foetuses and miscarriages. When both mother and father are exposed, there are miscarriages and stillbirths before you see any birth defects. So the dose response relation is not linear. At the higher doses there are no effects. The effects all appear at the lowest doses.
Bad epidemiology is easily manipulated
As far as the ABCC studies are concerned, there is another serious (and I would say dishonest) error in the epidemiology. Those people discarded their control population in favour of using the low dose group as a control.
This is such bad epidemiology that it should leave any honest reviewer breathless. But there were no reviewers. Or at least no-one seemed to care. Perhaps they didn’t dig deeply enough. In passing, the same method is now being used to assess risk in the huge INWORKS nuclear worker studies and no-one has raised this point there either.
Anyway, the ABCC scientists in charge of the genetic studies found the same levels of adverse birth outcomes in their exposed and their control groups, and concluded that there was no effect from the radiation.
Based on this nonsense, ICRP writes in their latest 2007 risk model, ICRP103, Appendix B.2.01, that “Radiation induced heritable disease has not been demonstrated in human populations.”
But it has. If we move away from this USA controlled, nuclear military complex controlled A-Bomb study and look in the real world we find that Muller was right to be worried. The radioactive contamination of the planet has killed tens of millions of babies, caused a huge increase in infertility, and increased the genetic burden of the human race and life on earth.
And now the truth is out!
In January of this year Prof. Inge Schmitz-Feuerhake, of the University of Bremen, Dr Sebastian Pflugbeil of the German Society for Radioprotection and I published a Special Topic paper in the prestigious peer-review journal Environmental Health and Toxicology. The title is: Genetic Radiation Risks – a neglected topic in the Low Dose debate.
In this paper we collected together all the evidence which has been published outside the single Japanese ABCC study in order to calculate the true genetic effects of radiation exposure. The outcome was sobering, but not unexpected.
Using evidence ranging from Chernobyl to the nuclear Test Veterans to the offspring of radiographers we showed clearly that a dose of 1mSv from internal contamination was able to cause a 50% increase in congenital malformations. This identifies an error in the ICRP model and in the current legislation of a factor of 1,000. And we write this down. The conclusion of the paper states:
Genetically induced malformations, cancers, and numerous other health effects in the children of populations who were exposed to low doses of ionizing radiation have been unequivocally demonstrated in scientific investigations.
Using data from Chernobyl effects we find a new Excess Relative Risk (ERR) for Congenital malformations of 0.5 per mSv at 1mSv falling to 0.1 per mSv at 10mSv exposure and thereafter remaining roughly constant. This is for mixed fission products as defined though external exposure to Cs-137.
Results show that current radiation risk models fail to predict or explain the many observations and should be abandoned. Further research and analysis of previous data is suggested, but prior assumptions of linear dose response, assumptions that internal exposures can be modelled using external risk factors, that chronic and acute exposures give comparable risks and finally dependence on interpretations of the high dose ABCC studies are all seen to be unsafe procedures.
Radiation causes genomic instability
Our paper is available on the web as a free download, so you can see what we wrote and follow up the 80 or so references we used to construct the case.
Most of the evidence is from effects reported in countries contaminated by the Chernobyl accident, not only in Belarus and Ukraine but in wider Europe where doses were less than 1mSv. Other evidence we referred to was from the offspring of the nuclear test veterans.
In a study I published in 2014 of the offspring of members of the British Nuclear Test Veterans Association (BNTVA) we saw a 9-fold excess of congenital disease in the children but also, and unexpectedly, an eight-fold excess in the grandchildren. This raises a new and frightening spectre not anticipated by Herman Muller.
In the last 15 years it has become clear that radiation causes genomic instability: experiments in the laboratory and animal studies show that radiation exposure throws some kind of genetic switch which causes a non-specific increase in general mutation rates.
Up until these genomic instability discoveries it was thought that genetic processes followed the laws of Gregor Mendel: there were specific dominant and recessive gene mutations that were passed down the generations and became diluted through a binomial process as offspring married away.
But radiation scientists and cancer researchers could not square the background mutation rate with the increased risks of cancer with age: the numbers didn’t fit. The discovery of the genomic instability process was the answer to the puzzle: it introduces enough random mutations to explain the observations.
It is this that supplies the horrifying explanation for the continuing high risk of birth defects in Fallujah and other areas where the exposures occurred ten to twenty years ago. Similar several generation effects have been seen in animals from Chernobyl.
Neonatal mortality in the nuclear bomb era
So where does that leave us? What can we do with this? What can we conclude? How can this change anything? Let’s start by looking at the effects of the biggest single injection of these radioactive contaminants, the atmospheric weapons tests of the period 1952 to 1963.
If these caused increases in birth defects and genetic damage we should see something in the data. We do. The results are chilling. If babies are damaged they die at or shortly before birth. This will show up in the vital statistics data of any country which collects and publishes it.
In Fig 1 (above right) I show a graph of the first day (neonatal) mortality rates in the USA from 1936 to 1985. You can see that as social conditions improved there was a fall in the rates between the beginning and end of the period, and we can obtain this by calculating what the background should have been using a statistical process called regression.
The expected backgound is shown as a thin blue line. Also superimposed is the concentration of Strontium-90 in milk (in red) and its concentration in the bones of dead infants (in blue). The graph shows first day neonatal mortality in the USA; it is taken from a paper by Canadian paediatrician Robin Whyte (woman) in the British Medical Journal in 1992. This paper shows the same effect in neonatal (1 month) mortality and stillbirths in the USA and also the United Kingdom. The doses from the Strontium-90 were less than 0.5mSv.
This is in line with what we found in our paper from Chernobyl and the other examples of human exposures. The issue was first raised by the late Prof Ernest Sternglass, one of the first of the radiation warrior-scientists and a friend of mine. The cover-ups and denials of these effects are part of the biggest public health scandal in human history.
It continues and has come to a venue near you: our study of Hinkley Point showed significant increased infant mortality downwind of the plant at Burnham on Sea as I wrote in The Ecologist.
It’s official – genetic damage in children is an indicator of harmful exposures to the father
As to what we can do with this new peer-reviewed evidence we can (and we shall) put it before the Nuclear Test Veterans case in the Pensions Appeals hearings in the Royal Courts of Justice which is tabled for three weeks from June 14th 2016 before a tribunal headed by high court judge Sir Nicholas Blake.
I represent two of the appellants in this hearing and will bring in the genetic damage in the children and grandchildren as evidence of genetic damage in the father.
We are calling Inge Schmitz-Feuerhake, the author of the genetic paper, as one expert witness; the judge has conceded that genetic damage in the children is an indicator of harmful exposures to the father. He has made a disclosure order to the University of Dundee to release the veteran questionnaires. They have.
Finally, I must share with you a window into the mind-set of the false scientists who work for the military and nuclear operation. As the fallout Strontium-90 built up in milk and in childrens’ bones and was being measured, they renamed the units of contamination, (picoCuries Sr-90 per gram of Calcium) ‘Sunshine Units’.
Can you imagine? I would ship them all to Nuremberg for that alone.
Dr Chris Busby is the Scientific Secretary of the European Committee on Radiation Riskand the author of Uranium and Health – The Health Effects of Exposure to Uranium and Uranium Weapons Fallout (Documents of the ECRR 2010 No 2, Brussels, 2010). For details and current CV see chrisbusbyexposed.org. For accounts of his work see greenaudit.org, llrc.org and nuclearjustice.org.