Aletho News


Cancer, George Monbiot and Nuclear Weapons Test Fallout

By Chris Busby | CounterPunch | March 20, 2018

George Monbiot, who has now been diagnosed with prostate cancer at the young age of 55, was therefore born in 1963, at the peak of the atmospheric test fallout. He is thus a peak exposed (at risk) member of a cohort of those exposed in the womb to the fallout (1959-63) and currently suffering the consequences of exposure to Strontium-90 in the milk, and (measured) in the childrens’ bones.

In his article in the Guardian, he says that he has always done all the healthy things, done lots of exercise, eaten vegetables, didn’t smoke or drink, all that stuff. He is clearly puzzled about being singled out by the three ladies. But the cause was something that he had no control over, and neither had anyone else who was born in the fallout period. George writes that he is happy. This insane response to his predicament, (which I personally am not happy about despite his intemperate attacks on me in his Guardian column and blogs) must go alongside his equally insane response about the Fukushima events where he publicised his road-to-Damascus conversion to nuclear power.

The effect of the genetic damage of the fallout on babies can be seen in the graph below, Fig 1, taken from a recent paper I published (Busby C (2017) Radiochemical Genotoxicity Risk and Absorbed Dose. Res Rep Toxi. Vol.1 No.1:1.).  The babies that did not die were just those with insufficient genetic damage to kill, but this damage would have affected them in later life in various ways. The most measurable effect (apart from genetic defects and congenital diseases) is higher cancer risk which is presented as early cancer onset. The issue of the 1959-63 cancer cohort was discussed in my 1995 book Wings of Death, and a letter I published in 1994 in the British Medical Journal (BMJ). The issue is one of Absorbed Dose. If internal exposure to radionuclides like Strontium-90 and Uranium-238 and Uranium-235 bind to DNA, which is the target for genetic damage, then Dose, which is an average quantity over kilograms of tissue, is an unsafe way of quantifying genetic damage. The issue of genetic damage from radioactive pollution was first raised in 1950 by Herman Muller, the Nobel Prize winning geneticist who discovered the effects of radiation, but his warnings were ignored, though they are now found to be accurate.

The serious effects of internal radionuclide exposures on Prostate Cancer were revealed in a study of UK Atomic Energy Agency workers also published in 1993 in the BMJ (Fraser P, Carpenter L, Maconochie N, Higgins C, Booth M and Beral V (1993) Cancer mortality and morbidity in employees of the United Kingdom Atomic Energy Authority 1946-86. Brit. J. Cancer 67 615-624.) This paper showed a 2-fold excess cancer risk in workers who had been monitored for internal radionuclides versus those who had not been. Prostate cancer mortality was significantly high. Although later cover-up studies by the nuclear industry, using a larger cohort reduced this effect for prostate cancer, the internal/ external exposure result for all cancers has not been satisfactorily followed up.

Fig 1. First day neonatal mortality USA shows the effects of the fallout. Because of advances in medicine and better social conditions, infant mortality was falling everywhere. But as soon as the atmospheric tests began, rates went up in time with the fallout. 1st day neonatal mortality is a measure of congenital damage: the baby survives in the mother by using the mothers’ oxygenation and other support but because the babies own organs are damaged and it cannot survive after birth. Strontium-90 was measured in bone where it built up to a peak in 1964. It will also have attached to chromosomes due to its affinity for DNA.

The fallout cohort is now entering the cancer bracket and these people are driving up the cancer rates in the Northern hemisphere, especially for breast cancer and prostate cancer. I have been studying this group since 1995, but now my predictions are appearing in the data.

But the true picture of the fallout effects is even more scary. Not only are the babies born over the peak fallout period, like George, at higher risk of more and earlier cancer, but it is now emerging that their children, born around 1980- 1990 are carrying the same genetic (or rather genomic) curse. I am in the process of putting together a scientific paper on this. There is a sudden increase in cancer rates in young people aged 25-35 which began after 2008. This is an extraordinary development. The finding was confirmed for colon cancer in the USA in a paper published recently in the Journal of the National Cancer Institute (Rebecca L. Siegel, Stacey A. Fedewa, William F. Anderson, Kimberly D. Miller, Jiemin Ma, Philip S. Rosenberg, Ahmedin Jemal Colorectal Cancer Incidence Patterns in the United States, 1974–2013 JNCI J Natl Cancer Inst (2017) 109(8): djw322). The authors were unable to explain their findings of increases in colon cancer in young people but decreasing colon cancer rates in older people. They were “puzzled”. The explanation is simple. These were children born to those who were themselves born during the fallout and genomically damaged at birth. The damage is passed to the children (and will be in turn passed to theirs and so on). The effect is clear also in the England and Wales data.

So, for the logical positivists, let’s have a look at the prostate cancer data in England and Wales.

In Table 1 below I show some data from the official ONS government annual reports on prostate cancer incidence in some selected years from 1974 to 2015.


No argument there then. The amazing thing is that there are huge amounts of money received and spent on cancer research: but no-one looks at the cause. Or rather that those who do look at the cause are attacked and marginalised and their work is not reported.

For example, and relevant here, are the serious genetic effects of small dose internal exposures in Europe after Chernobyl reviewed by Prof Inge Schmitz-Feuerhake, Dr Sebastian Pflugbeil and myself in a peer review publication in 2016 (Schmitz-Feuerhake, Busby C, Pflugbeil P  Genetic Radiation Risks-A Neglected Topic in the Low Dose Debate. Environmental Health and Toxicology.  2016. 31Article ID e2016001. .) You would think that this evidence, which was reported in the peer review literature from 20 studies from countries all over Europe, might make it into one of the newspapers. But nothing.

My attempts to draw attention to these internal genetic damage issues have also been ignored or dismissed by the British establishment. This year, in September, I was to have presented this evidence to British Government Minister Richard Harrington at a meeting of the NGOs and the government at Church House Westminster. My flight from Sweden was sabotaged but I made it to the meeting nevertheless, to find that the Minister had made some excuse, and had not come. )

At the meeting, the government radiation expert committee members (COMARE) refused to consider anything I said.

This behaviour by the British can be compared with the Swedish Environmental Court in Stockholm to which I had been presenting the same findings the previous week. In January 2018, the 8 judges of the Swedish Court told the Swedish government that they must not permit the development of the nuclear waste facility at Forsmark. This landmark decision was also omitted from any newspapers in the UK, which itself is currently busy trying to find a local council they can bribe to allow them to bury nuclear waste somewhere in England and (more probably) Wales.

When I presented the same genetic damage evidence in the nuclear test veteran case in the Royal Courts of Justice in 2016, I submitted reports by 4 eminent radiation experts, including Prof Schmitz-Feuerhake/ All gave evidence under cross examination. We filed the evidence of genetic damage in the Test Veteran children: a 10-fold excess risk for congenital malformations and in the grandchildren 8-fold. The British Judge, Sir Nicholas Blake, refused to listen to any of this evidence and dismissed our experts. Blake found for the Ministry of Defence. I am taking a new Test Veteran case this summer. We shall see what happens.

But no surprise about judge Blake. In a recent survey of judges in Europe, it was found that Britain was only exceeded by Albania in the percentage of judges (45%) who reported that their decisions had been made at the direction of the establishment. The lowest rates of interference with judges was found (1%)  in—guess where—Norway, Sweden and Denmark.

It seems that we live in a corrupt society here in Britain and I am ashamed to be part of this State which has poisoned its citizens consistently since 1945 and continues to do so, and to cover it all up, aided by dishonest scientists and celebrity reporters like George Monbiot. Those who have a magical view of events might delight in thinking that George has received his just due; for myself I just hope that this may make him look into the issue more deeply and change his mind about the effects of radioactive contamination.

Dr Chris Busby is the Scientific Secretary of the European Committee on Radiation Riskand the author of Uranium and Health – The Health Effects of Exposure to Uranium and Uranium Weapons Fallout (Documents of the ECRR 2010 No 2, Brussels, 2010). For details and current CV see For accounts of his work see and

March 20, 2018 Posted by | Environmentalism, Fake News, Mainstream Media, Warmongering, Militarism, Nuclear Power, Science and Pseudo-Science, Timeless or most popular | , | Leave a comment

Will advances in groundwater science force a paradigm shift in sea level rise attribution?

By Jim Steele | Landscapes and Cycles | March 4, 2018

A better accounting of natural groundwater discharge is needed to constrain the range of contributions to sea level rise. The greater the contribution from groundwater discharge, the smaller the adjustments used to amplify contributions from meltwater and thermal expansion.

In a 2002 paper, what is frequently referred to as “Munk’s enigma”, Scripps Institution of Oceanography’s senior researcher bemoaned the fact researchers could not fully account for the causes of sea level rise. He lamented, “the historic rise started too early, has too linear a trend, and is too large.” Early IPCC analyses noted about 25% of estimated sea level rise was unaccounted for. Accordingly, in 2012, an international team of prominent sea level researchers published, Twentieth-Century Global-Mean Sea Level Rise: Is the Whole Greater than the Sum of the Parts? (henceforth Gregory 2012). They hoped to balance struggling sea level budgets by re-analyzing and adjusting estimates of the contributions from melting glaciers and ice caps, thermal expansion, and the effects of dam building and groundwater extraction. However, a natural contribution from any imbalance in groundwater re-charge vs discharge was never considered. Yet the volume of freshwater stored as groundwater, is second only to Antarctica’s frozen supply, and 3 to 8 times greater than Greenland’s.

At the risk of oversimplifying, the effects of groundwater storage can be differentiated between shallow-aquifer effects that modulate global sea level on year to year and decade to decade timeframes, versus deep aquifer effects that modulate sea level trends over centuries and millennia.

Researchers are increasingly aware of natural shallow groundwater dynamics. As noted by Reager (2016) in A Decade of Sea Level Rise Slowed by Climate-Driven Hydrology, researchers had determined the seasonal delay in the return of precipitation to the oceans causes sea levels to oscillate by 17 ± 4 mm [~0.7 inches] per year. Reager (2016) also argued decadal increases in terrestrial water storage driven by climate events such as La Nina, had reduced sea level rise by 0.71 mm/year. Likewise, Cazenave 2014 had published according to altimetry data, sea level had decelerated from 3.5 mm/yr in the 1990s to 2.5mm/yr during 2003-2011, and that deceleration could be explained by increased terrestrial water storage, and the pause in ocean warming reported by Argo data.

Improved observational data suggest during more frequent La Nina years a greater proportion of precipitation falls on the land globally and when routed through more slowly discharging aquifers, sea level rise decelerates. During periods of more frequent El Niños, more rain falls back onto the oceans, and sea level rise accelerates. In contrast to La Nina induced shallow-aquifer effects, deep aquifers have been filled with meltwater from the last Ice Age, and that water is slowly and steadily seeping back into the oceans today.

Munk’s “Too Linear Trend” Enigma and Deep Groundwater Discharge

Hydrologists concerned with sustainable groundwater supplies and drinking water contamination, have been in the forefront of analyzing the volume and ages of the world’s groundwater, providing greater insight into deep aquifer effects. Gleeson (2015) determined, “total groundwater volume in the upper 2 km of continental crust is approximately 22.6 million cubic kilometers, twice as much as earlier estimates. If all 22.6 million cubic kilometers of freshwater stored underground reached the oceans, sea level would rise 204 feet (62,430 millimeters). Via various isotope analyses and flow models, Jasechko (2017) estimated that between 42-85% of all groundwater stored in the upper 1 kilometer of the earth’s crust is water that had infiltrated the ground more than 11,000 years ago, during last Ice Age.

Clearly the earth’s groundwater has yet to reach an equilibrium with modern sea levels. With deep aquifer discharge primarily regulated by geological pore spaces (in addition to pressure heads), the slow and steady discharge of these older waters affects sea level rise on century and millennial timeframes. And, although freshwater discharge from deep aquifers may be locally insignificant relative to river runoff, deep aquifer discharge when integrated across the globe could account for the missing contribution to the sea level rise budgets.

Unfortunately, quantifying the groundwater discharge contribution to sea level rise is extremely difficult, suffering from a low signal to noise problem. That difficulty is why natural groundwater contributions are often ignored or brushed aside as insignificant. Although GRACE satellite monitoring of gravity changes offers great promise for detecting changes in terrestrial groundwater storage, GRACE cannot accurately separate the relatively small discharge of deep aquifers from large annual changes in shallow groundwater. In periods of heavy rains, groundwater increases will mask deep aquifer discharge. And during a drought, any deep groundwater discharge will likely be attributed to the lack of rain.

However, estimates of groundwater re-charge via isotope analyses can provide critical information regards rates of groundwater re-charge and discharge.

Using the abnormal levels of tritium released during nuclear testing in the 1950s, plus carbon­14 dating, researchers have categorized the time since groundwater had last left the surface into 25, 50, 75 and 100-year old age classes. As expected, the youngest water is concentrated in the shallowest aquifer layers and the proportion of young water decreases with depth. The estimated volume of 25-year-old or younger groundwater suggests global groundwater is currently recharging at a rate that would reduce sea level by 21 mm/year (0.8 inches/year). Water cycle researchers (i.e. Dai and Trenberth) have made the dubious assumption that the amount of water transported via precipitation to the land from the ocean is balanced each year by river runoff. But if the tritium derived estimates are valid, balancing water cycle and sea level budgets becomes more enigmatic. Clearly a significant amount of precipitation does not return for decades and centuries.

Intriguingly, comparing the smaller volume of ground water aged 50 to 100-years-old versus the volume of water 50-years-old and younger suggests 2 possible scenarios. Either ground water recharge has increased in recent decades, or if recharge rates averaged over 50 years have remained steady, then as groundwater ages a significant portion seeps back to the ocean at rates approaching 1.7 mm/year, a rate that is very similar to 20th century IPCC estimates of sea level rise.

Groundwater discharge must balance recharge or else it directly alters global sea levels. When less than 21 mm/year seeps back to the ocean, then natural groundwater storage lowers sea level. When discharge is greater than 21 mm/year, then groundwater discharge is raising sea level. Without accounting for recharge vs discharge, the much smaller estimates of all the other factors contributing to sea level rise are simply not well constrained.

Higher rates of discharge could account for the enigmatic missing sea level contributions reported by the IPCC and other researchers (i.e. Gregory 2012). More problematic, if discharge proves to significantly exceed recharge, then estimates of contributions from other sources such as melting ice and thermal expansion may be too high. What is certain, the current estimates of contributions to sea level from melting ice and thermal expansion only range from 1.5 to 2.0 mm/year, and those factors by themselves cannot offset the tritium estimated 21 mm/year of groundwater recharge. So, what is missing in our current water cycle budgets?

The Importance of Submarine Groundwater Discharge (SGD)

The recharge-discharge imbalance can be reconciled if water cycle budgets included the difficult-to-measure rates of prolific submarine groundwater discharge (SGD). Freshwater springs bubbling up from coastal sea floors have long been observed. To reliably replenish drinking water, Roman fisherman mapped their occurrences throughout the Mediterranean. Moosdorf (2017) has reviewed the locations and many human uses of fresh submarine groundwater discharge around the world.

Recent ecological studies have measured local submarine groundwater seepages to determine contributions of solutes and nutrients to coastal ecosystems. But those sparse SGD measurements cannot yet be reliably integrated into a global estimate. Rodell (2015) notes that most water cycle budgets have ignored SGD due to its uncertainty, so Rodell’s water cycle budget included a rate of SGD equivalent to 6.5 millimeters/year (~0.25 inch/yr) of sea level rise. However, that estimate is still insufficient to balance current recharge estimates.

However, with improving techniques, researchers recently estimated total submarine groundwater (saline and fresh water combined) discharges suggesting a rate 3 to 4 times greater than the observed global river runoff, or a volume equivalent to 331 mm/year (13 inches) of sea level rise. Nonetheless more than 90% of that submarine discharge is saline sea water, most of which is likely recirculated sea water, and not likely to affect sea level. Only the fraction of entrained freshwater would raise sea level. To balance the 21 mm/year ground water recharge, between 6 and 7% of total SGD must be freshwater and that amount is very likely. Local estimates of the freshwater fraction of submarine discharge range from 1 to 35%, and on average just less than 10%. If fresh submarine groundwater discharge approaches just 7% of the total SGD, it would not only balance current groundwater recharge, but would steadily raise sea level by an additional 2 mm/year, even if there was no ocean warming and no melting glaciers. 

A Sea Level Rise “Base-flow” and Paleo-climate Conundrums

Hydrologists seek to quantify the aquifer contributions to river flow, otherwise known as the “base flow”. During the rainy season or the season of melting snow, any groundwater contribution is masked by heavy surface runoff and shallow aquifer effects. However, during extended periods of drought hydrologists assume the low river flow that persists must be largely attributed to supplies from deeper aquifers. Streams that dry up during a drought are usually supported by small shallow aquifers, while reduced but persistent river and stream flows must be maintained by large aquifers. Using a similar conceptual approach, we can estimate a possible “base flow” contribution to sea level.

When the continental ice sheets began to melt as the earth transitioned from its Ice Age maximum to our present warm interglacial, sea level began to rise from depths ~130 meters lower than today (see graph below). Melting continental ice sheets drove much higher rates of sea level rise than seen today, ranging from 10 to 40+ mm/year. Approximately 6,000 years ago, a consensus suggests the last of the continental ice sheets had melted completely, the earth’s montane glaciers had disappeared, and Greenland and Antarctic ice sheets had shrunk to their minimums. The earth then entered a long-term 5000-year cooling trend dubbed the Neoglaciation. Although sea level models forced only by growing glaciers and cooling ocean temperatures would project falling sea levels, proxy evidence enigmatically suggests global sea level continued to rise. Albeit at reduced rates, global sea level continued to rise another 4 meters (Figure 1 below). Although there is some debate regards any continued contribution from Antarctica and “ocean siphoning”, according to Lambeck 2014 about 3 meters of sea level were added between 6.7–4.2 thousand years ago. That continued sea level rise could be explained by aquifer discharge, suggesting a minimal “base flow” of ~1.2 mm/year from groundwater discharge.

Similarly, during the Little Ice Age between 1300 and 1850 AD, montane glaciers as well as Greenland and Antarctic ice sheets, grew and reached their largest extent in the last 7,000 years. Ocean temperatures cooled by about 1 degree. Yet inexplicably, most researchers estimate global sea level never dropped significantly. They report sea levels were “stable” during the Little Ice Age, fluctuating only by tenths of a millimeter. That stability contrasts greatly with the recent rising trend, that has led some to attribute the current rise to increasing CO2 concentrations. However Little Ice Age stability defies the physics of cooling temperatures and increasing water storage in growing glaciers that should have caused a significant sea level fall. However, that seeming paradox is consistent with a scenario in which a “base flow” from groundwater discharge would offset any transfer of waters to growing Little Ice Age glaciers.

Once the growth of Little Ice Age glaciers stopped, and groundwater base flow was no longer offset, we would expect sea levels to rise as witnessed during the 19th and 20th centuries. Such a scenario would also explain Munk’s enigma that sea level rise had started too early, before temperatures had risen significantly from any CO2-driven warming.

Interestingly, assuming a ballpark figure of a 1.2 mm/year groundwater base flow, unbalanced groundwater discharge could also explain the much higher sea levels estimated for the previous warm interglacial, the Eemian. Researchers estimate sea levels ~115,000 years ago were about 6 to 9 meters higher than today. That interglacial has also been estimated to have spanned 15,000 years before continental glaciation resumed. Compared to our present interglacial span of 11,700 years, an extra 3,300 years of groundwater discharge before being offset by resumed glacier growth, could account for 4 meters of the Eemian’s higher sea level. 

Recent glacier meltwater contribution to sea level is likely overestimated?

In addition to a groundwater base flow driving the current steady rise in sea level, meltwater from retreating Little Ice Age glaciers undoubtedly contributed as well. But by how much? Researchers have estimated there was greater glacial retreat (and thus a greater flux of meltwater) in the early 1900s compared to now. So, current glacier retreat is unlikely to cause any acceleration of recent sea level rise. Furthermore, we cannot assume glacier meltwater rapidly enters the oceans. A large proportion of meltwater likely enters the ground, so it may take several hundred years for Little Ice Age glacier meltwater to affect sea level.

How fast can groundwater reach the ocean? Groundwater measured in the Great Plains’ Ogallala Aquifer can flow at a higher-than-average seepage rate of ~300 mm (~1 foot) in a day, or about the length of a football field in a year. For such “fast” moving groundwater to travel 1000 kilometers (620 miles) to the sea, it would require over 10,000 years! Most ground water travels much slower. The great weight of the continental glaciers during our last ice age, applied such great pressure that it forced meltwater to into the ground at much greater rates than currently observed recharge. And that Ice Age meltwater is still slowly moving through aquifers like the Ogallala.

(However, its release to the ocean has been sped up by human pumping. Recent estimates suggest that globally, human groundwater extraction currently exceeds rates of water capture from dam building, so that groundwater depletion is now accelerating sea level rise.)

How much of the current meltwater can we expect to transit to the ocean via a slow groundwater route? That’s a tough question to answer. However, thirteen percent of the earth’s ice-free land surface is covered by endorheic basins as illustrated by the gray areas shown in the illustration below. Endorheic basins have no direct outlets to the ocean. Water entering endorheic basins only return to the sea via evaporation, or by the extremely slow route of groundwater discharge. Any precipitation or glacial meltwater flowing into an endorheic basin could require centuries to thousands of years to flow back to the oceans.

For example, in 2010-2011, researchers reported that a La Nina event had caused global sea level to fall by the equivalent of 7mm/year (~0.3 inches/year). That dramatic drop happened despite concurrent extensive ice melt in Greenland and despite any base flow contribution. As described by Fasullo (2013), GRACE satellite observations detected increased groundwater storage caused by higher rates of rainwater falling on endorheic basins, primarily in Australia. Although satellite observations suggested much of the rainwater remained in the Australian basin, sea level resumed its unabated rise as groundwater base flow contribution would predict.

To balance their sea level budgets, researchers assert melting glaciers have added ~0.8 mm/year to recent sea level rise. The 20th century retreat of most glaciers is undeniable, but we cannot simply assume all 20th century glacier meltwater immediately reached the oceans. The greatest concentration of ice, outside of Greenland and Antarctica, resides in the regions north of India and Pakistan, in the Himalaya and Karakoram glaciers. Most melt water flowing northward enters the extensive Asian endorheic basins. Likewise, some of the Sierra Nevada meltwater flows into Nevada’s Great Basin, and some Andes meltwater flows into the endorheic basins of the Altiplano and Lake Titicaca as well as the Atacama Desert. It is very likely much of the current glacial meltwater will then take decades to millennia to reach the ocean and has yet to impact modern sea levels. If the glacial melt water contribution to sea level is overestimated, then, the unaccountedfor contribution to sea level rise becomes much larger than initially thought.

Accurate Attribution of Groundwater Discharge and Recharge Will Constrain Sea Level Contributions

Using a combination of GRACE gravity data that measured changes in ocean mass, altimetry data that measured changes in ocean volume and Argo data that measured heat content, Cazenave (2008) used 2 different methods and both estimated the contribution from increased ocean heat to be about 0.3 to 0.37 mm/year. Jevrejeva (2008) calculated a similar heat contribution. Other researchers suggest thermal expansion contributes 1.2 to 1.5 mm/year (i.e. Chambers 2016). Such large discrepancies reveal contributing factors to sea level rise are not yet reliably constrained.

One of the great uncertainties in sea level research are glacial isostatic adjustments.

Researchers have subjectively adopted various Glacio-isotatic adjustment models with recommended adjustments ranging from 1 to 2 mm/year. For example, although GRACE gravity estimates had not detected any added water mass to the oceans, Cazenave (2008) added a 2 mm/year adjustment, as illustrated from her Figure 1 below. Other researchers only added a 1 mm/yr adjustment.

In the Gregory (2012) paper Twentieth-Century Global-Mean Sea Level Rise: Is the Whole Greater than the Sum of the Parts? researchers suggested the sea level budget could be balanced and the IPCC’s unaccounted for contribution to sea level rise could be explained by making 5 assumptions:

  • Assume the contribution from glacier melting was greater than previously estimated.

But greater melting rates were documented for the 30s and 40s, and the likelihood that some glacier meltwater is still trapped as groundwater, suggests the glacier meltwater contribution has been overestimated.

  • Assume an increased contribution from thermal expansion.

But ARGO data suggests the contribution from thermal expansion has been decreasing and plateauing.

  • Assume Greenland positively contributed to sea level throughout the entire 20th century.

Greenland has undoubtedly contributed to episodes of accelerating and decelerating sea level changes, but the greatest rate of Greenland warming occurred during the 1920s and 30s. Previous researchers suggested Greenland glaciers have oscillated during the 20th century but had been stable from the 60s to 1990s. Although there was increased surface melt in the 21st century, culminating in 2012, that melt rate has since declined. And according to the Danish Meteorological Institute, Greenland gained about 50 billion tons of ice in 2017 which should have lowered sea level in 2017. Clearly Greenland cannot explain the enigmatic steady 20th century sea level rise.

  • Assume reservoir water storage balanced groundwater extraction.

But net contributions from groundwater extraction vs water impoundments and other landscape changes are still being debated. For the period 2002–2014 landscape changes have been estimated to have reduced sea level by −0.40 mm/year versus IPCC estimates of contributing 0.38 mm/year from 1993–2010 to sea level rise.

  • Assume the remaining unaccounted contribution to sea level rise is small enough to be attributed to melting in Antarctica.

Debatably, Antarctic melting is too often used as the catch-all fudge factor to explain the unexplainable. Furthermore, there is no consensus within the Antarctic research community if there have been any human effects on Antarctica’s ice balance. Regions that are losing ice are balanced by regions that are gaining ice. Claims of net ice loss have been countered by claims of net ice gain such as NASA 2015. Additionally, unadjusted GRACE gravity data has suggested no lost ice mass and all estimates of ice gains or loss depend on which Glacial Isostatic Adjustments modelers choose to use. We cannot dismiss the possibility that unaccounted for groundwater discharge has been mistakenly attributed to hypothetical Antarctic melting?

A better accounting of natural groundwater discharge is needed to constrain the range of contributions to sea level rise suggested by researchers such as Gregory 2012. The greater the contribution from groundwater discharge, the smaller the adjustments used to amplify contributions from meltwater and thermal expansion. Until a more complete accounting is determined, we can only appreciate Munk’s earnest concern. How can we predict future sea level rise if we don’t fully understand the present or the past?

March 4, 2018 Posted by | Science and Pseudo-Science, Timeless or most popular | Leave a comment

Rebuttals to Ten Typical False Claims by Climate Alarmists

Ice Cap | March 2, 2018

Below are a series of rebuttals of typical climate alarmists’ claims such as those made in the recently released Fourth National Climate Assessment Report. The authors of these rebuttals are all recognized experts in the relevant scientific fields. The rebuttals demonstrate the falsity of EPA’s claims merely by citing the most credible empirical data on the topic. For each alarmist claim, a summary of the relevant rebuttal is provided along with a link to the full text of the rebuttal which includes the names and the credentials of the authors of each rebuttal.

Claim #1: Heat waves are increasing at an alarming rate and heat kills
Claim #2: Global warming is causing more hurricanes and stronger hurricanes.
Claim #3: Global warming is causing more and stronger tornadoes
Claim #4: Global warming is increasing the magnitude and frequency of droughts and floods.
Claim #5: Global Warming has increased U.S. Wildfires
Claim #6: Global warming is causing snow to disappear
Claim #7: Global warming is resulting in rising sea levels as seen in both tide gauge and satellite technology
Claim #8: Arctic, Antarctic and Greenland ice loss is accelerating due to global warming
Claim #9: Rising atmospheric CO2 concentrations are causing ocean acidification, which is catastrophically harming marine life
Claim #10: Carbon pollution is a health hazard

Claim #1: Heat Waves are Increasing at an Alarming Rate and Heat Kills

Summary of Rebuttal

There has been no detectable long-term increase in heat waves in the United States or elsewhere in the world. Most all-time record highs here in the U.S. happened many years ago, long before mankind was using much fossil fuel. Thirty-eight states set their all-time record highs before 1960 (23 in the 1930s!). Here in the United States, the number of 100F, 95F and 90F days per year has been steadily declining since the 1930s. The Environmental Protection Agency Heat Wave Index confirms the 1930s as the hottest decade.

James Hansen, while at NASA in 1999, said about the U.S. temperature record “In the U.S. the warmest decade was the 1930s and the warmest year was 1934”. When NASA was challenged on the declining heat records in the U.S, the reply was that the U.S. is just 2% of the world. However, all 8 continents recorded their all-time record highs before 1980. Interestingly while the media gives a great deal of coverage to even minor heat waves to support the case that man-made global warming is occurring, the media tends to ignore deadly cold waves. But in actual fact worldwide cold kills 20 times as many people as heat. This is documented in the “Excess Winter Mortality” which shows that the number of deaths in the 4 coldest winter months is much higher than the other 8 months of the year. The USA death rate in January and February is more than 1000 deaths per day greater than in it is July and August.

Clearly, there is no problem with increased heat waves due to climate change.

Detailed Rebuttal and Authors: EF_RRT_AC – Heat Waves

Claim #2: Global Warming Is Causing More Hurricanes and Stronger Hurricanes

Summary of Rebuttal

There has been no detectable long-term trend in the number and intensity of hurricane activity globally. The activity does vary year to year and over multidecadal periods as ocean cycles including El Nino/La Nina, multidecadal cycles in the Pacific (PDO) and Atlantic (AMO) favor some basins over others.

The trend in landfalling storms in the United States has been flat to down since the 1850s. Before the active hurricane season in the United States in 2017, there had been a lull of 4324 days (almost 12 years) in major hurricane landfalls, the longest lull since the 1860s. Harvey was the first hurricane to make landfall in Texas since Ike in 2008 and the first Category 4 hurricane in Texas since Hurricane Carla in 1961. There has been a downtrend in Texas of both hurricanes and major hurricanes. Texas is an area where Gulf Tropical Storms and hurricanes often stall for days, and 6 of the heaviest tropical rainfall events for the U.S. have occurred in Texas. Harvey’s rains were comparable to many of these events. Claudette in 1979 had an unofficial rainfall total greater than in Harvey.

In Florida, where Irma hit as a category 4 on the Keys, it came after a record 4339 days (just short of 12 years) without a landfalling hurricane. The previous record lull was in the 1860s (8 years). There has been no trend in hurricane intensity or landfalling frequency since at least 1900.

Detailed Rebuttal and Authors: EF_RRT_AC – Hurricanes

Claim #3: Global Warming Is Causing More and Stronger Tornadoes

Summary of Rebuttal

Tornadoes are failing to follow “global warming” predictions. Big tornadoes have seen a decline in frequency since the 1950s. The years 2012, 2013, 2014, 2015 and 2016 all saw below average to near record low tornado counts in the U.S. since records began in 1954. 2017 to date has rebounded only to the long-term mean.

This lull followed a very active and deadly strong La Nina of 2010/11, which like the strong La Nina of 1973/74 produced record setting and very deadly outbreaks of tornadoes. Population growth and expansion outside urban areas have exposed more people to the tornadoes that once roamed through open fields. Tornado detection has improved with the addition of NEXRAD, the growth of the trained spotter networks, storm chasers armed with cellular data and imagery and the proliferation of cell phone cameras and social media. This shows up most in the weak EF0 tornado count but for storms from moderate EF1 to strong EF 3+ intensity, the trend has been flat to down despite improved detection.

For Rebuttal and Author Credentials See: EF_RRT_AC – Tornadoes

Claim #4: Global warming Is Increasing the Magnitude and Frequency of Droughts and Floods

Summary of Rebuttal

Our use of fossil fuels to power our civilization is not causing droughts or floods. NOAA found there is no evidence that floods and droughts are increasing because of climate change. The number, extend or severity of these events does increase dramatically for a brief period of years at some locations from time to time but then conditions return to more normal. This is simply the long-established constant variation of weather resulting from a confluence of natural factors. In testimony before Congress Professor Roger Pielke, Jr. said: “It is misleading, and just plain incorrect, to claim that disasters associated with hurricanes, tornadoes, floods, or droughts have increased on climate timescales either in the United States or globally. Droughts have, for the most part, become shorter, less frequent, and cover a smaller portion of the U.S. over the last century.”

“The good news is U.S. flood damage is sharply down over 70 years,” Roger Pielke Jr. said. “Remember, disasters can happen any time…”. But it is also good to understand long-term trends based on data, not hype.”

Detailed Rebuttal and Authors: EF_RRT_AC – Droughts and Floods

Claim #5: Global Warming Has Increased U.S. Wildfires

Summary of Rebuttal

Wildfires are in the news almost every late summer and fall. The
National Interagency Fire Center has recorded the number of fires and acreage affected since 1985. This data show the number of fires trending down slightly, though the acreage burned had increased before leveling off over the last 20 years. The NWS tracks the number of days where conditions are conducive to wildfires when they issue red-flag warnings. It is little changed. 2017 was an active fire year in the U.S. but my no means a record. The U.S. had 64,610 fires, the 7th most since in 11 years and the most since 2012. The 9,574, 533 acres burned was the 4th most in 11 years and most since 2015. The fires burned in the Northwest including Montana with a very dry summer then the action shifted south seasonally with the seasonal start of the wind events like Diablo in northern California and Santa Ana to the south.

Fires spread to northern California in October with an episode of the dry Diablo wind that blows from the east and then in December as strong and persistent Santa Ana winds and dry air triggered a round of large fires in Ventura County. According to the California Department of Forestry and Fire Protection the 2017 California wildfire season was the most destructive one on record with a total of 8,987 fires that burned 1,241,158 acres. It included five of the 20 most destructive wildland-urban interface fires in the state’s history.

When it comes to considering the number of deaths and structures destroyed, the seven-fold increase in population in California from 1930 to 2017 must be noted. Not only does this increase in population mean more people and home structures in the path of fires, but it also means more fires. Lightning and campfires caused most historic fires; today most are the result of power lines igniting trees. The power lines have increased proportionately with the population, so it can be reasoned that most of the damage from wild fires in California is a result of increased population not Global Warming. The increased danger is also greatly aggravated by poor government forest management choices.

Detailed Rebuttal and Authors: EF_RRT_AC – Wildfires

Claim #6: Global Warming Is Causing Snow to Disappear

Summary of Rebuttal

This is one claim that has been repeated for decades even as nature showed very much the opposite trend with unprecedented snows even to the big coastal cities. Every time they repeated the claim, it seems nature upped the ante more.

Alarmists have eventually evolved to crediting warming with producing greater snowfall, because of increased moisture but the snow events in recent years have usually occurred in colder winters with high snow water equivalent ratios in frigid arctic air.

Snowcover in the Northern Hemisphere, North America, and Eurasia has been increasing since the 1960s in the fall and winter but declining in the spring and summer. However, as NOAA advised might be the case, snowcover measurement methodology changes at the turn of this century may be responsible for part of the warm season differences.

Detailed Rebuttal and Authors: EF_RRT_CA – Snow

Claim #7: Global warming is resulting in rising sea levels as Seen in Both Tide Gauge and Satellite Technology

Summary of Rebuttal

This claim is demonstrably false. It really hinges on this statement: “Tide gauges and satellites agree with the model projections.” The models project a rapid acceleration of sea level rise over the next 30 to 70 years. However, while the models may project acceleration, the tide gauges clearly do not.

All data from tide gauges in areas where land is not rising or sinking show instead a steady linear and unchanging sea level rate of rise from 4 up to 6 inches/century, with variations due to gravitational factors. It is true that where the land is sinking as it is in the Tidewater area of Virginia and the Mississippi Delta region, sea levels will appear to rise faster but no changes in production would change that.

The implication that measured, validated, and verified Tide Gauge data support this conclusion remains simply false. All such references rely on “semi-empirical” information, which merges, concatenates, combines, and joins, actual tide gauge data with various models of the reference author’s choosing. Nowhere on this planet can a tide gauge be found, that shows even half of the claimed 3.3 mm Sea level rise rate in Tectonically Inert” coastal zones. These are areas that lie between regions of geological uplift and subsidence. They are essentially neutral with respect to vertical land motion, and tide gauges located therein show between 1 mm/yr (3.9 inches/century) and 1.5 mm/yr (6 inches/century rise). The great Swedish Oceanographer, Nils-Axel Mörner, has commented on this extensively, and his latest papers confirm this ‘inconvenient truth.’

Further, alarmist claims that “Satellites agree with the model projection” are false. Satellite technology was introduced to provide more objective measurement of the sea level rise because properly adjusted tide gauge data was not fitting Alarmists’ claims. However, the new satellite and radar altimeter data lacked the resolution to accurately measure sea levels down to the mm level. Moreover, the raw data from this technology also conflicted with Alarmists’ claims. As a result, adjustments to this data were also made – most notably a Glacial Isostatic Adjustment (GIA). GIA assumes that basically all land is rebounding from long ago glaciations and oceanic basins are deepening. The assumption is that this rebounding is masking the true sea level rise. Alarmists continue to proclaim that their models project a rapid acceleration of sea level rise over the next 30 to 70 years, when those same models have failed to even come close to accurately measuring the past 25 years.

Detailed Rebuttal and Authors: EF_RRT_CA – Sea Level

Claim #8: Arctic, Antarctic and Greenland Ice Loss Is Accelerating Due to Global Warming

Summary of Rebuttal

Satellite and surface temperature records and sea surface temperatures show that both the East Antarctic Ice Sheet and the West Antarctic Ice Sheet are cooling, not warming and glacial ice is increasing, not melting. Satellite and surface temperature measurements of the southern polar area show no warming over the past 37 years. Growth of the Antarctic ice sheets means sea level rise is not being caused by melting of polar ice and, in fact, is slightly lowering the rate of rise. Satellite Antarctic temperature records show 0.02C/decade cooling since 1979. The Southern Ocean around Antarctica has been getting sharply colder since 2006. Antarctic sea ice is increasing, reaching all-time highs. Surface temperatures at 13 stations show the Antarctic Peninsula has been sharply cooling since 2000. The Arctic includes the Arctic Ocean, Greenland, Iceland, and part of Siberia and northern Alaska. Because of the absence of any land mass in the Arctic Ocean, most of area lacks glaciers, which require a land mass. Thus, most of the Arctic contains only floating sea ice. Greenland, Iceland, northern Alaska, and northern Siberia contain the only glaciers in the general Arctic region. Arctic temperature records show that the 1920s and 1930s were warmer than 2000. Records of historic fluctuations of Arctic sea ice go back only to the first satellite images in 1979. That happens to coincide with the end of the 1945–1977 global cold period and the maximum extent of Arctic sea ice. During the warm period from 1978 until recently, the extent of sea ice has diminished, but increased in the past several years. The Greenland ice sheet has also grown recently.

Detailed Rebuttal and Authors: EF_RRT_AC – Arctic, Antarctic,Greenland 123117

Claim #9: Rising Atmospheric CO2 Concentrations Are Causing Ocean Acidification, which Is Catastrophically Harming Marine Life

Summary of Rebuttal

As the air’s content rises in response to ever-increasing anthropogenic CO2 emissions, more and more carbon dioxide is expected to dissolve into the surface waters of the world’s oceans, which dissolution is projected to cause a 0.3 to 0.7 pH unit decline in the planet’s oceanic waters by the year 2300. A potential pH reduction of this magnitude has provoked concern and led to predictions that, if it occurs, marine life will be severely harmed—with some species potentially driven to extinction—as they experience negative impacts in growth, development, fertility and survival. This ocean acidification hypothesis, as it has come to be known, has gained great momentum in recent years, because it offers a second independent reason to regulate fossil fuel emissions in addition to that provided by concerns over traditional global warming. For even if the models are proven to be wrong with respect to their predictions of atmospheric warming, extreme weather, glacial melt, sea level rise, or any other attendant catastrophe, those who seek to regulate and reduce CO2 emissions have a fall-back position, claiming that no matter what happens to the climate, the nations of the Earth must reduce their greenhouse gas emissions because of projected direct negative impacts on marine organisms via ocean acidification.

The ocean chemistry aspect of the ocean acidification hypothesis is rather straightforward, but it is not as solid as it is often claimed to be. For one thing, the work of a number of respected scientists suggests that the drop in oceanic pH will not be nearly as great as the IPCC and others predict. And, as with all phenomena involving living organisms, the introduction of life into the analysis greatly complicates things. When a number of interrelated biological phenomena are considered, it becomes much more difficult, if not impossible, to draw such sweeping negative conclusions about the reaction of marine organisms to ocean acidification. Quite to the contrary, when life is considered, ocean acidification is often found to be a non-problem, or even a benefit. And in this regard, numerous scientific studies have demonstrated the robustness of multiple marine plant and animal species to ocean acidification—when they are properly performed under realistic experimental conditions.

Detailed Rebuttal and Author: EF_RRT_CA – Ocean pH

Claim #10: Carbon Pollution Is a Health Hazard<

Summary of Rebuttal

The term “carbon pollution” is a deliberate, ambiguous, disingenuous term, designed to mislead people into thinking carbon dioxide is pollution. It is used by the environmentalists to confuse the environmental impacts of CO2 emissions with the impact of the emissions of unwanted waste products of combustion. The burning of carbon-based fuels (fossil fuels – coal, oil, natural gas – and biofuels and biomass) converts the carbon in the fuels to carbon dioxide (CO2), which is an odorless invisible gas that is plant food and it is essential to life on the planet. Because the burning of the fuel is never 100% efficient, trace amounts of pollutants including unburnt carbon are produced in the form of fine particulates (soot), hydrocarbon gases and carbon monoxide. In addition, trace amounts of sulfur oxides, nitrogen oxides and other pollutant constituents can be produced. In the US, all mobile and industrial stationary combustion sources must have emission control systems that remove the particulates and gaseous pollutants so that the emissions are in compliance with EPA’s emission standards. The ambient air pollutant concentrations have been decreasing for decades and are going to keep decreasing for the foreseeable future because of existing non-GHG-related regulations.

Detailed Rebuttal and Authors: EF_RRT_AC – Health

This material is derived from the Fifth Supplement to Petition for Reconsideration of the Climate Endangerment Finding filed with the USEPA on February 9 by the Concerned Household Electricity Consumers Council (CHECC).

February 20, 2018 CHECC Press Release

Electricity Consumers File New Study in Their Call for EPA to Reopen its Endangerment Finding

Key Points:

o Just Released, new research findings demonstrate that Ten Frequent Climate Alarmists’ Claims have each been Rebutted by true experts in each Field by simply citing the most relevant and credible empirical data.

o The new results invalidate 10 very frequent Alarmist Claims in recent years, and thereby also invalidate the so-called “lines of evidence” on which EPA claimed to base its 2009 CO2 Endangerment Finding.

o If the Endangerment Finding is not vacated, whether the current administration likes it or not, it is certain that electric utility, automotive and many other industries will face ongoing EPA CO2 regulation.

This scientifically illiterate basis for regulation will raise U.S. energy prices thereby reducing economic growth, jobs and national security.

February 20, 2018

On February 9, 2018, The Concerned Household Electricity Consumers Council (CHECC) submitted a fifth Supplement to their Petition to provide additional new highly relevant and credible information. It relates to variables other than temperature describing the Earth’s Climate System. With each of EPA’s three Lines of Evidence purporting to support their 2009 Endangerment Finding already shown in the CHECC petition and its first 2 Supplements to be invalid, EPA has no proof whatsoever that CO2 has had a statistically significant impact on global temperatures.

The Council’s original Petition and First Supplement to Petition demonstrated that the Endangerment Finding is nothing more than assumptions that have each been disproved by the most relevant empirical evidence from the real world. The original Petition was substantially based on a major peer-reviewed 2016 scientific paper by James Wallace, John Christy and Joseph D’Aleo (Wallace 2016) that analyzed the best available temperature data sets and “failed to find that the steadily rising atmospheric CO2 concentrations have had a statistically significant impact on any of the 13 critically important tropical and global temperature time series data sets analyzed.” The full text of Wallace 2016 may be found here”>here.

First Supplement to Petition was substantially based on a new April 2017 peer reviewed scientific paper, also from the same authors (Wallace 2017A). Wallace 2017A can be found here. Wallace 2017A concluded that once impacts of natural factors such as solar, volcanic and ENSO activity are accounted for, there is no “natural factor adjusted” warming remaining to be attributed to rising atmospheric CO2 levels.

The Second Supplement to the Petition relied on a third new major peer reviewed scientific paper from James Wallace, Joseph D’Aleo and Craig Idso, published in June 2017 (Wallace 2017B). Wallace 2017B analyzes the GAST data issued by U.S. agencies NASA and NOAA, as well as British group Hadley CRU. (Wallace 2017B can be found here) In this research report past changes in the previously reported historical data are quantified. It was found that each new version of GAST has nearly always exhibited a steeper warming linear trend over its entire history. And, this result was nearly always accomplished by each entity systematically removing the previously existing cyclical temperature pattern. This was true for all three entities providing GAST data measurement, NOAA, NASA and Hadley CRU.
The Second Supplement to Petition states: Adjustments that impart an ever-steeper upward trend in the data by removing the natural cyclical temperature patterns present in the data deprive the GAST products from NOAA, NASA and Hadley CRU of the credibility required for policymaking or climate modeling, particularly when they are relied on to drive trillions of dollars in expenditures.

The invalidation of the adjusted GAST data knocked yet another essential pillar out from under the lines of evidence that are the claimed foundation of the Endangerment Finding. As the Second Supplement to Petition stated: It is therefore inescapable that if the official GAST data from NOAA, NASA and Hadley CRU are invalid, then both the “basic physical understanding” of climate and the climate models will also be invalid. The scientific invalidity of the Endangerment Finding becomes more blindingly obvious and undeniable with each day’s accumulation of reliable empirical data -and, the willingness of more scientists to come forward with such new evidence. (See here.) Perhaps recognizing this fact, Climate Alarmist have over time gone from focusing on Global Warming, to Climate Change to simply fear of Carbon. Thus, this research sought to determine the credibility of Ten (10) very frequently cited Climate Alarmists Claims. Above are Rebuttals to each of these ten typical climate alarmists’ claims. The rebuttal authors are all recognized experts on their topic and each rebuttal demonstrates the claim fallacy by merely citing the most credible empirical data.

The Conclusion of the Fifth Supplement

The invalidation of the three lines of evidence upon which EPA attributes global warming to human GHG emissions breaks the causal link between human GHG emissions and global warming. This in turn necessarily breaks the causal chain between human GHG emissions and the alleged knock-on effects of global warming, such as loss of Arctic ice, increased sea level, and increased heat waves, floods, droughts, hurricanes, tornadoes, etc.

Nevertheless, these alleged downstream effects are constantly cited to whip up alarm and create demands for ever tighter regulation of GHG emissions involving all fossil fuels, not just coal. EPA explicitly relied on predicted increases in such events to justify the Endangerment Finding. But there is no evidence to support such Alarmist Claims, and copious empirical evidence that refutes them. The enormous cost and essentially limitless scope of the government’s regulatory authority over GHG emissions cannot lawfully rest upon a collection of scary stories that are conclusively disproven by readily available empirical data.

The scientific invalidity of the Endangerment Finding becomes more blindingly obvious and undeniable with each day’s accumulation of reliable empirical data. It is time for an honest and rigorous scientific re-evaluation of the 2009 CO2 Endangerment Finding. The nation has been taken down a tragically foolish path of pointless GHG/CO2 regulations and wasteful mal-investments to “solve” a problem which does not actually exist. Our leaders must summon the courage to acknowledge the truth and act accordingly.

The legal criteria for reconsidering the Endangerment Finding are clearly present in this case. The scientific foundation of the Endangerment Finding has been invalidated. The parade of horrible calamities that the Endangerment Finding predicts and that a vast program of regulation seeks to prevent have been comprehensively and conclusively refuted by empirical data. The Petition for Reconsideration should be granted.

The Council brought its Petition because the Obama-era greenhouse gas regulations threaten, as President Obama himself conceded, to make the price of electricity “skyrocket.” But clearly CO2 regulation does not just raise electricity prices, it raises all fossil fuel prices. America can have, and must have, the lowest possible energy costs in order to attain and maintain its energy, economic and national security.

Media Contacts:

Harry W. MacDougald
Caldwell Propst & DeLoach LLP
Two Ravinia Drive, Suite 1600
Atlanta, Georgia 30346
(404) 843-1956

Francis Menton
Law Office of Francis Menton
85 Broad Street, 18th floor
New York, New York 10004
(212) 627-1796

March 4, 2018 Posted by | Deception, Economics, Science and Pseudo-Science | | 1 Comment

FOX News Cuts Off Reporter When She Links Psychotropic Drugs to Florida Shooter

By Matt Agorist | Free Thought Project | February 17, 2018

Stephen Paddock, Omar Mateen, Gavin Long, Eric Harris, Dylan Klebold, James Holmes, and now, Nikolas Cruz all have one thing in common other than the mass murders they carried out. They were all reportedly taking prescription drugs which alter their state of mind and carry a host of negative side effects ranging from aggression and suicide to homicidal ideation.

Suicide, birth defects, heart problems, hostility, violence, aggression, hallucinations, self-harm, delusional thinking, homicidal ideation, and death are just a few of the side effects caused by the medication taken by the monsters named above, some of which are known as SSRIs (selective serotonin reuptake inhibitors), or antidepressants.

There have been 150 studies in 17 countries on antidepressant-induced side effects. There have been 134 drug regulatory agency warnings from 11 countries and the EU warning about the dangerous side effects of antidepressants.

Despite this deadly laundry list of potential reactions to these medications, their use has skyrocketed by 400% since 1988. Coincidentally, as antidepressant use went up, so did mass shootings.

The website has been documenting the link between selective serotonin reuptake inhibitors (SSRIs) and violence. On the website is a collection of over 6,000 stories that have appeared in local media (newspapers, TV, scientific journals) in which prescription drugs were mentioned and in which the drugs may be linked to a variety of adverse outcomes including most of the mass shootings which have taken place on US soil.

As the Citizens Commission on Human Rights notes, before the late nineteen-eighties, mass shootings and acts of senseless violence were relatively unheard of. Prozac, the most well known SSRI (selective serotonin reuptake inhibitor) antidepressant, was not yet on the market. When Prozac did arrive, it was marketed as a panacea for depression which resulted in huge profits for its manufacturer Eli Lilly. Of course other drug companies had to create their own cash cow and followed suit by marketing their own SSRI antidepressants.

Subsequently, mass shootings and other violent incidents started to be reported.  More often than not, the common denominator was that the shooters were on an antidepressant, or withdrawing from one.  This is not about an isolated incident or two but numerous shootings.

The issue of psychotropic medication playing a role in mass shootings is not some conspiracy theory. It is very real and the drug manufacturers list these potentially deadly side effects on the very inserts of every one of these drugs. But the mainstream media and the government continue to ignore or suppress this information. Why is that?

In a clear example of how beholden mainstream media is to the pharmaceutical industries who manufacture and market these drugs, FOX News’ Sean Hannity was recorded this week, blatantly cutting off a reporter who dared mention Nikolas Cruz’s reported association with antidepressants.

In a news segment this week, Hannity was interviewing radio talk show host, Gina Loudon who tried to bring up Cruz’s association with SSRIs.

“I think we have to take a hard look at one thing we’re not talking about yet too, Sean, and that is psychotropic drugs,” Loudon says.

“My guess is, we’ll find out like most of these shooters…..” she says, just before Hannity jumps in to silence her.

Hannity then shuts up Loudon and moves to the doctor next to her. Just like that, all talk which was implicating big pharma in their role in mass shootings was effectively silenced.

It is no secret that the pharmaceutical industry wields immense control over the government and the media. It is their control which keeps any negative press about their dangerous products from airing. However, most people likely do not know the scope of this control.

As Mike Papantonio, attorney and host of the international television show America’s Lawyer, explains, with the exception of CBS, every major media outlet in the United States shares at least one board member with at least one pharmaceutical company. To put that into perspective: These board members wake up, go to a meeting at Merck or Pfizer, then they have their driver take them over to a meeting with NBC to decide what kind of programming that network is going to air.

In the report below, Papantonio explains how the billions of dollars big pharma gives to mainstream media outlets every year is used to keep them subservient and complicit in covering up the slew of deadly side effects from their products.

How much longer will we allow these billion-dollar drug companies to control the narrative and not let this conversation take place? How many more mass shootings will take place before Americans wake up to this reality?

February 18, 2018 Posted by | Corruption, Full Spectrum Dominance, Science and Pseudo-Science, Timeless or most popular, Video | , | 1 Comment

Big Pharma Still Tries to Push Dangerous Drug Class

By Martha Rosenberg | CounterPunch | February 16, 2018

Bisphosphonate bone drugs are among the most harmful and misrepresented drug classes still on the market. But that has not stopped Pharma-funded medical associations like the American Society of Bone and Mineral Research, the National Osteoporosis Foundation and the National Bone Health Alliance from periodically wringing their hands over low sales. [1]

This week the New York Times repeats the industry lament. “Currently, many people at risk of a fracture — and often their doctors — are failing to properly weigh the benefits of treating fragile bones against the very rare but widely publicized hazards of bone-preserving drugs, experts say,” it writes. Hip fractures among women 65 and older on Medicare are rising says the piece and Medicare reimbursements for bone density tests are falling. “Doctors who did them in private offices could no longer afford to [do them] which limited patient access and diagnosis and treatment of serious bone loss,” says a doctor quoted in the article which sounds like a Pharma plea for tax-payer funding.

But here is the back story.

The first bisphosphonate bone drug approved for osteoporosis, Merck’s Fosamax, received only a six month review before FDA approval. When its esophageal side effects were revealed, the FDA tried to unapprove it but Merck got the FDA to settle for a warning label that told patients to sit upright for an hour after taking the drug. Six months after Fosamax was approved, there were 1,213 reports of adverse effects including 32 patients hospitalized for esophageal harm. One woman who took Fosamax but remained upright for only thirty minutes was admitted to the hospital with “severe ulcerative esophagitis affecting the entire length of the esophagus” and had to be fed intravenously, according to the New England Journal of Medicine (NEJM).

Soon bisphosphonates (which include Boniva, Actonel and Zometa) were shown to weaken not strengthen bones by suppressing the body’s bone-remodeling action. Yes bone loss is stopped but since the bone is not renewed, it becomes brittle, ossified and prone to fracture. More than a decade ago, articles in the NEJM, the Annals of Internal Medicine, the Journal of Clinical Endocrinology & Metabolism, Journal of Orthopaedic Trauma and Injury warned of the paradoxical drug results. One-half of doctors at a 2010 American Academy of Orthopaedic Surgeons annual meeting presentation said they’d personally seen patients with bisphosphonate-compromised bone. “There is actually bone death occurring,” said Phuli Cohan, MD on CBS about a woman who’d been on Fosamax for years.

By 2003, dentists and oral surgeons found that after simple office dental work, the jawbone tissue of patients taking bisphosphonates would sometimes not heal but become necrotic and die. They had received no warnings though Merck knew about the jawbone effects from animal studies since 1977.

“Up to this point, this rare clinical scenario was seen only at our centers in patients who had received radiation therapy and accounted for 1 or 2 cases per year,” said the authors of an article titled “Osteonecrosis of the Jaws Associated with the Use of Bisphosphonates: A Review of 63 Cases,” published in the Journal of Oral and Maxillofacial Surgery.

Despite reports of ulcerative esophagitis, bone degradation, fractures and jawbone death Merck aggressively promoted Fosamax. It hired researcher Jeremy Allen to plant bone scan machines in medical offices across the country to drive sales and to push through the Bone Mass Measurement Act which made bone scans Medicare reimbursable paid by you and me. Hopefully that is changing.

Blaming hip fractures on not enough people taking bisphosphonates is not a new tactic for Pharma. It blamed increasing suicides on not enough people taking antidepressants (even when as much as a fourth of the population takes antidepressants). Get ready for Pharma to blame obesity on not enough people taking prescription obesity drugs. The ruse is even more dishonest because many popular drugs people are taking like GERD medications really do thin bones. First do no harm.


[1] According to the British Medical Journal, the National Osteoporosis Foundation is funded by Bayer Healthcare, Lane Laboratories, Mission Pharmacal, Novartis, Pharmavite, Pfizer, Roche, Warner Chilcott and Eli Lilly. The American Society for Bone and Mineral Research is funded by Pfizer and Eli Lilly. The National Bone Health Alliance is a public- private partnership that is an offshoot of the National Osteoporosis Foundation.

Martha Rosenberg is an investigative health reporter. She is the author of  Born With A Junk Food Deficiency: How Flaks, Quacks and Hacks Pimp The Public Health (Prometheus).

February 16, 2018 Posted by | Deception, Science and Pseudo-Science | , , , , | Leave a comment

Canadian judge dismisses all charges in lawsuit brought against Dr. Tim Ball by BC Green Party leader Andrew Weaver

Dr. Ball note to Climate Depot – February 13, 2018:

There are no media reports and my guess is there won’t be any.

At 0930 on the day the trial started we were told there was no judge or courtroom assigned. Amazingly and incorrectly, that information was reported almost immediately on media claiming the trial was postponed. It wasn’t, because by 1100 a judge and courtroom were assigned and the trail began at 1130. The postponement story likely explained why no media attended a single day of the three week trial. The nature of the case that involves a so-called climate change denier will likely also be ignored.

The trial was the only one adjudicated so far of the three lawsuits I received from the same lawyer, Roger McConchie, on behalf of three individuals all members of the Intergovernmental Panel on Climate Change (IPCC).

The first was filed on behalf of Gordon McBean, a former Assistant Deputy Minister at Environment Canada. He chaired the founding meeting of the IPCC in 1985.

The second was from Professor Andrew Weaver computer modeller and author on four of the IPCC Reports (1995, 2001, 2007 and 2013).

The third, filed nine days after the Weaver trial, was on behalf of Michael Mann, whose “hockey stick” graph dominated the 2001 IPCC Report and became what Professor Ross McKitrick called the “poster child of global warming.

McConchie also filed lawsuits against the publication in each case, which created confusion and conflict as they wanted to settle.

In the McBean case my wife and I decided not to fight because of the legal cost involved. We simply withdrew the article.

When we received the Weaver lawsuit we decided we would not be bullied into silence by what we considered to be SLAPP (Strategic Lawsuits Against Public Participation) and spent all our savings on legal fees before John O’Sullivan helped us set up a web site and a Paypal donation tab.

We later learned that the publication, Canada Free Press (CFP), had accepted and published an apology written by McConchie. I was not consulted or even informed that this was happening. Meanwhile we had hired Michael Scherr, a defamation lawyer with Pearlman Lindholm in Victoria BC.

The Mann trial was scheduled for February 20, 2017. About a month before the trial, Mann requested an adjournment. Apparently Canadian courts always grant an adjournment before a trial begins in the hope of an out-of-court settlement. I was opposed but had little choice.

The Mann case is interesting because it was filed in the supreme Court of British Columbia (BC) by an American citizen from Pennsylvania about something I said after a public presentation about the deception of manmade global warming in Winnipeg, Manitoba. BC had anti-SLAPP legislation but for some reason cancelled it. Now only two of ten Canadian Provinces, the other is Ontario, do not have anti-SLAPP legislation.

By the summer of 2017 a date for the Weaver trial was set and it was held in November over three weeks in Vancouver, Canada. Between filing the lawsuit and commencement of the trial, Weaver was elected as a Green Party member for the BC Legislature. At the trial he was the Green Party leader in his second term. The theme of the article he sued me for defamation involved the claim that the political hijacking of climatology by the IPCC set back climate research and understanding by 30 years. In the article I made comments about an interview and experience I had with Weaver that I did not fully substantiate. I wrote a letter of apology for those unsubstantiated comments but not for the overall claims of the article. Weaver posted my letter of apology on what he labelled a “wall of hate” in his University office. It appears just under his left arm in the photo at the link below.

Here is a newspaper article that shows Weaver in front of his wall of hate, apparently designed to show who and how nasty the attacks he sustained because of his views on global warming and attempts to save the planet.

The judge ruled that Weaver was not defamed by me and dismissed the claim completely. This was after almost seven years and thousands of dollars in legal costs.
Now we prepare to bring the Mann case back to the court.

February 14, 2018 Posted by | Civil Liberties, Science and Pseudo-Science | , | 1 Comment

Searching for the Catastrophe Signal: The origins of the Intergovernmental Panel on Climate Change

Review by Martin Kokus | February 7, 2018

Searching for the Catastrophe Signal: The origin of the Intergovernmental Panel on Climate Change by Bernie Lewin.
Published by the Global Warming Policy Foundation. Paperback $16.00, Kindal $7.00. Available from Amazon

This book is a must read for those interested in the current climate debate and its origin. The book does not argue the science as much as it challenges the narrative of the “consensus.” It challenges the popular notion that the primary drivers of climate change are greenhouse gases and that the theory originated in climate and environmental science departments. One cannot read the book without concluding that the theory hadn’t originated anyplace but the national nuclear labs of the United States government. Lewin’s is the first book on the subject I have read compatible with the history of the modern theory of catastrophic anthropogenic global warming that I lived through.

In 1973 I hoped to dedicate my life to studying human impacts on climate and weather. I went to the University of Virginia which had perhaps the only department in the US which was actively studying the subjects. My research concerned the lower atmosphere and the effect that changes in its heat capacity and albedo had on atmospheric circulation.  I took what I believe was the first course offered on human impacts on climate, titled Urban Meteorology which was taught by Roger Pielke and Mike Garstang. We spent many hours discussing the effects of deforestation, desertification, aerosols and urbanization on climate.  We did not spend much time on the greenhouse effect.  Estimates of the effect were small compared to the other effects and the planet was not warming.

There are many things which could cause the climate to change. There is the natural variation of the sun and a periodic variation of volcanic dust. Human industry can throw smoke into the atmosphere which clouds out the sun’s energy. Cutting, draining, plowing, and paving can change the amount of energy the earth absorbs and how fast it heats up and cools off. This was the subject of decades of research, strong correlations, and reasonable models. Most of which are now ignored.

The first time I heard a positive discussion of the theory that CO2 could catastrophically change earth climate, it was from speakers sponsored by the Nuclear Engineering department.  Their motivation was obvious.

Lewin describes how the funding for the study of non-greenhouse gas mechanisms of climate change was cut while funding for the study of greenhouse gas effects was increased. I lived through this and I appreciate that someone finally wrote it down.

So I thank Bernie Lewin for assembling an accurate history of the climate debate.

February 7, 2018 Posted by | Book Review, Environmentalism, Nuclear Power, Science and Pseudo-Science, Timeless or most popular | Leave a comment

Cancer: Monsanto knew glyphosate could cause it

RT America | February 2, 2018

Mike Papantonio and Author Carey Gillam discuss her new book which reveals how Monsanto viciously worked to cover-up the fact that their weed-killer could cause cancer.

February 5, 2018 Posted by | Book Review, Corruption, Deception, Environmentalism, Science and Pseudo-Science, Timeless or most popular, Video | | 3 Comments

Drinking the Self-driving Car Kool-aid

By Othello | Dissident Voice | January 27, 2018

Recently, a Tesla on autopilot slammed into a parked fire engine at 65 mph. It turns out that there was no malfunction. According to Tesla’s manual:

Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.

So whereas any half way decent human driver would have braked and/or swerved to avoid the collision, Tesla’s “smart” car proceeded full-speed ahead.

Even if you choose not to buy a self-driving car, you or your loved ones could have been in that parked vehicle struck by a stupid “smart” car. This is not just about technophiles who want to be able to play World of Warcraft while speeding down the highway… this technology is potentially dangerous to all road users and any deaths, injuries or property damage caused by this flawed technology should see the drivers, manufacturers and approving authorities prosecuted or sued… no high-tech exemption!

It is not only Tesla; according to the Wired article referred at the start of this article:

Volvo’s semi-autonomous system, Pilot Assist, has the same shortcoming. Say the car in front of the Volvo changes lanes or turns off the road, leaving nothing between the Volvo and a stopped car. Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed.

The article explains why these self-driving systems are engineered that way but blithely promises that in the future LIDAR (Light Identification Detection and ranging, which uses lasers) will replace and/or augment radar and cameras to solve this problem. However, one can discern the real agenda when it informs us that:

Lidar’s price and reliability problems are less of an issue when it comes to a taxi-like service, where a provider can amortize the cost over time and perform regular maintenance. But in today’s cars, meant for average or modestly wealthy consumers, it’s a no-go.

Self-driving cars are a promising new profit center for auto and technology companies. They want to own personal and commercial road transportation which they will provide as a service (at a tidy profit, of course). They repeatedly argue that the technology is safer that using human drivers using flawed statistics while self driving cars cause fatal accidents because the car’s cameras failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky or knock over motorcyclists.

There is a general love-fest for things regarded as cool technology. However, unlike the great innovations that have made driving safer like ABS, ESP, collision avoidance systems, air bags etc. the real intent of self-driving cars seems to be creating a new industry that will be dominated by auto and tech giants who would ultimately control all road traffic…a truly huge potential market.

You probably didn’t hear about the conclusions of Germany’s Highway Research Institute (BASt) that:

After many thousands of kilometers of testing, BASt reportedly concluded that Autopilot represents a significant traffic hazard. Judging that is was not designed for complex urban traffic situations, the report declared that the car’s sensors are too short-sighted to cope with the reality of German motorways.

Or that:

American research conducted by John F. Lenkeit of Dynamic Research, which concludes that forward collision warning systems for automobiles fail dramatically to detect motorcycles.

Before concluding that self-driving cars are an inevitable part of a rosy future one should read an article like The “Self-Driving” Car is only an Oxymoron. In it you might learn that:

… in the first week of March, Uber’s 43 test cars in three states logged some 20,000 miles on public roads. Their drivers had to intervene and take control away from the software, an average of once every mile. Critical interventions, required to save lives and property, were counted separately; they occurred every 200 miles.

In a world where millions would love to have the job of driver and where training and technology geared towards supporting safe driving provide accessible solutions to improving road safety, self-driving cars seem to be of dubious value and downright dangerous as well.

January 27, 2018 Posted by | Deception, Economics, Science and Pseudo-Science, Timeless or most popular | | 1 Comment

Bad Weather Is No Reason For Climate Alarm

GWPF | Jan 24, 2018

2017 seemed like a year full of bad weather news. But a deeper look at the global data suggests that attempts to link the last year’s extreme weather to climate change are misleading.

January 25, 2018 Posted by | Deception, Science and Pseudo-Science, Timeless or most popular, Video | 1 Comment

Hottest Week Of The Year – All Of Antarctica Below Freezing

By Tony Heller | Real Climate Science | January 24, 2018

This week is the hottest week of the year in Antarctica, and the entire continent is below freezing. In the map below, I have masked out all above freezing temperatures.

Climate Reanalyzer

Meanwhile, our fake news and fake science organizations tell us Antarctica is melting down, and it is bad news.

A huge part of Antarctica is melting and scientists say that’s bad news – CNN

Experts also say refugees will be forced to flee to Antarctica before 2030.

Climate change study predicts refugees fleeing into Antarctica – Telegraph

January 25, 2018 Posted by | Deception, Fake News, Mainstream Media, Warmongering, Science and Pseudo-Science | | 2 Comments

The Lancet Accused Of Sacrificing The Poor On Pollution

By Paul Homewood | Not A Lot Of People Know That | January 18, 2018

I have highlighted much of the mendacious nonsense coming out of the Lancet concerning climate change and pollution issues.

Now a hard hitting report by Mikko Paunio, a specialist in public health matters has destroyed both the credibility and integrity of two of the Lancet’s recent papers on pollution.

The GWPF, who commissioned the paper, report:

London, 18 January: A pair of influential reports published by the medical journal, The Lancet, are a “gross distortion” of public health science and threaten to devastate public health in the developing world. That is the warning by eminent epidemiologist Mikko Paunio.

The Lancet Commissions on Pollution and Health have claimed that the third world is suffering appalling health effects from industrial pollution. But as Professor Paunio explains, this is far from the truth:

“Most of the deaths that they say are caused by industrial air pollution are actually caused by domestic heating and cooking with renewable energy such as wood and dung, and most of the deaths from diarrhea that they say are caused by polluted water are actually caused by poor hygiene because the poor do not have enough water for washing.”

Professor Paunio also says that the Lancet Commissions’ proposal for a ban on new fossil-fueled power stations will be devastating for human health:

“To prevent most of the deaths from diarrhea, you need abundant water supplies, and that depends on having a reliable electricity grid, which can only come from fossil fuels. Clean air depends on centralised power generation in large power stations.”

Dr Paunio has set out his position in a hard-hitting report published by the Global Warming Policy Foundation (GWPF) this week, just ahead of an important meeting of the World Health Organization Executive, which is expected to consider the Lancet Commission’s proposals.

“Professor Paunio writes clinically and factually to demonstrate the errors, exaggerations, distortions, misquotations and suppressions of established evidence which pervade The Lancet reports. His facts and arguments are vitally important and should be widely read,” writes former Labour minister Lord Donoughue in his foreword.

Dr Paunio is a former government scientist at the European Commission and the World Bank. He works at the health ministry in Finland and is an adjunct professor at the University of Helsinki. He is best known as one of the first scientists to speak out against Andrew Wakefield’s claims, also published in The Lancet, about the MMR vaccine and autism.

Full paper (pdf)

Lord Donoghue has written the very pertinent foreword to the paper:

Professor Paunio has enjoyed a distinguished career in global public health, both in Europe and the USA. He has a proven record of countering medical falsehoods, based more on environmental propaganda than on scientific evidence. He certainly adds to that reputation in this hard-hitting and evidence-based paper. It focusses on two recent reports published (to its discredit) in the medical journal The Lancet. They have been widely quoted in the British Parliament and in the popular media. They were predictably trumpeted by climate alarmists at the 23rd UN Convention on Climate Change, clearly their target political audience.

The reports’ conclusions are supportive of the familiar climate-campaign claims that industrial development, and especially pollution derived from coal-fired power generation, are the main cause of much ill health and mortality in the world. Their political purpose is to convince global policy makers to take radical environmental action, for example by regulating and restructuring our energy economy, however inefficiently and expensively, in order to serve the noble cause of saving lives and improving health. There may be a case for that, if based on scientific facts, but Professor Paunio shows that The Lancet does not respectably advance that cause.

The Lancet’s political activism is apparently part of a wider political environmental campaign to blame almost any issue of current public and media concern on climate change (which is happening and always has): mass migration, floods, droughts, storms (now conveniently named to make a greater impact on public memory), and (allegedly) disappearing animal species such as Al Gore’s polar bears – now interestingly at a near peak of population. Professor Paunio writes clinically and factually to demonstrate the errors, exaggerations, distortions, misquotations and suppressions of established evidence which pervade The Lancet reports. Focussing on their misrepresentation of the latest factual evidence relating to the health factors involving air pollution and water supplies, he demonstrates how the main cause of global pollution deaths is from open-fire cooking and heating in the less-developed world, which causes ten times as much health damage in China and India than do their coal-fired power plants, which the climate alarmists so hate.

He also points out that global health has in fact dramatically improved during the past near two centuries of modest global warming. This is mainly due to economic development and especially because of improvements in institutional health provision in the developed world, something which the climate alarmists choose to ignore since it does not fit in with their ideological position.

Interestingly in this debate, it should be noted that modest global warming of the degree we have enjoyed is actually less health-threatening than global cooling. Warming does not significantly increase mortality; it does reduce temperature-related deaths. It is officially estimated that in the UK only 3 deaths per 100,000 of the population are heat related. However, 61 deaths per 100,000, twenty times as many, are cold related. So a cooling cycle, should it reappear, would be intrinsically more threatening to health than a warming one. This is not just in the UK. Stanford University research estimates that an increase of warming temperatures of 2.5◦C would reduce mortality in the USA by 40,000 deaths a year and so greatly reduce medical costs.

Most global ill health and mortality derives, not from industrial development and related climate matters, but from underdevelopment, especially domestic pollution and the malnutrition that can render it fatal. This does not mean that there are not serious concerns over climate change, where properly evidenced. But they should be address rationally, and not dogmatically.

Professor Paunio’s well researched paper shows that The Lancet’s concerns are not properly evidenced. His facts and arguments are vitally important and should be widely read, especially by policy makers and media commentators, not just for exposing the particular falsehoods in the reports, but also for demonstrating the dangers lying in the wider climate change debate of political groupthink.

Bernard Donoughue MA, D.Phil (Oxon)
Senior Policy Adviser to the Prime Minister 1974–79
Minister for Farming and Food 1997–99

January 18, 2018 Posted by | Science and Pseudo-Science, Timeless or most popular | | Leave a comment