Aletho News

ΑΛΗΘΩΣ

Statin Drugs – The Real Reason Official Guidelines Still Demonize Fats Despite the Evidence?

By Gabriela Segura, M.D. | Sott.net | January 15, 2018

Nina Teicholz, investigative journalist and author of the International bestseller The Big Fat Surprise, wrote an article for the BMJ (formerly the British Medical Journal) in September 2015, which makes the case for the inadequacy of the scientific advice that underpins the Dietary Guidelines (Teicholz, 2015). The title of the article was “The scientific report guiding the U.S. dietary guidelines: is it scientific?” Ian Leslie writing for The Guardian reports that the response of the nutrition establishment was ferocious: 173 scientists – some of whom were on the advisory panel, and many of whose work had been critiqued in Teicholz’s book – signed a letter to the BMJ, demanding it retract the piece (Leslie, 2016). Prominent cardiovascular and nutrition scientists from 19 countries called for the retraction. However, to this day, the article remains published. The BMJ has officially announced that it will not retract the peer-reviewed investigation after stating that two independent experts conducted formal post-publication reviews of the article and found no grounds for retraction (Sboros, 2016).

Yet, behind every mainstream medical practice, strict questionable guidelines are still followed faithfully every day. Doctors are still following cholesterol targets that are often unattainable without cholesterol lowering drugs, but many do try to achieve their targets with extremely low fat diets recommended irresponsibly in dietary guidelines.

Unfortunately the rest of the world has followed suit on these dietary changes. Traditional high fat foods have been given up for the low fat scam. Promoters of the highly touted Mediterranean diet, with its olive oil and ‘low animal fat’, fail to mention the fact that there are still fat loaded recipes that were passed from generation to generation among the Mediterranean people. Lardo di Colonnata with its cured strips of fatback and herbs and spices; Greek barbecue which often involves an entire lamb roasted on a spit; or the kokoretsi which is made from the internal organs of the lamb – liver, spleen, heart, glands – threaded onto skewers along with the fatty membrane from the lamb intestines, all of these are foods of the long-lived Mediterranean people. Yet the ‘American style Mediterranean Diet’ selectively picks foods from the diet of the Mediterranean people to give the picture they desire. Ironically, many of the Mediterranean people have adopted this Americanized version of the ‘Mediterranean Diet’.

The truth is that cholesterol is a substance our bodies make naturally, and it’s absolutely essential to our health. Cholesterol is so crucial that the body produces some 1000-1400 milligrams of it each day, mainly in the liver. Cholesterol is also synthesized to a smaller extent in the adrenal glands, intestines, reproductive organs, etc.

We are told by the “Official Thought-Control Institutions” to limit consumption to less than 300 milligrams of cholesterol per day, but our liver’s production of cholesterol is controlled by a feedback mechanism based on how much we eat. If we eat a lot of cholesterol, we produce less, leaving much needed liver energy for other important tasks. If we eat little cholesterol, replacing it with carbohydrates and vegetable oils, then the body will produce the cholesterol from these dietary raw materials. However, a high-carb and vegetable oil diet yields a very bad cholesterol profile even when the cholesterol is in normal range. If we hardly eat any cholesterol and we block its production with lowering cholesterol drugs, then we are limiting the supply of something the body desperately needs for its proper function. Yet statins, cholesterol-lowering drugs, are among the most profitable drugs in the history of the world.

Restricting or eliminating cholesterol in the diet overburdens the liver, which now has to overproduce it through its enzyme HMG-CoA reductase from food in our diet. This enzyme is the one that is blocked by statin drugs for the purpose of lowering the amount of cholesterol the body produces. But, as with all pharmaceuticals, it comes with a price. HMG-CoA reductase is also the enzyme needed for the creation of coenzyme Q10 (CoQ10), which is a key nutrient for energy production in our cells. CoQ10 is also a major antioxidant. People complain of muscle cramps or aches while on statins drugs. Keep in mind that your heart is a muscle as well. Coincidence or not, the incidence of congestive heart failure has spiked during the time statins have been a top selling drug. Even when statin drugs are not at fault for the increased prevalence of congestive heart failure during the last decades, we don’t necessarily want to decrease CoQ10 levels in a failing heart.

Coenzyme Q10 – also called ubiquinone, which means ‘occurring everywhere’ – plays an important role in the manufacture of ATP, the fuel of our cells. It is present in every cell of our bodies, especially in the very active cells of our hearts. Depriving the heart of CoQ10 is like removing the spark plug from an engine. It just won’t work. Low levels of CoQ10 are involved in practically all cardiovascular diseases including angina, hypertension, cardiomyopathy and congestive heart failure (Sarter, 2002). It is ironic that statins, for “heart health”, block coenzyme Q10.

Statins’ many potential side effects include depression, confusion, memory problems and inability to concentrate. It hinders our body’s ability to fight microbes, increases liver damage, increases risk of cancer, fatigue, impotence, kidney failure, rhabdomyolysis (destruction of muscle cells) and shortness of breath among other things (for a database on statin adverse effects, see here). Cholesterol levels that are below 150 mg/dL may increase the risk for cancer, hormonal imbalances, depression, sexual dysfunction, memory loss, Parkinson’s disease, type 2 diabetes, stroke, suicide, and violent behavior.

As scientists are beginning to understand the intricacies of cholesterol’s role in the function of our trillions of cell membranes, including the details of nutrient transport across membranes, they are starting to realize what a bad idea this whole statin business is. Well, some of them are, anyway. According to some researchers:

Current guidelines encourage ambitious long term cholesterol lowering with statins, in order to decrease cardiovascular disease events. However, by regulating the biosynthesis of cholesterol we potentially change the form and function of every cell membrane from the head to the toe. As research into cell morphology and membrane function realises more dependencies upon cholesterol rich lipid membranes, our clinical understanding of long term inhibition of cholesterol biosynthesis is also changing.” (Wainwright, Mascitelli, & Goldstein, 2009, p. 289)

We make highly unstable and dysfunctional cell membranes when we restrict organic animal fats. This harmful effect has far reaching consequences. And doctors, unfortunately, don’t seem to be receiving this information.

The past decade of research has shown the importance of cholesterol-rich membranes and their fundamental implications for our brain and nervous tissue, immune system and all areas where lipoproteins are created, secreted, delivered and utilized. Cholesterol is so vital to the formation and correct operation of the brain that neurons require additional cholesterol to be secreted by brain cells. No wonder some people lose their memories with statin therapy!

Statin drugs also impair the secretion of new myelin, the fatty coating that covers the nerve cells and facilitates their firing. The connection between cholesterol and its fundamental role in the immune system and in the cell membrane should also be kept in mind when it comes to autoimmune diseases.

Modern guidelines say that it is desirable to have a level of total cholesterol of less than 200 mg/dL. When I was in medical school, which was not that long ago, the upper limit was 240 mg/dL. Once upon a time, it used to be 280 mg/dl. Apparently, in 1970, the rule-of-thumb for a healthy serum cholesterol was in the 200 plus range. Now most doctors try to keep cholesterol below 200, which most people find very difficult (if not impossible) to achieve through diet and lifestyle changes alone. Since then, statin drugs like Lipitor became one of the all-time top-selling drugs in history (Angell, 2005).

The European guidelines on cardiovascular disease prevention in clinical practice (Piepoli et al., 2016) recommends that very high-risk patients lower their LDL cholesterol to less than 70mg/dL (<1.8 mmol/L) or “a reduction of at least 50% if the baseline is between 70 and 135 mg/dL (1.8 and 3.5 mmol/L).” (Ibid., p. 2331) Conveniently, pharmaceutical companies have the drug just for such a drastic reduction. For example Orvatez by Merck which combines ezetimibe (blocks the absorption of cholesterol) and atorvastatin (a statin drug) can bring LDL cholesterol down to 50 mg/dL. Merck highlights in a chart made for doctors that if a patient has a baseline LDL cholesterol of 70 mg/dL, target LDL should be of 35 mg/dL! And I’m not the only one who sees a problem with this. As the Mayo Clinic shyly puts it:

“There is no consensus on how to define very low LDL cholesterol, but LDL would be considered very low if it is less than 40 milligrams per deciliter of blood… very low levels of LDL cholesterol may be associated with an increased risk of cancer, hemorrhagic stroke, depression, anxiety, preterm birth and low birth weight if your cholesterol is low while you’re pregnant.” (Lopez-Jimenez, 2015, para. 2-3)

The above-mentioned European guidelines include a disclaimer where we read the following:

“[the] Guidelines do not override, in any way whatsoever, the individual responsibility of health professionals to make appropriate and accurate decisions in consideration of each patient’s health condition and in consultation with that patient and, where appropriate and/or necessary, the patient’s caregiver. Nor do the ESC Guidelines exempt health professionals from taking into full and careful consideration the relevant official updated recommendations or guidelines issued by the competent public health authorities, in order to manage each patient’s case in light of the scientifically accepted data pursuant to their respective ethical and professional obligations. It is also the health professional’s responsibility to verify the applicable rules and regulations relating to drugs and medical devices at the time of prescription.” (Piepoli et al., 2016, p. 2315)

Since I have first hand experience of the way research is done in Europe, most specifically Italy, I decided to have a look at the disclosure forms of the experts involved in the development of these guidelines. As it happens, there is no direct hyperlink to the disclosure from the electronic version. I found it hyperlinked in a smaller font as the last section of the menu on a separate page at their escardio.org website. After a while you get good at digging for these details that very few are trained to look for and/or are interested in. The declaration of interest is a PDF file of 35 pages and it specifies that “the report below lists declarations of interest as reported to the ESC by the experts covering the period of the Guidelines production, from Task Force creation to publication.” (Available at https://www.escardio.org/static_file/Escardio/Guidelines/DOI_CVDPrevention.pdf)

That is, the declaration of interest only covers 2014 and 2015, and it is not given by a third party. Most of the authors have so many links to Big Pharma that their declaration of interest can take an entire page. The reader can have fun searching for Big Pharma sponsoring for the years not covered for both the sponsored and the few authors who had nothing to declare in 2014 and 2015. I challenge anyone to find at least one author who chose to attend only conferences that were not financed by Big Pharma as a general rule for his entire career.

As Marcia Angell, Senior Lecturer in Social Medicine at Harvard Medical School and former Editor of the New England Journal of Medicine states:

If drug companies and medical educators were really providing education, doctors and academic institutions would pay them for their services. When you take piano lessons, you pay the teacher, not the other way around. But in this case, industry pays the academic institutions and faculty, and even the doctors who take the courses. The companies are simply buying access to medical school faculty and to doctors in training and practice.

This is marketing masquerading as education. It is self-evidently absurd to look to companies for critical, unbiased education about products they sell. It’s like asking a brewery to teach you about alcoholism, or a Honda dealer for a recommendation about what car to buy. Doctors recognize this in other parts of their lives, but they’ve convinced themselves that drug companies are different. That industry-sponsored education is a masquerade is underscored by the fact that some of the biggest Madison Avenue ad agencies, hired by drug companies to promote their products, also own their own medical-education companies. It’s one-stop shopping for the industry.[…]

It’s easy to fault drug companies for much of what I’ve described, and they certainly deserve a great deal of blame. Most of the big drug companies have paid huge fines to settle charges of illegal activities. Last year Pfizer pleaded guilty and agreed to pay $2.3 billion to settle criminal and civil charges of marketing drugs for off-label uses-the largest criminal fine in history. The fines, while enormous, are still dwarfed by the profits generated by these activities, and are therefore not much of a deterrent. Still, apologists might argue that, despite its legal transgressions, the pharmaceutical industry is merely trying to do its primary job-furthering the interests of its investors-and sometimes it simply goes a little too far.

Doctors, medical schools, and professional organizations have no such excuse; the medical profession’s only fiduciary responsibility is to patients and the public. (emphasis added) (Angell, 2010, para. 35-36, 39-40)

If only health care professionals at large would take a stand against the massive conflict of interests from pharmaceutical and food industries and their role in the corruption of the medical science, it wouldn’t have come to the point where there are guidelines advising the reduction of cholesterol to levels never seen before in medical records. Another line of research would have been followed where dietary and environmental factors and their role in inflammation and our health would play a greater role. Hopefully we will wake up soon, otherwise we risk a guideline recommending zero levels of LDL cholesterol. It sounds absurd, but then, I thought that an LDL target 35 mg/dL would shock conventional practitioners to realize the absurdity of these recommendations, and that doesn’t seem to have happened.

Statin drugs are among the most profitable drugs in the history of the world. Those profits buy a lot of propaganda: lobbyists, advertising and marketing to doctors, and free continuing medical education. Think of what even a small percentage of their massive profits could do for prevention if it were invested in public education towards a truly health promoting diet. Think of all the diseases that would essentially disappear from the face of the planet. But expecting a corporation to willingly cut off its main source of profit is a pipe dream. Even if they knew the truth about diet, it would be kept as the most tightly guarded secret in history.

It’s really not in the drug-maker’s’ best interest to have people making healthy dietary choices. So instead of promoting prevention strategies, cholesterol drugs continue to post record-breaking profits and create poor health and side effects in the people taking them. Those people in poor health can then be treated with more drugs. How many people do you know on multiple medications for various ailments? Whether the cause if malfeasance or ignorance is largely irrelevant because the result is the same.

It is only your own awareness that can turn things around. The public is gradually awakening to the fact that statins are virtually useless for the vast majority of people who take them, and yet they carry significant risks.

A group of eminent doctors including the President of the Royal College of Physicians, Sir Richard Thompson, argue in a declaration letter that a doctor making a case for these drugs can quite easily look ill-informed, biased or just plain stupid in the eyes of their patients. According to one of the letter’s signatories, Dr David Newman, Assistant Professor of Emergency Medicine and Director of clinical research at Mount Sinai School of Medicine:

I am always embarrassed when I have to tell patients that our treatment guidelines were written by a panel filled with people who stood to gain financially from their decisions. The UK certainly appears to be no different to that of the United States. The truth is, for most people at low risk of cardiovascular disease, a statin will give them diabetes as often as it will prevent a non fatal heart attack – and they won’t live any longer taking the pill. That’s not what patients are looking for. (Briffa, 2014, para. 20)

The letter was addressed to the chair of NICE, the National Institute for Health and Care Excellence in the United Kingdom. In the letter, the proposition to reduce the threshold for prescribing statins to those with a 10% risk of cardiovascular disease is rejected by addressing six major concerns (letter available at www.nice.org.uk/Media/Default/News/NICE-statin-letter.pdf):

  1. The medicalization of millions of healthy individuals
  2. Conflicting levels of adverse events
  3. Hidden data
  4. Industry bias
  5. Loss of professional confidence
  6. Conflicts of interest

So again we see guidelines being written to favor the industry and the over-medicalization of millions of people.

Ironically, the very same experts for some of these guidelines disagree, calling for expert groups such as the Adult Treatment Panel (ATP) IV to “abandon the paradigm of treating patients to LDL targets” (Hayward & Krumholz, 2012).

Blinded by the numbers, doctors will see LDL levels at 70 and say their patients are doing well. They could fail to see what might actually be in front of their eyes – an ill-looking and nutritionally deficient person. Cracking skin, plunging libido, muscle wasting, memory problems, blood sugar imbalances, premature aging – but hey, cholesterol numbers are right on the money! It is astounding to see how we as doctors do so little critical thinking, focusing only on arbitrary guidelines dictated by the same companies selling the drugs that are the only things that make the numbers possible. Talk about a collective blind spot facilitated by decades of programmed schooling. Even when a patient points out, ‘but I eat no fats and no salt and I’m getting worse!’, we might fail to connect the dots.

As if this weren’t enough, here’s another bit of irony. In one study, the use of statin drugs was associated with microalbuminuria (Van Der Tol et al., 2012). Microalbuminuria is a marker of poor endothelial function and it’s endothelial function which determines cardiovascular disease risk. Microalbuminuria is also a marker of kidney problems.

Similarly, in a study of nearly 26,000 beneficiaries of Tricare – the military health system in the United States – those taking statin drugs to control their cholesterol were 87 percent more likely to develop diabetes. The research confirmed past findings on the link between statins and diabetes risk, but it is among the first to show the connection in a relatively healthy group of people. The study included only people who at baseline were free of heart disease, diabetes, and other severe chronic disease (Veterans Affairs Research Communications, 2015).

In this same study, statin use was also associated with a very high risk of diabetes complications. Among 3351 pairs of similar patients–part of the overall study group–those patients on statins were 250 percent more likely than their non-statin-using counterparts to develop diabetes with complications (Mansi, Frei, Wang, & Mortensen, 2015). Statin users were also 14 percent more likely to become overweight or obese after being on the drugs. The study also found that the higher the dose of any of the statins, the greater the risk of diabetes, diabetes complications, and obesity. Ironically, it is those who have had a cardiovascular disease event who are prescribed higher doses of statin drugs.

Moreover, more frequent statin drug use is associated with accelerated coronary artery and aortic artery calcification, both of which greatly contribute to cardiovascular and all-cause mortality (Saremi, Bahn, & Reaven, 2012). An evaluation of thousands of individuals with no known cardiovascular disease and undergoing a coronary CT angiography which visualizes atherosclerosis, concluded that statin use is associated with an increased prevalence and extent of coronary plaques possessing calcium (Nakazato et al., 2012). So doctors might be prescribing a medicine that contributes to onset of the very thing they are trying to prevent.

In the meantime, people are getting increasingly high levels of calcified hearts. During heart surgery, the surgical instrument known as the ‘bone eater’ ends up being used to replace valves that should have remained silky and smooth. I know what I speak after witnessing and conducting thousands of open heart surgeries in three different countries.

Two top vascular surgeons have summarized statins in a damning report called “The Ugly Side of Statins. Systemic Appraisal of the Contemporary Un-Known Unknowns“. In the report they state: “The statin industry is the utmost medical tragedy of all times,” and that “statins are associated with triple the risk of coronary artery and aortic calcification.” (Sultan & Hynes, 2013, p. 180, 183)

The picture isn’t pretty. The decades of massive anti-fat propaganda has brainwashed all of us without exception. Upon being questioned about their dietary habits, a patient might guiltily recall all the fats they ate and think that those are to blame for their health woes. Never mind that they eat mostly carbs, or that most of the fats they do eat are of the processed, plastic and vegetable oil variety. On doctors orders, they remove the animal fat from their diets, thereby increasing the carbs and vegetable oils, the very two steps that will deteriorate their health. When and if cholesterol targets are not reached by these measures, then the doctor has ‘no choice’ but to put them on a statin drug.

There is, however, a small percentage of people out there who genuinely have a true genetic predisposition to high blood cholesterol called familial hypercholesterolemia, which is a condition which is characterized by an impaired or even lack of ability to metabolize cholesterol. This condition can have serious health consequences and sufferers may need medical interventions to bring their cholesterol levels down. But that doesn’t mean this can be extrapolated to all people who don’t have this genetic problem.

Medical research has not proven that lowering (or low) cholesterol in and of itself reduces risk of death from heart disease across a population (Siri-Tarino, Sun, Hu, & Krauss, 2010; Chowdhury et al., 2014). Men with very low cholesterol levels seem prone to premature death. Below 160 milligrams per deciliter (mg/dl), the lower the cholesterol level, the shorter the lifespan. These men die of cancer, respiratory and digestive diseases, and trauma (Smith, 1997). As for women, if anything, the higher their cholesterol, the longer they seem to live (Teicholz, 2014).

Despite these facts, it is estimated that by 2020, revenues from statin drug sales will reach 1 trillion dollars. Never mind that most people taking these drugs are not at risk for heart disease.

References

Angell, M. (2005). The truth about the drug companies: How they deceive us and what to do about it. New York: Random House Trade Paperbacks.

Angell, M. (2010, May 1). Big Pharma, Bad Medicine. Retrieved from http://bostonreview.net/angell-big-pharma-bad-medicine

Briffa, J. (2014, June 18). Prominent doctors declare their opposition to the planned expansion of statin prescribing. Retrieved from http://www.drbriffa.com/2014/06/18/prominent-doctors-declare-their-opposition-to-the-planned-expansion-of-statin-prescribing/

Chowdhury, R., Warnakula, S., Kunutsor, S., Crowe, F., Ward, H. A., Johnson, L., . . . Angelantonio, E. D. (2014). Association of dietary, circulating, and supplement fatty acids with coronary risk. Annals of Internal Medicine, 160(6), 398-406. doi:10.7326/m13-1788

Hayward, R. A., & Krumholz, H. M. (2012). Three reasons to abandon low-density lipoprotein targets: An open letter to the adult treatment panel IV of the National Institutes of Health. Circulation: Cardiovascular Quality and Outcomes, 5(1), 2-5. doi:10.1161/circoutcomes.111.964676

Leslie, I. (2016, April 07). The sugar conspiracy | Ian Leslie. Retrieved from https://www.theguardian.com/society/2016/apr/07/the-sugar-conspiracy-robert-lustig-john-yudkin

Lopez-Jimenez, F. (2015, October 30). Cholesterol level: Can it be too low? Retrieved from http://www.mayoclinic.org/diseases-conditions/high-blood-cholesterol/expert-answers/cholesterol-level/faq-20057952

Mansi, I., Frei, C. R., Wang, C., & Mortensen, E. M. (2015). Statins and new-onset diabetes mellitus and diabetic complications: A retrospective cohort study of US healthy adults. Journal of General Internal Medicine, 30(11), 1599-1610. doi:10.1007/s11606-015-3335-1

Nakazato, R., Gransar, H., Berman, D. S., Cheng, V. Y., Lin, F. Y., Achenbach, S., . . . Min, J. K. (2012). Statins use and coronary artery plaque composition: Results from the International Multicenter CONFIRM Registry. Atherosclerosis, 225(1), 148-153. doi:10.1016/j.atherosclerosis.2012.08.002

Piepoli, M. F., Hoes, A. W., Agewall, S., Albus, C., Brotons, C., Catapano, A. L., . . . Verschuren, W. M. (2016). 2016 European Guidelines on cardiovascular disease prevention in clinical practice. European Heart Journal, 37(29), 2315-2381. doi:10.1093/eurheartj/ehw106

Saremi, A., Bahn, G., & Reaven, P. D. (2012). Progression of vascular calcification is increased with statin use in the Veterans Affairs Diabetes Trial (VADT). Diabetes Care, 35(11), 2390-2392. doi:10.2337/dc12-0464

Sarter, B. (2002). Coenzyme Q10 and cardiovascular disease: A review. The Journal of Cardiovascular Nursing, 16(4), 9-20. doi:10.1097/00005082-200207000-00003

Sboros, M. (2016, December 10). Victory for Teicholz in battle of butter. Retrieved from http://foodmed.net/2016/12/04/victory-teicholz-battle-of-butter-bmj/

Siri-Tarino, P. W., Sun, Q., Hu, F. B., & Krauss, R. M. (2010). Meta-analysis of prospective cohort studies evaluating the association of saturated fat with cardiovascular disease. American Journal of Clinical Nutrition, 91(3), 535-546. doi:10.3945/ajcn.2009.27725

Sultan, S., & Hynes, N. (2013). The ugly side of statins. Systemic appraisal of the contemporary un-known unknowns. Open Journal of Endocrine and Metabolic Diseases, 03(03), 179-185. doi:10.4236/ojemd.2013.33025

Teicholz, N. (2014). The big fat surprise: Why butter, meat, and cheese belong in a healthy diet. New York: Simon and Schuster.

Van Der Tol, A., Van Biesen, W., Van Laecke, S., Bogaerts, K., De Lombaert, K., Warrinnier, H., & Vanholder, R. (2012). Statin use and the presence of microalbuminuria. Results from the ERICABEL trial: A non-interventional epidemiological cohort study. PLoS ONE, 7(2). doi:10.1371/journal.pone.0031639

Veterans Affairs Research Communications. (2015, May 07). Strong statin-diabetes link seen in large study. Retrieved from https://www.sciencedaily.com/releases/2015/05/150507145328.htm

Wainwright, G., Mascitelli, L., & Goldstein, M. R. (2009). Cholesterol lowering therapies and membrane cholesterol. Stable plaque at the expense of unstable membranes? Archives of Medical Science, (5), 289-295.

Avatar

Gabriela Segura, M.D.

Dr. Gaby was born into a mixed Eastern-Western family in Costa Rica and she is a countryside family medicine doctor and former heart surgeon. Her research in the medical field, the true nature of our world and all things related to healing have taken her to Italy, Canada, France and Spain. Gaby is co-host of the ‘Health and Wellness’ show on the SOTT Radio Network and her writings can be found at The Health Matrix.

January 16, 2018 Posted by | Corruption, Full Spectrum Dominance, Science and Pseudo-Science, Timeless or most popular | | Leave a comment

Scientific American, Global Warming & Iran

Penny For Your Thoughts | January 8, 2018

Today there are claims being made that the Iran Protests are Due to “Global Warming”- Seriously, this is the absolute baloney, garbage, nonsense that is being put forth as the reason for these protests.

Scientific American no less. Wild speculation at it’s most crazeeeee….

“Barbara Slavin, director of the Future of Iran Initiative at the Atlantic Council.

She said the role of climate change on the protests is “massive” and under-reported by the media.

This exact same claim was made regarding Syria, by self proclaimed authority figures and regurgitated by agenda pushing 5 eyes msm and alt media. Though it was later clarified that the scientific evidence for the Syrian claim was so thin as to be considered tenuous. Tenous: lacking a sound basis, as reasoning; unsubstantiated; weak: Yah, tenuous sounds exactly right!

Another AGW Lie Bites the Dust:“Climate Change Fuelled Syrian War”

“There is no sound evidence that global climate change was a factor in sparking the Syrian civil war,” said University of Sussex Professor Jan Selby, one of the study’s co-authors, in a statement.

“It is extraordinary that this claim has been so widely accepted when the scientific evidence is so thin.”

A lack of evidence never stops liars from lying. Same spin, different destabilization campaign.

January 8, 2018 Posted by | Deception, Fake News, Mainstream Media, Warmongering, Science and Pseudo-Science | , , | 2 Comments

Manufacturing consensus: the early history of the IPCC

By Judith Curry | Climate Etc. | January 3, 2018

Short summary: scientists sought political relevance and allowed policy makers to put a big thumb on the scale of the scientific assessment of the attribution of climate change.

Bernie Lewin has written an important new book:

SEARCHING FOR THE CATASTROPHE SIGNAL:The Origins of The Intergovernmental Panel on Climate Change

The importance of this book is reflected in its acknowledgements, in context of assistance and contributions from early leaders and participants in the IPCC:

This book would not have been possible without the documents obtained via Mike MacCracken and John Zillman. Their abiding interest in a true and accurate presentation of the facts prevented my research from being led astray. Many of those who participated in the events here described gave generously of their time in responding to my enquiries, they include Ben Santer, Tim Barnett, Tom Wigley, John Houghton, Fred Singer, John Mitchell, Pat Michaels . . . and many more.

You may recall a previous Climate Etc. post Consensus by Exhaustion, on Lewin’s 5 part series on Madrid 1995: The last day of climate science.

Read the whole book, it is well worth reading. The focus of my summary of the book is on Chapters 8-16 in context of the theme of ‘detection and attribution’, ‘policy cart in front of the scientific horse’ and ‘manufacturing consensus’. Annotated excerpts from the book are provided below.

The 1970’s energy crisis

In a connection that I hadn’t previously made, Lewin provides historical context for the focus on CO2 research in the 1970’s, motivated by the ‘oil crisis’ and concerns about energy security. There was an important debate surrounding whether coal or nuclear power should be the replacement for oil. From Chapter 8:

But in the struggle between nuclear and coal, the proponents of the nuclear alternative had one significant advantage, which emerged as a result of the repositioning of the vast network of government-funded R&D laboratories within the bureaucratic machine. It would be in these ‘National Laboratories’ at this time that the Carbon Dioxide Program was born. This surge of new funding meant that research into one specific human influence on climate would become a major branch of climatic research generally. Today we might pass this over for the simple reason that the ‘carbon dioxide question’ has long since come to dominate the entire field of climatic research—with the very meaning of the term ‘climate change’ contracted accordingly.

This focus was NOT driven by atmospheric scientists:

The peak of interest in climate among atmospheric scientists was an international climate conference held in Stockholm in 1974 and a publication by the ‘US Committee for GARP’ [GARP is Global Atmospheric Research Programme] the following year. The US GARP report was called ‘Understanding climate change: a program for action’, where the ‘climate change’ refers to natural climatic change, and the ‘action’ is an ambitious program of research.

[There was] a coordinated, well-funded program of research into potentially catastrophic effects before there was any particular concern within the meteorological community about these effects, and before there was any significant public or political anxiety to drive it. It began in the midst of a debate over the relative merits of coal and nuclear energy production [following the oil crisis of the 1970’s]. It was coordinated by scientists and managers with interests on the nuclear side of this debate, where funding due to energy security anxieties was channelled towards investigation of a potential problem with coal in order to win back support for the nuclear option.

The emergence of ‘global warming’

In February 1979, at the first ever World Climate Conference, meteorologists would for the first time raise a chorus of warming concern. The World Climate Conference may have drowned out the cooling alarm, but it did not exactly set the warming scare on fire.

While the leadership of UNEP (UN Environmental Programme) became bullish on the issue of global warming, the bear prevailed at the WMO (World Meteorological Organization). When UNEP’s request for climate scenario modelling duly arrived with the WCRP (World Climate Research Programme) committee, they balked at the idea: computer modelling remained too primitive and, especially at the regional level, no meaningful results could be obtained. Proceeding with the development of climate scenarios would only risk the development of misleading impact assessments.

It wasn’t long before we see scientific research on climate change becoming marginalized in the policy process, in context of the precautionary principle:

At Villach in 1985, at the beginning of the climate treaty movement, the rhetoric of the policy movement was already breaking away from its moorings in the science. Doubts raised over the wildest speculation were turned around, in a rhetoric of precautionary action: we should act anyway, just in case. With the onus of proof reversed, the research can continue while the question remains (ever so slightly) open.

Origins of the IPCC

With regards to the origins of the IPCC:

Jill JÅNager gave her view that one reason the USA came out in active support for an intergovernmental panel on climate change was that the US Department of State thought the situation was ‘getting out of hand’, with ‘loose cannons’ out ‘potentially setting the agenda’, when governments should be doing so. An intergovernmental panel, so this thinking goes, would bring the policy discussion back under the control of governments. It would also bring the science closer to the policymakers, unmediated by policy entrepreneurs. After an intergovernmental panel agreed on the science, so this thinking goes, they could proceed to a discussion of any policy implications.

While the politics were already making the science increasingly irrelevant, Bert Bolin and John Houghton brought a focus back to the science:

Within one year of the first IPCC session, its assessment process would transform from one that would produce a pamphlet sized country representatives’ report into one that would produce three large volumes written by independent scientists and experts at the end of the most complex and expensive process ever undertaken by a UN body on a single meteorological issue. The expansion of the assessment, and the shift of power back towards scientists, came about at the very same time that a tide of political enthusiasm was being successfully channelled towards investment in the UN process, with this intergovernmental panel at its core.

John Houghton (Chair of Working Group I) moved the IPCC towards a model more along the lines of an expert-driven review: he nominated one or two scientific experts—‘lead authors’—to draft individual chapters and he established a process through which these would be reviewed at lead-author meetings.

The main change was that it shifted responsibility away from government delegates and towards practising scientists. The decision to recruit assessors who were leaders in the science being assessed also opened up another problem, namely the tendency for them to cite their own current work, even where unpublished.

However, the problem of marginalization of the science wasn’t going away:

With the treaty process now run by career diplomats, and likely to be dominated by unfriendly southern political agitators, the scientists were looking at the very real prospect that their climate panel would be disbanded and replaced when the Framework Convention on Climate Change came into force.

And many scientists were skeptical:

With the realisation that there was an inexorable movement towards a treaty, there was an outpouring of scepticism from the scientific community. This chorus of concern was barely audible above the clamour of the rush to a treaty and it is now largely forgotten.

At the time, John Zillman presented a paper to a policy forum that tried to provide those engaged with the policy debate some insight into just how different was the view from inside the research community.  Zillman stated that:

. . . that the greenhouse debate has now become decoupled from the scientific considerations that had triggered it; that there are many agendas but that they do not include, except peripherally, finding out whether and how climate might change as a result of enhanced greenhouse forcing and whether such changes will be good or bad for the world.

To give some measure of the frustration rife among climate researchers at the time, Zillman quoted the director of WCRP. It was Pierre Morel, he explained, who had ‘driven the international climate research effort over the past decade’. A few months before Zillman’s presentation, Morel had submitted a report to the WCRP committee in which he assessed the situation thus:

The increasing direct involvement of the United Nations. . . in the issues of global climate change, environment and development bears witness to the success of those scientists who have vied for ‘political visibility’ and ‘public recognition’ of the problems associated with the earth’s climate. The consideration of climate change has now reached the level where it is the concern of professional foreign-affairs negotiators and has therefore escaped the bounds of scientific knowledge (and uncertainty).

The negotiators, said Morel, had little use for further input from scientific agencies including the IPCC ‘and even less use for the complicated statements put forth by the scientific community’.

There was a growing gap between the politics/policies and the science:

The general feeling in the research community that the policy process had surged ahead of the science often had a different effect on those scientists engaged with the global warming issue through its expanded funding. For them, the situation was more as President Bush had intimated when promising more funding: the fact that ‘politics and opinion have outpaced the science’ brought the scientists under pressure ‘to bridge the gap’.

In fact, there was much scepticism of the modelling freely expressed in and around the Carbon Dioxide Program in these days before the climate treaty process began. Those who persisted with the search for validation got stuck on the problem of better identifying background natural variability.

The challenge of ‘detection and attribution’

Regarding Jim Hansen’s 1998 Congressional testimony:

An article in Science the following spring gives some insight into the furore. In ‘Hansen vs. the world on greenhouse threat’, the science journalist Richard Kerr explained that while ‘scientists like the attention the greenhouse effect is getting on Capitol Hill’, nonetheless they ‘shun the reputedly unscientific way their colleague James Hansen went about getting that attention’.

Clearly, the scientific opposition to any detection claims was strong in 1989 when IPCC assessment got underway.

Detection and attribution of the anthropogenic climate signal was the key issue:

During the IPCC review process (for the First Assessment Report), Wigley was asked to answer the question: When is detection likely to be achieved? He responded with an addition to the IPCC chapter that explains that we would have to wait until the half-degree of warming that had occurred already during the 20th century is repeated. Only then are we likely to determine just how much of it is human-induced. If the carbon dioxide driven warming is at the high end of the predictions, then this would be early in the 21st century, but if the warming was slow then we may not know until 2050.

The IPCC First Assessment Report didn’t help the policy makers’ ‘cause.’ In the buildup to the Rio Earth Summit:

To support the discussions of the Framework Convention at the Rio Earth Summit, it was agreed that the IPCC would provide a supplementary assessment. This ‘Rio supplement’ explains:

. . . the climate system can respond to many forcings and it remains to be proven that the greenhouse signal is sufficiently distinguishable from other signals to be detected except as a gross increase in tropospheric temperature that is so large that other explanations are not likely.

Well, this supplementary assessment didn’t help either. The scientists, under the leadership of Bolin and Houghton, are to be commended for not bowing to pressure. But the IPCC was risking marginalization in the treaty process.

In the lead up to CoP1 in Berlin, the IPCC itself was badgering the negotiating committee to keep it involved in the political process, but tensions arose when it refused to compromise its own processes to meet the political need.

However, the momentum for action in the lead up to Rio remained sufficiently strong that these difficulties with the scientific justification could be ignored.  

Second Assessment Report

In context of the treaty activities, the second assessment report of the IPCC was regarded as very important for justifying implementation for the Kyoto Protocol.

In 1995, the IPCC was stuck between its science and its politics. The only way it could save itself from the real danger of political oblivion would be if its scientific diagnosis could shift in a positive direction and bring it into alignment with policy action.  

The key scientific issue at the time was detection and attribution:

The writing of Chapter 8 (the chapter concerned with detection and attribution) got off to a delayed start due to the late assignment of its coordinating lead author. It was not until April that someone agreed to take on the role. This was Ben Santer, a young climate modeller at Lawrence Livermore Laboratory.

The chapter that Santer began to draft was greatly influenced by a paper principally written by Tim Barnett, but it also listed Santer as an author. It was this paper that held, in a nutshell, all the troubles for the ‘detection’ quest. It was a new attempt to get beyond the old stumbling block of ‘first detection’ research: to properly establish the ‘yardstick’ of natural climate variability. The paper describes how this project failed to do so, and fabulously so.

The detection chapter that Santer drafted for the IPCC makes many references to this study. More than anything else cited in Chapter 8, it is the spoiler of all attribution claims, whether from pattern studies, or from the analysis of the global mean. It is the principal basis for  the Chapter 8 conclusion that. . .

. . .no study to date has both detected a significant climate change and positively attributed all or part of that change to anthropogenic causes.

For the second assessment, the final meeting of the 70-odd Working Group 1 lead authors . . . was set to finalise the draft Summary for Policymakers, ready for intergovernmental review. The draft Houghton had prepared for the meeting was not so sceptical on the detection science as the main text of the detection chapter drafted by Santer; indeed it contained a weak detection claim.

This detection claim appeared incongruous with the scepticism throughout the main text of the chapter and was in direct contradiction with its Concluding Summary. It represented a change of view that Santer had only arrived at recently due to a breakthrough in his own ‘fingerprinting’ investigations. These findings were so new that they were not yet published or otherwise available, and, indeed, Santer’s first opportunity to present them for broader scientific scrutiny was when Houghton asked him to give a special presentation to the meeting of lead authors.

However, the results were also challenged at this meeting: Santer’s fingerprint finding and the new detection claim were vigorously opposed by several experts in the field.

On the first day of the Madrid session of Working Group 1 in November 1995, Santer again gave an extended presentation of his new findings, this time to mostly non-expert delegates. When he finished, he explained that because of what he had found, the chapter was out of date and needed changing. After some debate John Houghton called for an ad-hoc side group to come to agreement on the detection issue in the light of these important new findings and to redraft the detection passage of the Summary for Policymakers so that it could be brought back to the full meeting for agreement. While this course of action met with general approval, it was vigorously opposed by a few delegations, especially when it became clear that Chapter 8 would require changing, and resistance to the changes went on to dominate the three-day meeting. After further debate, a final version of a ‘bottom line’ detection claim was decided:

The balance of evidence suggests a discernible human influence on global climate.

All of this triggered accusations of ‘deception’:

An opinion editorial written by Frederick Seitz ‘Major deception on “global warming” appeared in the Wall Street Journal on 12 June 1996.

This IPCC report, like all others, is held in such high regard largely because it has been peer-reviewed. That is, it has been read, discussed, modified and approved by an international body of experts. These scientists have laid their reputations on the line. But this report is not what it appears to be—it is not the version that was approved by the contributing scientists listed on the title page. In my more than 60 years as a member of the American scientific community, including service as president of both the NAS and the American Physical Society, I have never witnessed a more disturbing corruption of the peer-review process than the events that led to this IPCC report.

When comparing the final draft of Chapter with the version just published, he found that key statements sceptical of any human attribution finding had been changed or deleted. His examples of the deleted passages include:

  • ‘None of the studies cited above has shown clear evidence that we can attribute the observed [climate] changes to the specific cause of increases in greenhouse gases.’
  • ‘No study to date has positively attributed all or part [of the climate change observed to date] to anthropogenic [manmade] causes.’
  • ‘Any claims of positive detection of significant climate change are likely to remain controversial until uncertainties in the total natural variability of the climate system are reduced.’

On 4 July, Nature finally published Santer’s human fingerprint paper. In Science, Richard Kerr quoted Barnett saying that he is not entirely convinced that the greenhouse signal had been detected and that there remain ‘a number of nagging questions’. Later in the year a critique striking at the heart of Santer’s detection claim would be published in reply.

The IPCC’s manufactured consensus

What we can see from all this activity by scientists in the close vicinity of the second and third IPCC assessments is the existence of a significant body of opinion that is difficult to square with the IPCC’s message that the detection of the catastrophe signal provides the scientific basis for policy action.

The scientific debate on detection and attribution was effectively quelled by the IPCC Second Assessment Report:

Criticism would continue to be summarily dismissed as the politicisation of science by vested interests, while the panel’s powerful political supporters would ensure that its role as the scientific authority in the on-going climate treaty talks was never again seriously threatened.

And of course the ‘death knell’ to scientific arguments concerned about detection was dealt by the Third Assessment Report, in which the MBH Hockey Stick analysis of Northern Hemisphere paleoclimates effectively eliminated the existence of a hemispheric medieval warm period and Little Ice Age, ‘solving’ the detection conundrum.

JC reflections

Bernie Lewin’s book provides a really important and well documented history of the context and early  history of the IPCC.

I was discussing Lewin’s book with Garth Partridge, who was involved in the IPCC during the early years, he emailed this comment:

I am a bit upset because I was in the game all through the seventies to early nineties, was at a fair number of the meetings Lewin talked about, spent a year in Geneva as one of the “staff” of the early WCRP, another year (1990) as one of the staff of the US National Program Office in the Washington DC, met most of the characters he (Lewin) talked about…… and I simply don’t remember understanding what was going on as far as the politics was concerned.  How naive can one be??  Partly I suspect it was because lots of people in my era were trained(??) to deliberately ignore, and/or laugh at, all the garbage that was tied to the political shenanigans of international politics in the scientific world. Obviously the arrogance of scientists can be quite extraordinary!

Scientific scepticism about AGW was alive and well prior to 1995; took a nose-dive following publication of the Second Assessment Report, and then was was dealt what was hoped to be a fatal blow by the Third Assessment Report and the promotion of the Hockey Stick.

A rather flimsy edifice for a convincing, highly-confident attribution of recent warming to humans.

I think Bernie Lewin is correct in identifying the 1995 meeting in Madrid as the turning point. It was John Houghton who inserted the attribution claim into the draft Summary for Policy Makers, contrary to the findings in Chapter 8.  Ben Santer typically gets ‘blamed’ for this, but it is clearly Houghton who wanted this and enabled this, so that he and the IPCC could maintain a seat at the big policy table involved in the Treaty.

One might forgive the IPCC leaders for dealing with new science and a very challenging political situation in 1995 during which they overplayed their hand.  However, it is the 3rd Assessment Report where Houghton’s shenanigans with the Hockey Stick really reveal what was going on (including selection of recent Ph.D. recipient Michael Mann as lead author when he was not nominated by the U.S. delegation). The Hockey Stick got rid of that ‘pesky’ detection problem.

I assume that the rebuttal of the AGW  ‘true believers’ to all this is that politics are messy, but look, the climate scientists were right all along, and the temperatures keep increasing. Recent research increases confidence in attribution, that we have ‘known’ for decades.

Well, increasing temperatures say nothing about the causes of climate change.  Scientists are still debating the tropical upper troposphere ‘hot spot’, which was the ‘smoking gun’ identified by Santer in 1995 [link]. And there is growing evidence that natural variability on decadal to millennial time scales is much larger than previous thought (and larger than climate model simulations) [link].

I really need to do more blog posts on detection and attribution, I will do my best to carve out some time.

And finally, this whole history seems to violate the Mertonian norm of universalism:

universalism: scientific validity is independent of the sociopolitical status/personal attributes of its participants

Imagine how all this would have played out if Pierre Morel or John Zillman had been Chair of WG1, or if Tom Wigley or Tim Barnett or John Christy had been Coordinating Lead Author of Chapter 8. And what climate science would look like today.

I hope this history of manufacturing consensus gives rational people reason to pause before accepting arguments from consensus about climate change.

January 3, 2018 Posted by | Book Review, Corruption, Deception, Nuclear Power, Science and Pseudo-Science, Timeless or most popular | | Leave a comment

Storm Eleanor’s “100 MPH Winds” – Fake News From The Telegraph

By Paul Homewood | Not A Lot Of People Know That | January 3, 2018

Storm Eleanor has lashed the UK with violent storm-force winds of up to 100mph, leaving thousands of homes without power and hitting transport links.

Gusts of 100mph were recorded at Great Dun Fell in Cumbria at 1am.

Wow! Hurricane force winds, as has been reported elsewhere.

Only one slight problem though. Great Dun Fell is the second highest mountain in England’s Pennines , and the weather station is sat at the very top, at an altitude of 847m.

Even then, mean wind speeds only reached 75 mph.

At nearby Warcop, just seven miles away and at an altitude of 224m, wind speed never got above 29 mph, a “strong breeze” on the Beaufort Scale.

This all comes from a Press Association report, which in turn appears to have been fed by the Met Office.

Why the Met Office should decide to deliberately mislead the public is anybody’s guess.

The Telegraph goes on to mention that 77mph gusts were recorded in High Bradfield, South Yorkshire.

I live 5 miles away from High Bradfield, and it was no more than a bit windy. So it won’t come as any surprise that High Bradfield is also a high altitude site, high up in the Peak District at 395m.

The nearest site with up to date data, according to the Met Office, is Watnall, 32 miles away in Nottinghamshire.

There wind speeds only reached 24 mph, a “Fresh Breeze” on the Beaufort Scale.

Even in Southern Scotland, the area worst affected in Britain, where the Met Office reported gusts of 72 mph high up on exposed cliffs above the Solway near Dundrennan, the mean wind speed peaked at 54 mph, still only a “Strong Gale”.

The headline claim that Storm Eleanor has lashed the UK with violent storm-force winds of up to 100mph is quite fraudulent.

January 3, 2018 Posted by | Deception, Fake News, Mainstream Media, Warmongering, Science and Pseudo-Science | , | Leave a comment

Botched reporting- A Reply to The NYT on “How Climate Change Deniers Rise to the Top in Google Searches”

By Leo Goldstein | Watts Up With That? | January 1, 2018

The New York Times remains a slave to climate alarmism even after its miserable failure in Paris on December 12, and continues to push the fossil fuels conspiracy theory. It’s regularly publishing fake news. A NYT piece that appeared on December 29, How Climate Change Deniers Rise to the Top in Google Searches, mentions me, my website DefyCCC, and WUWT, and I take this opportunity to reply. In November and December 2017, I experimented with distributing the climate realism message using advertising options on Google and some other platforms. I will report on the results of this experiment in a separate article.  Apparently, some of my Google ads caught the attention of the NYT. On December 4th, a NYT reporter named Hiroko Tabuchi interviewed me for 45 minutes in preparing for the above NYT piece.

In the interview, I attempted to convince the reporter that the NYT got science wrong, that real scientists are against climate alarmism, and that other countries build coal power plants and more. The reporter was honest in telling me that the NYT piece would be about the ads, not about the climate debate (I hope NYT does not fire her for this act of honesty, unfit for its organizational culture), so I already knew what to expect. However, the piece weaves lies, half-truths, and trivial facts so seamlessly that it elevates fake news into an art form.  I will comment only on some falsehoods related to me.

The only thing that surprised me in the NYT piece was how it used me to link Trump to Russia:

“Of course, people click,” said Mr. Goldstein, who said he had emigrated from Russia two decades ago and had worked in the software and power industries. “Google is the No. 1 advertising choice.”

The proliferation of climate disinformation, both online and off, has coincided with an effort to undermine measures to combat climate change. Republican leaders regularly question climate science and President Trump has called climate change a hoax.

I emigrated from the Soviet Union (not from Russia) before it dissolved in 1991, the dissolution that happened twenty-six years ago. I was born and grew up in the Ukraine, then a part of the Soviet Union. This information is present in the About page of my site. I did not tell the reporter that I “emigrated from Russia two decades ago.” Here, the New York Times has “slightly” changed times and names in order to evoke another conspiracy theory, one of a Trump-Russia collusion. The rest could have been expected. This is how the NYT linked me to the Koch brothers:

DefyCCC, the site that recently bought the “climate change” search term on Google, devotes an entire section of its site to content from WattsUpWithThat, a well-known climate denial site by the blogger Anthony Watts. Mr. Watts has received funding from the Heartland Institute, backed by the billionaire Koch brothers.

Beyond that, little is known about DefyCCC. …

The reporter ran this line (except for the last quoted sentence) by me in the interview. In fact, DefyCCC has no sections at all. It does have a menu, and links to my articles in WUWT are collected under top menu items In WattsUpWithThat and WUWT 2016. I explained that to the reporter. But the NYT still published this line, falsely insinuating that I am connected to the Koch brothers. The next sentence was supposed to cement this lie as truth:

“Beyond that, little is known about DefyCCC.”

This is also a typical line in the hatchet job pieces, used when it cannot find dirt on somebody. For the record, I also told her I have no information about other allegations in that paragraph. Further in the piece, the NYT made another wild insinuation about me:

He received help with his site but would not say who his backers were to protect their privacy.

In the interview, I said I have colleagues and refused to name them. Then, I told the reporter about the shooting of the UAH building as a reason to withdraw personal information. This topic was blacked out by the media, so the NYT didn’t mention it, but made up its own explanation. This is where fake news becomes an art form. In the sentence, the word help (from colleagues or coworkers) is followed by the word backers, subtly turning it into financial support. And then a quote, taken out of context, cements this impression.

Having written about my imaginary backers, the NYT failed to disclose its own. Its largest shareholder is Mexican multi-billionaire Carlos Slim, who was the world’s richest man a few years ago. Mr. Slim has significant investments in oil and natural gas in Latin America, which compete against U.S. oil, gas, and coal industries. The NYT’s attempts to damage the U.S. fossil fuels industry and promote the financial interests of its largest shareholder.

I took record of the insults that the NYT hurled at me, but I will not dignify them with a response.

The NYT piece mentions WUWT and DefyCCC, but it links to neither of our sites. I understand that it doesn’t want to transfer “link equity” or encourage readers to visit them. But, when the NYT wrote about white supremacists, it linked to Stormfront with a perfect, link equity carrying link (3), although it didn’t have to, or could have used a nofollow tag that prevented transfer of link equity. When I checked in September 2017, I found that the top neo-nazi websites received most of their link equity from the leftstream media. Just a note.

I don’t want to finish this article on the NYT links to neo-nazi websites. Sorry, I mean, the links from the NYT site to neo-nazi sites. Reading the NYT is not only misinforming, but also morally degrading. The NYT published two pieces about UFOs in December 2017:  2 Navy Airmen and an Object That ‘Accelerated Like Nothing I’ve Ever Seen’ (in the section Politics) and Dad Believed in U.F.O.s. Turns Out He Wasn’t Alone (in the section News Analysis). Seems to me that the NYT is looking for its niche among tabloids.

Notes

Carlos Slim owned ~17% of class A shares of the NYT until a few months ago. But Class A shares of the NYT elect only about one third of the board. Class B shares are thought to be held by the Ochs-Sulzberger family. Father and son Ochs-Sulzberger have been the NYT publishers since 1963, so the NYT was considered independent from external financial influences. But, in the precarious financial situation into which the NYT painted itself by serving as a propaganda accessory and by false reporting — money ends up mattering more than formal voting rights. Thus, Carlos Slim probably wields or wielded much more power in the editorial room of the NYT than previously thought. To his credit, he is not a liberal. Mr. Slim also owns substantial interest in the tobacco industry around the world, which makes the NYT a sister company of Big Tobacco.

Posts about the New York Times take a good part of the fakestream media category in DefyCCC. Besides printing fake news, it was caught doing near-Orwellian re-writing of its articles to toe the party line. I have even proposed a new logo and byline for it that better reflects its new nature. It can use them free of charge under a Creative Commons license, just like other content of my website.


Addendum by Anthony:

The way the NYT article is written, it implies that WUWT has an ad campaign running in Google Adwords to attract readers. It does not, and never has. We have no advertising budget. The article also implies that WUWT is funded by the Heartland Institute. It is not and never has been. Neither WUWT nor the owner Anthony receives any payroll or regular funding from Heartland. We rely entirely upon advertisements (managed by WordPress.com and a sharing agreement) and donations from readers. In the past, Heartland helped locate a donor for a project, and Anthony has been given a $1000 honorarium and travel expenses to speak at some Heartland conferences on climate change, just like any other speaker, including pro-warming/pro-climate change scientist, Dr. Scott Denning.

Tabuchi also insinuated that WUWT and/or me is funded by the Koch Brothers; this is a laughable falsehood. They have never sent me a dime, either directly or indirectly. They don’t even know who I am and I’ve never had any contact with them or their charitable organization; it’s just a weak conspiracy theory pushed by the weak-minded who would rather take talking points from others than do their own homework.

But, the writer, one unheard of Ms. HIROKO TABUCHI never bothered to ask any questions of me. So as a journalist, she fails miserably based by relying on and writing about her own assumptions.

Is this the best the New York Times can do? Apparently so.

January 1, 2018 Posted by | Deception, Fake News, Mainstream Media, Warmongering, Science and Pseudo-Science, Timeless or most popular | , | 1 Comment

Earth is ‘Pretty Prepared’ for Upcoming Mini Ice Age – Researcher

Sputnik – December 29, 2017

Fluctuations in solar activity could mean a mini ice age on Earth in the 2030s, according to a mathematical model of the sun’s magnetic field produced by scientists in the UK and Russia. Professor Valentina Zharkova of Northumbria University, who led the research, told Radio Sputnik that the Sun regularly goes through a “hibernation stage.”

Sputnik: Are there any historical records of similar ice ages and what impact did they have on the environment and people?

Professor Valentina Zharkova: Yes, of course there are. The closest was the Maunder Minimum the 17th century, which lasted from 1655 to 1715, about 70 years, and we will have a slightly shorter ice age than there was 300-370 years ago.

Prior to that was the Wolf Minimum, which was something similar, during Roman times. They keep repeating every 350-400 years because the Sun goes through this [period of] minimum activity.

Sputnik: You’ve previously said that the magnetic waves from the Sun could become more active again in the 2050’s and that we have to be ready for the next big solar activity. What measures can be taken to prepare our planet for this?

Professor Valentina Zharkova: The planet is pretty prepared, it’s obviously been doing this for billions of years and survived. It has natural mechanisms which respond to solar activity in different ways. The problem will be for us to pass through the minimum of current magnetic field activity, which will come in the next 30 years, because I can only guess that the vegetation period will start reducing.

If you have less [solar] emissions, less radiation and a dropping temperature it means that vegetables won’t be able to grow properly, wheat can’t grow properly, so we might have a problem with some sorts of food which we will probably need to think through.When the Sun comes back, it will be like the previous 100-200 years, there will be some fluctuating activity but mostly a standard which people at the moment are very used to and have very nice models describing it.

Sputnik: Have you had any communication or reaction from other researchers regarding your study?

Professor Valentina Zharkova: I have strong support from a few hundred [researchers], maybe more. They invite me to international conferences for talks. I also have some people who criticize [my work] very strongly, these are mostly people who rely on the single [solar] dynamo model.

We just invited a Russian dynamo expert Elena Popova to help us to understand what we found, and this is how we got involved with the model. In our case, we assume there are two layers where the dynamo layers are generated and this is how our observations can be explained.

Those people who have worked all their lives producing rays from a single layer, they strongly resist [a different theory] for the reason that they invested so much time and effort in papers and other work. To some extent, it takes a while to accept new ideas but things are moving on because seismic observations show that the Sun has two cells with different regional circulations in the solar interior.

So, this idea of two dynamos is not alien, the Sun is trying to tell us, “Guys, look at me more carefully! And if they look more carefully they will discover that indeed it has these two cells which have been discovered.

We found two waves from magnetic field observations, not from theory. We got two of them [waves] with similar values. So, the Sun is telling us, “look, we have two waves, not one!” This is how we came up with the two dynamo waves, because the Sun has shown us this during observations.

December 30, 2017 Posted by | Environmentalism, Science and Pseudo-Science, Timeless or most popular | , | Leave a comment

Butterflies And Junk Science

By Paul Homewood | Not A Lot Of People Know That | December 16, 2017

 Camille Parmesan

Climate scientist, Camille Parmesan, is one of the recipients of President Macron’s largesse in awarding $70 million to US scientists.

I wonder whether the French public realise how much junk science they will be paying for.

Parmesan is famous for her studies on butterflies, which she argues are being forced polewards, and even being extinguished, because of climate change.

However, Jim Steele, Director emeritus Sierra Nevada Field Campus, San Francisco State University was not convinced by her work, and decided to take a closer look.

Here is his account, as published at WUWT in 2013:

Guest essay by Jim Steele, Director emeritus Sierra Nevada Field Campus, San Francisco State University

The pioneers of chaos theory coined the term “butterfly effect” to suggest that a hurricane’s formation could be affected by such unpredictable influences as the flap of a distant butterfly’s wings that changed the winds’ direction weeks before. Ironically, it was Dr. Camille Parmesan’s 1996 seminal Edith’s checkerspot butterfly paper titled “Species and Climate Range”1 that became the model for future peer-reviewed papers that blamed climate change for driving species northward and upward and causing species extinctions. Featured on the Union of Concerned Scientists’ website, Parmesan echoed Dr. Jim Hansen’s catastrophic predictions that global warming was already forcing global ecological collapse, “The latest research shows clearly that we face the threat of mass extinctions in coming years,” she says.

“My hope is that we will be able to reduce emissions enough so that assisted colonization efforts can be successful, because at the higher ranges of scientists’ projections of warming trends, frankly, we’re sunk.” For promoting global warming theory, she subsequently earned an invitation to speak at the White House and became one of just four biologists to partake in third global climate assessment by the United Nations’ Nobel-Prize-winning Intergovernmental Panel on Climate Change (IPCC). By 2009, Parmesan ranked as the second-most cited author of papers devoted expressly to global warming and climate change.2

Euphydryas editha in Olympic National Park Image: Wikipedia

Einstein said, “A question that sometimes drives me hazy: am I or are the others crazy?” and the fanfare given Parmesan drove me hazy. Detailed studies by butterfly experts and conservationists dedicated to saving the butterfly from extinction had all blamed habitat destruction and sought habitat restoration. In contrast Parmesan blamed global warming and argued for reduced carbon emissions. She had blamed “global” warming even though most maximum temperatures in California had not risen significantly.3 More disconcerting the butterflies never migrated northward or upward, as claimed. Yet she now seeks funding to support an ecologist’s worst nightmare, assisted colonization. Parmesan wants to create her own Noah’s ark shuttling animals northwards and upwards so they can escape the supposed rising tide of warmth predicted by models, despite the fact that introducing species into new habitat brings disease and disrupts the established ecological balance.

clip_image004

To her credit Parmesan had diligently spent four years of extensive and laborious fieldwork revisiting locations where the butterfly had been observed earlier in the century. However after verifying that more populations had gone extinct in the southern extremes and at the lowest elevations of the butterfly’s range, Parmesan enthusiastically claimed her results were consistent with global warming theory. In 2010 she summarized her work: “it was a bloody obvious change. These butterflies were shifting their entire range over the past century northward and upward, which is the simplest possible link you could have with warming. I was expecting some incredibly subtle, sophisticated response to warming, if at all. What I got was 80% of the populations in Mexico and the Southern California populations were extinct, even though their habitats still looked perfectly fine.”2 But as I discovered later Parmesan always knew the butterflies had never migrated further north or to higher elevations.

Hansen’s global warming theory had predicted that the increasing maximum temperatures would push animals northward and upward, however Parmesan failed to mention that most of California’s maximum temperatures had never exceeded the highs of the 1940s as seen in Yosemite National Park. In fact her paper never analyzed local temperatures at all.

clip_image002

Parmesan relied on the political global warming bias. Parmesan was speaking globally, but butterflies always act locally. Ask any university ecology professor. They would not hesitate to harshly criticize an undergraduate term paper that used a “global average” to explain a local event; yet that was her only climate “evidence”.

Furthermore Parmesan failed to address the fact that higher temperatures enhanced the butterfly’s survival. Warm microclimates are critical for its survival. Caterpillars living in cooler microclimates develop more slowly, while those actively basking in the direct sunlight digest their food more quickly and grow more robustly. Cool rainy years often extirpated local populations.

Since the 1950s, Stanford University’s Paul Ehrlich and his colleagues had made detailed observations throughout the checkerspot’s habitat on the Jasper Ridge Preserve. They determined that the caterpillars must raise their body temperature an additional 18-21°F above ambient air temperatures. To raise their body temperature, caterpillars shuffled across the hillsides seeking life‑giving hotspots.4,5,6 Any global warming, natural or anthropogenic, should have been a benefactor, not an executioner.

Parmesan’s observations of extirpated populations were not new. Conservationists had sounded the extinction alarm years before her “global warming study”. Butterfly populations had diminished so quickly that the checkerspot’s apparent fate was compared to the rapid ruination of the extinct passenger pigeon. Scientists working to prevent extinction had always warned that the suburban sprawl from Los Angeles to San Diego had devoured the butterfly’s critical habitat and extirpated most populations.7,8 When the checkerspot’s southern California Quino subspecies was finally listed as endangered, conservation scientists wrote, “The basis for the listing was habitat loss, degradation, and fragmentation, recognizing additional negative effects from fire management practice. All factors are the results of intensive human economic development of ever diminishing resources.”60

The conservationists’ detailed studies also reported that most extinctions observed in southern California had already transpired by the 1970s, before any purported CO2 warming had significantly developed and furthermore populations were now recovering. In 2003 researchers wrote, “although we now know that the butterfly likely disappeared from Orange County thirty years ago, it was rediscovered in Riverside County in the early 1990s, and in San Diego County at several formerly occupied sites soon after.”8

Nor were extinctions limited to the southern end of the butterfly’s range. Rapid urban development entirely extirpated the Canadian subspecies (the Taylor checkerspot) from the coldest northern end of the butterfly’s range. But because there was a greater preponderance of extinctions in southern California, the “average statistical center” for the species migrated northward. There was never any evidence of any real migration due to warming. There was never an apocalyptic flight to cooler lands. Parmesan’s climate claim was solely a statistical fairy tale. Still Parmesan’s unscientific climate claim was published in one of the most prestigious scientific journals with one of the highest rejection rates, Nature.

How did Parmesan deal with the multitude of contradictory factors? Instead of a more detailed study, she simply argued, “the predicted effects of climate change will come, not from attempts to analyze all possible confounding variables in single studies such as this one, but from replication of this type of study.”1 In essence, by arguing that confounding factors were no longer important, she suggested we throw out the foundation of good scientific analyses. To demonstrate the negative impacts of climate change, all anyone needed to do was demonstrate that populations were dwindling in the south more than in the north, or dwindling more at lower elevations than at higher elevations. Implausibly, the prestigious journal Nature supported this “new climate science.”

Defying the Experts

The evidence against any CO2 connection was overwhelming, but I was no butterfly expert. Needing a reality check, I talked with my friend Dr. Paul Opler, one of North America’s top butterfly experts. If you have ever spent any time with Paul, you quickly realize that no one has a greater love for butterflies. If there was the smallest threat, he would be the first to speak out. In 1974, he was hired as the first invertebrate specialist for the United States Federal Endangered Species program. Virtually every butterfly species now listed as endangered was listed under his watch. To my great good fortune, he agreed to teach a course, “Butterflies of the Sierra Nevada” (which he still teaches), for my environmental education program each year. When he visited, I expressed my doubts about the legitimacy of Parmesan’s claims and my bewilderment at all the media hype, and I asked if he had seen any supporting evidence.

He carefully stated that from all the data he had perused, he had seen absolutely no evidence that any butterflies had ever moved northwards, nor had they been pushed to higher elevations. He added the checkerspot has now been discovered further south in Baja, Mexico. He too couldn’t understand the public fanfare and echoed my thoughts that “only her statistical averages moved, not the butterflies”. Due to his expertise, Opler had been invited by the Fish and Wildlife Service to comment on the proposed recovery plans for the subspecies in southern California and wrote:

The lengthy space given to Camille Parmesan’s study and the suggestion that newly found colonies are the result of global warming is highly speculative. Her study did not find new northern, or higher populations of the species. Her results were a statistical artifact of the purported loss of low-lying southern populations (emphasis added). Her surveys that showed the absence of butterflies in some population areas could have been carried out in relatively bad years when the species was present only as diapausing larval clusters. (Diapause is a period of dormancy similar to hibernation)

Opler was not the only expert to dissent. Other scientists, armed with detailed studies aimed at insuring the butterfly’s recovery and survival, also disagreed. “Our observation that human impacts were almost always involved in local extirpations in southern California (even for those areas that may seem to still have “suitable habitat”), the role of global warming as the proximate cause of extinction must be carefully evaluated. We suspect that warming is perhaps an exacerbating factor, but that increased extinction rates in southern California are primarily caused by more direct anthropogenic forces.”7

So I decided Parmesan’s landmark climate study needed to be replicated with a more critical eye on the contributing land use factors. However, when I looked for her methods section there was none. Her study had been published as a correspondence, and in Nature, a correspondence doesn’t require a methods section that allows for independent verification. That also explained how her paper survived a gauntlet of disagreement by leading experts. A correspondence is not typically peer reviewed. It is published simply based on the advocacy of Nature’s editors.

Withholding the Evidence

“We are trying to prove ourselves wrong as quickly as possible, because only in that way can we find progress.” -Dr. Richard Feynman, Nobel Prize in Physics

I emailed Dr. Parmesan and asked for the locations of the extinct populations. After months without reply, I called. Caught off guard, she initially refused to share any data, but after more discussion offered the possibility of collaboration. She said she needed to hang up but promised to send some data. More than three years later, I am still waiting. So much for Feynman’s good scientist “trying to prove ourselves wrong as quickly as possible.”

Her husband eventually responded to a follow-up email I sent a year later in which I expressed my frustration with their failure to allow independent verification. Her husband, Dr. Michael Singer, is a checkerspot expert who had shared in her research. Singer unintentionally confirmed Opler’s criticisms, “Her study did not find new northern, or higher populations of the species…There are no ‘new’ northern populations in Parmesan’s study. The study consisted entirely of re-examining populations known from past records and assessing which of them was currently extant or extinct. No ‘new’ populations were sought or found (emphasis added).” Trying to discourage my replication efforts Dr. Singer wrote, “But I do remember writing to you to say that E. editha has been increasing through the 2000s and that many of the populations that Camille and I recorded as extinct in the 1990s have been recolonized….So, any new census of Sierra Nevada populations would show a reduced correlation between elevation and population status, perhaps no longer a significant correlation.” Singer and Parmesan illustrate a glaring problem when limiting debate to peer-reviewed journals. Contradictory evidence is simply never published.

So why haven’t they published this good news of the butterfly’s recovery? Why did only her erroneous climate gloom and doom bring worldwide acclaim? Despite a wealth of evidence that contradicted global warming predictions, her faulty “Climate and Species Range” went viral and is now cited by over 580 articles. In contrast just 17 have cited the paper detailing conservationists’ efforts that actually saved the butterfly, “The Endangered Quino Checkerspot Butterfly”. Parmesan wrote subsequent papers blaming extreme weather and climate change for population extinctions and again withheld evidence of the species’ success. Likewise her half-truths were immediately embraced and published by our leading climate scientists and then cited by more than a thousand articles. That deception however requires a future essay.

https://wattsupwiththat.com/2013/07/14/fabricating-climate-doom-part-1-parmesans-butterfly-effect/

This really is a stunning account of malpractice, but it is not the only example that Jim Steele found.

He also identified serious problems with another Parmesan paper in 2000:

How the American Meteorological Society Justified Publishing Half-Truths

Background: In 2000, the Bulletin of the Meteorological Society published “Impacts of Extreme Weather and Climate on Terrestrial Biota” by Camille Parmesan, Terry Root, and Michael Willig. The paper introduced to the peer-reviewed literature analyses by Parmesan that extreme weather events had caused an extinction event in California’s Sierra Nevada and advocated the extreme weather was the mechanism by which global warming was driving animals northward and upward as Parmesan claimed in her first controversial paper discussed here. According to Google Scholar, the BAMS paper has been cited by 324 consensus articles. Thomson Reuter’s Essential Science Indicators  report that by December 2009, Parmesan went on to be ranked #2 among highly cited authors for papers devoted expressly to global warming and climate change.

Below is a map of Parmesan’s study site first published in Singer, M., and C. D. Thomas (1996) Evolutionary responses of a butterfly metapopulation to human and climate-caused environmental variation. American Naturalist, vol. 148, p. S9–S39. I have added call out boxes. Notice how surgically “climate changed” supposedly killed individuals on the annual plant Collinsia (Xs) in the logged clearling while just a few feet away the same species was originally reported to be thriving on its normal host plant in undisturbed habitat. The observations of those thriving populations were later “amputated” from Parmesan’s extinction story that she spun in “Impacts of Extreme Weather and Climate on Terrestrial Biota

Parmesan et al biased their conclusion by omitting observations that all other individuals in the surrounding natural habitat had survived better than had ever been observed during the same weather events. Only the butterflies that had recently colonized a novel plant species in a highly disturbed logged area had been extirpated. If all observations were honestly presented, it would have been both an example of nature’s resilience and an example of the effect of landscape changes on microclimates. By omitting half of the data, their paper manufactured an illusion of extreme climate catastrophe as discussed here. So I requested an official retraction. It was no more honest than Enron officials leaving half the data off their books.

http://landscapesandcycles.net/American_Meterological_Society_half-truth.html

Needless to say, the AMS refused to retract.

With a track record like this, it is little wonder Parmesan has to go abroad for funding.

December 16, 2017 Posted by | Science and Pseudo-Science, Timeless or most popular | , , | 3 Comments

The Polar-Bear-Gate Saga: How a picture is worth a thousand lies – Paul Nicklen and Michael Mann vs Susan Crockford

By Jim Steele | Watts Up With That? | December 16, 2017

What oddly seems to surprise so many people, reality can quickly disagree with the hypotheses and speculative models of scientists. The polar bear is a rich case in point. In 2008, the polar bear was listed as threatened under the Endangered Species Act as a result of the Center for Biological Diversity’s (CBD) petition. Due to hypotheses regards future effects of increasing CO2 on sea ice and polar bear health, CBD argued polar bears were endangered. However then Interior Secretary Kempthorne made it clear that “the ESA will not be used as a tool for trying to regulate the greenhouse gas emissions blamed for creating climate change.” But as seen in other memos and petitions, such as for the bearded seals, the CBD ultimately wants to use the ESA as a tool to regulate CO2.

So the CBD stepped up their demands and petitioned the Obama administration to list the bears as endangered. Climate scientists Ken Caldeira and Michael Mann co-authored a 2010 letter to Interior Secretary Salazar supporting CBD efforts. They warned “sea ice has been projected to disappear in the 2030s or before” and lost sea ice was both a future and “current threat to this important habitat of the polar bear.” The Polar Bear Specialist Group (PBSG) led by researchers like Andrew Derocher, Steve Amstrup and Ian Stirling warned the world that “two thirds of the world’s bears will be lost by mid-century due to climate change”. The PBSG published a status table for all the polar bear sub-populations showing in the best studied populations, 8 were declining.

However, since 2010 those predictions have been unraveling. All the evidence now reveals polar bears are thriving and increasing, and the PBSG’s recent status tables show just that. Research by Chambellant and Stirling determined it was heavy springtime ice that was most detrimental to bears and their main prey, the ringed seal. The loss of Arctic summer sea ice was happening faster than CO2 driven models had predicted, suggesting flawed models. Research revealed that in response to the natural Arctic Oscillation, thick sea ice had been blown into the warmer Atlantic due to a directional shift in freezing winds. Further loss of Arctic sea ice has recently been shown to be caused by cycles of intruding waters from the Pacific and the Atlantic resulting in heat that gets stored in the subsurface of the Arctic Ocean, dynamics that have not been accurately incorporated into global climate models. Accordingly, the loss of sea ice has not accelerated. Instead the loss has slowed considerably.

Skeptics argued such evidence challenges prevailing hypotheses about the polar bears’ demise, and question the contention that greenhouse gases are the primary cause of sea ice fluctuations. Driven by the hubris of scientists like Michael Mann whose careers are totally invested in the “dire predictions” of rising CO2, the normal scientific process of challenging a hypothesis was framed as an “attack on science”.

Again in 2010, in the paper Climate Change and the Integrity of Science Peter Gleick wrote, “We are deeply disturbed by the recent escalation of political assaults on scientists in general and on climate scientists in particular. Accompanying his paper (below) was a photo-shopped picture of a polar bear stranded on a shrinking piece of ice. A deception that skeptics quickly pointed out.

clip_image002

So the following correction was placed in the paper’s online version.

“Due to an editorial error, the original image associated with this Letter was not a photograph, but a collage. The image was selected by the editors [of Science, the journal of the American Association for the Advancement of Science], and it was a mistake to have used it. The original image has been replaced in the online HTML and PDF versions of the article with an unaltered photograph from National Geographic.”

That replacement picture (below) was from National Geographic photographer Paul Nicklin, who would become infamous for specializing in dead and skinny polar bear photos. If Gleick or his editors were pulling photos from an archive (National Geographic ?) of photographs, then the question arises if the fake collage was also the work of the same photographer. And if so, for what purpose were they creating such a dishonest photo? The timing of the article and fake photo also raised suspicions from skeptics as it coincided with the Center for Biological Diversity’s campaign to up-list the polar bear from threatened to endangered,

clip_image004

Despite having “carelessly” used a fake photo, Gleick was anointed the Chairman of the new task force on “scientific ethics and integrity” for the American Geophysical Union in 2011. Leading by example, in 2012 Gleick was outed in a flagrant attempt to anonymously smear the Heartland Institute’s climate skepticism by disseminating documents dishonestly obtained, including a damning but forged memo. Quickly identified by internet skeptics, Gleick finally confessed. Although the forged document was only being disseminated by Gleick, he denied any hand in forgery, and there was not enough evidence to convict him of forgery. In a KQED interview, Michael Mann, likely motivated by self-protection, downplayed Gleick’s underhanded actions as “poor judgement”. Mann then argued the release of the climate-gate emails, emails that had exposed Mann’s own underhanded methods, was a more dastardly deed. To this day, it is still unknown if the release of climate gate emails were the work of a whistle-blower or a hacker.

However, consistent with Mann’s efforts to promote polar bears as an icon of catastrophic global warming, Mann expressed no concern about Gleick’s fake polar bear picture. Indeed Mann was actively trying to pull on heart strings by mewing in the CBD release, “When I ventured up to Hudson Bay in mid-November and saw the undernourished polar bears with their cubs, sitting around at the shore of the Hudson Bay, waiting for the then month-overdue sea ice to arrive so they could begin hunting for food, it suddenly came home for me. For the first time in my life, I actually saw climate change unfolding before my eyes. It was a sobering moment, and one I’ll never forget.” In contrast to such storytelling, the unpublished research data from Stirling and Lunn, determined polar bear’s Body Condition Index for Hudson Bay bears had been improving since 1998 (in Landscapes and Cycles, p. 217). Improving body condition was also consistent with the increasing number of Hudson Bay bears estimated in subsequent surveys.

Susan Crockford runs the website polarbearscience.com, that aggregates the most up-to-date, peer-reviewed science and media releases by polar bear researchers. For example, Crockford reported the latest survey showing a healthy rebounding Western Hudson Bay population, months before the Polar Bear Specialist Group (PBSG) researchers publicized the increase. The PBSG had incorrectly predicted a dramatic decline in Hudson Bay bears, so their tardiness to expose their own shortcomings is understandable. Crockford also reported the lack of consensus among polar bear researchers. While Enviornment Canada agreed with the latest survey that estimated a healthy 1,030 Western Hudson Bay bears, PBSG alarmist Andrew Derocher was actively pushing a much lower estimate of 800 bears to the media and suggesting the bears were doomed. This too is understandable as Derocher was invested in his earlier predictions that “by the middle of this century, two-thirds of the polar bears will be gone from their current populations”

Nonetheless despite multiple surveys suggesting polar bear abundance was and is increasing, others tried to deny the evidence and suggest bears were starving and still on the brink of extinction. In 2015, photos by Kerstin Langenberger and once again by Paul Nicklin were pumped on social media, suggesting bears were suffering from a climate catastrophe. Who were these photographers?

clip_image005

The dying bear above was put on Facebook by Kerstin Langenberger whom internet articles referred to as just a German photographer. But a little digging revealed she is a Greenpeace activist, which is consistent with her catastrophic narratives that accompanied her photo and contradicted our best science. She stated, “With the pack ice retreating further and further north every year, they tend to be stuck on land where there’s not much food,” and “many times I have seen horribly thin bears, and those were exclusively females – like this one here” and “Only once I have seen a bear getting a big fat ‘5,’ but several times I have seen dead bears and bears like this one: a mere ‘1’ on the scale, doomed to death.” [polar bears’ body condition is often rated from 1(dangerously thin) to 5 (fat)].

However contradicting Langenberger’s narrative, Norwegian Polar Institute researcher Kit Kovacs stated there’s reason to question claims that the number of animals experiencing such hardships is increasing. Our monitoring work indicates that (on-average) bears in the Svalbard population have NOT declined in condition over the last two decades – based on male body masses and fat levels”. Similarly, in the South Beaufort Sea population, female body condition had improved despite reduced summer ice.

clip_image007

Also in 2015, Nicklin posted his photo of a dead bear that went viral. Journalist Andrew Freedman promoted the picture in Mashable writing, “Global warming may have led to the death of this polar bear.” Presenting a thin veneer of objectivity, he quotes polar bear researcher Ian Stirling who suggested that Nicklen’s photo shows a bear that most likely, but not certainly, died as a result of starvation related to sea ice melt. But Stirling’s remarks must be taken with a grain of salt as there is absolutely no evidence to support why the bear died. Furthermore, Stirling has appeared slightly schizophrenic lately as has been detailed. For example despite his research showing cycles of heavy spring ice had been most detrimental to seals and bears, Stirling and Derocher’s review of polar bear “science” used the very same research to falsely imply that less summer ice was the problem.

In contrast to those 2015 photos, Crockford’s website was one of the few places where scientific reports of a healthy bear population could be found. Contradicting Langenberger and Nicklin’s story-telling of dead bears strewn across Svalbard due to climate change, Crockford posted links to actual researchers from the Norwegian Polar Institute who reported fat bears in Svalbard.

Researchers were reporting:

“The polar bears on Svalbard is round and full, thanks to a good [ice year] and good hunting opportunities.” And “… Polar bears were fat, many looked like pigs”, says polar researcher at the Norwegian Polar Institute, Jon Aars to the High North News. Furthermore the Svalbard bears are part of the Barent Sea population and in 2017 Crockford relayed the most recent survey data showing Barent Sea Bears have been increasing. But such facts don’t have the emotional appeal as Nicklin’s fanciful pictorial story telling.

The Polar Bear Specialist Group (PBSG) had created a status table in 2009 to illustrate the trends of each polar bear population. Above is their 2010 version. The trends are boldly shown in red for declining and green for stable or increasing populations. Eight populations were believed to be declining of which 6 were considered very likely to decline further. Only 3 populations were considered stable and only 1 was increasing. These declining PBSG estimates also went viral, and websites such as the one run by psychologist John Cook, who is now part of the well-funded Center for Climate Change Communication, posted an article concluding, “Current analysis of subpopulations where data is sufficient clearly shows that those subpopulations are mainly in decline” and thus support the ESA listing of polar bears as threatened. In contrast in Landscapes and Cycles I documented how bear populations since 2010 were definitely increasing based on latest research. That analysis has been confirmed while earlier PBSG hype of declining populations and speculation of coming extirpations have not survived the test of time.

Fortunately Susan Crockford’s Polar Bear Science blog has continuously discussed population trends as reported by bear experts plus PBSG’s status updates. While the PBSG removes their old tables, Crockford’s website serves as an archive that allows the public to readily witness how the bears have been increasing. For example the 2014 table (below) revealed the good news that only 3 of the past 8 populations were still declining, one was still increasing, and the stable populatons had doubled to 6.

Oddly in 2017 the PBSG eliminated the trends from their population table. The most likely reason for this omission would be that none of the bear populations are currently declining. Every population would be green or data deficient. Despite rising CO2 and reduced summer sea ice, polar bears are doing quite well and that contradicted the their predictions.

Of the 3 previously declining populations listed in their 2014 status report, the Baffin Bay population has now increased from 1,546 in 2004 to 2,826 in the most recent survey. The Kane Basin bears, that suffer from heavy ice, were estimated at 167 in 1997 but rose to 357 in 2014. The South Beaufort Sea population estimation remained unchanged but this population has been heavily criticized for poor analyses of mark and recatpure data.

clip_image011

clip_image013

In the face of rapid increases in the Baffin Bay bear population, a social media splash of Nicklin’s starving bear on Baffin Island appears to be another orchestrated attempt to resuscitate the failing claim that climate change is killing bears. National Geographic who sponsored Nicklin reports by “telling the story of one polar bear, Nicklen hopes to convey a larger message about how a warming climate has deadly consequences.” The NY Times pushed the video with similar headlines: Video of Starving Polar Bear ‘Rips Your Heart Out of Your Chest’. The Washington Post hyped the bear as evidence of an environmental disaster with the headlines, ‘We stood there crying’: Emaciated polar bear seen in ‘gut-wrenching’ video and photos. If you searched the internet for an objective scientific examination, oddly no matter how many variations of “starving polar bears” are queried Google’s first link brings up the WWF’s plea for money to save the bears, and perhaps a violation of net neutrality.

Snopes who advertises itself as a fact-checker of truth, rated Nicklin’s starving bear video as “TRUE”. But Snopes’ bias is revealed by its discussion on the photo’s relevance, which pushes catastrophic climate change speculation. Snopes quotes polar bear researcher Steve Amstrup, who has flip flopped on several bear issues over his career and whose “expertise model” has been severely criticized by colleagues in released emails. Amstrup promotes the starving bear photo on his website, again with the obligatory thin veneer of objectivity stating, “we cannot say, from the footage captured here, that this bear’s malnutrition was caused by global warming and its associated sea ice loss”. He then launches his speculative catastrophic message, “The problem is that an ever-warmer future means polar bears will have less and less access to their seal prey, so the rate at which bears die from malnutrition/starvation will increase. So, regardless of the proximate cause of this bear’s condition, this heart-wrenching footage provides us with a warning about the future.” Yet not a word about the survey of Baffin Bay bears robustly increasing from 1,546 in 2004 to 2,826 today.

clip_image015

Amstrup and Mann are facing an embarrassing professional dilemma. With all the polar bear populations increasing or stable, their predictions that two-thirds of the polar bears will be gone by the middle of this century appears destined for utter failure. They had to do something. Otherwise who would trust a doctor whose past diagnoses were absolutely wrong. So, Harvey, Stirling, Amstrup, Mann and a professor of psychobabble Stephan Lewandowsky, banded together as coauthors of the paper Internet Blogs, Polar Bears, and Climate-Change Denial by Proxy that fortuitously gets publicized alongside Nicklin’s starving bear hype.

Their paper acknowledges observations that polar bears have yet to be harmed writing, “Although the effects of warming on some polar-bear subpopulations are not yet documented and other subpopulations are apparently still faring well.” But they then confuse speculation with proven facts by suggesting “the fundamental relationship between polar-bear welfare and sea-ice availability is well established.” Clearly the growing bear populations present an undeniable challenge to any belief in the “requirement” of summer ice.

Their paper argued, “a growing body of scientific research reports the wide array of negative effects of AGW on biodiversity” by citing Parmesan whose bogus claims about the negative effects of climate change on wildlife are well documented. Harvey, Stirling, Amstrup and Mann confuse speculative hypotheses with “fundamental relationship”. Published observations have shown heavy springtime ice is more harmful for seals and bears. Observations by Arrigo determined that reduced ice, whether natural or anthropogenic, has increased phytoplankton productivity and bolstered the Arctic food web, while fishery researchers find that less ice and warmer temperatures increase Arctic cod abundance that is required to sustain the seals that sustain the bears.

Because skeptic websites like Crockford’s polarbearscience.com, Anthony Watts’ WUWT, and many others are the best source for alternative explanations that challenge catastrophic hypotheses, they are denigrated by these supposed objective scientists. As mounting evidence continues to turn against their prior polar bear predictions Harvey, Stirling, Amstrup, Mann and Lewandowsky’s were running low on scientific ammunition. So now they chose to publish a paper, solely aimed at shooting the messengers. They offered no scientific facts about polar bears that contradicted anything Crockford had published. Their arguments were based solely on the fallacy of authority, authorities whose predictions are failing. Their paper is nothing more than a smear campaign hoping to suppress the upwelling call for more debate. Such tactics, tactics that try to obscure any evidence that challenges a failing hypothesis, are the real attacks on the scientific process. That is why Mann has been labeled by some as a disgrace to the profession. And whether or not Nicklin’s latest wretched polar bear photo is part of an orchestrated attempt to resuscitate their failed predictions, the media hype reveals that such photos, taken out of context, are worth a thousand lies.

Jim Steele is Director emeritus Sierra Nevada Field Campus, San Francisco State University and author of Landscapes & Cycles: An Environmentalist’s Journey to Climate Skepticism

December 16, 2017 Posted by | Science and Pseudo-Science, Timeless or most popular | Leave a comment

The Environment: A True Story

John Robson | October 2, 2017

A comparison of alarmist claims about man-made global warming with widely accepted scientific facts about the past history and current condition of the Earth.

December 1, 2017 Posted by | Deception, Science and Pseudo-Science, Timeless or most popular, Video | Leave a comment

A veneer of certainty stoking climate alarm

In private, climate scientists are much less certain than they tell the public. – Rupert Darwall

Rupert Darwall has written a tour-de-force essay “A Veneer of Certainty Stoking Climate Alarm“, which has been published by CEI [link to full essay].

Foreword

I was invited to write a Foreword to the essay, which provides context for the essay:

While the nations of the world met in Bonn to discuss implementation of the Paris Climate Agreement, the Trump administration was working to dismantle President Obama’s Clean Power Plan and to establish a climate “red team” to critically evaluate the scientific basis for dangerous human-caused climate change and the policy responses.

The mantra of “settled science” is belied by the inherent complexity of climate change as a scientific problem, the plethora of agents and processes that influence the global climate, and disagreements among scientists. Manufacture and enforcement of a “consensus” on the topic of human-caused climate change acts to the detriment of the scientific process, our understanding of climate change, and the policy responses. Indeed, it becomes a fundamentally anti-scientific process when debate, disagreement, and uncertainty are suppressed.

This essay by Rupert Darwall explores the expressions of public certainty by climate scientists versus the private expressions of uncertainty, in context of a small Workshop on Climate organized by the American Physical Society (APS). I was privileged to participate in this workshop, which included three climate scientists who support the climate change consensus and three climate scientists who do not—all of whom were questioned by a panel of distinguished physicists.

The transcript of the workshop is a remarkable document. It provides, in my opinion, the most accurate portrayal of the scientific debates surrounding climate change. While each of the six scientists agreed on the primary scientific evidence, we each had a unique perspective on how to reason about the evidence, what conclusions could be drawn and with what level of certainty.

Rupert Darwall’s essay provides a timely and cogent argument for a red/blue team assessment of climate change that provides both sides with an impartial forum to ask questions and probe the other side’s case. Such an assessment would both advance the science and open up the policy deliberations to a much broader range of options.

Excerpts

Here are some highlights from the full essay (but you’ll want to read the whole thing!):

Introduction. How dependable is climate science? Global warming mitigation policies depend on the credibility and integrity of climate science. In turn, that depends on a deterministic model of the climate system in which it is possible to quantify the role of carbon dioxide (CO2) with a high degree of confidence. This essay explores the contrast between scientists’ expressions of public confidence and private admissions of uncertainty on critical aspects of the science that undergird the scientific consensus.

Instead of debating, highlighting and, where possible, resolving disagreement, many mainstream climate scientists work in a symbiotic relationship with environmental activists and the news media to stoke fear about allegedly catastrophic climate change, providing a scientific imprimatur for an aggressive policy response while declining to air private doubts and the systematic uncertainties.

Two Statements, Two Perspectives. Two statements by two players in the climate debate illustrate the gap between the certainty that we are asked to believe and a branch of science shot through with uncertainty. “Basic physics explains it. If global warming isn’t happening, then virtually everything we know about physics is wrong,” states Jerry Taylor, president of a group that advocates for imposing a carbon tax on the United States. In so many words, Taylor says that the case for cutting carbon dioxide emissions is incontrovertible: Science demands conservatives support a carbon tax.

The second statement was made by an actual climate scientist, Dr. William Collins of the Lawrence Berkeley National Laboratory. Speaking in 2014 at an American Physical Society climate workshop, Collins, who was a lead author of the chapter evaluating climate models in the 2013 Intergovernmental Panel on Climate Change’s (IPCC) Fifth Assessment Report, talked of the challenges of dealing with several sources of uncertainty. “One of them is the huge uncertainties even in the historical forcings,” he said. Commenting on the “structural certainty” of climate models, he observed that there were “a number of processes in the climate system we just do not understand from basic physical principles. … We understand a lot of the physics in its basic form. We don’t understand the emergent behavior that results from it.

The 2014 APS Climate Workshop: A Perfect Venue for Open Debate. Things are different when climate scientists are on the stand alongside their peers who know the science as well as they do, but disagree with the conclusions they draw from the same body of knowledge. Such open debate was on display at the 2014 American Physical Society climate workshop, which took place in Brooklyn and lasted just over seven hours. A unique event in the annals of the climate debate, it featured three climate scientists who support the climate change consensus and three climate scientists who do not. That format required an unusual degree of honesty about the limitations of the current understanding of the climate system. For the most part, circumspection, qualification, and candid admissions of lack of knowledge were the order of the day.

The IPCC’s Use and Abuse of Climate Models. The discussion in Brooklyn shows that putting the words “gold standard” and “IPCC” in the same sentence demonstrates a serious misunderstanding of the reliability of IPCC-sanctioned climate science.

“It’s clouds that prevent us from fundamentally in some reductive fashion understanding the climate system,” Princeton Atmospheric and Oceanic Sciences Professor Isaac Held, senior research scientist at the National Oceanic and Atmospheric Administration’s (NOAA) Geophysical Fluid Dynamics Laboratory, declared from the IPCC climate consensus bench. Collins made a similar point toward the end of the session. “My sense, to be honest with you, is that, and I think this all makes us a little bit nervous,” he said; “climate is not a problem that is amenable necessarily to reductionist treatment.”

Yet the IPCC’s top-line judgment in its Fifth Assessment Report—that it is “extremely likely” that the human emissions of greenhouse gases are the dominant cause of the warming since the mid-20th century—was described by Dr. Ben Santer of the Lawrence Livermore National Laboratory as likely to be conservative. The basis for this claim? General circulation models.

Santer’s claim would have sounded impressive if earlier in the day Collins had not presented charts showing GCMs performing poorly in reproducing temperature trends in the first half of the 20th century. Lindzen asked, what in the models causes the 1919-1940 warming? “Well, they miss the peak of the warming,” Held replied. While the IPCC is extremely certain that the late 20th century warming is mostly man-made, to this day it cannot collectively decide whether the earlier warming, which is of similar magnitude to the one that started in the mid-1970s, is predominantly man-made or natural. “It actually turns to be very hard to use the past as prologue,” Collins conceded before explaining: “We do not have a first principles theory that tells us what we have to get right in order to have an accurate projection.” And, as Held noted, over the satellite era from 1979, GCMs over- estimated warming in the tropics and the Arctic.

Steven Koonin, chairing the APS workshop, read an extract from chapter 10 of the IPCC’s Fifth Assessment Report. Model-simulated responses to forcings—including greenhouse gas forcings—“can be scaled up or down.” To match observations, some of the forcings in some of the models had to be scaled down. But when it came to making the centennial projections, the scaling factors were removed, probably resulting in a 25 to 30 percent over-projection of the 2100 warming, Koonin said. Only the transcript does full justice to the exchange that followed.

Dr. Koonin: But if the model tells you that you got the response to the forcing wrong by 30 percent, you should use that same 30 percent factor when you project out a century.
Dr. Collins: Yes. And one of the reasons we are not doing that is we are not using the models as [a] statistical projection tool.

Dr. Koonin: What are you using them as?
Dr. Collins: Well, we took exactly the same models that got the forcing wrong and which got sort of the projections wrong up to 2100.
Dr. Koonin: So, why do we even show centennial-scale projections?
Dr. Collins: Well, I mean, it is part of the [IPCC] assessment process.

“It is part of the assessment process” is not a scientific justification for using assumptions that are known to be empirically wrong to produce projections that help drive the political narrative of a planet spinning toward a climate catastrophe.

Climate Science and Falsifiability. A lively exchange developed between Christy and Santer. Georgia Tech’s Dr. Judith Curry, the third member of the critics’ bench, had crossed swords with Santer on whether the IPCC’s statement that more than half the observed warming was anthropogenic was more than expert judgment. In subsequent testimony to the House Science, Space, and Technology Committee, Curry explained:

Science is often mischaracterized as the assembly and organization of data and as a collection of facts on which scientists agree. Science is correctly characterized as a process in which we keep exploring new ideas and changing our understanding of the world, to find new representations of the world that better explain what is observed. … Science is driven by uncertainty, disagreement, and ignorance—the best scientists cultivate doubt.

Curry’s approach to science stands firmly on the methods and philosophical standards of the scientific revolution—mankind’s single greatest intellectual achievement.

Politicized Science vs. Red/Blue Team Appraisals. The APS workshop provides the strongest corrective to date to the politicized IPCC process. It revealed the IPCC’s unscientific practice of using different assumptions for projecting future temperature increases from those used to get models to reproduce past temperature. One need not be a climate expert to see that something is seriously amiss with the near certainties promulgated by the IPCC. “I have got to say,” Koonin remarked to climate modeler William Collins, “that this business is even more uncertain than I thought, uncertainties in the forcing, uncertainties in the modelling, uncertainties in historical data. Boy, this is a tough business to navigate.”

Koonin came away championing Christy’s idea of a red/blue team appraisal, a term drawn from war-gaming assessments performed by the military rather than from politics, which EPA Administrator Scott Pruitt has since adopted.

A revealing indicator of its potential value is the response to it. A June 2017 Washington Post op-ed, condemned calls for red/blue team appraisals as “dangerous attempts to elevate the status of minority opinions.”

Conclusion: Climate Policy’s Democratic Deficit. Open debate is as crucial in science as it is in a democracy. It would be contrary to democratic principles to dispense with debate and rely on the consensus of experts. The latter mode of inquiry inevitably produces prepackaged answers. But, as we have seen, relying on “consensus” buttresses erroneous science rather than allow it to be falsified.

The IPCC  was created to persuade, not provide objectivity and air disagreement. By contrast, the APS workshop gave both sides an impartial forum in which they could ask questions and probe the other side’s case. In doing so, it did more to expose the uncertainty, disagreement, and ignorance—to borrow Judith Curry’s words—around climate science than thousands of pages of IPCC assessment reports.

EPA Administrator Scott Pruitt’s proposal for red/blue team assessment is a logical progression from the workshop. The hostile reaction it elicited from leading consensus advocates strongly suggests that they fear debate. Climate scientists whose mission is to advance scientific understanding have nothing to fear and much to gain. Those who seek to use climate science as a policy battering ram have good reason to feel uncomfortable at the prospect. The biggest winner from a red/blue team assessment will be the public. If people are to buy into policies that will drastically alter their way of life, they should be fully informed of the consequences and justifications. To do otherwise would represent a subversion of democracy.

JC reflections

I’m very pleased to have this opportunity to revisit the APS Workshop on Climate Change, and am delighted to see that a journalist of Darwall’s caliber interpreting this. Drawl’s essay provides an eloquent argument in support of a climate red team, perhaps more so than what I, Steve Koonin or John Christy have provided.

The thing that really clicked in my brain was this statement by Bill Collins:

We understand a lot of the physics in its basic form. We don’t understand the emergent behavior that results from it.

Trying to leverage our understanding of the infrared emission spectra from CO2 and other ‘greenhouse’ gases into a ‘consensus’ on what has caused the recent warming in a complex chaotic climate system is totally unjustified — this is eloquently stated by Bill Collins.

My key take away conclusion from the APS Workshop is that the scientists on both sides are considering the same datasets and generally agree on their utility (the exception being the 2 decade debate between Santer and Christy on the uncertainties/utility of the satellite derived tropospheric temperature data). There is some disagreement regarding climate models, although I can’t say that I disagreed with much if anything said by Held and Collins in this regard.

The real issue is the logics used in linking the varied and sundry lines of evidence into drawing conclusions and assessing the uncertainties and areas of ambiguity and ignorance. I don’t think that any of the 6 scientists were using the same chain of reasoning about all this, even among scientists on the same ‘sides’.

Darwall’s essay deserves to be widely read. Here’s to hoping that it will reignite the discussion surrounding the climate red team.

November 29, 2017 Posted by | Science and Pseudo-Science, Timeless or most popular | , | Leave a comment

Global Temperature Trends Based On Non-Existent Data

By Paul Homewood | Not A Lot Of People Know That | November 28, 2017

hadcrut4_annual_global

 https://www.metoffice.gov.uk/hadobs/hadcrut4/diagnostics.html

 

We are all too familiar with graphs showing how much global temperatures have risen since the 19thC.

The HADCRUT version above is typical, and also very precise, with fairly tight error bars even in the early part of the record.

One wonders where they got the data to work all this out, because it certainly could not have come from thermometers.

All of the major global temperature datasets rely heavily on the Global Historical Climatology Network (GHCN). Yet as the “Overview of the Global Historical Climatology Network-Daily Database”, published by Matthew Menne et al in 2012, rather inconveniently showed, most of the world had little or no temperature data in the 19thC, and even up to 1950.

image

Density of GHCN-Daily stations with daily maximum and minimum temperature
http://journals.ametsoc.org/doi/full/10.1175/JTECH-D-11-00103.1

Prior to 1950, there were no more than a couple of hundred or so of GHCN stations outside of North America:

jtech-d-11-00103.1-f3

There are many competent scientists and statisticians who believe that even now it is not possible to measure the Earth’s average temperature, indeed that it is a meaningless concept.

Whether they are right or not, no serious scientist would claim to know the global temperature a century or more ago.

November 29, 2017 Posted by | Deception, Science and Pseudo-Science, Timeless or most popular | Leave a comment

The big slide in renewable energy tells the real story

No, renewables are not taking over the world anytime soon.

By Bjørn Lomborg | Watts Up With That? | November 26, 2017

We have spent the last two centuries getting off renewables because they were mostly weak, costly and unreliable. Half a century ago, in 1966, the world got 15.6% of its energy from renewables. Today (2016) we still get less of our energy at 13.8%.

With our concern for global warming, we are ramping up the use of renewables. The mainstream reporting lets you believe that renewables are just about to power the entire world. But this is flatly wrong.

The new World Energy Outlook report from the International Energy Agency shows how much renewables will increase over the next quarter century, to 2040. In its New Policies Scenario, which rather optimistically expects all nations to live up to their Paris climate promise, it sees the percentage increase less than 6 percentage points from 13.8% to 19.4%. More realistically, the increase will be 2 percentage points to 15.8%.

Most of the renewables are not solar PV and wind. Today, almost 10 percentage points come from the world’s oldest fuel: wood. Hydropower provides another 2.5 percentage points and all other renewables provide just 1.6 percentage points, of which solar PV and wind provide 0.8 percentage points.

Neither will most renewables in 2040 come from solar PV and wind, as breathless reporting tends to make you believe. 10 percentage points will come from wood. Hydropower provides another 3 percentage points and all other renewables provide 6 percentage points, of which solar PV and wind will (very optimistically) provide 3.7 percentage points.

Oh, and to achieve this 3.7 % of energy from solar PV and wind, you and I and the rest of the world will pay – according to the IEA – a total of $3.6 trillion in subsidies from 2017-2040 to support these uncompetitive energy sources. (Of course, if they were competitive, they wouldn’t need subsidies, and then they will be most welcome.)

Most people tend to think about electricity for renewables, but the world uses plenty of energy that is not electricity (heat, transport, manufacture and industrial processes).

Actually, if the world miraculously could make the *entire* global electricity sector 100% green without emitting a single ton of greenhouse gasses, we would have solved just a third of the total global greenhouse gas problem.

As Al Gore’s climate adviser, Jim Hansen, put it bluntly:

“Suggesting that renewables will let us phase rapidly off fossil fuels in the United States, China, India, or the world as a whole is almost the equivalent of believing in the Easter Bunny and [the] Tooth Fairy.”

We need to get real on renewables. Only if green energy becomes much cheaper – and that requires lots of green R&D – will a renewables transition be possible.


References

Data for graph: “A brief history of energy” by Roger Fouquet, International Handbook of the Economics of Energy 2009; IEA data DOI: 10.1787/enestats-data-en, and World Energy Outlook 2017, unfortunately not free, https://www.iea.org/weo2017/

Hansen quote: http://www.columbia.edu/…/mail…/2011/20110729_BabyLauren.pdf

The world emitted 49Gt CO₂e in 2014, and all electricity/heat came to 15Gt or less than a third, http://cait.wri.org/profile/World.

November 28, 2017 Posted by | Deception, Economics, Fake News, Mainstream Media, Warmongering, Science and Pseudo-Science, Timeless or most popular | 2 Comments