Once you tug on the thread of undisclosed financial interests in climate science, you’ll find it more a norm than exception. – Roger Pielke Jr (tweet)
I started working on this post last week, in response to the Willie Soon imbroglio. This whole issue has now become personal.
In case you haven’t been following this, Justin Gillis broke the story on Willie Soon with this article Deeper Ties to Corporate Cash for Doubtful Climate Researcher. The Smithsonian issued the following statement on the issue of Soon’s funding and apparent failure to disclose this funding in journal publications. Science Magazine has a summary [here] and Nature has a summary [here].
The ‘plot’ thickened yesterday, as Arizona Congressman Raul Grijalva (Democrat) Asks for Conflict-of-Interest Disclosures from GOP’s Go-To Climate Witnesses [link]. Excerpts:
The conflict-of-interest scandal involving a climate denier secretly funded by the fossil-fuel industry is spreading to other academics who oppose regulation of climate pollution. A top House Democrat has issued letters asking several researchers who have appeared as Republican witnesses before Congress questioning climate science to disclose their funding sources.
“I am hopeful that disclosure of a few key pieces of information will establish the impartiality of climate research and policy recommendations published in your institution’s name and assist me and my colleagues in making better law,” Grijalva wrote. “Companies with a direct financial interest in climate and air quality standards are funding environmental research that influences state and federal regulations and shapes public understanding of climate science. These conflicts should be clear to stakeholders, including policymakers who use scientific information to make decisions. My colleagues and I cannot perform our duties if research or testimony provided to us is influenced by undisclosed financial relationships.”
The letters request the institutions’ disclosure policies, drafts and communications relating to Congressional testimony, and sources of external funding for the academics in question.
The disclosure requests are needed because Congressional “truth in testimony” rules require witnesses to disclose government funding sources, but not private or corporate funding. Under Republican control, the rules are unevenly implemented, with not-for-profit witnesses required to submit pages of additional disclosures, while corporate-sector witnesses are not.
The seven academics who dispute the scientific consensus on anthropogenic global warming who have been asked to disclose their funding are:
David Legates, John Christy, Judith Curry, Richard Lindzen, Robert Balling, Roger Pielke Jr., Steven Hayward.
A copy of the letter from Grijalva that was sent to President Peterson of Georgia Tech is [here].
An article in ClimateWire provides additional context [link].
Skip to JC reflections for my punch line.
Conflict in scientific publication
Conflict of interest related to industry funding is a very big issue in biomedical research (related to drug and food safety) and also related to environmental contaminants. It isn’t a big issue in other scientific fields. Apart from expecting scientists to describe funding sources in the Acknowledgements, many journals don’t even have any conflict of interest disclosure requirements.
For those journals that do have such requirements, the requirements for disclosure are vastly different. As examples:
Nature : In the interests of transparency and to help readers to form their own judgements of potential bias, Nature journals require authors to declare to the editors any competing financial interests in relation to the work described. The corresponding author is responsible for submitting a competing financial interests statement on behalf of all authors of the paper. Authors submitting their manuscripts using the journal’s online manuscript tracking system are required to make their declaration as part of this process and to specify the competing interests in cases where they exist. The definition of conflict of interest relates to funding sources, employment, and personal financial interests.
Science : Science goes further with this statement: Management/Advisory affiliations: Within the last 3 years, status as an officer, a member of the Board, or a member of an Advisory Committee of any entity engaged in activity related to the subject matter of this contribution. Please disclose the nature of these relationships and the financial arrangements. Within the last 3 years, receipt of consulting fees, honoraria, speaking fees, or expert testimony fees from entities that have a financial interest in the results and materials of this study.
Wow. I haven’t published anything in Science in recent years (and never as a first author). So, all those scientists serving on Boards of green advocacy groups [Climate Scientists Joining Green Advocacy Groups] who publish in Science on any environmental or climate change topic should be declaring a conflict of interest.
So, once an author of a climate change paper declares a conflict of interest, what is that supposed to mean? An article in Science Magazine addresses this issue:
Conflict-of-interest controversies are rare in her field, she notes, and “they can be tricky.” Conflict is often in the eye of the beholder, she says, and researchers often accept all kinds of funding that doesn’t necessarily skew their peer-reviewed publications. “I’m for full disclosure,” she says, “but I’m not sure how we’re going to address this.” The journal, published by Elsevier, asks authors to fill out a conflict-of-interest disclosure. But Strangeway admits he’s never carefully examined one—and isn’t sure what he’s supposed to do if he sees a red flag. “We wouldn’t be raising the journal issue if [Soon] had simply disclosed Southern’s support,” he says.
Scientific journals are being alerted by watchdog groups to fossil fuel funding of contrarian climate studies [link]. Are we not to be concerned by fossil fuel funding of consensus climate science (there is plenty of that, see below)? Are we not to be concerned by funding from green advocacy groups and scientists serving on the Boards of green advocacy groups?
DeSmog surprised me with this article: How often were Willie Soon’s Industry-funded Deliverables Were Referenced by the IPCC? I was surprised to find that published journal papers with ties to industry made it into the IPCC, to counter all those gray literature articles by Greenpeace et al.
So, in climate science, what is the point of conflict of interest disclosure? Bishop Hill sums it up this way:
As far as I can see, the story is that Soon and three co-authors published a paper on climate sensitivity. At the same time (or perhaps in the past – this being a smear-job it’s hard to get at the facts) he was being funded to do work on things like the solar influence on climate by people that greens feel are the baddies. They and the greens feel he should have disclosed that baddies were paying him to do stuff on a paper that was not funded by the baddies.
The issue is this. The intense politicization of climate science makes bias more likely to be coming from political and ideological perspectives than from funding sources. Unlike research related to food and drug safety and environmental contaminants, most climate science is easily replicable using publicly available data sets and models. So all this IMO is frankly a red herring in the field of climate science research.
Bottom line: Scientists, pay attention to conflict of interest guidelines for journals to which you are submitting papers. Select journals that have COI disclosure requirements that are consistent with your comfort level.
Conflict in Testimony
The HillHeat article provides links to the relevant testimony by the 7 individuals (see original article for actual links):
- David Legates, Department of Agricultural Economics & Statistics, University of Delaware climatologist (6/3/14, 7/29/03, 3/13/02)
- John Christy, University of Alabama atmospheric scientist (12/11/13, 9/20/12, 8/1/12, 3/31/11, 3/8/11, 2/25/09, 7/27/06 (video), 5/13/03, 5/2/01, 5/17/00, 7/10/97)
- Judith Curry, Georgia Institute of Technology climatologist (1/16/14, 4/25/13, 11/17/10)
- Richard Lindzen, Massachusetts Institute of Technology atmospheric physicist (11/17/10, 5/2/01, 7/10/97, 1991 (Senate), 10/8/91)
- Robert C Balling Jr, Arizona State University geographer (3/6/96; North Carolina Legislature 3/20/06)
- Roger Pielke Jr, University of Colorado political scientist (12/11/13, 7/18/13, 3/8/11, 5/16/07, 1/30/07 (video), 7/20/06, 3/13/02)
- Steven Hayward, School of Public Policy, Pepperdine University historian (5/25/11, 10/7/09, 4/22/09, 3/12/09, 3/17/99)
HOLD ON. The article ‘forgot’ to reference my earlier testimony for the Democrats in 2006, 2007:
- House Committee on Govt Reform, “Hurricanes and Global Warming,” 7/20/06 [link]
- House Select Committee on Energy Independence and Global Warming, “Dangerous Climate Change,” 4/26/07 [link]
I can see that this earlier testimony is ‘inconvenient’ to their argument.
When you testify, you are required to include a financial disclosure related to your government funding. Presumably this is relevant if you are testifying with relation to performance by a government agency. There is no disclosure requirement that is relevant to individuals from industry or advocacy groups, or for scientists receiving funding from industry or advocacy groups.
To clarify my own funding, I have included the following statement of financial interests at the end of my testimony:
Funding sources for Curry’s research have included NSF, NASA, NOAA, DOD and DOE. Recent contracts for CFAN include a DOE contract to develop extended range regional wind power forecasts and a DOD contract to predict extreme events associated with climate variability/change having implications for regional stability. CFAN contracts with private sector and other non-governmental organizations include energy and power companies, reinsurance companies, other weather service providers, NGOs and development banks. Specifically with regards to the energy and power companies, these contracts are for medium-range (days to weeks) forecasts of hurricane activity and landfall impacts. CFAN has one contract with an energy company that also includes medium-range forecasts of energy demand (temperature), hydropower generation, and wind power generation. CFAN has not received any funds from energy companies related to climate change or any topic related to this testimony.
I note that during congressional questioning, I was never asked anything about my funding sources.
Again, I think that biases in testimony related to climate change are more likely to be ideological and political than related to funding.
So what is the point of asking for detailed financial information (including travel) from these academic researchers?
Intimidation and harassment is certainly one reason that comes to mind. Roger Pielke Jr seems to think this is the case, as described in his blog post I am Under Investigation:
I have no funding, declared or undeclared, with any fossil fuel company or interest. I never have. Representative Grijalva knows this too, because when I have testified before the US Congress, I have disclosed my funding and possible conflicts of interest. So I know with complete certainty that this investigation is a politically-motivated “witch hunt” designed to intimidate me (and others) and to smear my name.
The relevant issue to my mind is to expect non-normative testimony from academic researchers. I discussed this issue on a previous blog post Congressional testimony and normative science. Consensus climate scientists routinely present normative testimony, along the lines of ‘urgent mitigation action needed’. On the other hand, I personally work to make my testimony non-normative, and I would judge Christy’s and Pielke Jr’s testimony to be generally non-normative also (note Christy and Pielke Jr are the two on the list of 7 that I know best).
The issue of concern of Congressman Grijalva is funding from the Koch brothers and fossil fuel companies somehow contaminating Congressional testimony from scientists invited by Republicans to testify.
The reality is that fossil fuel money is all over climate research, whether pro or con AGW. Gifts of $100M+ have been made by oil companies to Stanford and Princeton. Anthony Watts notes the prominence of oil companies in funding the American Geophysical Union [link]. The Sierra Club and the Nature Conservancy take fossil fuel money [link]. The UKMetOffice has stated that energy companies are major customers.
NRO has an article Follow the Money, excerpt:
In truth, the overwhelming majority of climate-research funding comes from the federal government and left-wing foundations. And while the energy industry funds both sides of the climate debate, the government/foundation monies go only toward research that advances the warming regulatory agenda. With a clear public-policy outcome in mind, the government/foundation gravy train is a much greater threat to scientific integrity.
With federal research funding declining in many areas, academics at universities are being encouraged to obtain funding from industry.
I have to say I was pretty intrigued by Soon’s funding from the Southern Company. Southern Company (SoCo) provides power to Georgia. Georgia Power (a SoCo subsidiary) has provided considerable funding to Georgia Tech (although I have never received any). For most of the time that I was Chair, the School of Earth and Atmospheric Sciences had an endowed Chair from Georgia Power. When the faculty member left Georgia Tech, I chose not to hire a replacement, since I felt that my faculty hiring funds would be more productively used on younger faculty members in different research areas. I also note that one of my faculty members received funds from Georgia Power that was a ‘charitable donation’, without overhead and without deliverables. I also ‘heard’ that Southern Company/Georgia was very unhappy with the Webster et al. 2005 paper on hurricanes [link]. Note, I have received no funding from SoCo/GaPower.
My first reaction to this was to tweet: Looks like I am next up in this ‘witch hunt’. My subsequent reactions have been slowed by a massive headache (literally; cause and effect?)
It looks like it is ‘open season’ on anyone who deviates even slightly from the consensus. The political motivations of all this are apparent from barackobama.com: Call Out The Climate Deniers.
It is much easier for a scientist just to ‘go along’ with the consensus. In a recent interview, as yet unpublished, I was asked: I’ve seen some instances where you have been called a “denier” when it comes to climate change, I am just curious as to your opinion on that? My reply:
As a scientist, I am an independent thinker, and I draw my own conclusions about the evidence regarding climate change. My conclusions, particularly my assessments of high levels of uncertainty, differ from the ‘consensus’ of the Intergovernmental Panel on Climate Change (IPCC). Why does this difference in my own assessment relative to the IPCC result in my being labeled a ‘denier’? Well, the political approach to motivate action on climate change has been to ‘speak consensus to power’, which seems to require marginalizing and denigrating anyone who disagrees. The collapse of the consensus regarding cholesterol and heart disease reminds us that for scientific progress to occur, scientists need to continually challenge and reassess the evidence and the conclusions drawn from the evidence.
Well, the burden is on Georgia Tech to come up with all of the requested info. Georgia Tech has a very stringent conflict of interest policy, and I have worked closely in the past with the COI office to manage any conflicts related to my company. Apart from using up valuable resources at Georgia Tech to respond to this, there is no burden on me.
Other than an emotional burden. This is the first time I have been ‘attacked’ in a substantive way for doing my science honestly and speaking up about it. Sure, anonymous bloggers go after me, but I have received no death threats via email, no dead rats delivered to my door step, etc.
I think Grijalva has made a really big mistake in doing this. I am wondering on what authority Grijalva is demanding this information? He is ranking minority member of a committee before which I have never testified. Do his colleagues in the Democratic Party support his actions? Are they worried about backlash from the Republicans, in going after Democrat witnesses?
I don’t think anything good will come of this. I anticipate that Grijalva will not find any kind of an undisclosed fossil fuel smoking gun from any of the 7 individuals under investigation. There is already one really bad thing that has come of this – Roger Pielke Jr has stated:
The incessant attacks and smears are effective, no doubt, I have already shifted all of my academic work away from climate issues. I am simply not initiating any new research or papers on the topic and I have ring-fenced my slowly diminishing blogging on the subject. I am a full professor with tenure, so no one need worry about me — I’ll be just fine as there are plenty of interesting, research-able policy issues to occupy my time. But I can’t imagine the message being sent to younger scientists. Actually, I can: “when people are producing work in line with the scientific consensus there’s no reason to go on a witch hunt.”
“More than 50 conditions can cause or mimic the symptoms of dementia.” and “Alzheimer’s (can only be) distinguished from other dementias at autopsy.” – from a Harvard University Health Publication entitled What’s Causing Your Memory Loss? It Isn’t Necessarily Alzheimer’s
“Medications have now emerged as a major cause of mitochondrial damage, which may explain many adverse effects. All classes of psychotropic drugs have been documented to damage mitochondria, as have statin medications, analgesics such as acetaminophen, and many others.” – Neustadt and Pieczenik, authors of Medication-induced Mitochondrial Damage and Disease
“Establishing mitochondrial toxicity is not an FDA requirement for drug approval, so there is no real way of knowing which agents are truly toxic.” – Dr. Katherine Sims, Mass General Hospital -http://www.mitoaction.org
“It is difficult to get a man to understand something, when his salary depends upon his not understanding it!” – Upton Sinclair, anti-fascist, anti-imperialist American author who wrote in the early 20thcentury
“No vaccine manufacturer shall be liable… for damages arising from a vaccine-related injury or death.” – President Ronald Reagan, as he signed The National Childhood Vaccine Injury Act (NCVIA) of 1986, absolving drug companies from all medico-legal liability when children die or are disabled from vaccine injuries.
Over the past several decades there have been a number of well-financed campaigns, promoted by well-meaning laypersons, to raise public awareness to the plight of patients with dementia. Suspiciously, most of these campaigns that come from “patient support” groups lead the public to believe that every dementia patient has Alzheimer’s dementia (AD).
Not so curiously, it turns out that many – perhaps all – of these campaigns have been funded – usually secretly – by the very pharmaceutical companies that benefit economically by indirectly promoting the sale of so-called Alzheimer’s drugs. Such corporate-generated public relations “campaigns” are standard operating procedure for all of BigPharma drugs, especially its psychopharmaceutical drugs. BigPharma has found that the promotion and de-stigmatization of so-called “mental illnesses” (for which there are FDA-approved drugs) is a great tool for marketing their drugs.
Recently Alzheimer’s support groups all around the nation have been sponsoring the documentary about country singer Glen Campbell who has recently been diagnosed by his physicians with Alzheimer’s disease (of unknown etiology) despite the obvious fact that Campbell was infamous for his chronic heavy use of brain-damaging, dementia-inducing, addicting, and very neurotoxic drugs like cocaine and alcohol. And, just like so many other hard-living celebrities like the recently suicidal Robin Williams, Campbell was known to have received prescriptions of legal drugs from their prescribing boutique psychiatrists and physicians, just adding to the burden that their failing livers, brains and psyches had to endure.
Since it is known that Alzheimer’s disease can only be truly diagnosed by a microscopic examination of the cerebral cortex (at autopsy), we have to question the very alive Glen Campbell’s diagnosis. And we also have to question the veracity and motivations of the sponsoring patient support groups and their BigPharma sponsors.
Is the Alzheimer’s Epidemic Actually a Drug-Induced Dementia Epidemic?
Synchronous with the huge increases (over the past generation or so) in
1) the incidence of childhood and adult vaccinations,
2) the widespread use of psychotropic and statin (cholesterol-lowering) drug use, and
3) the increased ingestion of a variety of neurotoxic substances – including food additives, there has been a large parallel increase in the incidence of
a) chronic illnesses of childhood, including autistic spectrum disorders,
b) “mental illnesses of unknown origin”, and also
c) dementia, a multifactorial reality which, via clever marketing and the studied ignorance of what is scientifically known about the actual causes – and diagnosis – of dementia, which has been primarily – and mistakenly – referred to as Alzheimer’s disease (of unknown etiology).
It is important to ask and then demand an honest answer to the question “could there be a connection between America’s increasingly common over-prescribing of immunotoxic, neurotoxic, synthetic prescription drugs and vaccines and some of the neurodegenerative disorders that supposedly “have no known cause”?
Could the economically disabling American epidemic of autoimmune disorders, psychiatric disorders, autism spectrum disorders, etc (all supposedly of unknown origin) that have erupted over the past several decades be found to have recognizable root causes and therefore be treatable and, most importantly, preventable?
These are extremely important questions, especially in the case of the current dementia epidemic, because the so-called Alzheimer’s patient support groups seem to be totally unaware of the powerful evidence that prescription drugs known to damage brain cells (especially by poisoning their mitochondria) would be expected to cause a variety of neurological and psychological disorders because of the brain cell death that eventually happens when enough of the mitochondria (the microscopic hearts and lungs of every cell) have been wounded irretrievably or killed off. (See more info on drugs and mitochondria below.)
One of the big problems in America’s corporate-controlled culture, corporate-controlled media and corporate-controlled medical industries is that the giant pharmaceutical corporations, who are in the business of developing, marketing and selling known mitochondrial toxins (in the form of their drugs and vaccine ingredients) have a special interest in pretending that there is no known cause for the disorders that their synthetic chemicals are causing (or they use the unprovable “it’s probably genetic” subterfuge).
It should be a concern of everybody who knows a demented patient, that some AD patient support groups are known to be front groups for the pharmaceutical companies that profit from the marketing to patients and their doctors the disappointingly ineffective drugs for Alzheimer’s like Aricept, Exelon, Namenda, Hexalon, and Razadyne.
Prescription Drug-Induced – and Vaccine-Induced – Mitochondrial Disorders
Acquired mitochondrial disorders (as opposed to the relatively rare primary mitochondrial disorders like muscular dystrophy) that can be caused by commonly prescribed drugs are difficult to diagnose and are generally poorly understood by most practitioners. When I went to med school, nobody knew anything about what synthetic drugs or vaccines did to the mitochondria.
A lot of mitochondrial research, especially since the 1990s, has proven the connections between a variety of commonly prescribed medications and mitochondrial disorders. That evidence seems to have been cunningly covered-up by the for-profit pharma groups (who control medical education and much of the media) and various other powers-that-be because of the serious economic consequences if the information was allowed in the popular press. The stake-holders in the pharmaceutical and medical industries, most of whom profit mightily from the routine and increasing usage of neurotoxic drugs and vaccines, supposedly operating in the name of Hippocrates, would be very displeased if this information got out. I submit that BigPharma’s cover-up of the connections is totally unethical and, in the opinion of many other whistleblowers, criminal.
An Honest Patient Guide for Dementia Patients from Harvard!
So I was pleasantly surprised to find a reasonably honest guide for dementia patients on a Harvard University website.
(The entire guide can be accessed at http://www.helpguide.org/harvard/whats-causing-your-memory-loss.htm#top.)
The information at that website stated that there were over 50 conditions that could cause or mimic early dementia symptoms. I hadn’t been taught anything about that reality when I went to med school, and I doubt that many of my physician colleagues were either. And besides, what medical practitioner in our double-booked clinic environment, even if he or she was aware, has the time to thoroughly rule out the 50 conditions when confronted with a patient with memory loss?
I have often said to my patients and my seminar participants: “it takes only 2 minutes to write a prescription, but it takes 20 minutes to not write a prescription”. And in the current for-profit clinic culture, time is money and few physicians are given the “luxury” of spending adequate time with their patients. (In defense of the physicians that I know, they are not happy about that reality but don’t know what to do about it.)
It is so tempting to use the popularized, but rather squishy label of AD (of unknown etiology) rather than to educate ourselves about the possibility of drug- or vaccine-induced dementia. But what is so important is that many of the 50+ conditions are preventable or reversible, which will be therapeutic only if the conditions are identified before permanent brain damage occurs.
The Harvard guide actually said that “medications are common culprits in mental decline. With aging, the liver becomes less efficient at metabolizing drugs, and the kidneys eliminate them from the body more slowly. As a result, drugs tend to accumulate in the body. Elderly people in poor health and those taking several different medications are especially vulnerable.”
The guide continued with a list of the possible classes of prescription drugs that number in the hundreds:
“The list of drugs that can cause dementia-like symptoms is long. It includes antidepressants, antihistamines, anti-Parkinson drugs, anti-anxiety medications, cardiovascular drugs, anticonvulsants, corticosteroids, narcotics, sedatives.”
The Harvard guide went on to emphasize that Alzheimer’s can only be accurately diagnosed on a post-mortem examination. The guide states that “Alzheimer’s is distinguished from other dementias at autopsy by the presence of sticky beta-amyloid plaques outside brain cells (neurons) and fibrillary tangles within neurons (all indicative of cellular death). Although such lesions may be present in any aging brain, in people with Alzheimer’s these lesions tend to be more numerous and accumulate in areas of the brain involved in learning and memory.”
“The leading theory is that the damage to the brain results from inflammation and other biological changes that cause synaptic loss and malfunction, disrupting communication between brain cells. Eventually the brain cells die, causing tissue loss In imaging scans, brain shrinkage is usually first noticeable in the hippocampus, which plays a central role in memory function.”
But even the Harvard guide inexplicably failed to mention known mitochondrial toxins such as statin drugs, metformin, Depakote, general anesthetics, fluoroquinolone antibiotics, fluorinated psychotropic drugs, NutraSweet (every molecule of aspartame, when it reaches 86 degrees F, releases one molecule of the excitotoxin aspartic acid and one molecule of methanol [wood alcohol] which metabolizes into the known mitochondrial poison formaldehyde [embalming fluid]), pesticides (including the chlorinated artificial sweetener Splenda, which was initially developed as a pesticide) or the mercury (thimerosal), aluminum and formaldehyde which are common ingredients in vaccines. These are only some of the synthetic drugs that are capable of causing mitochondrial damage in brain cells – with memory loss, confusion and cognitive dysfunction, all early symptoms of dementia.
It is tragic, but all–too-common, for reversible and preventable drug-induced dementias (therefore of known cause and thus not Alzheimer’s) to be mis-diagnosed as Alzheimer’s disease “of unknown etiology” and to then be prescribed costly, essentially ineffective and potentially toxic drugs – whose mitochondrial toxicities have not been tested for.
(The pharmaceutical industry, it should be noted, is not required by the FDA to test its drugs for mitochondrial toxicity when it is doing its studies for marketing approval, again exhibiting the total disdain for the Precautionary Principle by both industry and the regulatory agencies such as the FDA, the CDC and WHO.)
There is much more in the basic neuroscience literature proving the connections, at least from authors who do not have conflicts of interest with BigPharma and BigMedicine. The authors of these articles have raised the questions and have published the proof that concerned families of patients and their physicians desperately need to know.
Don’t expect BigPharma to respond or to offer apologies or mea culpas. Do expect denials, dismissals, distractions, discrediting and then the delaying of real legitimate explorations of the real scientific evidence that exposes its subterfuge in the name of maintaining large profits for their stakeholders.
Here are the abstracts from just two of the many peer-reviewed articles from various science journals that support the thesis of this column.
Medication-induced mitochondrial damage and disease
Published in the Molecular Nutrition and Food Research journal ; 2008 Jul;52(7):780-8.
Authors: Neustadt, J, Pieczenik SR.
Mitochondrial Dysfunction and Psychiatric Disorders
From: The Journal of Neurochemical Research 2009 Jun;34(6):1021-9.
Mitochondrial oxidative phosphorylation is the major ATP-producing pathway, which supplies more than 95% of the total energy requirement in the cells. Damage to the mitochondrial electron transport chain has been suggested to be an important factor in the pathogenesis of a range of psychiatric disorders. Tissues with high energy demands, such as the brain, contain a large number of mitochondria, being therefore more susceptible to reduction of the aerobic metabolism. Mitochondrial dysfunction results from alterations in biochemical cascade and the damage to the mitochondrial electron transport chain has been suggested to be an important factor in the pathogenesis of a range of (so-called) neuropsychiatric disorders, such as (psychotropic drug-treated) bipolar disorder, depression and schizophrenia….Alterations of mitochondrial oxidative phosphorylation in (anti-psychotic drug-treated) schizophrenia have been reported in several brain regions and also in platelets. Abnormal mitochondrial morphology, size and density have all been reported in the brains of (anti-psychotic drug-treated) schizophrenic individuals. Considering that several studies link energy impairment to neuronal death, neurodegeneration and disease, this review article discusses energy impairment as a mechanism underlying the pathophysiology of some psychiatric disorders, like (psychotropic drug-treated) bipolar disorder, depression and schizophrenia.
Dr Kohls is a retired physician who practiced holistic mental health care for the last decade of his career, and took seriously the Hippocratic Oath that he swore when he received his medical degree. He is also a peace and justice advocate and writes a weekly column for the Reader Weekly, an alternative newsweekly published in Duluth, Minnesota, USA. The last three years of Dr Kohls’ columns are archived at http://duluthreader.com/articles/categories/200_Duty_to_Warn.
The United Nations Intergovernmental Panel on Climate Change (IPCC) was set up by the United Nations in 1988 and has been trying very hard to demonstrate the threat of a dangerous human influence on climate due to the emission of greenhouse gases. This is in line with their Charter, which directs the IPCC to assemble reports in support of the Global Climate Treaty — the 1992 Framework Convention on Climate Change (FCCC) of Rio de Janeiro.
It is interesting that IPCC “evidence” was based on peer-reviewed publications – but (reluctantly) abandoned only after protracted critiques from outside scientists. E-mails among members of the IPCC team, revealed in the 2009 ‘Climategate’ leak, describe their strenuous efforts to silence such critiques, often using unethical methods.
I will show here that the first three IPCC assessment reports contain erroneous scientific arguments, which have never been retracted or formally corrected, but at least have now been abandoned by the IPCC — while the last two reports, AR4 and AR5, use an argument that seems to be circular and does not support their conclusion. Australian Prof. “Bob” Carter, marine geologist and paleo-climatologist, refers to IPCC as using “hocus-pocus” science. He is a co-author of the latest (2013) NIPCC (Non-governmental International Panel on Climate Change) report “Climate Change Reconsidered-II” www.climatechangereconsidered.org . We also co-authored a critique of the 2013 IPCC-AR5 Summary.
1. IPCC-AR1 (1990)
This first report of the IPCC bases its entire claim for AGW on the fact that both CO2 and surface temperatures increased during the 20th century — although not in lock-step. They assign the major warming of 1910 to 1940 to a human influence — based on a peer-reviewed paper by BD Santer and TML Wigley, which uses a very strange statistical argument. But the basis of their statistics has been critiqued (by Tsonis and Swanson) — and I have demonstrated empirically elsewhere that their conclusion does not hold.
While this faulty paper has never been retracted, it is now no longer quoted as evidence by the IPCC — nor accepted by the overwhelming majority of IPCC scientists: Most if not all warming of the early 20th century is due to natural, not human causes.
2. IPCC AR2 (1996)
This report devotes a whole chapter, #8, to “Attribution and Detection.” Its main feature is what one might call the “invention” of the “Hotspot,” i.e. an enhanced warming trend in the tropical troposphere — never actually observed.
Unfortunately, the “evidence,” as presented by BD Santer, was published only after the IPCC report itself appeared; it contains two fundamental errors. The first error was to argue that the Hotspot is a “fingerprint” of human influence — and specifically, related to an increase in greenhouse gases. This is not true. The Hotspot, according to all model calculations, is simply an atmospheric amplification of a surface trend, a consequence of the physics of the tropical atmosphere.
[Technically speaking, it is caused by increased convective activity whereby cumulus clouds carry latent heat from the surface of the tropical ocean into the upper troposphere. In other words, the Hotspot is not human-caused, but arises from a “moist-adiabatic lapse rate” of the atmosphere. This effect is discussed in most meteorological textbooks and is widely accepted.]
How then did AR2 conclude that a Hotspot exists observationally? This is the second issue: The IPCC selected a short interval in the atmospheric temperature record that showed an increase — while the general trend was one of cooling. In other words, they cherry-picked their data to invent a Hotspot — as pointed out in a subsequent publication by PJ Michaels and PC Knappenberger [see graph below]
The matter of the existence of a Hotspot in the actual tropical troposphere has been the topic of lively debate ever since. On the one hand, DH Douglass, JR Christy, BD Pearson and SF Singer, demonstrated absence of a Hotspot empirically while Santer (and 17[!] IPCC coauthors), publishing in the same journal, argued the opposite. This issue now seems to have been finally settled, as discussed by Singer in two papers in Energy & Environment [2011 and 2013].
It is worth noting that a US government report [CCSP-SAP-1.1 (2006)] showed absence of a Hotspot in the tropics (Chapter 5, BD Santer, lead author). But the report’s Executive Summary managed to obfuscate this result by referring to global atmosphere rather than tropical.
It is also worth noting that while the IPCC-AR2 used the Hotspot invention to argue that the “balance of evidence suggest a human influence,” later IPCC reports no longer use the Hotspot argument.
Nevertheless, one consequence of this unfortunate phrase in AR2 has been the adoption of the Kyoto Protocol, an international treaty to limit emissions of greenhouse gases. Even though Kyoto expired in 2012, it has managed to waste hundreds of billions of dollars so far — and continues to distort energy policies with uneconomic schemes in most industrialized nations.
3. IPCC AR3 (2001)
AR3 attributes global warming to human influences based on the “Hockey-Stick” graph, using published papers by Michael Mann, derived from his analysis of multi-proxy data. The hockeystick graph [bottom graph below] claims that the 20th century showed unusually rapid warming — and thus suggests a strong human influence. The graph also does away with the well-established Medieval Warm Period and Little Ice Age, which were shown in earlier IPCC reports [see top graph below].
It was soon found that the Hockeystick graph was in error and did not deserve continued reliance. Canadian statisticians Steven McIntyre and Ross McKitrick demonstrated errors in Mann’s statistical analysis and in the use of certain tree-ring data for calibration. In fact, they showed that Mann’s algorithm would generate a Hockeystick graph — even if the input data was pure noise. [I served as a reviewer for M&M’s initial paper in Energy & Environment 2003.]
It is worth noting that the IPCC no longer uses the Hockeystick to support human-caused warming, even though AR3 still claims to be at least 66% certain that greenhouse-gas emissions are responsible for 20th century warming.
4. IPCC-AR4 (2007) and AR5 (2013)
Both reports use essentially the same faulty argument in their attempt to support their conclusion of human-caused global warming. Their first step is to construct a model that tries to match the reported 20th-century surface warming. This is not very difficult; it is essentially a ‘curve-fitting’ exercise: By selecting the right level of climate sensitivity and the right amount of aerosol forcing, they can match the reported temperature rise of the final decades of the 20th century, but not the initial decades — as becomes evident from a detailed graph in their Attribution chapter. This lack of agreement is due to the fact that their models ignore major forcings — both from variations of solar activity and from changes in ocean circulation.They then use the following trick. They re-plot their model graph, but without an increase in greenhouse gases; this absence of forcing now generates a gap between the reported warming and unforced model. Then they turn around and argue that this gap must be due to an increase in greenhouse gases. It appears to me that this argument may be circular. Even if the reported late-20th-century surface warming really exists (it is absent from the satellite and radiosonde records), the IPCC argument is not convincing.
It is ironic, however, that IPCC claims increasing certainty (at 90% in AR4 and at least 95% in AR5) for an attribution to human causes, which appears to be contrived. Additionally, while AR4 calculates a Climate Sensitivity (for a doubling of CO2) of 2.0 – 4.5 degC, AR5 expands the uncertainty interval to 1.5 – 4.5 degC. So much for the claim of increased certainty in the IPCC-AR5 Summary.
Yet, while claiming increased certainty about manmade global warming, both reports essentially ignore the absence of any surface warming trend since about 1998. Of course, they also ignore absence of any significant warming in the troposphere, ocean record, and proxy data during the crucial preceding (1979-1997) interval.
In spite of much effort, the IPCC has never succeeded in demonstrating that climate change is significantly affected by human activities — and in particular, by the emission of greenhouse gases. Over the last 25 years, their supporting arguments have shifted drastically — and are shown to be worthless. It appears more than likely that climate change is controlled by variations in solar magnetic activity and by periodic changes in ocean circulation.
[Full disclosure: I have a very small dog in this fight, having demonstrated some time ago that solar-emitted magnetic fields, projected into interplanetary space by the solar wind, modulate the intensity of cosmic rays striking the Earth’s atmosphere; this is no longer a contentious, hot-topic issue. The exact mechanism by which cosmic rays then influence the climate is not known in detail, but current efforts, mainly by a Danish research group, focus on changes in Earth’s cloudiness.]
However, that there is no doubt about the existence of such a solar influence on climate. As shown in the graph below, cosmic-ray intensity (as measured by the radioactive carbon isotope C-14) and terrestrial climate (as measured by the oxygen isotope O-18) correlate in amazing detail over an interval of at least 3000 years (see graph below; the bottom graph is the central section, blown up to reveal detail)
S. Fred Singer is professor emeritus at the University of Virginia and director of the Science & Environmental Policy Project. His specialty is atmospheric and space physics.
ABC Australia Investigative Report on Statin Scam Pulled from YouTube
Dr. MaryAnne Demasi’s documentary on the criminal activity of the pharmaceutical industry regarding cholesterol-lowering statin drugs sent shock waves through the mainstream media in Australia at the end of 2013. Published in two parts on the popular news show The Catalyst, the pharmaceutical industry complained loudly after the first show, and requested the network not air the second episode, “Heart of the Matter Part 2 – Cholesterol Drug War.”
ABC Australia aired it anyway, but the pharmaceutical influence was apparently too strong, as they later announced that the network would remove the videos from their website because “they breached its impartiality standards.” All copies found on YouTube were also removed.
Dr. Michael Eades has published them on his Vimeo channel, however, and you can watch them below.
Heart of the Matter – Part 1
Heart of the Matter Part 2 – Cholesterol Drug War
A voluminous scientific study on climate change and man-made possibilities of altering it was funded by several federal agencies, including the Central Intelligence Agency (CIA).
The CIA’s decision to partially fund the research left at least one expert who participated in the study a little uneasy.
Scientist Alan Robock at Rutgers University told The Guardian the CIA’s investment in the $630,000 study “makes me really worried who is going to be in control” of efforts to stem the impact of climate change.
In addition to the CIA, the National Aeronautics and Space Administration, the Department of Energy, and the National Oceanic and Atmospheric Administration funded the National Academy of Sciences research that produced two reports within the study.
One report addressed ways to remove carbon dioxide from the atmosphere, and the other looked at ways to alter cloud cover or change the planet’s surface to make it reflect more sunlight back into space.
The CIA never explained to the academy why it was funding the project.
But Robock became suspicious after two CIA consultants contacted him inquiring about the possibility of another country gaining control of the weather.
“They said: ‘We are working for the CIA and we’d like to know if some other country was controlling our climate, would we be able to detect it?’ I think they were also thinking in the back of their minds: ‘If we wanted to control somebody else’s climate could they detect it?’” he told The Guardian.
He said that he told the consultants that any attempt to generate large, climate-changing clouds would be noticed by weather satellites or other equipment used to monitor the atmosphere.
The CIA didn’t respond to a press inquiry about its involvement and has previously refused to confirm its role in the study. In 2013, CIA spokesman Edward Price told Mother Jones: “It’s natural that on a subject like climate change the Agency would work with scientists to better understand the phenomenon and its implications on national security.”
Using the weather as a weapon is forbidden under international law, per the Environmental Modification Convention of 1978.
The agency’s inquiry left Robock concerned. “I’d learned of lots of other things the CIA had done that didn’t follow the rules,” he said. “I thought that wasn’t how my tax money was spent.”
The CIA opened its own climate change office in 2009 but shut it down three years later after criticism from some Republicans who called it a distraction from the agency’s focus on combatting terrorism.
To Learn More:
Spy Agencies Fund Climate Research in Hunt for Weather Weapon, Scientist Fears (by Ian Sample, The Guardian )
CIA Backs $630,000 Scientific Study on Controlling Global Climate (by Dana Liebelson and Chris Mooney, Mother Jones )
C.I.A. Closes Its Climate Change Office (by John Broder, New York Times )
I started a PhD program in Environmental Engineering because I worried about climate change. It didn’t take long for me to become a skeptic.
My first paper, a study about precipitation intensity over the U.S., was rejected by reviewers because it contradicted the climate model projections. Though they could find nothing wrong with the methodology, they decided observational data must be flawed because climate models couldn’t possibly be wrong and wrote that the paper could not be published.
I then started reading the atmospheric science literature about precipitation trends. It was clear to me that the theory about changes in precipitation intensity were designed to explain climate model results that didn’t mesh with observations. When I found that changes in observed precipitation were largest in autumn, and did not find the same patterns of precipitation in climate models outputs, I really became skeptical about the use of climate models. When I started working with climate models and saw how poorly they reproduce precipitation patterns, I was forced into the realization that the “science” was being fit to the models and that the models were not very realistic. From my perspective, this runs contrary to the scientific method.
After finishing my PhD in Environmental Engineering, I earned a M.S. in Atmospheric Science and started working on a PhD. As I learned more about meteorology and atmospheric dynamics, I started to see the contradictions in the climate change discussion.
I had another paper refused by a high profile journal because it showed that cold air is required to produce the conditions that cause storm surges in the western Canadian arctic. That suggestion really seemed to upset the editor (an engineer) who wouldn’t even send it out for review. My later research has shown the importance of strong jets and cold air in building the blocking ridges that cause the extreme weather we’ve seen over the last two autumns/winters. The claims that are being made that a warming of the arctic will lead to warmer conditions in the mid-latitudes because it will cause more blocking are preposterous because strong jets are needed to support the blocking ridges. I received dozens of letters saying my published paper must be wrong because I suggest that strong jets, not weak jets, cause blocking. Most of the claims being made by climate change advocates appear to run contrary to basic meteorology.
As I’ve been attacked personally and professionally for offering contrary views, I decided to leave the field. I will defend my Atmospheric Science PhD thesis and walk away. It’s become clear to me that it is not possible to undertake independent research in any area that touches upon climate change if you have to make your living as a professional scientist on government grant money or have to rely on getting tenure at a university. The massive group think that I have encountered on this topic has cost me my career, many colleagues and has damaged my reputation among the few people I know in the field.
I’m leaving to work in the financial industry. It’s a sad day when you feel that you have to leave a field that you are passionately interested in because you fear that you won’t be able to find a job once your views become widely known. Until free thought is allowed in the climate sciences, I will consider myself a skeptic of catastrophic human induced global warming.
Science has been misused for political purposes many times in history. However, the most glaring example of politically motivated pseudoscience—that employed by U.S. government scientists to explain the destruction of the World Trade Center (WTC)—continues to be ignored by many scientists. As we pass the 10th anniversary of the introduction of that account, it is useful to review historic examples of fake science used for political purposes and the pattern that defines that abuse.
An early example of pseudoscience used to promote a political agenda was the concerted Soviet effort to contradict evolutionary theory and Mendelian inheritance. For nearly 45 years, the Soviet government used propaganda to foster unproven theories of agriculture promoted by its minister of agriculture, Trofim Lysenko. Scientists seeking favor with the Soviet hierarchy produced fake experimental data in support of Lysenko’s false claims. Scientific evidence from the fields of biology and genetics was banned in favor of educational programs that taught only Lysenkoism and many biologists and geneticists were executed or sent to labor camps. This propaganda-fueled program of anti-science continued for over forty years, until 1964, and spread to other countries including China.
In the 2010 book Merchants of Doubt, authors Naomi Oreskes and Erik Conway describe several other examples of the misuse of science, spanning from the 1950s to the present. They show how widely respected scientists participated in clearly non-scientific efforts to promote the agendas of big business and big government. Examples include the tobacco industry’s misuse of science to obfuscate the links between smoking and cancer, the military industrial complex’s use of scientists to support the scientifically indefensible Strategic Defense Initiative (SDI), and several abuses of environmental science.
As Oreskes and Conway made clear, science is about evidence. “It is about claims that can be, and have been, tested through scientific research—experiment, experience, and observation—research that is then subject to critical review by a jury of scientific peers.” In science, if experiments performed do not support a hypothesis, that hypothesis must be rejected. If conclusions fail to pass peer-review due to a lack of supportive evidence or the discovery of evidence that directly contradicts them, those conclusions must be rejected.
From Lysenkoism through the examples given by Oreskes and Conway, politically motivated pseudoscience demonstrates a pattern of characteristics as follows.
- There is a lack of experiments.
- The results of experiments are ignored or contradicted in the conclusions.
- There is either no peer-review or peer-reviewer concerns are ignored.
- The findings cannot be replicated or falsified due to the withholding of data.
- False conclusions are supported by marketing or media propaganda.
- Hypotheses that are supported by the evidence are ignored.
All six of these characteristics of pseudo-science are exhibited by the U.S. government investigation into what happened at the WTC on September 11th, 2001. That investigation was conducted by the National Institute for Standards and Technology (NIST) and it had much in common with the examples given by Oreskes and Conway. As with the false science that supported tobacco use, millions of lives were lost as a result—in this case through the “War on Terror.” Like support for the Strategic Defense Initiative, the abuses were focused on supporting the military-industrial complex. And as with the environmental examples, NIST’s manipulations affect everyone on the planet because they prop up a never-ending war.
In terms of historical experience, the destruction of the three WTC skyscrapers was unprecedented. No tall building had ever experienced global collapse for any reason other than explosive demolition and none ever has since that time. In terms of observation, nearly everyone who examines the videos from the day recognizes the many similarities to explosive demolition. Perhaps the most compelling evidence in favor of the demolition theory is that the NIST WTC Reports, which took up to seven years to produce, exhibit all six of the characteristics of politically motivated pseudoscience.
The lack of experiment:
NIST performed no physical experiments to support its conclusions on WTC Building 7. Its primary conclusion, that a few steel floor beams experienced linear thermal expansion thereby shearing many structural connections, could have easily been confirmed through physical testing but no such testing was performed. Moreover, other scientists had performed such tests in the past but since the results did not support NIST’s conclusions, those results were ignored (see peer-review comments below).
The results of experiments were ignored or contradicted in the conclusions:
- For the Twin Towers, steel temperature tests performed on the few steel samples saved suggested that the steel reached only about 500 degrees Fahrenheit. This is more than one thousand degrees below the temperature needed to soften steel and make it malleable—a key requirement of NIST’s hypothesis. NIST responded by exaggerating temperatures in its computer model.
- Another key requirement of NIST’s explanation for the Twin Towers was that floor assemblies had sagged severely under thermal stress. Floor model tests conducted by my former company Underwriters Laboratories showed that the floor assemblies would sag only 3 to 4 inches, even after removal of all fireproofing and exposure to much higher temperatures than existed in the buildings. NIST responded by exaggerating the results—claiming up to 42-inches worth of floor assembly sagging in its computer model.
- After criticism of its draft report in April 2005, NIST quietly inserted a short description of shotgun tests conducted to evaluate fireproofing loss in the towers. These results also failed to support NIST’s conclusions because the shotgun blasts were not reflective of the distribution or trajectories of the aircraft debris. Additionally, the tests suggested that the energy required to “widely dislodge” fireproofing over five acre-wide floors—required by NIST’s findings—was simply not available.
There was no peer review and public comments from peers were ignored:
NIST published its own WTC reports and therefore its work was not subject to peer-review as is the case for all legitimate science. The people and companies involved in the NIST investigation were either government employees or contractors dependent on government work and were therefore not objective participants.
In terms of indirect peer-review, the international building construction community has made no changes to building construction standards in response to NIST’s officially cited root causes for the WTC destruction. Furthermore, no existing buildings have been retrofitted to ensure that they do not fail from those alleged causes.
NIST provided a period for public comment on its draft reports but the comments provided by those not beholden to government were not supportive of NIST’s findings. In some cases, as with NIST’s linear expansion claim for WTC 7, independent scientists submitted comments about physical tests they had performed (which NIST had not) that directly contradicted NIST’s findings.
There was one important exception to NIST’s ignoring of public comments. After a physics teacher’s well-publicized comments, NIST was forced to admit that WTC 7 was in free-fall for a vertical distance equivalent to at least eight stories of the building. Structural engineers have since noted that many hundreds of high-strength steel bolts and steel welds would have had to vanish instantaneously for an 8-story section of the building to fall without any resistance.
The findings cannot be replicated or falsified due to the withholding of data:
NIST will not share it computer models with the public. A NIST spokesman declared, in response to a Freedom of Information Act request, that revealing the computer models would “jeopardize public safety.” Because NIST’s conclusions depend entirely on those computer models, they cannot be verified or falsified by independent scientists.
False conclusions are supported by media or marketing propaganda:
As with the Soviet propaganda machine that supported Lysenkoism and the tobacco industry’s marketing propaganda, NIST’s pseudoscience was fully and uncritically supported by the mainstream media. Hearst Publications, the British Broadcasting Corporation (BBC), and Skeptic magazine are examples of media that went to great lengths to stifle any questioning of the official account and divert attention from the glaring discrepancies.
NIST depended on that media support as indicated by the timing of its release of reports. NIST’s final report appeared to be scheduled for dual political purposes, to coincide with the seventh anniversary of 9/11 and to give the appearance of finished business at the end of the Bush Administration. The timing of NIST’s other reports coincided with political events as well. These included the draft report on the towers in October 2004—just before the election, the final report on the towers—just before the fourth anniversary of 9/11, and NIST’s first “responses to FAQs”—just before the fifth anniversary. All of them appeared to involve politically motivated release dates.
The report release dates allowed time for the media to quickly present the official story while public interest was high, but did not allow time for critical review. With the report on WTC 7, the public was given just three weeks prior to September 11th, 2008 to comment on a report that was nearly seven years in the making.
Hypotheses that are supported by the evidence were ignored:
Throughout its seven-year investigation, NIST ignored the obvious hypothesis for the destruction of the WTC buildings—demolition. That evidence includes:
- Free-fall or near-free fall acceleration of all three buildings (now acknowledged by NIST for WTC 7)
- Photographic and video evidence demonstrating the characteristics of demolition for both the Twin Towers and WTC 7
- Eyewitness testimony from many people at the scene who witnessed explosions or were warned that a demolition was proceeding
- The expert testimony of thousands of licensed engineers and architects who are calling for a new investigation
- The peer-reviewed science that supports the demolition theory including fourteen points of agreement between NIST and independent researchers, environmental anomalies that indicate the use of thermitic materials, and analytical results confirming the presence of nanothermite in the WTC dust
The WTC reports produced by NIST represent the most obvious example of politically motivated pseudoscience in history. The physical experiments NIST performed did not support its conclusions. The reports were not peer-reviewed and public comments that challenged the findings were ignored. NIST will not share its computer models—the last supposed evidence that supports its conclusions—with the public and therefore its conclusions are not verifiable.
These glaring facts should be readily recognizable by any scientist and, given the unprecedented impact of the resulting War on Terror, this abuse of science should be the basis for a global outcry from the scientific community. The fact that it is not—with even Oreskes and Conway ignoring this most obvious example—indicates that many scientists today still cannot recognize false science or cannot speak out about it for fear of social stigma. It’s possible that our society has not suffered enough to compel scientists to move out of their comfort zones and challenge such exploitation of their profession. If so, the abuse of science for political and commercial purposes will only get worse.
A new website spoofs preposterous claims regarding climate change:
“I’ve started a website with the idea of making it entertaining as well as informative. The website presents global warming predictions that have been made over the past 40 or so years, especially predictions that are either contradictory or alternatively plainly ridiculous and thus amusing.”
Climate change can also affect the Earth’s spin. Previously, Felix Landerer of NASA’s Jet Propulsion Laboratory in Pasadena, California. and colleagues showed that global warming would cause Earth’s mass to be redistributed towards higher latitudes. Since that pulls mass closer to the planet’s spin axis, it causes the planet to rotate faster – just as an ice skater spins faster when she pulls her arms towards her body. – New Scientist 20 Aug 2009
Belgian scientists have identified a hitherto unsuspected benefit of global warming – more time for all of us. They say increasing levels of carbon dioxide (CO2) in the atmosphere will slow the Earth’s rotation. They used computer models to analyse the effect of adding 1% more CO2 to the atmosphere annually. – BBC News, 12 Feb 2002
The Food and Drug Administration (FDA) routinely fails to report evidence of fraud or misconduct when it inspects the way researchers conduct clinical trials, leaving the public unaware of which research is credible and which isn’t.
Researchers at New York University found that in dozens of published papers where the FDA had uncovered faults in clinical trials, only three ever indicated that violations occurred. In a stem cell trial, for example, all patients were said to have experienced improvement – despite one having a foot amputated.
The New York University study examined 57 clinical trials that received a notice of violation from the FDA for poor record keeping, false information, and poor patient study. Researchers found that findings from those clinical trials were used in 78 published papers – but only in three instances were the faults in the clinical trials mentioned in the papers.
In the other cases, none of the published papers containing data from faulty trials were corrected or retracted.
“These are major things,” Professor Charles Seife, the study’s author, told Reuters. “No one really knows unless you go through these documents that anyone is question the integrity of the trials.”
In one case, an entire clinical trial was considered unreliable by the FDA, but the published paper didn’t mention the violation at all. In another trial, researchers covered up a patient’s death.
Of the 57 published clinical trials, 39 percent had evidence of false information, 25 percent reported adverse events, 61 percent had record keeping problems, and 35 percent failed to protect the safety of the patient or had issues with oversight or informed consent.
“The FDA has repeatedly hidden evidence of scientific fraud not just from the public, but also from its most trusted scientific advisers, even as they were deciding whether or not a new drug should be allowed on the market,” Seife wrote at Slate. “For an agency devoted to protecting the public from bogus medical science, the FDA seems to be spending an awful lot of effort protecting the perpetrators of bogus science from the public.”
Seife said his team could have uncovered even more instances from the 600 clinical trials mentioned in the documents, but most of the documents obtained from the FDA were heavily redacted. “In some cases, you can’t even tell which drug is being tested,” he said.
Every year, the FDA inspects several hundred clinical sites performing biomedical research on human participants and occasionally finds evidence of violations of good clinical practices and misconduct. The study said, however, that the FDA has no systematic method for communicating these findings to the scientific community, and its findings go unremarked in peer-reviewed literature.
In a statement to Reuters, the FDA said it is “committed to increasing the transparency of compliance and enforcement activities with the goal of enhancing the public’s understanding of the FDA’s decision, promoting the accountability of the FDA, and fostering an understanding among regulated industry about the need for consistently safe and high-quality products.”
In an article at the Grand Forks Herald, Minnesota Public Radio goes all in on climate hysteria – and fails in what is simply terrible science journalism by a public broadcaster.
The fact-checking can start with the opening sentences:
St. Patrick’s Day 2012 was the crowning moment of one of Minnesota’s mildest winters: Jubilant parade spectators wore flip flops, Miss Shamrock beamed in sleeveless, emerald satin, and the beer never tasted so refreshing as temperatures hit 80 degrees.
Three months later, the dazzling sunlight was nowhere to be found when rain sheets pummeled the Duluth area. Muddy torrents of chocolate, fuming floodwaters tore through town, leaving shock and devastation.
Both extremes happened in a Minnesota our descendants never knew. It’s warmer, especially in the winter, and rising global temperatures have stacked the deck in favor of heavier rains.
The hottest temperature during that March 2012 in Duluth was 75 degrees. Not even close to the record of 81 degrees set in 1946.
But, the alarmists may say, St. Patrick’s Day 2012 was on March 17, and we’ve never seen temperatures this high on that date before. Perhaps, but one day in one month in one year doesn’t make a trend. Over the past century, and also since 1970 and during the past three decades, there has not been any sign of a significant trend in maximum temperatures on March 17 for the Duluth area. Same goes with absolute maximum temperatures during March. No significant trends over any of these time frames, and during the last 30 years, the correlation is negative – toward lower extreme maximum temperatures in March.
In the Minneapolis-St. Paul region, the 80 degrees in March 2012 was tied with 1967 as only the fifth highest March extreme maximum temperature on record, behind 1986, 1968, 1910, and 2007. No significant trends in maximum temperatures for March 17, either, and the last three decades have a negative correlation toward lower – not higher – extreme maximum temperatures in this month.
Thus, the problems in this article start early. And they continue.
The growing season in the Twin Cities is several weeks longer than it was even in the 1970s.
This classifies as cherry-picking 101, and it is egregious science journalism. Has there been a statistically significant increase in the growing season for the Twin Cities since the 1970s? Yes. But here is the growing season length dating back to when records began in 1873.
Since records began in the 1870s, there is an overall negative correlation toward a shorter – not longer – growing season in the Twin Cities region. Even with the increase in growing season length since the 1970s, the area is only back up to where it historically was before the 1970s.
Between 1873 and 1969, the area averaged a growing season length of 165 days. The average since 1970 has been 164 days. Some climate change.
Then there are the extreme rains:
In Minnesota and the Midwest generally, 37 percent more rain falls in these big 2.5-inch-plus storms than did 50 years ago, said researcher Ken Kunkel of the National Climatic Data Center in North Carolina. ‘We’ve found that the last decade actually has the largest number of these events since the network began in the late 19th Century.’
There are no significant trends in the number of days per year with 2.5+ inches of precipitation for any of the state’s climate subregions in the National Weather Service database. The Twin Cities and Duluth climate areas have the longest records for this metric, and here are the non-existent trends since the early 1870s.
See a climate crisis? No, because there isn’t one. Next issue.
The 2-inch rains historically have come about every five years in a given place. And then there are the really big storms that bring at least 6 or 7 inches of rain over a huge geographic area, with powerful enough spots within the storm dumping 8 inches or more. These types of storms are occurring more frequently, at least partly because warmer air can hold more water.
Two-inch rains come about every five years in the historic record? No chance. Between 1872 and 1970 for the Duluth region, they came about every 1.5 years (i.e., 0.65 per year on average). Overall from 1872 to 2014, they come about every 1.3 years. For the Twin Cities, the average is 0.93 per year since the 1870s, or one per year. All a far cry from “about every five years.”
As for the 6- to 7-inch megastorms, the Duluth region has never (at least during recorded history) received 6 inches of precipitation in a day. The record is 5.20 inches, set in 1909, followed by 4.14 inches in 2012 and 4.00 inches in 1876, all of which seems to contradict this claim:
The 2012 storm in Duluth was considered a 500-year event. It overwhelmed culverts and took out streets.
In June 2012, Duluth received 4.14 inches over one day, and 7.25 inches over two consecutive days, with no rain on the third day. But back in July 1909, the city received 5.20 inches in one day, 6.68 inches over two consecutive days, and 7.83 inches over three consecutive days. Ergo, storms of this magnitude have happened before since records began in the late 1800s, leading to the question as whether the 2012 event was really a 500-year event, and if such events are really becoming more common.
The Saint Cloud area’s top four record daily rainfalls all came before 1957, and none was more than 5 inches. The Twin Cities received 9.15 inches in a single day during 1987, and the next three daily rainfall maxima occurred in 1977, 1892, and 1903. The International Falls region’s record daily rainfall is only 4.82 inches, set back in 1942. The next highest 24-hour totals are from 1966 and 1898.It is certainly debatable whether extreme rain events are on the rise.
Then come the omnipresent concerns over unpredictability, as if weather or climate were ever predictable:
A third facet of the change in Minnesota’s climate, in addition to more heat and bigger storms, is murkier because it involves scientists asking whether things are in fact getting more variable and unpredictable.
For example, because big rainstorms account for a bigger portion of total rainfall, the state can dry out for weeks without reducing annual precipitation.
Some meteorologists call it ‘flash drought.’ Suddenly, after a wet spring, the spigot turns off. The big May 2012 storm in Duluth gave the St. Louis River its highest-ever discharge crest. But six months later, the river was at drought levels.
Actually, both the Twin Cities and Duluth regions have positive correlations since records began in the 1870s – and statistically significant trends over the past century – toward more days per year with precipitation, not less.
Finally, we have the 2012 storm (which was in June, not May) in Duluth that “gave the St. Louis River its highest-ever discharge crest.” Here is the USGS peak streamflow record for the St. Louis River at Scanlon, just upstream from Duluth:
Yes, 2012 set a record, but look at the peak flow trend since the 1970s: declining with no unusual variability aside from the single data point in 2012. One data point does not make climate change.
And about those “drought levels” in the river six months after the flood – which would mean December 2012 – the flow in the river during December was only the 18th lowest on record (i.e., hardly unusual) and almost threefold higher than the record low December flow set back in 1910. By the way, the trend since records began on the river in 1908 is toward more December flow – not less – so climate change isn’t leading to wintertime “drought” flows on the river, either.
So ends the examination of but one climate change story in a single relatively small newspaper from the American Midwest. There is climate reality, but science journalism by the mainstream media is getting farther away from it.
. . . They usually mop the floor with the climatistas. That’s one reason why the climate campaign has resorted to rank conformism and outright bullying.
Matt Ridley offered his observations about the state of things in an article in the London Times a few days ago entitled “My Life as A Climate Lukewarmer.”
I am a climate lukewarmer. That means I think recent global warming is real, mostly man-made and will continue but I no longer think it is likely to be dangerous and I think its slow and erratic progress so far is what we should expect in the future. That last year was the warmest yet, in some data sets, but only by a smidgen more than 2005, is precisely in line with such lukewarm thinking.
This view annoys some sceptics who think all climate change is natural or imaginary, but it is even more infuriating to most publicly funded scientists and politicians, who insist climate change is a big risk. My middle-of-the-road position is considered not just wrong, but disgraceful, shameful, verging on scandalous. I am subjected to torrents of online abuse for holding it, very little of it from sceptics.
I was even kept off the shortlist for a part-time, unpaid public-sector appointment in a field unrelated to climate because of having this view, or so the headhunter thought. In the climate debate, paying obeisance to climate scaremongering is about as mandatory for a public appointment, or public funding, as being a Protestant was in 18th-century England.
Kind friends send me news almost weekly of whole blog posts devoted to nothing but analysing my intellectual and personal inadequacies, always in relation to my views on climate. Writing about climate change is a small part of my life but, to judge by some of the stuff that gets written about me, writing about me is a large part of the life of some of the more obsessive climate commentators. It’s all a bit strange.
There’s more; definitely worth reading the whole thing. … continue
Mr Dunlop, who’s now with the Association for the study of Peak Oil and Gas, says Australia will be one of the hardest hit by a rise in global temperatures. “We’re one of the driest continents on the earth and the effects on Australia will be more severe than elsewhere.” – ABC News, May 2013
Australia’s top intelligence agency believes south-east Asia will be the region worst affected by climate change by 2030, with decreased water flows from the Himalayan glaciers triggering a ‘cascade of economic, social and political consequences’. The dire outlook was provided by the deputy director of the Office of National Assessments, Heather Smith, in a confidential discussion on the national security implications of climate change with US embassy officials. — Sydney Morning Herald, Dec 2010
The effects of climate change will impact more severely on the economy of Papua New Guinea than on any other in the Pacific, according to a new report by the Asian Development Bank. – ABC News, Nov 2013
Research reports that Bangladesh is one of the hardest hit nations by the impacts of climate change. — UK climate4classrooms.org
There seems to be consensus in the developed world that Africa will be the hardest hit or most affected region, due to anthropogenic climate change. – YouLead Collective, a young generation of climate leaders, Nov 2014
Vietnam is likely to be among the countries hardest hit by climate change, mainly through rising sea levels and changes in rainfall and temperatures. – International Food Policy Research Institute, 2010
Norway’s Minister of the Environment and International Development Erik Solheim stated today that “The Small Island Developing States are among the hardest hit by climate change.” — as reported by the Norwegian media, Nov 2011
Maldives’ economy hardest hit by climate change: Asian Development Bank. The Maldives is the most at-risk country in South Asia from climate change impacts, said the report titled ‘Assessing the costs of climate change and adaptation in South Asia.’ – Minivan News, Aug 2014
According to the latest data modelling, climate change is likely to have the strongest impact on Scandinavian countries such as Denmark, Norway and Sweden – planetearthherald.com
Bulgaria, Spain, Portugal, Italy and Greece are the countries that would be worst affected by global warming, according to a European Union report. The EC Joint Research Commission (JRC) report, released on Wednesday, takes into account four significantly sensitive factors: agriculture, river flooding, coastal systems and tourism. — Sofia News Agency, Nov 2009
The economies of southern Europe and the Mediterranean, including Malta, are forecast to suffer the most adverse effects of climate change, according to a new report drawn up by the European Environment Agency. — Primo-europe.eu, July 2010
Climate change is faster and more severe in the Arctic than in most of the rest of the world. The Arctic is warming at a rate of almost twice the global average — panda.org
China’s Poor Farmers Hit Hardest by Climate Change. Declan Conway, a University of East Anglia researcher who has studied climate change’s affect on China’s farmers, told Reuters that people in remote communities in China’s poorer regions are particularly exposed to climate hazards. — Circle Of Blue, Dec 2012
Report: Middle East, African Countries to Be Hardest Hit by Climate Change — CommonDreams, Dec 2012