Aletho News

ΑΛΗΘΩΣ

Open Letter to Samantha Power

teleSUR | October 25, 2014

Dear Ambassador Power:

I recently read your statement decrying the UN General Assembly’s election of Venezuela to the UN Security Council. This statement, so obviously laden with hypocrisy, necessitated this response.

You premise your opposition to Venezuela’s ascendancy to the Security Council on your claim that “From ISIL and Ebola to Mali and the Central African Republic, the Security Council must meet its responsibilities by uniting to meet common threats.” If these are the prerequisites for sitting on the Security Council, Venezuela has a much greater claim for this seat than the U.S., and this is so obvious that it hardly warrants pointing out. Let’s take the Ebola issue first. As even The New York Times agrees, it is little Cuba (another country you decry) which is leading the fight against Ebola in Africa. Indeed, The New York Times describes Cuba as the “boldest contributor” to this effort and criticizes the U.S. for its diplomatic estrangement from Cuba.

Venezuela is decidedly not estranged from Cuba, and indeed is providing it with critical support to aid Cuba in its medical internationalism, including in the fight against Ebola in Africa and cholera in Haiti. And, accordingly, the UN has commended both Cuba and Venezuela for their role in the fight against Ebola. Indeed, the UN Secretary-General’s Special Envoy on Ebola recently stated:

I urge countries in the region and around the world to follow the lead of Cuba and Venezuela, who have set a commendable example with their rapid response in support of efforts to contain Ebola.

By this measure, then, Venezuela should be quite welcome on the Security Council.

In terms of ISIL, or ISIS as some call it, Venezuela has no blame for that problem. Of course, that cannot be said of the U.S. which has been aiding Islamic extremists in the region for decades, from the Mujahideen in Afghanistan (which gave rise to Bin Laden and Al Qaida) to the very radical elements in Syria who have morphed into ISIL. And, of course, the U.S.’s multiple military forays into Iraq — none of which you ever opposed, Ms. Power — have also helped bring ISIS to prominence there. So again, on that score, Venezuela has a much greater claim to a Security Council seat than the U.S.

And what about Mali? Again, it is the U.S. which has helped destabilize Mali through the aerial bombardment of Libya, which brought chaos to both countries in the process. Of course, you personally supported the U.S.-led destruction of Libya so you should be painfully aware of the U.S.’s role in unleashing the anarchy which now haunts Libya and Mali. Venezuela, on the other hand, opposed the U.S.’s lawless assault on Libya, thereby showing again its right to be on the Security Council.

Indeed, while you state quite correctly that “[t]he UN Charter makes clear that candidates for membership on the Security Council should be contributors to the maintenance of international peace and security and support the other purposes of the UN, including promoting universal respect for human rights,” the U.S. is unique in its undermining of all of these goals. It is the U.S. — through its ceaseless wars in countries such as Iraq, Afghanistan, Libya, Yugoslovia, El Salvador, Guatemala, Nicaragua and Vietnam, to name but a few — which has been the greatest force of unleashing chaos and undermining peace, security and human rights across the globe for the past six decades or so. As Noam Chomsky has recently opined — citing an international poll in which the U.S. was ranked by far “the biggest threat to world peace today” — the U.S. is indeed “a leading terrorist state.”

Meanwhile, Venezuela has played a key role in brokering peace in Colombia, and has been a leader in uniting the countries of Latin America and the Caribbean into new and innovative economic and political formations (such as ALBA) which allow these countries to settle their disputes peacefully, and to confront mutual challenges, such as Ebola. It is indeed because of such productive leadership that, as you note in your statement, Venezuela ran unopposed by any of its Latin American neighbors for the Security Council seat.

What’s more, as Chomsky again points out, Venezuela’s Hugo Chavez led “the historic liberation of Latin America” from centuries-long subjugation by Spain and then the U.S. I would submit that it is Venezuela’s leadership in that regard which in fact motivates your opposition to Venezuela’s seat on the Security Council, and not any feigned concern about world peace or human rights.

October 25, 2014 Posted by | Deception, Militarism | , , , , , , | Leave a comment

US diplomat tells Hungary to back EU, criticizes PM Orban over Russia stance

RT | October 24, 2014

A US diplomat visiting Hungary has criticized its PM’s policies towards Russia and stated that he believes Budapest should back the EU in its policy of imposing sanctions on Russia.

On Friday, US Chargé d’Affaires André Goodfriend made the condemnations of Hungarian of Prime Minister Viktor Orban’s policies, particularly in regards to Hungary’s decision to grant Russia a contract to expand the Paks nuclear plant and over its support for the South Stream gas pipeline.

Meanwhile the US denied entry to six Hungarian public officials on Monday in the light of corruption allegations. According to Goodfried, their being banned was related to actions specific to each individual, however, rather than Hungarian politics on the whole.

Goodfried criticized Hungary for how it was veering away from the rule of law which was consolidated after its switch to democracy in 1989 and how it was not a good time to be debating the protection and autonomy of Hungarians in Ukraine.

Orban has been calling for the autonomy of some 200,000 Hungarians who currently reside within Ukrainian borders.

“Particularly with calls for autonomy among Hungarian ethnic nationals in Ukraine… this is not the time to have that discussion,” Goodfriend said.

Hungary should “stand firm with the EU, with EU sanctions” he added and should “understand the sensitivities on the ethnic nationalism question”.

The country has been critical of EU sanctions on Russia. Goodfriend stated that it was not the time for Hungary to “break with its EU partners to criticize so publicly the approach that the partners have taken”.

Hungary, however, is very much dependent on Russian gas supplies and says that the South Stream pipeline would actively aid its energy security.

Earlier in August Orban condemned the EU sanctions against Russia likening them to “shooting oneself in the foot.”

Russia is Hungary’s largest trade partner outside of the EU, with exports worth $3.4 billion in 2013.

October 24, 2014 Posted by | Economics | , , , , | Leave a comment

Russia accuses Sweden of escalating tension in Baltic Sea

RT | October 24, 2014

The Russian Defense Ministry believes the military operation in the Baltic conducted by Sweden in search of possible “foreign underwater activity” can only lead to undermining stability and escalate tension in the region.

“Such unfounded actions of the Swedish Defense Department, fuelled by the Cold War-style rhetoric, are only leading today to escalation of tension in the region,” Ministry spokesman Igor Konashenkov told journalists on Friday.

“It might result not in strengthening of a particular country’s security, but in undermining the principles of the naval economic activity in the Baltic Sea,” he added.

Konashenkov said Russian military officials were anticipating “the culmination of the exciting operation” accompanied by “never-ceasing speculations by the Swedish over detecting a ‘Russian submarine’ in the region of the Stockholm archipelago.”

Sweden started its largest since the Cold War military operation in the Baltic a week ago, explaining that the troops were engaged in search of a possible “foreign underwater activity.”

The Swedish media alleged the operation could be the hunt for a “damaged Russian submarine” in the area.

Moscow has long denied any of its vessels have been damaged. Konashenkov on Friday once again ruled out any possibility of the Swedish military ever finding a Russian submarine in the Stockholm archipelago.

The Swedish military announced on Friday it is curtailing the search operation.

“This means the bulk of ships and amphibious forces have returned to port,” the armed forces said in a statement, cited by Reuters. The military have however said the area would still be monitored by smaller forces.

That’s a U-turn from Thursday’s statement by Swedish Armed Forces spokesman Erik Lagersten, who said that the operation was not scaling down, but was entering a “new phase.”

“The intelligence-gathering operation is continuing just as before,” Lagersten said, according to the Local. “We still believe there is underwater activity.”

On Tuesday, Sweden announced it was ready to use force if it detects any foreign submarine in the waters of the Stockholm Archipelago.

Stockholm has chosen not to prolong the program for military exchange with Moscow, citing Russia’s alleged “challenging” activity in the Baltic Sea, according to Sweden’s draft budget, made public on Thursday.

“This means that Defense Forces’ cooperation with Russia is suspended until further notice,” the text of the budget says.

The draft budget says Sweden has to boost its security. According to the document, Stockholm plans to increase its military spending for 2015 by 680 million kronas (US$93.7 million).

Background: Sweden ready to use force to surface foreign sub as search continues

October 24, 2014 Posted by | Deception, Mainstream Media, Warmongering, Militarism | , , | Leave a comment

Middle East borders bound to change: Israel minister

Press TV – October 24, 2014

Israeli Minister for Military Affairs Moshe Ya’alon says the borders of many Middle Eastern countries are bound to change in the future as a result of recent developments in the region.

The Israeli minister said in a recent interview with the US-based National Public Radio (NPR) that the current borders would change in the coming years, as some have “been changed already.”

The Israeli minister added that the borders of some countries in the region were artificially drawn by the West.

“Libya was a new creation, a Western creation as a result of World War I. Syria, Iraq, the same — artificial nation-states — and what we see now is a collapse of this Western idea,” he stated.

However, Ya’alon said the borders of some nations, including the Egyptian border with Israel, would remain unchanged.

“We have to distinguish between countries like Egypt, with their history. Egypt will stay Egypt,” said Ya’alon.

The minister did not say whether the borders of Israel, also drawn by Western powers after World War I, would change or not.

Regarding the right to return for Palestinian refugees, Ya’alon said Tel Aviv could not allow such a move, as it would keep the Israeli-Palestinian conflict alive “forever.”

He also said that the insistence to remove Israeli settlers from the West Bank amounts to ethnic cleansing.

The Israeli regime expelled more than 700,000 people from their homeland after it occupied Palestine in 1948.

Israeli forces have wiped nearly 500 Palestinian villages and towns off the map, leaving an estimated total of 4.7 million Palestinian refugees hoping for an eventual return to their homeland more than six decades later.

Since 1948, the Israeli regime has denied Palestinian refugees the right of return, despite United Nations’ resolutions and international laws that uphold the people’s right to return to their homeland.

Tel Aviv has built over 120 illegal settlements built since the occupation of the Palestinian territories of the West Bank and East al-Quds.

October 24, 2014 Posted by | Ethnic Cleansing, Racism, Zionism, Illegal Occupation | , , , , , , | 3 Comments

US opposes post-Fukushima nuclear safety proposal

RT | October 24, 2014

The United States is reportedly trying to fend off an attempt out of Switzerland to change a multi-national nuclear safety agreement in the wake of the 2011 Fukushima disaster in Japan.

Reuters and Bloomberg News both reported this week that Swiss officials are seeking addendums to the 77-nation Convention on Nuclear Safety, or CNS, so that countries around the globe are compelled to upgrade energy facilities in hopes of preventing fallout like the one spawned by the Fukushima meltdown more than three years ago.

But while Reuters says the Swiss-led initiative is tentatively being backed by other European countries, the newswire alleges that energy officials in the US, Russia and Canada are all opposed to the measure, which would likely increase industry costs.

Although details of the proposed pact have not been made public, Bloomberg reported that it would involve rewriting “international standards to ensure nuclear operators not only prevent accidents but mitigate consequences if they occur, by installing costly new structures built to survive natural disasters.” In a report published on Thursday this week by Reuters, the newswire said that the proposed changes would not only apply up-to-date safety standards for new reactors, but also carry out back-fitting measures on sites that are already in operation.

According to this week’s reports, however, some of the world’s top energy powers are opposed because, as Reuters’ Fredrick Dahl wrote, any changes to the CNS could take years to be installed if, of course, they are ratified by the dozens of nations involved.

“You are trying to drop a Ferrari engine into a Volkswagen. If you want a new car, let’s go to the show room” and buy one, a senior but unnamed Department of State official said to Dahl.

But experts have previously said American facilities, in particular, are in need of upgrades, with a July 2014 report published by the National Academy of Science that said the US “should access their preparedness for severe nuclear accidents associated with offsite-scale disasters.” Additionally, the authors of that study wrote that America’s current approach to nuclear safety is “clearly inadequate for preventing core-melt accidents and mitigating their consequences,” yet newly-initiated upgrades in the US are being conducted on a scale hardly comparable to what’s occurring overseas: according to Bloomberg, Electricite de France SA is spending around $13 billion on implementing safety measures on its 59 reactors, whereas American utilities will spend only $3 billion on portable generators and cooling reserves for roughly 100 reactors.

Nevertheless, officials in Berne remain optimistic that the countries currently opposed to the proposed changes will come to an agreement that makes facilities around the world more secure.

“Switzerland, as the initiator of the proposal, will continue to collaborate with all delegations and do everything to find a solution that is acceptable to all of us,” Georg Schwarz, deputy director general of the Swiss nuclear-safety regulator, ENSI, wrote to Bloomberg Business Week.

Russian officials did not immediately respond to Bloomberg’s requests for comment, and neither BusinessWeek nor Reuters included remarks from Canada in their report.

October 24, 2014 Posted by | Environmentalism, Nuclear Power, Timeless or most popular | , , | 1 Comment

Canadian authorities ran war game drills depicting ISIS attack scenarios

By Brandon Martinez | Non-Aligned Media | October 23, 2014

Joshua Blakeney has pointed out that Adrienne Arsenault of CBC reported last night that in the weeks leading up to the two so-called ‘terror’ incidents that took place this week in Quebec and Ottawa Canadian authorities had been running war games exercises depicting such attacks.

The relevant commentary starts at 1:52 of the video below:

According to Arsenault,

They [Canadian authorities] may have been surprised by the actual incidents but not by the concepts of them. Within the last month we know that the CSIS, the RCMP and the National Security Task Force … ran a scenario that’s akin to a war games exercise if you will where they actually imagined literally an attack in Quebec, followed by an attack in another city, followed by a tip that that ‘hey some foreign fighters are coming back from Syria.’ So they were imagining a worst case scenario. We’re seeing elements of that happening right now. … [Canadian authorities] may talk today in terms of being surprised but we know that this precise scenario has been keeping them up at night for awhile.

What an amazing coincidence that Canadian intelligence ran a drill envisioning an attack first in Quebec, then another city. On Monday October 20 a man identified as Martin Rouleau supposedly ran over two Canadian soldiers with his car in a mall parking lot in the city of Saint-Jean-sur-Richelieu in Quebec. And yesterday, as we know, one soldier was gunned down in Ottawa followed by a siege on the parliament itself. Authorities and media are claiming that both suspects were converts to Islam who had become “radicalized.”

What are the chances that these mock terror drills are just a coincidence? In nearly every instance of a major terrorist occurrence in the West, it has been revealed that intelligence services were conducting war games exercises mimicking the very events that later come to pass. On the day of the London subway bombings in 2005 British authorities ran drills depicting the exact attack scenario that transpired later in the day. On 9/11 multiple US agencies were running drills simulating jet hijackings. And now we have confirmation that Canada’s intelligence services were doing the same thing.

It has also been revealed that both suspects in the two incidents this week were being monitored by both US and Canadian intelligence for some time prior to their alleged attacks.

October 24, 2014 Posted by | Civil Liberties, False Flag Terrorism, Full Spectrum Dominance | , , | Leave a comment

Root Cause Analysis of the Modern Warming

By Matt Skaggs | Climate Etc. | October 23, 2014

For years, climate scientists have followed reasoning that goes from climate model simulations to expert opinion, declaring that to be sufficient. But that is not how attribution works.

The concept of attribution is important in descriptive science, and is a key part of engineering. Engineers typically use the term “root cause analysis” rather than attribution. There is nothing particularly clever about root cause methodology, and once someone is introduced to the basics, it all seems fairly obvious. It is really just a system for keeping track of what you know and what you still need to figure out.

I have been performing root cause analysis throughout my entire, long career, generally in an engineering setting. The effort consists of applying well established tools to new problems. This means that in many cases, I am not providing subject matter expertise on the problem itself, although it is always useful to understand the basics. Earlier in my career I also performed laboratory forensic work, but these days I am usually merely a facilitator. I will refer to those that are most knowledgeable about a particular problem as the “subject matter experts” (SMEs).

This essay consists of three basic sections. First I will briefly touch on root cause methodology. Next I will step through how a fault tree would be conducted for a topic such as the recent warming, including showing what the top tiers of the tree might look like. I will conclude with some remarks about the current status of the attribution effort in global warming. As is typical for a technical blog post, I will be covering a lot of ground while barely touching on most topics, but I promise that I will do my best to explain the concepts as clearly and concisely as I can.

Part 1: Established Root Cause Methodology

Definitions and Scope

Formal root cause analysis requires very clear definitions and scope to avoid chaos. It is a tool specifically for situations in which we have detected an effect with no obvious cause, but discerning the cause is valuable in some way. This means that we can only apply our methodology to events that have already occurred, since predicting the future exploits different tools. We will define an effect subject to attribution as a significant excursion from stable output in an otherwise stable system. One reason this is important is that a significant excursion from stable behavior in an otherwise stable system can be assumed to have a single root cause. Full justification of this is beyond the scope of this essay, but consider that if your car suddenly stops progressing forward while you are driving, the failure has a single root cause. After having no trouble for a year, the wheel does not fall off at the exact same instant that the fuel pump seizes. I will define a “stable” system as one in which significant excursions are so rare in time that they can safely be assumed to have a single root cause.

Climate science is currently engaged in an attribution effort pertaining to a recent temperature excursion, which I will refer to as the “modern warming.” For purposes of defining the scope of our attribution effort, we will consider the term “modern warming” to represent the rise in global temperature since 1980. This is sufficiently precise to prevent confusion, we can always go back and tweak this date if the evidence warrants. 

Choosing a Tool from the Toolbox 

There are two basic methods to conclusively attribute an effect to a cause. The short route to attribution is to recognize a unique signature in the evidence that can only be explained by a single root cause. This is familiar from daily life; the transformer in front of your house shorted and there is a dead black squirrel hanging up there. The need for a systematic approach such as a fault tree only arises when there is no black squirrel. We will return to the question of a unique signature later, after discussing what an exhaustive effort would look like.

Once we have determined that we cannot simply look at the outcome of an event and see the obvious cause, and we find no unique signature in the data, we must take a more systematic approach. The primary tools in engineering root cause analysis are the fault tree and the cause map. The fault tree is the tool of choice for when things fail (or more generally, execute an excursion), while the cause map is a better tool for when a process breaks down. The fault tree asks “how?,” while the cause map asks “why?” Both tools are forms of logic trees with all logical bifurcations mapped out. Fault trees can be quite complex with various types of logic gates. The key attributes of a fault tree are accuracy, clarity, and comprehensiveness. What does it mean to be comprehensive? The tree must address all plausible root causes, even ones considered highly unlikely, but there is a limit. The limit concept here is euphemistically referred to as “comet strike” by engineers. If you are trying to figure out why a boiler blew up, you are not obligated to put “comet strike” on your fault tree unless there is some evidence of an actual comet.

Since we are looking at an excursion in a data set, we choose the fault tree as our basic tool. The fault tree approach looks like this:

  1. Verify that a significant excursion has occurred.
  2. Collect sufficient data to characterize the excursion.
  3. Assemble the SMEs and brainstorm possible root causes for the excursion.
  4. Build a formal fault tree showing all the plausible causes. If there is any dispute about plausibility, put the prospective cause on the tree anyway.
  5. Apply documented evidence to each cause. This generally consists of direct observations and experimental results. Parse the data as either supporting or refuting a root cause, and modify the fault tree accordingly.
  6. Determine where evidence is lacking, develop a plan to generate the missing evidence. Consider synthetically modeling the behavior when no better evidence is available.
  7. Execute plan to fill all evidence blocks. Continue until all plausible root causes are refuted except one, and verify that the surviving root cause is supported by robust evidence.
  8. Produce report showing all of the above, and concluding that the root cause of the excursion was the surviving cause on the fault tree.

I will be discussing these steps in more detail below.

The Epistemology of Attribution Evidence

As we work through a fault tree, we inevitably must weigh the value of various forms of evidence. Remaining objective here can be a challenge, but we do have some basic guidelines to help us.

The types of evidence used to support or refute a root cause are not all equal. The differences can be expressed in terms of “fidelity.” When we examine a failed part or an excursion in a data set, our direct observations are based upon evidence that has perfect fidelity. The physical evidence corresponds exactly to the effect of the true root cause upon the system of interest. We may misinterpret the evidence, but the evidence is nevertheless a direct result of the true root cause that we seek. That is not true when we devise experiments to simulate the excursion, nor is it true when we create synthetic models.

When we cannot obtain conclusive root cause evidence by direct observation of the characteristics of the excursion, or direct analysis of performance data, the next best approach is to simulate the excursion by performing input/output (I/O) experimentation on the same or an equivalent system. This requires that we make assumptions about the input parameters, and we cannot assume that our assumptions have perfect fidelity to the excursion we are trying to simulate. Once we can analyze the results of the experiment, we find that it either reproduced our excursion of interest, or it did not. Either way, the outcome of the experiment has high fidelity with respect to the input as long as the system used in test has high fidelity to the system of interest. If the experiment based upon our best guess of the pertinent input parameters does not reproduce the directly-observed characteristics of the excursion, we do not discard the direct observations in favor of the experiment results. We may need to go back and double check our interpretation, but if the experiment does not create the same outcome as the actual event, it means we chose the wrong input parameters. The experiment serves to refute our best guess. The outcomes from experimentation obviously sit lower on an evidence hierarchy than direct observations.

The fidelity of synthetic models is limited in exactly the same way with respect to the input parameters that we plug into the model. But models have other fidelity issues as well. When we perform our experiments on the same system that had the excursion (which is ideal if it is available), or on an equivalent system, we take great care to assure that our test system responds the same way to input as the original system that had the excursion of interest. We can sometimes verify this directly. In a synthetic model, however, an algorithm is substituted for the actual system, and there will always be assumptions that go into the algorithm. This adds up to a situation in which we are unsure of the fidelity of our input parameters, and unsure of the fidelity of our algorithm. The compounded effect of this uncertainty is that we do not apply the same level of confidence to model results that we do to observations or experiment results. So in summary, and with everything else being equal, direct observation will always trump experimental results, and experimental results will always trump model output. Of course, there is no way to conduct meaningful experiments on analogous climates, so one of the best tools is not of any use to us.

Similar objective value judgments can be made about the comparison of two data sets. When we look at two curves and they both seem to show an excursion that matches in onset, duration and amplitude, we consider that to be evidence of correlation. If the wiggles also closely match, that is stronger evidence. Two curves that obviously exhibit the same onset, magnitude, and duration prior to statistical analysis will always be considered better evidence than two curves that can be shown to be similar after sophisticated statistical analysis. The less explanation needed to correlate two curves, the stronger the evidence of correlation.

Sometimes we need to resolve plausible root causes but lack direct evidence and cannot simulate the excursion of interest by I/O testing. Under these circumstances, model output might be considered if it meets certain objective criteria. When attribution of a past event is the goal, engineers shun innovation. In order for model output to be considered in a fault tree effort, the model requires extensive validation, which means the algorithm must be well established. There must be a historical record of input parameters and how changes in those parameters affected the output. Ideally, the model will have already been used successfully to make predictions about system behavior under specific circumstances. Models can be both sophisticated and quite trustworthy, as we see with the model of planetary motion in the solar system. Also, some very clever methods have been developed to substitute for prior knowledge. An example is the Monte Carlo method, which can sometimes tightly constrain an estimation of output without robust data on input. Similarly, if you have good input and output data, we can sometimes develop a useful empirical relationship of the system behavior without really knowing much about how the system works. A simple way to think of this is to consider three types of information, input data, system behavior, and output data. If you know two of the three, you have some options for approximating the third. But if you only have adequate information on one or less of the types of information, your model approach is underspecified. Underspecified model simulations are on the frontier of knowledge and we shun their use on fault trees. To be more precise, simulations from underspecified models are insufficiently trustworthy to adequately refute root causes that are otherwise plausible.

Part 2: Established Attribution Methodology Applied to the Modern Warming

Now that we have briefly covered the basics of objective attribution and how we look at evidence, let’s apply the tools to the modern warming. Recall that attribution can only be applied to events in the past or present, so we are looking at only the modern warming, not the physics of AGW. A hockey stick shape in a data set provides a perfect opportunity, since the blade of the stick represents a significant excursion from the shaft of the stick, while the shaft represents the stable system that we need to start with.

I mentioned at the beginning that it is useful for an attribution facilitator to be familiar with the basics of the science. While I am not a climate scientist, I have put plenty of hours into keeping up with climate science, and I am capable of reading the primary literature as long as it is not theoretical physics or advanced statistics. I am familiar with the IPCC Annual Report (AR) sections on attribution, and I have read all the posts at RealClimate.org for a number of years. I also keep up with some of the skeptical blogs including Climate Etc. although I rarely enter the comment fray. I did a little extra reading for this essay, with some help from Dr. Curry. This is plenty of familiarity to act as a facilitator for attribution on a climate topic. Onward to the root cause analysis.

Step 1: Verify that a significant excursion has occurred.

Here we want to evaluate the evidence that the excursion of interest is truly beyond the bounds of the stability region for the system. When we look at mechanical failures, Step 1 is almost never a problem, there is typically indisputable visual evidence that something broke. In electronics, a part will sometimes seem to fail in a circuit but meet all of the manufacturer’s specifications after it is removed. When that happens we shift our analysis to the circuit and the component originally suspected of causing the failure becomes a refuted root cause.

In looking at the modern warming, we first ask whether there are similar multi-decadal excursions in the past millennium of unknown cause. We also need to consider the entire Holocene. While most of the available literature states that the modern excursion is indeed unprecedented, this part of the attribution analysis is not a democratic process. We find that there is at least one entirely plausible temperature reconstruction for the last millennium that shows comparable excursions. Holocene reconstructions suggest that the modern warming is not particularly significant. We find no consensus as to the cause of the Younger Dryas, the Minoan, Roman, and Medieval warmings, or the Little Ice Age, all of which may constitute excursions of at least similar magnitude. I am not comfortable with this because we need to understand the mechanisms that made the system stable in the first place before we can meaningfully attribute a single excursion.

When I am confronted with a situation like this in my role as facilitator, I would have a discussion with my customer as to whether they want to expend the funds to continue the root cause effort given the magnitude of uncertainly regarding the question of whether we even have a legitimate attribution target. I have grave doubts that we have survived Step 1 in this process, but let’s assume that the customer wants us to continue.

Step 2. Collect sufficient data to characterize the excursion.

The methodology can get a little messy here. Before we can meaningfully construct a fault tree, we need to carefully define the excursion of interest, which usually means studying both the input and output data. However, we are not really sure of what input data we need since some may be pertinent to the excursion while other data might not. We tend to rely upon common sense and prior knowledge as to what we should gather at this stage, but any omissions will be caught during the brainstorming so we need not get too worried.

The excursion of interest is in temperature data. We find that there is a general consensus that a warming excursion has occurred. The broad general agreement about trends in surface temperature indices is sufficient for our purposes.

The modern warming temperature excursion exists in the output side of the complex process known as “climate.” A fully characterized excursion would also include robust empirical input data, which for climate change would be tracking data for the climate drivers. When we look for input data at this stage, we are looking for empirical records of the climate both prior to and during the modern warming. We do not have a full list yet, but we know that greenhouse gases, aerosols, volcanoes, water vapor, and clouds are all important. Rather than continue on this topic here, I will discuss it in more detail after we construct the fault tree below. That way we can be specific about what input data we need.

Looking for a Unique Signature

Now that we have chosen to consider the excursion as anomalous and sufficiently characterized, this is a good time to look for a unique signature. Has the modern warming created a signature that is so unique that it can only be associated with a single root cause? If so, we want to know now so that we can save our customer the expense of the full fault tree that we would build in Steps 3 and 4.

Do any SMEs interpret some aspect of the temperature data as a unique signature that could not possibly be associated with more than one root cause? It turns out that some interpret the specific spatio-temporal heterogeneity pattern as being evidence that the warming was driven by the radiation absorbed by increased greenhouse gas (GHG) content in the atmosphere. Based upon what I have read, I don’t think there is anyone arguing for a different root cause creating a unique signature in the modern warming. The skeptic arguments seem to all reside under a claim that the signature is not unique, not that it is unique to something other than GHG warming. So let’s see whether we can take our shortcut to a conclusion that an increase in GHG concentration is the sole plausible root cause due to a unique data signature.

Spatial heterogeneity would be occurring up to the present day, and so can be directly measured. I have seen two spatial pattern claims about GHG warming, 1) the troposphere should warm more quickly, and 2) the poles should warm more quickly. Because this is important, I have attempted to track these claims back through time. The references mostly go back to climate modeling papers from the 1970s and 1980s. In the papers, I was unable to find a single instance where any of the feedbacks thought to enhance warming in specific locations were associated solely with CO2. Instead, some are associated with any GHG, while others such as arctic sea ice decrease occur due to any persistent warming. Nevertheless, the attribution chapter in IPCC AR 5 contains a paragraph that seems to imply that enhanced tropospheric warming supports attribution of the modern warming to anthropogenic CO2. I cannot make the dots connect. But here is one point that cannot be overemphasized: the search for a unique signature in the modern warming is the best hope we have for resolving the attribution question.

Step 3. Assemble the SMEs and brainstorm plausible root causes for the excursion.

Without an overwhelmingly strong argument that we have a unique signature situation, we must do the heavy lifting involved with the exhaustive approach. Of course, I am not going to be offered the luxury of a room full of climate SMEs, so I will have to attempt this myself for the purposes of this essay.

Step 4. Build a Formal Fault Tree

An attribution analysis is a form of communication, and the effort is purpose-driven in that we plan to execute a corrective action if that is feasible. As a communication tool, we want our fault tree to be in a form that makes sense to those that will be the most difficult to convince, the SMEs themselves. And when we are done, we want the results to clearly point to actions we may take. With these thoughts in mind, I try to find a format that is consistent with what the SMEs already do. Also, we need to emphasize anthropogenic aspects of causality because those are the only ones we can change. So we will base our fault tree on an energy budget approach similar to a General Circulation Model (GCM), and we will take care to ensure that we separate anthropogenic effects from other effects.

GCMs universally, at least as far as I know, use what engineers call a “control volume” approach to track an energy budget. In a control volume, you can imagine an infinitely thin and weightless membrane surrounding the globe at the top of the atmosphere. Climate scientists even have an acronym for the location “top of the atmosphere,” TOA. Energy that migrates inside the membrane must equal energy that migrates outside the membrane over very long time intervals, otherwise the temperature would ramp until all the rocks melted or everything froze. In the rather unusual situation of a planet in space, the control volume is equivalent to a “control mass” equation in which we would track the energy budget based upon a fixed mass. Our imaginary membrane defines a volume but it also contains all of the earth/atmosphere mass. For simplicity, I will continue with the term “control volume.”

The control volume equation in GCMs is roughly equivalent to:

[heat gained] – [heat lost] = [temperature change]

This is just a conceptual equation because the terms on the left are in units of energy, while the units on the right are in degrees of temperature. The complex function between the two makes temperature an emergent property of the climate system, but we needn’t get too wrapped up in this. Regardless of the complexity hidden behind this simple equation, it is useful to keep in mind that each equation term (and later, each fault tree box) represents a single number that we would like to know.

There is a bit of housekeeping we need to do at this point. Recall that we are only considering the modern warming, but we can only be confident about the fidelity of our control volume equation when we consider very long time intervals. To account for the disparity in duration, we need to consider the concept of “capacitance.” A capacitor is a device that will store energy under certain conditions, but then discharge that energy under a different set of conditions. As an instructive example, the argument that the current hiatus in surface temperature rise is being caused by energy storage in the ocean is an invocation of capacitance. So to fit our approach to a discrete time interval, we need the following modification:

[heat gained] + [capacitance discharge] – [heat lost] – [capacitance recharge] = [modern warming]

Note that now we are no longer considering the entire history of the earth, we are only considering the changes in magnitude during the modern warming interval. Our excursion direction is up, so we discard the terms for a downward excursion. Based upon the remaining terms in our control volume equation, the top tier of the tree is this:

Slide1From the control volume standpoint, we have covered heat that enters our imaginary membrane, heat that exits the membrane, and heat that may have been stashed inside the membrane and is only being released now. I should emphasize that this capacitance in the top tier refers to heat stored inside the membrane prior to the modern warming that is subsequently released to create the modern warming.

This top tier contains our first logical bifurcation. The two terms on the left, heat input and heat loss, are based upon a supposition that annual changes in forcing will manifest soon enough that that the change in temperature can be considered a direct response. This can involve a lag as long as the lag does not approach the duration of the excursion. The third term, capacitance, accounts for the possibility that the modern warming was not a direct response to a forcing with an onset near the onset of our excursion. An alternative fault tree can be envisioned here with something else in the top tier, but the question of lags must be dealt with near the top of the tree because it constitutes a basic division of what type of data we need.

The next tier could be based upon basic mechanisms rooted in physics, increasing the granularity:

Slide2The heat input leg represents heat entering the control volume, plus the heat generated inside. We have a few oddball prospective causes here that rarely see the light of day. The heat generated by anthropogenic combustion and geothermal heat are a couple of them. In this case, it is my understanding that there is no dispute that any increases above prior natural background combustion (forest fires, etc.) and geothermal releases are trivial. We put these on the tree to show that we have considered them, but we need not waste time here. Under heat loss, we cover all the possibilities with the two basic mechanisms of heat transfer, radiation and conduction. Conduction is another oddball. The conduction of heat to the vacuum of space is relatively low and would be expected to change only slightly in rough accordance to the temperature at TOA. With conduction changes crossed off, a decrease in outward radiation would be due to a decreased albedo, where albedo represents reflection across the entire electromagnetic spectrum. A control volume approach allows us to lump convection in with conduction.   The last branch in our third tier is the physical mechanism by which a temperature excursion occurs due to heat being released from a reservoir, which is a form of capacitance discharge.

I normally do not start crossing off boxes until the full tree is built. However, if we cross off the oddballs here, we see that the second tier of the tree decomposes to just three mechanisms, solar irradiance increase, albedo decrease, and heat reservoir release. This comes as no revelation to climate scientists.

This is as far as I am going in terms of building the full tree, because the next tier gets big and I probably would not get it right on my own. Finishing it is an exercise left to the reader! But I will continue down the “albedo decrease” leg until we reach anthropogenic CO2-induced warming, the topic du jour. A disclaimer: I suspect that this tier could be improved by the scrutiny of actual SMEs.

Slide3The only leg shown fully expanded is the one related to CO2, the reader is left to envision the entire tree if each leg were to be expanded in a similar manner. The bottom left corner of this tree fragment shows anthropogenic CO2-induced warming in proper context. Note that we could have separated anthropogenic effects at the first tier of the tree, but then we would have two almost identical trees.

Once every leg is completed in this manner, the next phase of adding evidence begins.

Step 5. Apply documented evidence to each cause.

Here we assess the available evidence and decide whether it supports or refutes a root cause. The actual method used is often dictated by how much evidence we are dealing with. One simple way is to make a numbered list of evidence findings. Then when a finding supports a root cause, we can add that number to the fault tree block in green. When the same finding refutes a different root cause, we can add the number to the block in red. All findings must be mapped across the entire tree.

The established approach to attribution looks at the evidence based upon the evidence hierarchy and exploits any reasonable manner of simplification. The entire purpose of a control volume approach is to avoid having to understand the complex relationship that exists between variables within the control volume. For example, if you treat an engine as a control volume, you can put flow meters on the fuel and air intakes, a pressure gauge on the exhaust, and an rpm measurement on the output shaft. With those parameters monitored, and a bit of historical data on them, you can make very good predictions about the trend in rpm of the engine based upon changes in inputs without knowing very much about how the engine translates fuel into motion. This approach does not involve any form of modeling and is, as I mentioned, the rationale for using control volume in the first place.

The first question the fault tree asks of us is captured in the first tier. Was the modern warming caused by a direct response to higher energy input, a direct response to lower energy loss, or as a result of heat stored during an earlier interval being released? If we consider this question in light of our control volume approach (we don’t really care how energy gets converted to surface temperature), we see that we can answer the question with simple data in units of energy, watts or joules. Envision data from, say, 1950 to 1980, in terms of energy. We might find that for the 30-year interval, heat input was x joules, heat loss was y joules, and capacitance release was z joules.   Now we compare that to the same data for the modern warming interval. If any one of the latter numbers is substantially more than the corresponding earlier numbers x, y, or z, we have come a long way already in simplifying our fault tree. A big difference would mean that we can lop off the other legs. If we see big changes in more than one of our energy quantities, we might have to reconsider our assumption that the system is stable.

In order to resolve the lower tiers, we need to take our basic energy change data and break it down by year, so joules/year. If we had reasonably accurate delta joules/year data relating to the various forcings, we could wiggle match between the data and the global temperature curve. If we found a close match, we would have strong evidence that forcings have an important near-term effect, and that (presumably) only one root cause matches the trend. If no forcing has an energy curve that matches the modern warming, we must assume capacitance complicates the picture.

Let’s consider how this would work. Each group of SMEs would produce a simple empirical chart for their fault tree block estimating how much energy was added or lost during a specific year within the modern warming, ideally based upon direct measurement and historical observation. These graphs would then be the primary evidence blocks for the tree. Some curves would presumable vary around zero with no real trend, others might decline, while others might increase. The sums roll up the tree. If the difference between the “heat gained” and “heat lost” legs shows a net positive upward trend in energy gained, we consider that as direct evidence that the modern warming was driven be heat gained rather than capacitance discharge. If those two legs sum to near zero, we can assume that the warming was caused by capacitance discharge. If the capacitance SMEs (those that study El Nino, etc.) estimate that a large discharge likely occurred during the modern warming, we have robust evidence that the warming was a natural cycle.

  1. Determine where evidence is lacking…

Once all the known evidence has been mapped, we look for empty blocks. We then develop a plan to fill those blocks as our top priority.

I cannot find the numbers to fill in the blocks in the AR documents. I suspect that the data does not exist for the earlier interval, and perhaps cannot even be well estimated for the modern warming interval.

  1. Execute plan to fill all evidence blocks.

Here we collect evidence specifically intended to address the fault tree logic. That consists of energy quantities from both before and during the modern warming. Has every effort been made to collect empirical data about planetary albedo prior to the modern warming? I suspect that this is a hopeless situation, but clever SMEs continually surprise me.

In a typical root cause analysis, we continue until we hopefully have just one unrefuted cause left. The final step is to exhaustively document the entire process. In the case of the modern warming, the final report would carefully lay out the necessary data, the missing data, and the conclusion that until and unless we can obtain the missing data, the root cause analysis will remain unresolved.

Part 3: The AGW Fault Tree, Climate Scientists, and the IPCC: A Sober Assessment of Progress to Date

I will begin this section by stating that I am unable to assess how much progress has been made towards resolving the basic fault tree shown above. That is not for lack of trying, I have read all the pertinent material in the IPCC Annual Reports (ARs) on a few occasions. When I read these reports, I am bombarded with information concerning the CO2 box buried deep in the middle of the fault tree. But even for that box, I am not seeing a number that I could plug into the equations above. For other legs of the tree, the ARs are even more bewildering. If climate scientists are making steady progress towards being able to estimate the numbers to go in the control volume equations, I cannot see it in the AR documents.

How much evidence is required to produce a robust conclusion about attribution when the answer is not obvious? For years, climate scientists have followed reasoning that goes from climate model simulations to expert opinion, declaring that to be sufficient. But that is not how attribution works. Decomposition of a fault tree requires either a unique signature, or sufficient data to support or refute every leg of the tree (not every box on the tree, but every leg). At one end of the spectrum, we would not claim resolution if we had zero information, while at the other end, we would be very comfortable with a conclusion if we knew everything about the variables. The fault tree provides guidance on the sufficiency of the evidence when we are somewhere in between. My customers pay me to reach a conclusion, not muck about with a logic tree. But when we lack the basic data to decompose the fault tree, maintaining my credibility (and that of the SMEs as well) demands that we tell the customer that the fault tree cannot be resolved because we lack sufficient information.

The curve showing CO2 rise and the curve showing the modern global temperature rise do not look the same, and signal processing won’t help with the correlation. Instead, there is hypothesized to be a complex function involving capacitance that explains the primary discrepancy, the recent hiatus. But we still have essentially no idea how much capacitance has contributed to historical excursions. We do not know whether there is a single mode of capacitance that swamps all others, or whether there are multiple capacitance modes that go in and out of phase. Ocean capacitance has recently been invoked as perhaps the most widely endorsed explanation for the recent hiatus in global warming, and there is empirical evidence of warming in the ocean. But invoking capacitance to explain a data wiggle down on the fifth tier of a fault tree, when the general topic of capacitance remains unresolved in the first tier, suggests that climate scientists have simply lost the thread of what they were trying to prove. The sword swung in favor of invoking capacitance to explain the hiatus turns out to have two edges. If the system is capable of exhibiting sufficient capacitance to produce the recent hiatus, there is no valid argument against why it could not also have produced the entire modern warming, unless that can be disproven with empirical data or I/O test results.

Closing Comments

Most of the time when corporations experience a catastrophe such as a chemical plant explosion resulting in fatalities, they look to outside entities to conduct the attribution analysis. This may come as a surprise given the large sums of money at stake and the desire to influence the outcome, but consider the value of a report produced internally by the corporation. If the report exonerates the corporation of all culpability, it will have zero credibility. Sure, they can blame themselves to preserve their credibility, but their only hope of a credible exoneration is if it comes from an independent entity. In the real world, the objectivity of an independent study may still leave something to be desired, given the fact that the contracted investigators get their paycheck from the corporation, but the principle still holds. I can only assume when I read the AR documents that this never occurred to climate scientists.

The science of AGW will not be settled until the fault tree is resolved to the point that we can at least estimate a number for each leg in our fault tree based upon objective evidence. The tools available have thus far not been up to the task. With so much effort put into modelling CO2 warming while other fault tree boxes are nearly devoid of evidence, it is not even clear that the available tools are being applied efficiently.

The terms of reference for the IPCC are murky, but it is clear that it was never set up to address attribution in any established manner. There was no valid reason to not use an established method, facilitated by an entity with expertise in the process, if attribution was the true goal. The AR documents are position papers, not attribution studies, as exemplified by the fact that supporting and refuting arguments cannot be followed in any logical manner and the arguments do not roll up into any logical framework. If AGW is really the most important issue that we face, and the science is so robust, why would climate scientists not seek the added credibility that could be gained from an independent and established attribution effort?

October 24, 2014 Posted by | Science and Pseudo-Science | | 2 Comments

The Long Battle Over Pesticides, Birth Defects and Mental Impairment

By Dr. JANETTE D. SHERMAN, MD | CounterPunch | October 24, 2014

The recent number of articles in the popular press concerning loss of intellect among children exposed to chlorpyrifos is important in the case of this pesticide. Although in-home use of chlorpyrifos was restricted in the U. S in 2000, it is widely used in agriculture, and is a serious risk to health and intellect for people working and living in proximity to fields. Detectable levels of chlorpyrifos detected in New York City children, raises the question of exposure via food.

Across the U. S. we learn that students are doing poorly in school, often blaming the teachers and their unions. Are teachers no longer competent to teach or have children been “dumbed-down” by exposure to this neurotoxin?

The State of California is considering restriction on use, but is prepared for strong opposition from the pesticide and big agricultural industries.

Back in the “Dark Ages” – a mere 50 years ago – when I was a medical student and intern at Wayne State University, I rotated through Children’s Hospital in Detroit. It was staffed by some of the most thoughtful and kind physician/professors I have ever met. I attended a clinic named “FLK” otherwise known as Funny Looking Kid clinic. There we saw children who had abnormal looking faces, abnormal body parts, and, often impaired intelligence. Many of the children required complicated medical care, but I don’t recall much discussion as to why they had these abnormalities that had dramatically cut short their futures and altered the lives of their families.

Realizing you have given birth to a child with birth defects is devastating – not only for the child, but for the family, and for society in general. If the child survives infancy, it means being “different” and having to cope with disability, and with having to learn alternative ways to function. For many families, it means 24/7 care of a child who can never live independently. For society the costs can be enormous – surgery (often multiple), medications, social services, special education, special equipment, then alternative living arrangements, if and when family cannot care for their child, now grown to a non-functional adult.

Although the neurotoxicity of pesticides has been known for decades, recently, several national magazines, have named the pesticide, chlorpyrifos (Dursban/ Lorsban), as an agent causing loss of intelligence, as well as birth defects and structural brain damage.

Dr. James Hamblin’s article in March 2014 issue of The Atlantic, titled “The Toxins that Threaten Our Brains.” listed 12 commonly used chemicals, including chlorpyrifos, which is marketed as Dursban and Lorsban. The exposures described in the Atlantic articles were urban, so we do not know exactly how widespread this epidemic is, especially if we do not include agricultural areas such as in California, Hawaii and the mid-West.

That same month, The Nation published articles by Susan Freinkel “Poisoned Politics” and Lee Fang “Warning Signs” who reported adverse effects from exposure to Dursban and Lorsban.

Dr. Hamblin’s article generously cites Drs. Philip Landrigan of Mt. Sinai in New York City and Philippe Grandjean of Harvard that a “’silent pandemic’ of toxins has been damaging the brains of unborn children.”

Dr. Landrigan chaired a 1998 meeting of the Collegium Ramazzini International Scientific Conference, held in Carpi, Italy.   In attendance was Dr. Grandjean, whose research found “Methylmercury as a hazard to brain development.” Dr. Richard Jackson, from the U. S. CDC was also in attendance, as well as U.S. governmental and university members.

At that Collegium Ramazzini International Scientific Conference, on October 25, 1998, I presented definitive data in my paper: “Chlorpyrifos (Dursban) exposure and birth defects: report of 15 incidents, evaluation of 8 cases, theory of action, and medical and social aspects.” This presentation followed my earlier publications beginning in 1994 wherein I reported damage to the unborn from the same pesticide.

The Ramazzini organization sent my paper to the European Journal of Oncology for publication. Since my paper reported birth defects, not cancer, the paper has received little notice, but the attendees, including the EPA, have known of the findings for 16 years.

Currently a new battle is occurring in Hawaii over the use of pesticides, especially by Dow AgroSciences, DuPont Pioneer, BASF Plant Science, and Syngenta on the island of Kauai where giant seed companies develop Genetically Modified Organisms (GMOs) and other specialized seeds. The pesticides used there include alachlor, atrazine, chlorpyrifos, methomyl, metalochlor, permethrin and paraquat. The author, Paul Koberstein from Cascadia Times estimates that annually, more than 2000 pounds of chlorpyrifos are used per acre per year on Kauai, compared to less than 0.025 averages for the U. S. Mainland.

In addition to Hawaii, areas in California include workers and families from the Imperial Valley and other intensive agricultural areas where pesticide use is extensive. Using the Koberstein data, annual use of chlorpyrifos in California is approximately 1500 pounds/ acre.

Neurological Damage: Before and After Birth

Birth defects arise as a result of two mechanisms – damage to a gene, prior to fertilization, or damage to the growing cells of the fetus after life in the womb has begun. Differing from genetic damage, such as occurs in Down syndrome or Trisomy-21, the latter damage results from exposure of the developing fetus to agents called teratogens. For many years Mongolism was the name applied to children with growth delays, similar facial and hand features and intellectual deficits.

Chlorpyrifos is a unique pesticide. It is a combination of an organophosphate and a trichlorinatedpyridinol (TCP.) TCP is not only the feedstock used in the manufacture of chlorpyrifos, but also a contaminant in the product, and a metabolic breakdown product that is known to cause central nervous system abnormalities (hydrocephaly and dilated brain ventricles), and other abnormalities (cleft palate, skull and vertebral abnormalities) in fetuses as reported by Dow Chemical Co.

In March 1995, I was asked to fly to Arkansas to see a child whose mother had been exposed to the pesticide Dursban (chlorpyrifos) early in the pregnancy of her daughter.

Mrs. S had been working in a bank when in mid-March, 1991, she noticed a man spraying the baseboards behind the station where she worked as a teller. She said she asked the man if was okay to be in the area since she was pregnant, and she said the man told her it was “perfectly safe. She said the spraying had occurred around 4 PM, and that she worked at the bank until 6:30 PM, and when she went home that evening she had nausea and a” bit of headache.” She said she retuned to work the next day, felt nausea, but worked most of the day. An electrical fire at the drive-in window followed the pesticide event, and a technician used of a fogger that sprayed a “citrus-like” chemical that was intended to deodorize the smoke odor. Mrs. S. said she worked at the bank until about April of that year, and then worked at a credit union until her daughter was born in September.

When Mrs. S. was about five months pregnant she had an ultrasound, which showed that her baby had enlarged ventricles in her brain. Further examination revealed absence of the septum pellucidum, a central portion of her brain. Mrs. S. had additional follow up at a university center as well as with her own physician that showed normal amniocentesis and normal chromosomes.

Both Mr. & Mrs. S. said that caring for the daughter A. has been a severe financial and emotional drain, sometimes requiring them to be up 72 hours to try to soothe A’s crying. A. had surgery to repair her cleft lip when she was six months old, and repair of her cleft palate and left eyelid when she was a year old.

Both cleft lip and palate can now be repaired (in areas with skilled surgeon, and insurance or other funds) but until they are, the child has difficulty feeding and risks poor nutrition, upper respiratory and lung problems as a result of aspiration of food.

Additional diagnostic procedures indicated that A has a cleft left eye (failure of her eye to fuse during development), and she cannot blink her eye or move the left side of her face.

A was unable to sit up on her own by the time she was a year old, had to have food pureed until she was two, then her parents realized that when A neared her 4th birthday, she could not hear, when they began a program of sign language with the aid of a speech therapist.

A’s brother B. was born two years later, and is well, sleeping thought the night when he was two weeks of age.

I was given a tour of the bank where Ms. S worked by its’ Senior Vice-President, and to minimize stress to A, I examined her in the office and presence of her pediatrician. I also accompanied her parents to their home where I could observe A. at her home.

A was a small-boned child who walked with a wide-based, unsteady gait and who made audible sounds, but no language content. Her head was enlarged with hydrocephaly and a small bruise due to a recent, commonly occurring fall.

Her abnormalities included the following, and were characteristic of findings in other children:

low-set, tapering ears, wide-spaced nipples, and frequent infections. This litany is not to horrify readers, but to bring to attention the burdens imposed upon this child, her parents, and society as a whole. I evaluated seven more children, two families each having two children with similar, but more severe medical conditions.

With the exception of child #1, the seven children were profoundly retarded, were diapered, could not speak, and required feeding.

I first met C & D in 1996, along with their parents and handsome, healthy older brother, at their attractive home on the West Coast. Both D (a girl) and C (a boy) were lying flat, diapered, mouths open, fists clenched, staring into space, and being fed by bottle. Even today, looking at the photographs reminds me what an enormous burden was dealt to that family.

Ultimately I evaluated eight children, and identified seven more, reported by Dow Chemical Co., the manufacturer, to EPA on November 2, 1994, with reporting delays of as long as seven years from when the corporation first learned of them. I obtained the reports via a Freedom of Information request (FOI) from EPA. The reports were labeled with the revealing name: “DERBI” – or – “Dow Elanco Research Business Index.”

When I saw seven more children, all of who looked like siblings, (much as Trisomy-21 or Down Syndrome children do) it became clear to me, that the cause was linked to Dursban, the pre-natal exposure common to each.

Among the Dursban-exposed children, all 8 had both brain and palate abnormalities, seven had widespread nipples and growth retardation, six had low vision or blindness and six had genital abnormalities, five had brain atrophy and external ear abnormalities, four children had absence of the corpus collosum that is the critical connection between the two hemispheres of the brain.   Chromosomal studies were normal in all 8 families. All families reported stress and enormous financial burden to care for their children.

In addition to the children with birth defects, I also evaluated a number of families and a group of adults who had been exposed at their work site. Of the workers, all 12 complained of headache, and three of dizziness. Eight had findings of central nervous system damage, and six had peripheral nervous system damage. The patients reported upper respiratory and chest symptoms, as well as nausea, vomiting, diarrhea, and four had incontinence. The families also reported abnormalities and deaths in their household pets.

In February 1996, my deposition in the first case was taken by three groups of attorneys representing the defendants, two principally defending Dow Elanco. I was questioned for three 8-hour days. Ultimately a list of 565 exhibits was accumulated that included over 10,000 pages of materials that I supplied and relied upon for my opinion. These materials included Dow documents and correspondence, EPA documents, legal depositions, basic embryology, biochemistry and toxicology of chlorpyrifos, medical records of other exposed children, patents, books, articles, etc, etc.

Chlorpyrifos was designed to be neurotoxic in action. It is an interesting pesticide, in that it has not only an organophosphate portion, but also it has three chlorine atoms attached to a pyridinol ring. This ring is trichloropyridinol (TCP), a significant hazard, because it is fat-soluble, and persistent, up to 18 years as claimed by Dow Chemical Co. TCP also forms the body of trichlophenoxyacetic acid, part of Agent Orange, also linked to birth defects and cancer. In a war that ended in 1975, Agent Orange continues as a risk to the Vietnamese, and to military troops that were stationed there.

According to multiple Dow documents, TCP is the feedstock for production of chlopryrifos, a contaminant in the product, and a metabolic breakdown product. TCP has been demonstrated to cause central nervous system anomalies (hydrocephaly and dilated brain ventricles) as well as cleft palate, skull and vertebral abnormalities in the fetus at doses nontoxic to the mother, similar to the defects seen in affected children.

That TCP caused birth defects was known by Dow in 1987, but not reported to EPA until five years later in 1992. TCP is used to manufacture chlorpyrifos, and as such, comes under regulation of Section 8(e) of the Toxic Substances Control Act (TSCA), rather than the Federal Insecticide, Fungicide and Rodenticide Control Act (FIFRA.) Though there was regulatory difference, TSCA states very clearly “any person who manufactures, processes or distributed in commerce a chemical substance or mixture, or who obtains information which reasonably supports the conclusion that such substance or mixture presents a substantial risk of injure to heath or the environment, shall immediately inform the Administrator of such information. From 1976 to 1982, I was a member of a 16 person Advisory Committee to the EPA for TSCA, Chairman of the Risk-Benefit Assessment Group from 1977 to 1979, and a member of the Carcinogen Policy Sub-group from 1977 to 1981. It was clear that risks and benefits do no accrue to the same party. In the case of chlorpyrifos, the risks are to the unaware public, and the benefits to the corporation.

The Legal System is Not the Same as the Justice System

Bernard P. Whetstone was a well-established attorney who handled the initial birth defects case in Little Rock, Arkansas, and was aware of another case in that state. Mr. Whetstone was a “Southern Gentleman” with a soft drawl who had earned both a bachelor and doctorate of jurisprudence, and started practice in 1934. In 1995, he was worked with Davidson and Associates until he retired in 1999 at age 86. Mr. Whetstone died in 2001.

I was required to appear In Court in Little Rock, where Judge Eisley ruled that I was not qualified. Hard to believe that 10,000 pages of documents is not adequate, but that opinion was softened because he ruled that all the plaintiff’s experts were not qualified. Another physician/ toxicology expert and I evaluated additional patients (adults) who developed multiple adverse effects, including central nervous system damage, so Dow, employing the Eisley decision, argued successfully in other court jurisdictions that we were not qualified to give an opinion.

The main Dow law firm was Barnes and Thornburg from Indianapolis, where DowElanco, the co-manufacturer Eli Lilly is located. Eli Lilly is a manufacturer of both pharmaceuticals and pesticides. Barnes & Thornburg has over 500 attorneys in 12 cities and appeared to be very well staffed and funded.

A recent news release noted that William W. Wales, who spent more than 30 years in the legal department of The Dow Chemical Company and Dow AgroSciences LLC, had joined Barnes & Thornburg LLP’s Indianapolis office as a partner in the firm’s litigation and corporate departments. “Bill’s depth and breadth of experience in a variety of matters will be a tremendous asset to many of our clients who are dealing with similar issues,” said Joseph G. Eaton, Vice Chair of the firm’s Litigation Department and Co-Chair of the Toxic Tort Practice Group. Joseph Eaton is one of the attorneys who took my extensive deposition. They were the most aggressive law firm I had ever encountered, and I have testified in more than 700 depositions and/or court appearances.

In defense of their product, the Dow attorneys argued that there were no reports of levels of pesticides used or existing levels – a questionable tactic, since the corporation has never suggested or requested that such records be obtained.

Although the EPA stopped home use of Dursban in 2000, Lorsban is widely used in agriculture, on ornamentals, and places where women, the unborn and children are exposed. For many, this exposure is without their knowledge or consent. How is this allowed to happen?

Is it successful advertising, recommendations from country and state agricultural agents, an inept or politically adept EPA such as when on September 11, 2001, the then administrator of the U.S. Environmental Protection Agency and former governor of New Jersey Christie Whitman said on September 13, 2001, “EPA is greatly relieved to have learned that there appears to be no significant levels of asbestos dust in the air in New York City.” A week

Whitman said: “Given the scope of the tragedy from last week, I am glad to reassure the people of New York and Washington, DC that their air is safe to breathe and their water is safe to drink.”

In 2008, the U. S. EPA named Dow as an Energy Star Partner of the Year for excellence in energy management and reductions in greenhouse gas emissions.

Dow’s fleet of skilled lawyers have managed to save Dow from liability, when they achieved a reversal of a $925 million judgment for the contamination of the area around Rocky Flats, the Colorado facility that produced plutonium triggers for hydrogen bombs. And, a lawsuit filed by Vietnamese, damaged by Agent Orange against Dow and Monsanto was dismissed.

Dow is a multinational corporation and the third largest chemical manufacturer in the world, with earnings more than $57 billion in 2013. In addition to the manufacture of insecticides, herbicides, fungicides, and genetically modified seeds, Dow also manufactures multiple plastics, polystyrene, polyurethane, synthetic rubber, biphenyl-A as well as many other chemicals.

What are the chances that the use of Lorsban will be curtailed in the agricultural areas of Hawaii, California and elsewhere? Given what we know of the financial strength of the Dow Corporation, the weakness of the EPA, and our paid-for Congress, it does not look promising.

The Burden of Brain Damage 

If the top corporate officials were required to care for one of these severely brain-damaged children for a week, would it change their minds about the ethics of manufacturing chlorpyrifos and corporate profits?

There is not a teacher who can teach brain-damaged children to read and do math, which raises the larger question being proposed: are children’s lack of learning due to poor teachers, or to subtle brain damage? If children are being damaged to various degrees, profoundly in the situation of the 15 children sited in my research, to “mild” learning and/or behavioral problems, ranging from decreased IQ, Asperbergers, hyperactivity, autism, etc., how much is attributable to exposure to pesticides such as Dursban/ Lorsban? If we blame poor teaching, and teachers’ unions, but don’t stop the use of brain-damaging pesticides, where does that leave our U.S. society as a source of creativity and intellect in this world?

Note: All of my chlorpyrifos/ Dursban documents have been accepted and will be archived at the National Library of Medicine, along with my other scientific, medical and legal research.

Janette D. Sherman, M. D. is the author of Life’s Delicate Balance: Causes and Prevention of Breast Cancer and Chemical Exposure and Disease, and is a specialist in internal medicine and toxicology. She edited the book Chernobyl: Consequences of the Catastrophe for People and Nature, written by A. V. Yablokov, V. B., Nesterenko and A. V. Nesterenko, published by the New York Academy of Sciences in 2009.  Her primary interest is the prevention of illness through public education.  She can be reached at:  toxdoc.js@verizon.netand www.janettesherman.com

October 24, 2014 Posted by | Deception, Economics, Environmentalism, Science and Pseudo-Science, Timeless or most popular | , , , , , , , | Leave a comment

MH-17: The Untold Story

RT | October 22, 2014

Three months after Malaysia Airlines Flight MH17 was violently brought down from the skies over Ukraine, there are still no definitive answers to what caused the tragedy.

Civil conflict in the area prevented international experts from conducting a full and thorough investigation.

The wreckage should have been collected and scrupulously re-assembled to identify all the damage, but this standard investigative procedure was never carried out. Until that’s done, evidence can only be gleaned from pictures of the debris, the flight recorders or black boxes and eye-witnesses’ testimonies. This may be enough to help build a picture of what really happened to the aircraft, whether a rocket fired from the ground or gunfire from a military jet.

October 23, 2014 Posted by | Deception, False Flag Terrorism, Mainstream Media, Warmongering, Timeless or most popular, Video, War Crimes | | Leave a comment

US Court Rejects Argentina’s Appeal in Vulture Funds Case

teleSUR | October 23, 2014

The ongoing saga between Argentina and the vulture funds continues after a U.S court rejects Argentina’s appeal to allow the country to pay its creditors.

A United States appeals court has dismissed the Argentine appeal of an order directing Bank of New York Mellon to hold on to US$539 million dollars that Argentina deposited to pay its bondholders.

The appeals court said that it lacked jurisdiction over the appeal as an earlier ruling by U.S. District Judge Thomas Griesa was a clarification rather than modification of his earlier rulings on the matter.

In Griesa’s original ruling, the judge ruled that Argentina deposit with Bank of New York Mellon to pay bondholders who had renegotiated their debt with Argentina was “illegal,” and ordered the bank to hold on to the funds.

No progress has been made in talks between the country and hedge-fund holdouts, led by Elliott Management and Aurelius Capital Management.

Griesa has also scheduled another hearing on December 2 to weigh arguments over whether Citigroup Inc (C.N) should be allowed to process an expected interest payment by Argentina on bonds issued under its local laws following its 2002 default.

The hearing comes less than a month before an interest payment by Argentina on the bonds is due on December 31.

The hold outs, commonly referred to as vulture funds, had previously rejected all Argentina’s past restructuring offers on the country’s debt, most of which was incurred under Argentina’s military dictatorships and neoliberal governments. Ninety-two percent of creditors accepted the offer, and Argentina has been taking steps to continue to pay them back in spite of Judge Griesa’s ruling.

October 23, 2014 Posted by | Economics | , , , , , | Leave a comment

Boeing reneges on Iran business pledge

Press TV – October 23, 2014

American aircraft-manufacturing giant Boeing has ended a 35-year break in business with Iran, supplying the country’s national flag carrier with a cargo of aircraft-related items.

But the sale did not include spare parts for Iranian aircraft as promised by Washington following last year’s nuclear deal between Iran and six world powers.

“During the third quarter of 2014, we sold aircraft manuals, drawings, and navigation charts and data to Iran Air,” Boeing said in its quarterly report on Wednesday.

This is the first time that the American company has sold safety items to Iran Air since the 1979 Islamic Revolution.

The business deal brought Boeing USD 120,000 in revenue, the report added.

The sales came after the US Treasury Department issued a license in April that allowed Boeing to provide “spare parts that are for safety purposes” to Iran for a “limited period of time.”

Boeing said the plane parts were purchased “consistent with guidance from the US government in connection with ongoing negotiations.”

Boeing, which is still banned from selling new aircraft to the Islamic Republic, said that it could sell more plane parts to Iran Air in the future.

“We may engage in additional sales pursuant to this license,” it added.

In February, two major US aerospace manufacturers, Boeing and General Electric, applied for export licenses in order to sell airliner parts to Iran following an interim nuclear agreement between Tehran and the P5+1 group of world powers in November 2013.

Under the deal dubbed the Geneva Joint Plan of Action, the six countries – the US, France, Britain, Russia, China and Germany – undertook to provide Iran with some sanctions relief in exchange for Tehran agreeing to limit certain aspects of its nuclear activities.

In the past decade, Iran has witnessed several major air accidents blamed on its aging aircraft due to the US sanctions that prevent Iran from buying aircraft spare parts.

October 23, 2014 Posted by | Deception | , , , | Leave a comment

Secret Project Created Weaponized Ebola in South Africa in the 1980s

By Daniel Taylor | Old-Thinker News | October 20, 2014

“No records are available to confirm that the biological agents were destroyed.”

Operating out of South Africa during the Apartheid era in the early 1980’s, Dr. Wouter Basson launched a secret bioweapons project called Project Coast. The goal of the project was to develop biological and chemical agents that would either kill or sterilize the black population and assassinate political enemies. Among the agents developed were Marburg and Ebola viruses.

Basson is surrounded by cloak and dagger intrigue, as he told Pretoria High court in South Africa that “The local CIA agent in Pretoria threatened me with death on the sidewalk of the American Embassy in Schoeman Street.” According to a 2001 article in The New Yorker magazine, the American Embassy in Pretoria was “terribly concerned” that Basson would reveal deep connections between Project Coast and the United States.

In 2013, Basson was found guilty of “unprofessional conduct” by the South African health council.

Bioweapons expert Jeanne Guillemin writes in her book Biological Weapons: From the Invention of State-Sponsored Programs to Contemporary Bioterrorism, “The project’s growth years were from 1982 to 1987, when it developed a range of biological agents (such as those for anthrax, cholera, and the Marburg and Ebola viruses and for botulinum toxin)…”

Basson’s bioweapons program officially ended in 1994, but there has been no independent verification that the pathogens created were ever destroyed. The order to destroy them went directly to Dr. Basson. According to the Wall Street Journal, “The integrity of the process rested solely on Dr. Basson’s honesty.”

Basson claims to have had contact with western agencies that provided “ideological assistance” to Project Coast. Basson stated in an interview shot for the documentary Anthrax War that he met several times with Dr. David Kelly, the infamous UN weapons inspector in Iraq. Kelly was a top bioweapons expert in the United Kingdom. He was found dead near his home in Oxfordshire in 2003. While the official story claims he committed suicide, medical experts highly doubt this story.

In a 2007 article from the Mail Online, it was reported that a week prior to his death, Dr. Kelly was to be interviewed by MI5 about his ties to Dr. Basson.

Dr. Timothy Stamps, Minister of Health of Zimbabwe, suspected that his country was under biological attack during the time that Basson was operating. Stamps told PBS Frontline in 1998 that “The evidence is very clear that these were not natural events. Whether they were caused by some direct or deliberate inoculation or not, is the question we have to answer.”

Stamps specifically named the Ebola and Marburg viruses as suspect. Stamps thinks that his country was being used as a testing ground for weaponized Ebola.

“I’m talking about anthrax and cholera in particular, but also a couple of viruses that are not endemic to Zimbabwe [such as] the Ebola type virus and, we think also, the Marburg virus. We wonder whether in fact these are not associated with biological warfare against this country during the hostilities… Ebola was along the line of the Zambezi [River], and I suspect that this may have been an experiment to see if a new virus could be used to directly infect people.”

The Ghanaian Times reported in early September on the recent Ebola outbreak, noting connections between Basson and bioweapons research. The article points out that, “… there are two types of scientists in the world: those who are so concerned about the pain and death caused to humans by illness that they will even sacrifice their own lives to try and cure deadly diseases, and those who will use their scientific skill to kill humans on the orders of… government…”

Indeed, these ideas are not new. Plato wrote over 2,000 years ago in his work The Republic that a ruling elite should guide society, “… whose aim will be to preserve the average of population.” He further stated, “There are many other things which they will have to consider, such as the effects of wars and diseases and any similar agencies, in order as far as this is possible to prevent the State from becoming either too large or too small.”

As revealed by The Age, Nobel prize winning Australian microbiologist Sir Macfarlane Burnet secretly urged the Australian government in 1947 to develop bio weapons for use against the “overpopulated countries of South-East Asia.” In a 1947 meeting with the New Weapons and Equipment Development Committee, the group recommended that “the possibilities of an attack on the food supplies of S-E Asia and Indonesia using B.W. agents should be considered by a small study group.”

This information gives us an interesting perspective on the recent unprecedented Ebola outbreak. Is it an organic natural phenomenon? Did this strain of Ebola accidentally escape from a bioweapons lab? Or, was it deliberately released?

October 23, 2014 Posted by | Ethnic Cleansing, Racism, Zionism, Timeless or most popular, Video | , , | 1 Comment

Follow

Get every new post delivered to your Inbox.

Join 757 other followers