Aletho News

ΑΛΗΘΩΣ

The Problem with Transparency International’s Corruption Perceptions Index

img_3960_670-300x200

By Joseph Thomas – New Eastern Outlook – 13.02.2017

Transparency International puts out what it calls the “Corruption Perceptions Index.” It is an annual index it claims “has been widely credited with putting the issue of corruption on the international policy agenda.”

These carefully selected words, taken at face value appear benign, even progressive. But upon digging deeper into this organisation’s background it becomes clear that these “perceptions” are politically motivated, and the “international policy agenda” clearly favours a very specific region of the globe, particularly that region occupied by Washington, London and Brussels.

Transparency International claims upon its “Who We Are” page of its website that (our emphasis):

From villages in rural India to the corridors of power in Brussels, Transparency International gives voice to the victims and witnesses of corruption. We work together with governments, businesses and citizens to stop the abuse of power, bribery and secret deals. As a global movement with one vision, we want a world free of corruption. Through chapters in more than 100 countries and an international secretariat in Berlin, we are leading the fight against corruption to turn this vision into reality.

Before moving onto the organisation’s funding and financials, one would assume that above and beyond any other organisation in the world, Transparency International would carefully and diligently avoid any perceptions of conflicts of interest on its own part. Yet, not surprisingly, that isn’t the case.

An Anti-Corruption Org Swimming in Conflicts of Interest

Upon their page, “Who Supports Us,” Transparency International admits that it receives funding from government agencies including:

  • The United Kingdom’s Department for International Development (DFID);
  • Federal Foreign Office, Germany and;
  • The US State Department.

Transparency International not only receives funding from the very governments it is tasked to investigate, hold accountable and “index” annually, constituting a major conflict of interest, it also receives money from the following:

  • The National Endowment for Democracy;
  • Open Society Institute Foundation and;
  • Shell Oil.
Other troubling sponsors dot Transparency International’s funding disclosure, but the inclusion of immense corporate interests like energy giant Shell, is particularly troubling.
So is the inclusion of the National Endowment for Democracy whose board of directors is chaired by representatives from other large corporations and financial institutions as well as partisan political figures involved heavily in not only influencing politics in their own respective nations, but who use the National Endowment for Democracy itself as a means to influence other nations.While these interests are transparently self-serving, the use of the National Endowment for Democracy allows them to predicate their involvement in the political affairs and elections of foreign nations upon “democracy promotion.” This seems to be the very essence of corruption, “abuse of power” and “secret deals,” yet they are funding Transparency International’s very existence.

Open Society in turn, is the sociopolitical fund employed by convicted financial criminal George Soros. The New York Times in its article, “French court upholds Soros conviction,” reported that:

The conviction of George Soros, the billionaire investor and former fund manager, on insider trading charges was upheld on Thursday by a French appeals court, which rejected his argument that his investment in a French bank in 1988 was not based on confidential information.

Soros, 74, now retired from money management but active as a philanthropist and author, was ordered to pay a fine of €2.2 million, or $2.9 million, representing the money made by funds he managed from an investment in Société Générale. He said the purchase had been part of a strategy to invest in a group of companies that had been privatized by the French government.

Were it not for the very serious impact Transparency International’s false perception globally as a reputable corruption watchdog has on nations targeted by its CPI reports, it would be almost comical that this so-called anti-corruption organisation is funded by not only the very governments it is supposed to be objectively detached from, but also funded by convicted criminals like Soros and organisations like the National Endowment for Democracy well known for their use of “democracy promotion” as cover in pursuit of their own self-serving interests.

Thus it is clear,  that even at face value, Transparency International likewise serves as just such cover, but instead of hiding behind “democracy promotion” to advance what is a very specific, political agenda, it is hiding behind “fighting corruption.”

And even if impropriety wasn’t so blatant, Transparency International’s lack of better judgement regarding its funding and conflicts of interest discredit it as a legitimate corruption watchdog.

For nations around the world pressured by Transparency International and its CPI reports, dismissing them with this evidence in hand, as well as devising domestic (and credible) anti-corruption watchdogs as alternatives would be particularly useful.

Special interests using Transparency International to target and undermine nations and governments they seek to influence or coerce is not limited only to this organisation, but is a pattern repeated over and over again, from the National Endowment for Democracy’s Freedom House “Freedom in the World” index, to reports published by Human Rights Watch and Amnesty International, US-European special interests have honed this craft of using just causes as cover for corruption and coercion into a fine art.

February 13, 2017 Posted by | Corruption, Deception, Economics, Timeless or most popular | , , | 1 Comment

Rothschild reveals crucial role his ancestors played in the Balfour Declaration and creation of Israel

Rothschild reveals crucial role his ancestors played in the Balfour Declaration and creation of Israel

 If Americans Knew | February 9, 2017

The Times of Israel reports that Lord Jacob Rothschild recently revealed new details about the crucial role his ancestors played in obtaining the Balfour Declaration, which “helped pave the way for the creation of Israel.”

The 80-year-old Rothschild is the current head of the banking family and a strong supporter of Israel.

The Balfour Declaration (text below) was an official 1917 letter from the British Foreign Minister, Lord Balfour, addressed to Lord Rothschild, a Zionist leader in Britain at the time and the current Lord Rothschild’s uncle.

During a television interview, the Times of Israel reports that Balfour revealed for the first time the  role of his cousin Dorothy de Rothschild.

Rothschild described Dorothy, who was in her teens at the time, as “devoted to Israel,” and said: ‘What she did, which was crucially important.’”

Rothschild said that Dorothy connected Zionist leader Chaim Weizmann to the British establishment. Dorothy “told Weizmann how to integrate, how to insert himself into British establishment life, which he learned very quickly.”

Rothschild said that the way the declaration was procured was extraordinary. “It was the most incredible piece of opportunism.”

“[Weizmann] gets to Balfour,” Rothschild described, “and unbelievably, he persuades Lord Balfour, and Lloyd George, the prime minister, and most of the ministers, that this idea of a national home for Jews should be allowed to take place. I mean it’s so, so unlikely.”

 

The interview was was conducted by former Israeli ambassador Daniel Taub as part of the Balfour 100 project. Taub interviewed Rothschild at Waddeston Manor in Buckinghamshire, a manor bequeathed to the nation by the Rothschild family in 1957, where the Declaration is kept.

According to Ambassador Taub, the declaration “changed the course of history for the Middle East.”

The Times reports that Rothschild said his family at the time was divided on the idea of Israel, noting that some members “didn’t think it was a good thing that this national home be established there”.

Dorothy’s letters are also stored at Waddeston. They describe her later dealings with diverse Zionist leaders and her advice on the organization of the Zionist Conference, according to the Times.

Rothschild said that the Declaration went through five drafts before finally being issued on November 2, 1917.

Alison Weir reports in her book, Against Our Better Judgment: The Hidden History of How the U.S. Was Used to Create Israel, that drafts of the declaration went back and forth to Zionists in the United States before the document was finalized. The main writer was secret Zionist Leopold Amery.

Balfour Declaration Text:

Foreign Office
November 2nd, 1917

Dear Lord Rothschild,

I have much pleasure in conveying to you. on behalf of His Majesty’s Government, the following declaration of sympathy with Jewish Zionist aspirations which has been submitted to, and approved by, the Cabinet

His Majesty’s Government view with favour the establishment in Palestine of a national home for the Jewish people, and will use their best endeavors to facilitate the achievement of this object, it being clearly understood that nothing shall be done which may prejudice the civil and religious rights of existing non-Jewish communities in Palestine or the rights and political status enjoyed by Jews in any other country.

I should be grateful if you would bring this declaration to the knowledge of the Zionist Federation.

Yours,

Arthur James Balfour

February 10, 2017 Posted by | Corruption, Ethnic Cleansing, Racism, Zionism, Timeless or most popular, War Crimes | , , , , | 1 Comment

Police video editing and mass media lies

Mass Privatel | February 9, 2017

Last year, the Feds were accused of editing video evidence to protect the Bureau of Land Management. And they were also accused of editing press briefings about Iran’s nuclear technology.

Feds admit they edited videos

In 2016, the Obama Administration was forced to admit that questions about the government’s secret discussions with Iran were deliberately edited.

Police caught editing videos

Police in New Mexico, Colorado, Chicago and North Carolina have been caught deleting and editing videos. A 2015 article in the Huffington Post warned everyone about the dangers of police releasing copies of dashcam/bodycam footage.

“If courts and news outlets can’t access the original recording and digital record, there’s no way to check that what you’re seeing is unaltered video.”

The Huffington Post warns, that none of the so-called high tech security protocols can prevent law enforcement from editing video footage.

“There are no national regulations that force departments to release the raw footage — or any trail of data — to the public or press.Neither body cam nor dashcam footage is accessible by the Freedom of Information Act, so the policies are left up to individual police departments.”

This is American policing in a nutshell, police and prosecutor immunity, secret Stingray cellphone surveillance agreements etc.

Police secrecy is more important than our Bill of Rights.

Future of policing: Video manipulation

Face2Face‘ video manipulation technology will make you doubt everything you see on TV and every video you watch. Users of ‘Face2Face’s video manipulation technology can make anyone say pretty much anything they want.

Imagine a future where police and the Feds use this technology to make anyone appear guilty.

If a technology can be abused, it will be abused, soon we won’t be able to trust anything we see or hear.

February 10, 2017 Posted by | Civil Liberties, Corruption, Deception, Timeless or most popular, Video | , | 3 Comments

Good News and Bad News at Hanford, America’s Most Polluted Site

By Joshua Frank | The Investigative Fund | February 7, 2017

It’s a new year and new administration, but the strong radioactive stench is the same out at Hanford in eastern Washington, home of the world’s costliest environmental cleanup. In January, a dozen workers reported smelling a toxic odor outside the site’s tank farms, where nuclear waste is stored underground. From April to December 2016, 70 people were exposed to chemical vapors emanating from the facility — and 2017 is off to the similar start.

Toxic odors at an old nuclear depot? This would be startling news anywhere else. But this is Hanford after all, where taxpayer money freely flows to contractors despite the snail-paced half-life of their work. Twenty years and $19 billion later, Hanford is still a nightmare — likely the most toxic site in the Western Hemisphere. Not one ounce of nuclear waste has ever been treated, and there are no indications Hanford will be nuke free anytime soon. To date, at least 1 million gallons of radioactive waste has leaked and is making its way to the Columbia River. It’s an environmental disaster of epic proportions — a disaster created by our government’s atomic obsession during the Cold War era.

No doubt, Hanford is a wreck in search of a remedy, yet the costs covered by American taxpayers appears to be growing exponentially. At the tail end of 2016, the estimated cost of turning the radioactive gunk into glass rods bumped up a cool $4.5 billion (adding to the ultimate price tag for the remaining Hanford cleanup, which had already reached a whopping $107.7 billion). These sorts of increases are so common they hardly make news anymore.

Donald Trump’s pick for Department of Energy Secretary, Rick Perry, who infamously stated he’d like to do away with the DoE altogether [without a DoE there would be no nuclear weapons programs or US agency promoting nuclear energy], now admits that Hanford’s one of the most dangerous facilities in the nation. But his commitment to cleaning up the fiscal and nuclear boondoggle remains to be seen. The plant that is to turn the waste into glass rods is set to open in 2023, but it’s a safe bet that won’t be happening. It’s already two decades behind schedule.

Meanwhile, workers on the front lines of the cleanup are often put in situations that are poorly monitored and exceedingly unsafe. Over the past three years KING 5 News in Seattle has tracked dozens of employees who were exposed to chemical vapors at Hanford and found their illnesses to include “toxic encephalopathy (dementia), reactive airway disease, COPD, and painful nerve damage.”

“The people running Hanford need to have a moral compass that directs them in the right way, as human beings, to do the right thing to protect these people,” retired Hanford employee Mike Geffre, who worked at Hanford for 26 years, told KING 5. “They’re trying to save money and save face. They’re standing behind their old position that there’s no problem. That’s absurd. They need to accept the fact that they made mistakes and get over it.”

Fortunately, there is a bit of good news in his heap of radioactivity. Last November, a settlement was reached between the US Department of Justice, Bechtel Corp. and AECOM (formerly URS) for a whopping $125 million. The civil lawsuit alleged taxpayer funds were mismanaged and that both companies performed shoddy work. The lawsuit also claimed that government funds were illegally used to lobby members of Congress. Brought on by whistleblowers Gary Brunson, Donna Busche, and Walter Tamosaitis (Busche and Tamosaitis’s sagas were highlighted in two Investigative Fund reports I authored for Seattle Weekly in 2011 and 2012), the settlement was one of the largest in DoE history.

No doubt it was a substantial victory for whistleblowers and government accountability, despite the fact that the defendants did not admit guilt. Now, Washington State legislators are pushing HB 1723, a bill that would protect and treat Hanford workers for certain health problems that are a result of the work they’ve done at the facility, such as respiratory problems, heart issues, certain cancers like bone, breast, lung and thyroid, as well as neurological issues.

“Currently, many Hanford workers are not receiving necessary medical care because they are put in the impossible situation of being unable to specify the chemicals to which they have been exposed, and in what concentrations, making it difficult for their doctors to connect their disease with their exposures,” Randy Walli, Business Manager for the pipefitters union, Local 598, told King 5.

Compensation for whistleblowers and employees whose health is impacted by their work are steps in the right direction. But Hanford’s contractors and the DoE that oversees them still have much to do to make the increasingly expensive nuclear cleanup at Hanford, safe, effective and transparent.

February 9, 2017 Posted by | Corruption, Deception, Environmentalism, Militarism, Timeless or most popular | , , | Leave a comment

Saudi Aramco picks Israel-linked banker

Press TV – February 8, 2017

Saudi Arabian Oil Co. (Aramco) has chosen the New York-based boutique investment bank Moelis & Co to advise on its initial public offering, reports say.

The sale of the world’s biggest oil company is the latest of several moves by the Saudi government to generate revenues in the face of a gaping budget deficit.

Aramco had invited banks in January to pitch for an advisory position on what is expected to be the world’s biggest initial public offering.

JPMorgan, which has been Aramco’s commercial banker for years, and Michael Klein, a former star Citigroup banker, had been advising Saudi authorities on the IPO.

However, the kingdom’s decision to pick a small banker has surprised many observers. International business outlets such as Bloomberg and the Financial Times said the choice represents a coup for Moelis founded no earlier than 2007.

The IPO, which is predicted to raise about $100 billion, is set to yield millions of dollars in fees and push Moelis up in global investment bank rankings.

Last year, Moelis hired Shlomo Yanai, a retired Israeli military officer, to join the firm as a senior adviser. Yanai had earlier been offered the directorship of the Israeli spy agency Mossad by Prime Minister Benjamin Netanyahu but he turned it down.

The oil giant’s initial public offering, holding $2 trillion in assets, is expected to take place in 2018 with an initial sale of a five-percent share.

According to Bloomberg, Aramco expects Moelis to help it select underwriters for the sale, make decisions on potential listing venues and ensure the IPO goes smoothly.

Saudi Arabia is currently dealing with a budget deficit of nearly $100 billion caused by a sharp slump in oil prices as well as Riyadh’s rising military expenditure. The kingdom emerged as the world’s third largest military spender in 2015 when it began its military campaign against Yemen.

The Saudis have also been forced to introduce a series of austerity measures that include canceling of some bonuses offered to state employees and increasing of entry visa fees for residents and foreigners.

The ruling Saudi family will transfer the revenue from the sale of Aramco to the country’s public investment fund (PIF), which will then be tapped to purchase strategic financial and industrial assets abroad.

February 8, 2017 Posted by | Corruption, Economics | , | 1 Comment

Exposed: How world leaders were duped into investing billions over manipulated global warming data

By David Rose | The Mail on Sunday | February 4, 2017

The Mail on Sunday today reveals astonishing evidence that the organisation that is the world’s leading source of climate data rushed to publish a landmark paper that exaggerated global warming and was timed to influence the historic Paris Agreement on climate change.

A high-level whistleblower has told this newspaper that America’s National Oceanic and Atmospheric Administration (NOAA) breached its own rules on scientific integrity when it published the sensational but flawed report, aimed at making the maximum possible impact on world leaders including Barack Obama and David Cameron at the UN climate conference in Paris in 2015.

The report claimed that the ‘pause’ or ‘slowdown’ in global warming in the period since 1998 – revealed by UN scientists in 2013 – never existed, and that world temperatures had been rising faster than scientists expected. Launched by NOAA with a public relations fanfare, it was splashed across the world’s media, and cited repeatedly by politicians and policy makers.

But the whistleblower, Dr John Bates, a top NOAA scientist with an impeccable reputation, has shown The Mail on Sunday irrefutable evidence that the paper was based on misleading, ‘unverified’ data.

It was never subjected to NOAA’s rigorous internal evaluation process – which Dr Bates devised.

His vehement objections to the publication of the faulty data were overridden by his NOAA superiors in what he describes as a ‘blatant attempt to intensify the impact’ of what became known as the Pausebuster paper.

His disclosures are likely to stiffen President Trump’s determination to enact his pledges to reverse his predecessor’s ‘green’ policies, and to withdraw from the Paris deal – so triggering an intense political row.

In an exclusive interview, Dr Bates accused the lead author of the paper, Thomas Karl, who was until last year director of the NOAA section that produces climate data – the National Centers for Environmental Information (NCEI) – of ‘insisting on decisions and scientific choices that maximised warming and minimised documentation… in an effort to discredit the notion of a global warming pause, rushed so that he could time publication to influence national and international deliberations on climate policy’.
Dr Bates was one of two Principal Scientists at NCEI, based in Asheville, North Carolina.

Official delegations from America, Britain and the EU were strongly influenced by the flawed NOAA study as they hammered out the Paris Agreement – and committed advanced nations to sweeping reductions in their use of fossil fuel and to spending £80 billion every year on new, climate-related aid projects.

The scandal has disturbing echoes of the ‘Climategate’ affair which broke shortly before the UN climate summit in 2009, when the leak of thousands of emails between climate scientists suggested they had manipulated and hidden data. Some were British experts at the influential Climatic Research Unit at the University of East Anglia.

NOAA’s 2015 ‘Pausebuster’ paper was based on two new temperature sets of data – one containing measurements of temperatures at the planet’s surface on land, the other at the surface of the seas.

Both datasets were flawed. This newspaper has learnt that NOAA has now decided that the sea dataset will have to be replaced and substantially revised just 18 months after it was issued, because it used unreliable methods which overstated the speed of warming. The revised data will show both lower temperatures and a slower rate in the recent warming trend.

The land temperature dataset used by the study was afflicted by devastating bugs in its software that rendered its findings ‘unstable’.

The paper relied on a preliminary, ‘alpha’ version of the data which was never approved or verified.

A final, approved version has still not been issued. None of the data on which the paper was based was properly ‘archived’ – a mandatory requirement meant to ensure that raw data and the software used to process it is accessible to other scientists, so they can verify NOAA results.

Dr Bates retired from NOAA at the end of last year after a 40-year career in meteorology and climate science. As recently as 2014, the Obama administration awarded him a special gold medal for his work in setting new, supposedly binding standards ‘to produce and preserve climate data records’.

Yet when it came to the paper timed to influence the Paris conference, Dr Bates said, these standards were flagrantly ignored.

The paper was published in June 2015 by the journal Science. Entitled ‘Possible artifacts of data biases in the recent global surface warming hiatus’, the document said the widely reported ‘pause’ or ‘slowdown’ was a myth.

Less than two years earlier, a blockbuster report from the UN Intergovernmental Panel on Climate Change (IPCC), which drew on the work of hundreds of scientists around the world, had found ‘a much smaller increasing trend over the past 15 years 1998-2012 than over the past 30 to 60 years’. Explaining the pause became a key issue for climate science. It was seized on by global warming sceptics, because the level of CO2 in the atmosphere had continued to rise.

Some scientists argued that the existence of the pause meant the world’s climate is less sensitive to greenhouse gases than previously thought, so that future warming would be slower. One of them, Professor Judith Curry, then head of climate science at the Georgia Institute of Technology, said it suggested that computer models used to project future warming were ‘running too hot’.

However, the Pausebuster paper said while the rate of global warming from 1950 to 1999 was 0.113C per decade, the rate from 2000 to 2014 was actually higher, at 0.116C per decade. The IPCC’s claim about the pause, it concluded, ‘was no longer valid’.

The impact was huge and lasting. On publication day, the BBC said the pause in global warming was ‘an illusion caused by inaccurate data’.

One American magazine described the paper as a ‘science bomb’ dropped on sceptics.

Its impact could be seen in this newspaper last month when, writing to launch his Ladybird book about climate change, Prince Charles stated baldly: ‘There isn’t a pause… it is hard to reject the facts on the basis of the evidence.’

Data changed to make the sea appear warmer

The sea dataset used by Thomas Karl and his colleagues – known as Extended Reconstructed Sea Surface Temperatures version 4, or ERSSTv4, tripled the warming trend over the sea during the years 2000 to 2014 from just 0.036C per decade – as stated in version 3 – to 0.099C per decade. Individual measurements in some parts of the globe had increased by about 0.1C and this resulted in the dramatic increase of the overall global trend published by the Pausebuster paper. But Dr Bates said this increase in temperatures was achieved by dubious means. Its key error was an upwards ‘adjustment’ of readings from fixed and floating buoys, which are generally reliable, to bring them into line with readings from a much more doubtful source – water taken in by ships. This, Dr Bates explained, has long been known to be questionable: ships are themselves sources of heat, readings will vary from ship to ship, and the depth of water intake will vary according to how heavily a ship is laden – so affecting temperature readings.

Dr Bates said: ‘They had good data from buoys. And they threw it out and “corrected” it by using the bad data from ships. You never change good data to agree with bad, but that’s what they did – so as to make it look as if the sea was warmer.’

ERSSTv4 ‘adjusted’ buoy readings up by 0.12C. It also ignored data from satellites that measure the temperature of the lower atmosphere, which are also considered reliable. Dr Bates said he gave the paper’s co-authors ‘a hard time’ about this, ‘and they never really justified what they were doing.’

Now, some of those same authors have produced the pending, revised new version of the sea dataset – ERSSTv5. A draft of a document that explains the methods used to generate version 5, and which has been seen by this newspaper, indicates the new version will reverse the flaws in version 4, changing the buoy adjustments and including some satellite data and measurements from a special high-tech floating buoy network known as Argo. As a result, it is certain to show reductions in both absolute temperatures and recent global warming.

The second dataset used by the Pausebuster paper was a new version of NOAA’s land records, known as the Global Historical Climatology Network (GHCN), an analysis over time of temperature readings from about 4,000 weather stations spread across the globe.

This new version found past temperatures had been cooler than previously thought, and recent ones higher – so that the warming trend looked steeper. For the period 2000 to 2014, the paper increased the rate of warming on land from 0.15C to 0.164C per decade.

In the weeks after the Pausebuster paper was published, Dr Bates conducted a one-man investigation into this. His findings were extraordinary. Not only had Mr Karl and his colleagues failed to follow any of the formal procedures required to approve and archive their data, they had used a ‘highly experimental early run’ of a programme that tried to combine two previously separate sets of records.

This had undergone the critical process known as ‘pairwise homogeneity adjustment’, a method of spotting ‘rogue’ readings from individual weather stations by comparing them with others nearby.

However, this process requires extensive, careful checking which was only just beginning, so that the data was not ready for operational use. Now, more than two years after the Pausebuster paper was submitted to Science, the new version of GHCN is still undergoing testing.

Moreover, the GHCN software was afflicted by serious bugs. They caused it to become so ‘unstable’ that every time the raw temperature readings were run through the computer, it gave different results. The new, bug-free version of GHCN has still not been approved and issued. It is, Dr Bates said, ‘significantly different’ from that used by Mr Karl and his co-authors.

Dr Bates revealed that the failure to archive and make available fully documented data not only violated NOAA rules, but also those set down by Science. Before he retired last year, he continued to raise the issue internally. Then came the final bombshell. Dr Bates said: ‘I learned that the computer used to process the software had suffered a complete failure.’

The reason for the failure is unknown, but it means the Pausebuster paper can never be replicated or verified by other scientists.

The flawed conclusions of the Pausebuster paper were widely discussed by delegates at the Paris climate change conference. Mr Karl had a longstanding relationship with President Obama’s chief science adviser, John Holdren, giving him a hotline to the White House.

Mr Holdren was also a strong advocate of robust measures to curb emissions. Britain’s then Prime Minister David Cameron claimed at the conference that ‘97 per cent of scientists say climate change is urgent and man-made and must be addressed’ and called for ‘a binding legal mechanism’ to ensure the world got no more than 2C warmer than in pre-industrial times.

President Obama stressed his Clean Power Plan at the conference, which mandates American power stations to make big emissions cuts.

President Trump has since pledged he will scrap it, and to withdraw from the Paris Agreement.

Whatever takes its place, said Dr Bates, ‘there needs to be a fundamental change to the way NOAA deals with data so that people can check and validate scientific results. I’m hoping that this will be a wake-up call to the climate science community – a signal that we have to put in place processes to make sure this kind of crap doesn’t happen again.

‘I want to address the systemic problems. I don’t care whether modifications to the datasets make temperatures go up or down. But I want the observations to speak for themselves, and for that, there needs to be a new emphasis that ethical standards must be maintained.’

He said he decided to speak out after seeing reports in papers including the Washington Post and Forbes magazine claiming that scientists feared the Trump administration would fail to maintain and preserve NOAA’s climate records.

Dr Bates said: ‘How ironic it is that there is now this idea that Trump is going to trash climate data, when key decisions were earlier taken by someone whose responsibility it was to maintain its integrity – and failed.’

NOAA not only failed, but it effectively mounted a cover-up when challenged over its data. After the paper was published, the US House of Representatives Science Committee launched an inquiry into its Pausebuster claims. NOAA refused to comply with subpoenas demanding internal emails from the committee chairman, the Texas Republican Lamar Smith, and falsely claimed that no one had raised concerns about the paper internally.

Last night Mr Smith thanked Dr Bates ‘for courageously stepping forward to tell the truth about NOAA’s senior officials playing fast and loose with the data in order to meet a politically predetermined conclusion’. He added: ‘The Karl study used flawed data, was rushed to publication in an effort to support the President’s climate change agenda, and ignored NOAA’s own standards for scientific study.’

Professor Curry, now the president of the Climate Forecast Applications Network, said last night: ‘Large adjustments to the raw data, and substantial changes in successive dataset versions, imply substantial uncertainties.’

It was time, she said, that politicians and policymakers took these uncertainties on board.

Last night Mr Karl admitted the data had not been archived when the paper was published. Asked why he had not waited, he said: ‘John Bates is talking about a formal process that takes a long time.’ He denied he was rushing to get the paper out in time for Paris, saying: ‘There was no discussion about Paris.’

They played fast and loose with the figures

He also admitted that the final, approved and ‘operational’ edition of the GHCN land data would be ‘different’ from that used in the paper’.

As for the ERSSTv4 sea dataset, he claimed it was other records – such as the UK Met Office’s – which were wrong, because they understated global warming and were ‘biased too low’. Jeremy Berg, Science’s editor-in-chief, said: ‘Dr Bates raises some serious concerns. After the results of any appropriate investigations… we will consider our options.’ He said that ‘could include retracting that paper’. NOAA declined to comment.

It’s not the first time we’ve exposed dodgy climate data, which is why we’ve dubbed it: Climate Gate 2

Dr John Bates’s disclosures about the manipulation of data behind the ‘Pausebuster’ paper is the biggest scientific scandal since ‘Climategate’ in 2009 when, as this paper reported, thousands of leaked emails revealed scientists were trying to block access to data, and using a ‘trick’ to conceal embarrassing flaws in their claims about global warming.

Both scandals suggest a lack of transparency and, according to Dr Bates, a failure to observe proper ethical standards.

Because of NOAA ’s failure to ‘archive’ data used in the paper, its results can never be verified.

Like Climategate, this scandal is likely to reverberate around the world, and reignite some of science’s most hotly contested debates.

Has there been an unexpected pause in global warming? If so, is the world less sensitive to carbon dioxide than climate computer models suggest?

And does this mean that truly dangerous global warming is less imminent, and that politicians’ repeated calls for immediate ‘urgent action’ to curb emissions are exaggerated?


Judith Curry has also blogged on the same story.

February 5, 2017 Posted by | Corruption, Deception, Science and Pseudo-Science, Timeless or most popular | , , , , | Leave a comment

Climate scientists versus climate data

By John Bates | Climate Etc. | February 4, 2017

A look behind the curtain at NOAA’s climate data center.

I read with great irony recently that scientists are “frantically copying U.S. Climate data, fearing it might vanish under Trump” (e.g., Washington Post 13 December 2016). As a climate scientist formerly responsible for NOAA’s climate archive, the most critical issue in archival of climate data is actually scientists who are unwilling to formally archive and document their data. I spent the last decade cajoling climate scientists to archive their data and fully document the datasets. I established a climate data records program that was awarded a U.S. Department of Commerce Gold Medal in 2014 for visionary work in the acquisition, production, and preservation of climate data records (CDRs), which accurately describe the Earth’s changing environment.

The most serious example of a climate scientist not archiving or documenting a critical climate dataset was the study of Tom Karl et al. 2015 (hereafter referred to as the Karl study or K15), purporting to show no ‘hiatus’ in global warming in the 2000s (Federal scientists say there never was any global warming “pause”). The study drew criticism from other climate scientists, who disagreed with K15’s conclusion about the ‘hiatus.’ (Making sense of the early-2000s warming slowdown). The paper also drew the attention of the Chairman of the House Science Committee, Representative Lamar Smith, who questioned the timing of the report, which was issued just prior to the Obama Administration’s Clean Power Plan submission to the Paris Climate Conference in 2015.

In the following sections, I provide the details of how Mr. Karl failed to disclose critical information to NOAA, Science Magazine, and Chairman Smith regarding the datasets used in K15. I have extensive documentation that provides independent verification of the story below. I also provide my suggestions for how we might keep such a flagrant manipulation of scientific integrity guidelines and scientific publication standards from happening in the future. Finally, I provide some links to examples of what well documented CDRs look like that readers might contrast and compare with what Mr. Karl has provided.

Background

In 2013, prior to the Karl study, the National Climatic Data Center [NCDC, now the NOAA National Centers for Environmental Information (NCEI)] had just adopted much improved processes for formal review of Climate Data Records, a process I formulated [link]. The land temperature dataset used in the Karl study had never been processed through the station adjustment software before, which led me to believe something was amiss. When I pressed the co-authors, they said they had decided not to archive the dataset, but did not defend the decision. One of the co-authors said there were ‘some decisions [he was] not happy with’. The data used in the K15 paper were only made available through a web site, not in digital form, and lacking proper versioning and any notice that they were research and not operational data. I was dumbstruck that Tom Karl, the NCEI Director in charge of NOAA’s climate data archive, would not follow the policy of his own Agency nor the guidelines in Science magazine for dataset archival and documentation.

I questioned another co-author about why they choose to use a 90% confidence threshold for evaluating the statistical significance of surface temperature trends, instead of the standard for significance of 95% — he also expressed reluctance and did not defend the decision. A NOAA NCEI supervisor remarked how it was eye-opening to watch Karl work the co-authors, mostly subtly but sometimes not, pushing choices to emphasize warming. Gradually, in the months after K15 came out, the evidence kept mounting that Tom Karl constantly had his ‘thumb on the scale’—in the documentation, scientific choices, and release of datasets—in an effort to discredit the notion of a global warming hiatus and rush to time the publication of the paper to influence national and international deliberations on climate policy.

Defining an Operational Climate Data Record

For nearly two decades, I’ve advocated that if climate datasets are to be used in important policy decisions, they must be fully documented, subject to software engineering management and improvement processes, and be discoverable and accessible to the public with rigorous information preservation standards. I was able to implement such policies, with the help of many colleagues, through the NOAA Climate Data Record policies (CDR) [link].

Once the CDR program was funded, beginning in 2007, I was able to put together a team and pursue my goals of operational processing of important climate data records emphasizing the processes required to transition research datasets into operations (known as R2O). Figure 1 summarizes the steps required to accomplish this transition in the key elements of software code, documentation, and data.

slide1Figure 1. Research to operations transition process methodology from Bates et al. 2016.

Unfortunately, the NCDC/NCEI surface temperature processing group was split on whether to adopt this process, with scientist Dr. Thomas C. Peterson (a co-author on K15, now retired from NOAA) vigorously opposing it. Tom Karl never required the surface temperature group to use the rigor of the CDR methodology, although a document was prepared identifying what parts of the surface temperature processing had to be improved to qualify as an operational CDR.

Tom Karl liked the maturity matrix so much, he modified the matrix categories so that he could claim a number of NCEI products were “Examples of “Gold” standard NCEI Products  (Data Set Maturity Matrix Model Level 6).” See his NCEI overview presentation all NCEI employees [ncei-overview-2015nov-2 ] were told to use, even though there had never been any maturity assessment of any of the products.

NCDC/NCEI surface temperature processing and archival

In the fall of 2012, the monthly temperature products issued by NCDC were incorrect for 3 months in a row [link]. As a result, the press releases and datasets had to be withdrawn and reissued. Dr. Mary Kicza, then the NESDIS Associate Administrator (the parent organization of NCDC/NCEI in NOAA), noted that these repeated errors reflected poorly on NOAA and required NCDC/NCEI to improve its software management processes so that such mistakes would be minimized in the future. Over the next several years, NCDC/NCEI had an incident report conducted to trace these errors and recommend corrective actions.

Following those and other recommendations, NCDN/NCEI began to implement new software management and process management procedures, adopting some of the elements of the CDR R2O process. In 2014 a NCDC/NCEI Science Council was formed to review new science activities and to review and approve new science products for operational release. A draft operational readiness review (ORR) was prepared and used for approval of all operational product releases, which was finalized and formally adopted in January 2015. Along with this process, a contractor who had worked at the CMMI Institute (CMMI, Capability Maturity Model Integration, is a software engineering process level improvement training and appraisal program) was hired to improve software processes, with a focus on improvement and code rejuvenation of the surface temperature processing code, in particular the GHCN-M dataset.

The first NCDC/NCEI surface temperature software to be put through this rejuvenation was the pairwise homogeneity adjustment portion of processing for the GHCN-Mv4 beta release of October 2015. The incident report had found that there were unidentified coding errors in the GHCN-M processing that caused unpredictable results and different results every time code was run.

The generic flow of data used in processing of the NCDC/NCEI global temperature product suite is shown schematically in Figure 2. There are three steps to the processing, and two of the three steps are done separately for the ocean versus land data. Step 1 is the compilation of observations either from ocean sources or land stations. Step 2 involves applying various adjustments to the data, including bias adjustments, and provides as output the adjusted and unadjusted data on a standard grid. Step 3 involves application of a spatial analysis technique (empirical orthogonal teleconnections, EOTs) to merge and smooth the ocean and land surface temperature fields and provide these merged fields as anomaly fields for ocean, land and global temperatures. This is the product used in K15. Rigorous ORR for each of these steps in the global temperature processing began at NCDC in early 2014.slide2Figure 2. Generic data flow for NCDC/NCEI surface temperature products.

In K15, the authors describe that the land surface air temperature dataset included the GHCN-M station data and also the new ISTI (Integrated Surface Temperature Initiative) data that was run through the then operational GHCN-M bias correction and gridding program (i.e., Step 2 of land air temperature processing in Figure 2). They further indicated that this processing and subsequent corrections were ‘essentially the same as those used in GHCN-Monthly version 3’. This may have been the case; however, doing so failed to follow the process that had been initiated to ensure the quality and integrity of datasets at NCDC/NCEI.

The GHCN-M V4 beta was put through an ORR in October 2015; the presentation made it clear that any GHCN-M version using the ISTI dataset should, and would, be called version 4. This is confirmed by parsing the file name actually used on the FTP site for the K15 dataset [link]; NOTE: placing a non-machine readable copy of a dataset on an FTP site does not constitute archiving a dataset). One file is named ‘box.12.adj.4.a.1.20150119’, where ‘adj’ indicates adjusted (passed through step 2 of the land processing) and ‘4.a.1’ means version 4 alpha run 1; the entire name indicating GHCN-M version 4a run 1. That is, the folks who did the processing for K15 and saved the file actually used the correct naming and versioning, but K15 did not disclose this. Clearly labeling the dataset would have indicated this was a highly experimental early GHCN-M version 4 run rather than a routine, operational update. As such, according to NOAA scientific integrity guidelines, it would have required a disclaimer not to use the dataset for routine monitoring.

In August 2014, in response to the continuing software problems with GHCNMv3.2.2 (version of August 2013), the NCDC Science Council was briefed about a proposal to subject the GHCNMv3 software, and particularly the pairwise homogeneity analysis portion, to a rigorous software rejuvenation effort to bring it up to CMMI level 2 standards and resolve the lingering software errors. All software has errors and it is not surprising there were some, but the magnitude of the problem was significant and a rigorous process of software improvement like the one proposed was needed. However, this effort was just beginning when the K15 paper was submitted, and so K15 must have used date with some experimental processing that combined aspects of V3 and V4 with known flaws. The GHCNMv3.X used in K15 did not go through any ORR process, and so what precisely was done is not documented. The ORR package for GHCNMv4 beta (in October 2015) uses the rejuvenated software and also includes two additional quality checks versus version 3.

Which version of the GHCN-M software K15 used is further confounded by the fact that GHCNMv3.3.0, the upgrade from version 3.2.2, only went through an ORR in April 2015 (i.e., after the K15 paper was submitted and revised). The GHCN-Mv3.3.0 ORR presentation demonstrated that the GHCN-M version changes between V3.2.2 and V3.3.0 had impacts on rankings of warmest years and trends. The data flow that was operational in June 2015 is shown in figure 3.

slide3Figure 3. Data flow for surface temperature products described in K15 Science paper. Green indicates operational datasets having passed ORR and archived at time of publication. Red indicates experimental datasets never subject to ORR and never archived.

It is clear that the actual nearly-operational release of GHCN-Mv4 beta is significantly different from the version GHCNM3.X used in K15. Since the version GHCNM3.X never went through any ORR, the resulting dataset was also never archived, and it is virtually impossible to replicate the result in K15.

At the time of the publication of the K15, the final step in processing the NOAAGlobalTempV4 had been approved through an ORR, but not in the K15 configuration. It is significant that the current operational version of NOAAGlobalTempV4 uses GHCN-M V3.3.0 and does not include the ISTI dataset used in the Science paper. The K15 global merged dataset is also not archived nor is it available in machine-readable form. This is why the two boxes in figure 3 are colored red.

The lack of archival of the GHCN-M V3.X and the global merged product is also in violation of Science policy on making data available [link]. This policy states: “Climate data. Data should be archived in the NOAA climate repository or other public databases”. Did Karl et al. disclose to Science Magazine that they would not be following the NOAA archive policy, would not archive the data, and would only provide access to a non-machine readable version only on an FTP server?

For ocean temperatures, the ERSST version 4 is used in the K15 paper and represents a major update from the previous version. The bias correction procedure was changed and this resulted in different SST anomalies and different trends during the last 15+ years relative to ERSST version 3. ERSSTV4 beta, a pre-operational release, was briefed to the NCDC Science Council and approved on 30 September 2014.

The ORR for ERSSTV4, the operational release, took place in the NCDC Science Council on 15 January 2015. The ORR focused on process and questions about some of the controversial scientific choices made in the production of that dataset will be discussed in a separate post. The review went well and there was only one point of discussion on process. One slide in the presentation indicated that operational release was to be delayed to coincide with Karl et al. 2015 Science paper release. Several Science Council members objected to this, noting the K15 paper did not contain any further methodological information—all of that had already been published and thus there was no rationale to delay the dataset release. After discussion, the Science Council voted to approve the ERSSTv4 ORR and recommend immediate release.

The Science Council reported this recommendation to the NCDC Executive Council, the highest NCDC management board. In the NCDC Executive Council meeting, Tom Karl did not approve the release of ERSSTv4, noting that he wanted its release to coincide with the release of the next version of GHCNM (GHCNMv3.3.0) and NOAAGlobalTemp. Those products each went through an ORR at NCDC Science Council on 9 April 2015, and were used in operations in May. The ERSSTv4 dataset, however, was still not released. NCEI used these new analyses, including ERSSTv4, in its operational global analysis even though it was not being operationally archived. The operational version of ERSSTv4 was only released to the public following publication of the K15 paper. The withholding of the operational version of this important update came in the middle of a major ENSO event, thereby depriving the public of an important source of updated information, apparently for the sole purpose of Mr. Karl using the data in his paper before making the data available to the public.

So, in every aspect of the preparation and release of the datasets leading into K15, we find Tom Karl’s thumb on the scale pushing for, and often insisting on, decisions that maximize warming and minimize documentation. I finally decided to document what I had found using the climate data record maturity matrix approach. I did this and sent my concerns to the NCEI Science Council in early February 2016 and asked to be added to the agenda of an upcoming meeting. I was asked to turn my concerns into a more general presentation on requirements for publishing and archiving. Some on the Science Council, particularly the younger scientists, indicated they had not known of the Science requirement to archive data and were not aware of the open data movement. They promised to begin an archive request for the K15 datasets that were not archived; however I have not been able to confirm they have been archived. I later learned that the computer used to process the software had suffered a complete failure, leading to a tongue-in-cheek joke by some who had worked on it that the failure was deliberate to ensure the result could never be replicated.

Where do we go from here?

I have wrestled for a long time about what to do about this incident. I finally decided that there needs to be systemic change both in the operation of government data centers and in scientific publishing, and I have decided to become an advocate for such change. First, Congress should re-introduce and pass the OPEN Government Data Act. The Act states that federal datasets must be archived and made available in machine readable form, neither of which was done by K15. The Act was introduced in the last Congress and the Senate passed it unanimously in the lame duck session, but the House did not. This bodes well for re-introduction and passage in the new Congress.

However, the Act will be toothless without an enforcement mechanism. For that, there should be mandatory, independent certification of federal data centers. As I noted, the scientists working in the trenches would actually welcome this, as the problem has been one of upper management taking advantage of their position to thwart the existing executive orders and a lack of process adopted within Agencies at the upper levels. Only an independent, outside body can provide the needed oversight to ensure Agencies comply with the OPEN Government Data Act.

Similarly, scientific publishers have formed the Coalition on Publishing Data in the Earth and Space Sciences (COPDESS) with a signed statement of commitment to ensure open and documented datasets are part of the publication process. Unfortunately, they, too, lack any standard checklist that peer reviewers and editors can use to ensure the statement of commitment is actually enforced. In this case, and for assessing archives, I would advocate a metric such as the data maturity model that I and colleagues have developed. This model has now been adopted and adapted by several different groups, applied to hundreds of datasets across the geophysical sciences, and has been found useful for ensuring information preservation, discovery, and accessibility.

Finally, there needs to be a renewed effort by scientists and scientific societies to provide training and conduct more meetings on ethics. Ethics needs to be a regular topic at major scientific meetings, in graduate classrooms, and in continuing professional education. Respectful discussion of different points of view should be encouraged. Fortunately, there is initial progress to report here, as scientific societies are now coming to grips with the need for discussion of and guidelines for scientific ethics.

There is much to do in each of these areas. Although I have retired from the federal government, I have not retired from being a scientist. I now have the luxury of spending more time on these things that I am most passionate about. I also appreciate the opportunity to contribute to Climate Etc. and work with my colleague and friend Judy on these important issues.

Postlude

A couple of examples of how the public can find and use CDR operational products, and what is lacking in a non-operational and non-archived product

  1. NOAA CDR of total solar irradiance – this is the highest level quality. Start at web site – https://data.nodc.noaa.gov/cgi-bin/iso?id=gov.noaa.ncdc:C00828

Here you will see a fully documented CDR. At the top, we have the general description and how to cite the data. Then below, you have a set of tabs with extensive information. Click each tab to see how it’s done. Note, for example, that in ‘documentation’ you have choices to get the general documentation, processing documents including source code, data flow diagram, and the algorithm theoretical basis document ATBD which includes all the info about how the product is generated, and then associated resources. This also includes a permanent digital object identifier (doi) to point uniquely to this dataset.

  1. NOAA CDR of mean layer temperature – RSS – one generation behind in documentation but still quite good – https://www.ncdc.noaa.gov/cdr/fundamental/mean-layer-temperature-rss

Here on the left you will find the documents again that are required to pass the CDR operations and archival. Even though it’s a slight cut below TSI in example 1, a user has all they need to use and understand this.

  1. The Karl hiatus paper can be found on NCEI here – https://www.ncdc.noaa.gov/news/recent-global-surface-warming-hiatus

If you follow the quick link ‘Download the Data via FTP’ you go here – ftp://ftp.ncdc.noaa.gov/pub/data/scpub201506/

The contents of this FTP site were entered into the NCEI archive following my complaint to the NCEI Science Council. However, the artifacts for full archival of an operational CDR are not included, so this is not compliant with archival standards.

Biosketch:  

John Bates received his Ph.D. in Meteorology from the University of Wisconsin-Madison in 1986. Post Ph.D., he spent his entire career at NOAA, until his retirement in 2016.  He spent the last 14 years of his career at NOAA’s National Climatic Data Center (now NCEI) as a Principal Scientist, where he served as a Supervisory Meteorologist until 2012.

Dr. Bates’ technical expertise lies in atmospheric sciences, and his interests include satellite observations of the global water and energy cycle, air-sea interactions, and climate variability. His most highly cited papers are in observational studies of long term variability and trends in atmospheric water vapor and clouds.

NOAA Administrator’s Award 2004 for “outstanding administration and leadership in developing a new division to meet the challenges to NOAA in the area of climate applications related to remotely sensed data”. He was awarded a U.S. Department of Commerce Gold Medal in 2014 for visionary work in the acquisition, production, and preservation of climate data records (CDRs). He has held elected positions at the American Geophysical Union (AGU), including Member of the AGU Council and Member of the AGU Board. He has played a leadership role in data management for the AGU.

He is currently President of John Bates Consulting Inc., which puts his recent experience and leadership in data management to use in helping clients improve data management to improve their preservation, discovery, and exploitation of their and others data. He has developed and applied techniques for assessing both organizational and individual data management and applications. These techniques help identify how data can be managed more cost effectively and discovered and applied by more users.

David Rose in the Mail on Sunday

David Rose of the UK Mail on Sunday is working on a comprehensive expose of this issue [link].

Here are the comments that I provided to David Rose, some of which were included in his article:

Here is what I think the broader implications are.  Following ClimateGate, I made a public plea for greater transparency in climate data sets, including documentation. In the U.S., John Bates has led the charge in developing these data standards and implementing them.  So it is very disturbing to see the institution that is the main U.S. custodian of climate data treat this issue so cavalierly, violating its own policy. The other concern that I raised following ClimateGate was overconfidence and inadequate assessments of uncertainty.  Large adjustments to the raw data, and substantial changes in successive data set versions, imply substantial uncertainties. The magnitude of these uncertainties influences how we interpret observed temperature trends, ‘warmest year’ claims, and how we interpret differences between observations and climate model simulations. I also raised concerns about bias; here we apparently see Tom Karl’s thumb on the scale in terms of the methodologies and procedures used in this publication.

Apart from the above issues, how much difference do these issues make to our overall understanding of global temperature change? All of the global surface temperature data sets employ NOAA’s GHCN land surface temperatures. The NASA GISS data set also employs the ERSST datasets for ocean surface temperatures. There are global surface temperature datasets, such as Berkeley Earth and HadCRUT that are relatively independent of the NOAA data sets, that agree qualitatively with the new NOAA data set. However, there remain large, unexplained regional discrepancies between the NOAA land surface temperatures and the raw data. Further,  there are some very large uncertainties in ocean sea surface temperatures, even in recent decades. Efforts by the global numerical weather prediction centers to produce global reanalyses such as the European Copernicus effort is probably the best way forward for the most recent decades.

Regarding uncertainty, ‘warmest year’, etc. there is a good article in the WSJ : Change would be healthy at U.S. climate agencies (hockeyshtick has reproduced the full article).

I also found this recent essay in phys.org to be very germane:  Certainty in complex scientific research an unachievable goal. Researchers do a good job of estimating the size of errors in measurements but underestimate chance of large errors.

Backstory

I have known John Bates for about 25 years, and he served on the Ph.D. committees of two of my graduate students. There is no one, anywhere, that is a greater champion for data integrity and transparency.

When I started Climate Etc., John was one of the few climate scientists that contacted me, sharing concerns about various ethical issues in our field.

Shortly after publication of K15, John and I began discussing our concerns about the paper.  I encouraged him to come forward publicly with his concerns. Instead, he opted to try to work within the NOAA system to address the issues –to little effect. Upon his retirement from NOAA in November 2016, he decided to go public with his concerns.

He submitted an earlier, shorter version of this essay to the Washington Post, in response to the 13 December article (climate scientists frantically copying data). The WaPo rejected his op-ed, so he decided to publish at Climate Etc.

In the meantime, David Rose contacted me about a month ago, saying he would be in Atlanta covering a story about a person unjustly imprisoned [link]. He had an extra day in Atlanta, and wanted to get together. I told him I wasn’t in Atlanta, but put him in contact with John Bates. David Rose and his editor were excited about what John had to say.

I have to wonder how this would have played out if we had issued a press release in the U.S., or if this story was given to pretty much any U.S. journalist working for the mainstream media. Under the Obama administration, I suspect that it would have been very difficult for this story to get any traction. Under the Trump administration, I have every confidence that this will be investigated (but still not sure how the MSM will react).

Well, it will be interesting to see how this story evolves, and most importantly, what policies can be put in place to prevent something like this from happening again.

I will have another post on this topic in a few days.

Being retired sure is liberating . . .

February 5, 2017 Posted by | Corruption, Deception, Science and Pseudo-Science, Timeless or most popular | , | Leave a comment

New Book Details US Attempts to Topple Correa

Cover of Ecaudor in the Sights: The Wikileaks Revelations and the Conspiracy Against the Government of Rafael Correa

Cover of Ecaudor in the Sights: The Wikileaks Revelations and the Conspiracy Against the Government of Rafael Correa | Photo: El Telegrafo
teleSUR | February 2017

In his new book, “Ecuador In the Sights: The WikiLeaks Revelations and the Conspiracy Against the Government of Rafael Correa,” released this week in Quito, Norwegian journalist Eirik Vold details attempts by the U.S. government to topple Ecuadorean President Rafael Correa and derail his Citizens’ Revolution.

“Correa was not about to let Washington maintain its dominance through financial institutions like the World Bank and the International Monetary Fund,” Vold told the Andes press agency in explaining the motivation behind years of U.S. efforts to undermine the Ecuadorean president.

The book is largely based on the “Cablegate” documents released by WikiLeaks in 2010, including thousands of secret documents sent from the U.S. Embassy in Quito and the U.S. consulate in Guayaquil.

“There is direct U.S. interference in Ecuador,” Vold told El Telegrafo, adding that “documents show a close relationship between several figures of Ecuadorean political life, the financial sector, and the United States Embassy.”

In the book, Vold outlines how the U.S. looked to thwart Correa from the very beginning, trying to directly prevent his election out of fear of losing the U.S. military base in Manta, the base of CIA operations in the region, as well as control over the U.S. oil company Occidental Petroleum Corp.

After his 2006 election, Correa nationalized the oil company and closed the U.S. base in Manta.

Vold says his book documents multiple attempts by the U.S. to sabotage UNASUR — the regional cooperation body founded in 2007 by progressive governments in Latin America — as well as extensive contacts between the U.S. Embassy and members of the national police force before an attempted 2010 coup, known as 30S.

In 2015, 22 police officers were found guilty of insubordination for their role in the failed coup.

Vold also claims the secret cables identify multiple NGO, media, finance, and political contacts which the U.S. embassy used to attempt to destabilize Correa’s government.

One of those Vold names is current vice presidential candidate Andres Paez. Paez, formerly the president of the left-wing Left Democracy Party, is now running on the right-wing CREO ticket along with former banker Guillermo Lasso.

“The U.S. says in a document he is one of our most trusted contacts. In other documents, it is pointed out that he was considered an ally for imposing free trade agreements, and it is evident that he had meetings at the United States Embassy.”

The Norwegian journalist, who has written extensively about U.S. involvement in Latin America, including a book about Hugo Chavez’s Bolivarian Revolution, said that Ecuador is of particular importance due to its efforts to protect WikiLeaks founder Julian Assange from U.S. persecution, ensuring its role as a “protector of the right to information for the whole planet.”

“We’re talking about a region with the world’s greatest concentration of natural resources, and obviously a region which has been known as the U.S.’s ‘backyard’,” he told Andes. “So U.S. activities are very intense in the region, but they have been maintained, for the last decade, with a more discrete, more covert strategy.”

“The revelations are many, the purpose is one,” he said at a book launch in Quito on Thursday. “That the Ecuadorean public regardless of their political inclination has access to truthful information about the activities of U.S. officials. And local informants in the country who had previously been concealed from them.”

February 4, 2017 Posted by | Book Review, Corruption, Deception, Timeless or most popular | , , , | Leave a comment

Booming Black Market in Bundeswehr Rifles Funds Iraqi Refugees’ Flight to Europe

Sputnik – 04.02.2017

Last month German media revealed that German weapons supplied to Kurdish Peshmerga in northern Iraq are being sold on the black market, where they may end up in the hands of terrorist groups.

NDR and WDR reporters in the cities of Erbil and Sulaymaniyah in Iraqi Kurdistan discovered German weapons being sold on the black market there, engraved with the initials “Bw,” meaning Bundeswehr.

Weapons on sale included Heckler and Koch G3 rifles on sale for $1450 and $1800, and Walther P1 pistols with an asking price of $1200.

One arms dealer told a reporter that he could procure the Heckler & Koch G36 assault rifle for $5000.

These weapons sales have also financed the flight of refugees from Iraq to Germany, an Iraqi Kurdish refugee living in Germany told the news program.

Former Peshmerga Mustafa S said that he is one of hundreds of fighters who have sold their weapons to finance their escape from Iraq.

“Mustafa S said that he knows around 100 Peshmerga who have sold their weapons in recent months in order to flee. The situation has become unbearable for many. The low oil price, lack of payment from the Iraqi central government and the battle against Daesh, which guzzles about five million dollars daily, have brought the Kurdish regional government to the brink of bankruptcy. (Mustafa) himself had not been paid for five months and did not know how he was going to pay rent, food, and medicine for his disabled daughter. Now, he lives with his wife and their six children in a home for asylum seekers in East Germany,” Tagesschau reported.

Deputy Chairman of the Die Linke opposition party in the German Bundestag Tobias Pfluger called on the government to stop supplying arms to the Peshmerga. He told Sputnik that the deliveries are counter to the German constitution.

“The interesting thing is that the training missions that are connected with these weapons deliveries break several domestic federal laws. German and EU legislation, the War Weapons Control Act and the Foreign Trade and Payments Act, prohibit direct deliveries to war zones,” Pfluger explained.

The German Defense Ministry has delivered an estimated 2,400 tons of arms and munitions to Kurdish Peshmerga fighters since it began to supply the militia in summer 2014. A government spokesman told NDR and WDR that the government of Iraqi Kurdistan is responsible for the weapons’ misuse.

The German Defense Ministry is committed to the “proper verification of supplied weapons,” and their use in accordance with international law.

However, since the Ministry is unable to trace individual arms, “the sale of individual weapons cannot be excluded with absolute certainty.”

Pfluger said that assurances from local forces that the arms will remain in their possession are “worthless.”

“It is completely perverse that they have to sign a so-called confirmation of retention. That is nothing other than a completely worthless piece of paper because we know that the weapons show up on the markets in Iraq and Syria. In this respect, we say that this commitment must be ended, it is an intensification of the war and is in no way something that creates peace there.”

Peace activist and spokesman for Aktion Aufschrei — Stop the arms trade! Jurgen Grasslin told Sputnik Deutschland that German guns have ended up far removed from their intended destination.

“The federal government usually has no idea where their weapons are actually delivered to, when they are exported. My research, based on (studies of) numerous countries and trips to crisis regions and war zones over the past 30 years, shows clearly that weapons roam. Weapons do not stay in the place where they are delivered.”

Grasslin, author of a book entitled “The Black Book of Arms Trading: How Germany Profits from War” (Schwarzbuch Waffenhandel: Wie Deutschland am Krieg verdient), alleges that German arms deliveries constitute “complicity in murder.”

“If Daesh is firing German weapons, and of course weapons from other countries, that is more than a scandal, it is a breach of the law. It is complicity in murder. You are delivering to a war zone. You know that these weapons don’t stay in the hands of the recipients, and that they land in the hands of the worst terrorist groups, for example Daesh. The people who authorize these arms exports must be named, that is namely members of the Bundessicherheitsrat (Federal Security Council) or Federal Government. The first to be named should be Chancellor Angela Merkel and her deputy Sigmar Gabriel, who lead the Bundessicherheitsrat and are thus responsible for these armed forces.”

February 4, 2017 Posted by | Corruption, Deception, Economics, War Crimes | , , | Leave a comment

Democrats, Including Bernie Sanders, Won’t Fight for Medicare for All

A Black Agenda Radio commentary by Glen Ford | February 1, 2017

Democrats will tell you that the Republicans are the reason the U.S. is the only industrial country in the world that does not have universal, government-provided health care. But, that’s not true. Despite their legislative majorities, the Republicans are not strong enough on their own to defeat a concerted campaign for a Medicare for All program, which is supported by overwhelming majorities of the public, including a very large percentage of Republicans. The U.S. does not have a single payer health care system because corporate Democratic leadership, most shamefully under Presidents Bill Clinton and Barack Obama, have confused the public about what is, and what is not, “universal” health care. They have offered counterfeit, private industry-based schemes – “Hillary-care” in 1993, Obamacare in 2009 — and fraudulently called them universal health care, when in fact these were bait-and-switch schemes designed to prevent the successful passage of a genuine single payer health program.

The numbers tell the tale. When Bill Clinton first ran for president in 1992, two-thirds of the public – 66 percent – told pollsters they supported a “national health insurance plan financed by tax money.” The Clintons instead responded with a secretly-hatched “managed competition” plan that relied on private insurers and took more than 1,300 pages to explain. “Hillary-care” died ignominiously in Congress.

By late 2006, an amazing 69 percent of Americans were telling pollsters they agreed with the statement that “it is the responsibility of the federal government to make sure that all Americans have healthcare coverage.” Barack Obama was getting ready to make his run for the White House. He claimed to want something he called “universal health care.” Most people assumed he meant a single payer, Medicare for All-type program, but instead, Obama dusted off an old Republican private insurance-based plan devised by the far-right Heritage Foundation and later adopted by Republican Massachusetts governor Mitt Romney.

Not only was Obamacare unpopular on its own merits, but because Obama had used the trick terminology “universal health care” to describe his program, the public became confused and demoralized about the whole subject of government health care. By 2016, only 44 percent of the people had a positive reaction to the term “single payer health coverage.” However, nearly two-thirds of those polled — 64 percent — liked the idea of Medicare-for-All, which is a form of single payer. Medicare for the elderly is probably the nation’s most popular social safety net program, and younger folks would like to be part of it, too, including lots of low-income Republicans. Even Donald Trump used to be for it.

Medicare-for-All is an idea whose time has come — again! And, with Obamacare being dismantled and a health care emergency looming, one would think the Democrats would seize the time to push for a truly universal health care plan with such broad support. But, corporate Democratic leadership — just like Bill Clinton and Barack Obama — do not really want single payer health care, because their fat cat contributors oppose it. Therefore, they will urge the people to waste their energy on trying to salvage some aspects of Obamacare.

Bernie Sanders will help them do it, because he’s still sheep-dogging for corporate Democrats.

BAR executive editor Glen Ford can be contacted at Glen.Ford@BlackAgendaReport.com.

February 1, 2017 Posted by | Corruption, Deception, Progressive Hypocrite, Timeless or most popular | , , , | Leave a comment

Nancy Pelosi says she’s for single payer, refuses to co-sponsor single payer legislation

An Open Letter to Nancy Pelosi from Ralph Nader

Representative Nancy Pelosi

House Minority Leader

United States House of Representatives

233 Cannon House Office Building

Washington, DC 20515

January 17, 2017

Dear Representative Pelosi:

I see you were quoted in The Hill newspaper recently (“Pelosi Rips GOP for Cut and Run Strategy on Obamacare,” by Mike Lillis, January 12, 2017) saying that you are for single payer health insurance. You had this preference before Presidents Clinton and Obama, who ideally agree with you, dismissed single payer as “impractical” given the entrenched and powerful healthcare industry.

A couple of years ago, I wrote an article titled “21 Ways Canada’s Single Payer System Beats Obamacare”.

Within a week or so, your colleague, Congressman John Conyers (D-Michigan), will re-introduce HR 676, the single payer bill in the House.

Will you actively support this much more efficient and comprehensive legislation, with its many advantages proven in other countries, and persuade other House Democrats to also co-sponsor?

Last year, only 63 Democrats co-sponsored.

Obamacare, without a public option, has been a complex patchwork in so many ways — including forcing individuals to purchase inadequate insurance from private health insurance companies — insurance that carries with it high premiums, deductibles, co-pays and forces narrow networks.

For many, Obamacare is quasi-catastrophic insurance with limited choice of doctor and hospital.

If the Republicans repeal Obamacare, Democrats need to be ready and offer to replace it with something that can attract left/right support — single payer, Medicare for All — everyone in, nobody out, free choice of doctor and hospital, no medical bankruptcies, no coercive co-pays or deductibles, with all their accompanying fears and anxieties, and no more deaths due to lack of health insurance.

A December 2015 national Kaiser public opinion poll found that 58 percent of adults in the U. S. supported single payer (Medicare for All), including 81 percent of Democrats, 60 percent of Independents, and 30 percent of Republicans. Imagine the poll numbers when Full Medicare for All starts to be explained, in its clear simplicity, and promoted by a major political party.

Let’s work together to present the American people something both more efficient and responsive that they want and need — Medicare for All and freedom to choose their doctor, clinic and hospital.

Sincerely,

Ralph Nader

PO Box 19312

Washington, DC 20036

February 1, 2017 Posted by | Corruption, Deception, Economics, Progressive Hypocrite | | Leave a comment

Russia Calls the West’s Bluff over Real Elections, At Long Last

By Seth Ferris – New Eastern Outlook – 01.02.2017

The world continues to turn upside down. Think the Western democracies are still the authorities on the one thing they are supposed to know about? Think again.

For generations the OSCE, the Organisation for Security and Cooperation in Europe, has been monitoring elections in various countries to ensure they meet Western democratic standards. After each one it publishes a report, which sometimes bears little relation to what people on the ground have seen, and each one of these mysteriously reflects the current political position of the Western powers – if they like a country and its government, they have conducted free and fair elections, if not, the elections are declared wholly or partially invalid.

This practice has often raised criticism, but still the OSCE is being called in to monitor elections all over the globe. Why? Because it is the institution representing countries with a long democratic tradition, and those these have since chosen as their friends. That’s it. It does not have to do anything to justify the vast sums given to it to be the authority on elections, simply be there.

Theoretically the OSCE is the product of a partnership between East and West. But in effect the OSCE is run by Western democracies and those countries it now believes have adopted Western standards since the Cold War. These nations, we are told, understand democracy and can therefore recognise it when they see it.

On 19 January the Head of Russia’s Central Elections Commission, Ella Pamfilova, recommended to the OSCE’s Michael Link that it should adopt a common set of standards for election monitoring. This would enable it to compare one country’s performance with another’s and see whether countries are improving or regressing compared to previous elections. “I consider it very important that the standards of elections monitoring in all OSCE member countries be unified,” she reportedly said.

This statement opened mouths all over the world. So let’s get this straight – the OSCE has been monitoring all these elections without any set standards of what democracy is, what is free and fair, what are the acceptable and unacceptable variables, what are the irreducible minimums or what the rights and responsibilities of governments, election commissions and political parties are? It has continued being regarded as the authority on these questions in spite of this? And now the Russians – the RUSSIANS – have to call for a common set of standards to give the monitors some idea what they are supposed to be doing?

You think, therefore we are

This isn’t about elections. It is about how long you can get away with a con. Since the end of World War Two The West’s policy has been based on lies – it is supposed to believe in certain values, such as democracy and human rights, but goes round depriving the rest of the world of the same values it says are paramount.

Everyone has seen this happen, but the West is still supposed to know more about these values than anyone else. So if other countries want democracy and human rights they automatically turn to the West. If they end up with governments which claim to respect democracy and human rights but do exactly the opposite, and Western monitors telling them that rigged elections are substantially free and fair, is that their own fault or that of the Westerners they asked to prevent these things happening? But still they feel they have nowhere else to turn, because Western democracies must somehow know best, and they won’t be better off any other way.

Russia’s request makes one simple point. If you believe in democracy, you will have developed a sophisticated definition of what it is and why it is so important after such long experience of it. If you have such a definition, every observer who monitors elections will know it and be able to assess the elections against it. Of course there will be some local variations in practice between democratic countries, and some of these might raise the concern of some countries. But there will be a common set of standards already in place against which these too can be judged, and thus everyone sent as a monitor will be able to cite these to acquire credentials as a democrat, even if they’ve played little or no part in actual elections.

This is indeed perfectly logical. If it hasn’t happened, this cannot be because the conditions don’t exist. It is because no one wants to be bound by any definition of what democracy is. The West wants to use the term how it wants, when it wants, and make everything up to fit whatever broader political goals it has. The West pronouncing on whether elections are free and fair is about as credible as Australia saying it is in the northern hemisphere because the cover of the North-South report said so and that was written by Western European politicians.

Power to some other people

Does this process actually achieve anything? It enables selected people to enjoy all-expenses-paid junkets to different countries, where they go around with badges on which technically say “Election Observer” but in reality say “You can’t say anything about me, I’m the expert”. How many of these junkets they get depend on how well they fit the evidence to the pre-existing script about that election. Many observers arrive before polling day and stay afterwards: maybe this is to do with monitoring the pre- and post-election situations, or maybe it is to give them time in various places of ill repute as a payoff for going along with the official script.

Lots of historic examples of election monitoring fraud are available. When India decided it wanted to annex the independent Kingdom of Sikkim in 1975 it persuaded its parliament to abolish the monarchy and then hold a referendum on joining India. Obviously, as Sikkim was already an Indian protectorate, it monitored that referendum to ensure it was free and fair. Very few others were able to find out anything about the referendum until the results came in, which showed 97% support for joining India, despite the fact the alleged number of people could not have physically voted on the day due to the terrain, most of the voters had been imported from India and the observers were also armed troops in many cases.

Similarly, when Viktor Yanukovych was up against Yulia Tymoshenko in the 2010 presidential runoff in Ukraine Mikheil Sakkashvili’s Georgia sent hundreds of observers, as a neighbouring, friendly country. The trouble was, most of these were actually martial artists, or simply thugs, with no experience of organising elections. Saakashvili openly supported Tymoshenko in these elections and had already been happy to use force on his own citizens in Georgia. It would have interesting to quiz these observers about what a “free and fair” election is supposed to mean.

Indeed, the May 2008 Georgian Parliamentary Election had already provided a classic example of vote rigging and fraud which was obvious to anyone from a democratic country. The international observers looked on and saw nothing, and the OSCE rubber stamped the results, with the existence of various spying platforms in Georgia at stake.

What OSCE monitors do has nothing to do with the welfare of the people whose country they are pronouncing upon. It is about exerting control. If the outcome of an election is what the West desires, it is free and fair, and no one can complain because they have no other set of standards to refer to. They can’t call on some other organisation to review the OSCE’s judgment because, although such organisations exist and can act independently, their credibility can be easily exploded.

If someone disagrees with the conclusions of the mighty OSCE, however farcical those conclusions may be, they must have some political motive or be unaware of the full facts. This sort of common thinking, however baseless, is what has enabled the West to get away with this for so long. How it responds to Russia’s request for it to adopt standards it can be held to, which it should have done itself long ago, remains to be seen.

Velvet fist in an iron glove

This sort of control is familiar to anyone who has worked with aid agencies, which, like democratic systems, are designed to help people. Whether these are international or internal to a specific country, the principle is the same: we know everything; you know nothing, so you have to accept whatever we say so we can prevent you ever achieving what you want to achieve.

Eastern Europe is full of aid agencies from Western countries, regardless of the political orientation of that country. Each one brings money to conduct programmes which are supposed to bring greater democracy, rule of law, industrial or agricultural efficiency, human rights etcetera. The process is supposedly simple: benchmarks are set, and if prospective beneficiaries achieve these benchmarks they get the funding to take part in the programme, which involves meeting further benchmarks as they go along.

This results in situations such as the National Democratic Institute in Georgia insisting that the principles of democracy and fairness are “very clear” because it says so, without explaining what these principles actually are, why it therefore produces wildly inaccurate opinion polls at each election for pay and why it never says a word about a president who was democratically elected with 87% of the vote being overthrown in a coup and the state being built ever since on supporting that coup. It results in situations where people who’ve never set foot in a country before try to tell local farmers, with all their accumulated experience, that they have to do things differently, rather than better, to enter shiny Western markets whilst also supporting the rigging of those markets against them to suit other clients elsewhere, who pay better or are more politically reliable.

But the worst aspect is that the pump soon runs dry. The further people get involved in these programmes the more paperwork they have to do. That in itself is onerous, but it comes with strings attached. To keep receiving support they have to become increasingly politically acceptable to the donor, as the aid is not designed to improve the situation on the ground but to serve the broader political objectives of the donor governments. Georgia provides another disgusting example of this: during Saakashvili’s time even staff of the International Red Cross, most of who didn’t support him, had to be seen canvassing for him and his party, flags waving, trumpeting Western progress, when that same government wouldn’t let them rescue people stranded in South Ossetia during the 2008 war.

Internal aid organisations are no different. They also tell prospective clients, which are usually local welfare organisations with their own remit, that they have to adopt all kinds of quality standards to be eligible for any funding, because everyone else has trustworthy quality standards and they don’t. These standards are usually drawn up by people who have never worked in a similar organisation, and the standards themselves are often irrelevant to the organisations which are told to get them.

But the more money they get as a result, the more games they have to play to retain those funds and keep providing services, even though what they do has increasingly less to do with the welfare of their clients. Who is creating the problems their clients face? The same government whose various arms are telling them they have to adopt these systems to function. It is therefore rather obvious which such systems are invented, by whom, and what they are ultimately designed to achieve.

It’s not going to go away

It would be a positive thing if a country like Russia, which has always been told it has to learn from the West because it is deficient, was able to make Western countries adopt better standards. People in Eastern Europe know perfectly well what democracy actually means, which is why they cry out for it and object when they don’t get it. At every election in every Western country there are some offences committed, and no one has ever been able to demonstrate that people, who were originally from “young democracies” or no democracy at all, commit more of these than anyone else.

However it is likely that “Missionary Syndrome” will still hold sway. Whatever fine words the OSCE might come out with about listening to experts; it all depends on where those experts come from. In the 1980s there was a craze for Protestant countries which had formerly been British or German colonies to send missionaries to “the Old Country” to try and get local people going to church again. The common response was, “we sent our missionaries to you, what do you have to teach us?” Even those who agreed with every point being made wouldn’t accept it coming from the mouth of an ex-colonial, because natives of the former imperial power must automatically know more.

Nevertheless, this latest move is yet another example of Russia taking on the mantle the US used to have – Russia is increasingly the power of legality and international agreements, the US increasingly the rogue operator. Everything Russia does which the West objects to was done by the West long before, in defiance of its professed principles, and that is exactly why Russia is doing it. The way to change the game is for all sides to behave legally and properly, but it is Russia, not the US, which is seeking to bring that about.

All this is very alarming to the millions of people brought up with the opposite assumption, which at one time really was justified. Realising this is what is happening is like suddenly discovering you’re the opposite of what you thought you were.

Now that Donald Trump, allegedly a Russian stooge, has taken power in the US there is much cry over the threat Russia poses. That “threat” exists because Western hypocrisy and criminality put it there – and only by doing what it was always supposed to do, with or without Russian prompting, is that “threat” ever going to go away.

Seth Ferris, investigative journalist and political scientist, expert on Middle Eastern affairs.

February 1, 2017 Posted by | Civil Liberties, Corruption, Deception | , | Leave a comment