A top commander for the Alaska National Guard was recommended for “other than honorable” discharge earlier this year following an investigation that found he allowed recruiters to sexually assault and harass women.
Lt. Col. Joseph Lawendowski failed to act on multiple complaints of serious misconduct, including rape, against four noncommissioned officers under his command, according to an Army investigation.
The Anchorage Press obtained a copy of the Army Regulation 15-6 report that found Lawendowski, a former pornography company owner and co-founder of an “end times” fundamentalist group, violated the National Guard code of conduct on multiple occasions.
The 46-year-old Lawendowski, who joined the Alaska National Guard in 2003, promoted steroid use by recruiters, used government vehicles for strip club outings, showed up drunk to a sled dog race sponsored by the service, and possibly used government-issued credit cards for improper purchases.
The March 3 report recommended a separate investigation into his possible misuse of funds, including purchases of plane tickets to Dubai and Sweden, items at a luxury children’s store in Paris, and $1,500 and $2,000 bar tabs in Anchorage and Juneau, the newspaper reported.
The report found Lawendowski created a workplace climate of fear and intimidation by allowing four NCOs – identified as Command Sgt. Maj. Clinton Brown, Master Sgt. Jarrett Carson, Master Sgt. John Nieves, and Sgt. 1rst Class Shannon Tallant – to retaliate against soldiers who filed complaints and to feel above the law.
Three of those NCOs – Carson, Nieves, and Tallant – were known to National Guard members as the “Three-Headed Monster” due to their widely known misconduct and abnormally large size, and sworn statements show they had a motto: “What happens in recruiting, stays in recruiting.”
The report found Tallant fraternized with his direct supervisor, Brown, who was a passenger when the recruiter was arrested for drunken driving in March 2011, and investigators found their relationship allowed Tallant to “continue to use his rank and position to abuse junior soldiers as well as prey on young women.”
The recruiters boasted of their close relationships with superior officers, including Maj. Gen. Thomas Katkus, who led the Alaska National Guard until he was forced to resign last month by Republican Gov. Sean Parnell, who took over when former Gov. Sarah Palin stepped down in 2009.
Brigadier Gen. Catherine Jorgensen, the Chief of Staff for the Alaska Army National Guard, was fired earlier this month by the acting commander of the Alaska National Guard but rehired the next day at Parnell’s insistence.
The National Guard Bureau’s Office of Complex Investigations strongly criticized the Alaska National Guard leadership, saying the service mishandled sexual assault cases and widespread unethical behavior by officers.
The OCI report showed Lawendowski had been the subject of multiple criminal investigations for weapons smuggling, rape, and drug trafficking – but none of those criminal investigations resulted in prosecution “due to jurisdictional issues or lack of evidence,” the newspaper reported.
Lawendowski, who formed a corporation — Kodiak Entertainment Group, Inc. – that operated at least seven pornographic websites and then the Christian fundamentalist Berean Watchmen organization, reported directly to Katkus, the OCI report found.
This deviated from the normal chain of command, the newspaper reported, and investigators found that other National Guard members knew Lawendowski was friends and neighbors with Katkus.
Lawendowski was named Deputy Chief of Staff for Operations for Operations and Training in June 2012, but head chaplain Lt. Col. Rick Koch and at least five other active or retired officers have begged Parnell’s office for about a year and a half to investigate him and his associates.
They provided the governor’s office with detailed allegations about fraud and the cover-up of sexual assaults, the newspaper reported.
The chaplain told Parnell’s chief of staff that Lawendowski had improperly spent more than $200,000 and led a command with known ties to illegal drug sales and many sexual assaults – and he said Katkus knew about the allegations.
“As one officer put it, ‘We are now putting criminals in our senior positions,’” Koch said in an email.
High-Level NSA Official Tied To Husband’s Private Signals Intelligence Business, Has A Second Business That Owns A Plane
Buzzfeed’s Aram Roston has uncovered more evidence linking the NSA’s SIGINT (signals intelligence) director to a number of private contractors known to do business with the US government — perhaps even the agency itself.
Roston previously exposed the close ties between Teresa Shea’s position and her husband James’ employer, DRS Signal Solutions, a company focused on “SIGINT systems.” Not only that, but business records indicated that James Shea apparently runs Telic Networks, another SIGINT-focused business operating out of their hometown (Ellicott City, Maryland).
Needless to say, neither Teresa Shea, her husband, her husband’s employer, nor the NSA itself have offered anything in the way of comments on this suspicious-looking arrangement. The NSA did offer some boilerplate about “robust internal controls,” but simultaneously stiff-armed Buzzfeed’s request for Teresa Shea’s financial disclosure statements, citing the National Security Act of 1959. (This citation is also agency boilerplate, or at least was until Jason Leopold challenged it with a lawsuit. This move forced former NSA head Keith Alexander’s financial disclosure statements out of its hands. In light of this recent decision, it appears Shea’s statements will be released as well.)
This all looked conflicted enough, but Roston has uncovered more suspicious-looking information.
Yet another company, apparently focused on the office and electronics business, is based at the Shea residence on that well-tended lot.
This company is called Oplnet LLC.
Teresa Shea, who has been at the NSA since 1984, is the company’s resident agent.
The company’s articles of organization, signed by Teresa Shea, show that the firm was established in 1999 primarily “to buy, sell, rent and lease office and electronic equipment and related goods and services.” An attorney who also signed the document, Alan Engel, said he couldn’t comment on client matters.
Roston and Buzzfeed were unable to come up with any hard evidence linking Teresa Shea’s home business with federal contracts, but it did uncover a very interesting purchase.
Records show Oplnet does own a six-seat airplane, as well a condominium property with an assessed value of $275,000 in the resort town of Hilton Head, South Carolina.
Flight records for this aircraft show it has made a majority of its landings at three airports — one of them being Ft. Meade, Maryland, home of the NSA. It is not uncommon for people who own their own planes to actually set up a company to own that plane for a variety of legal and tax reasons — and it’s possible that’s what’s happened here — though it is notable that James Shea has a pilot’s license, while Teresa does not.
Perhaps it’s indicative of nothing at all, other than the overwhelming gravitational pull of the Beltway. But then, there’s this timeline.
1984 – Teresa Shea joins the NSA as an engineer working in SIGINT issues.
1990 – James Shea sets up Sigtek, Inc., which goes on to receive “hundreds of thousands of dollars in contracts with the federal government, according to a federal contracting database.”
1999 – Teresa Shea registers Oplnet, using their home address.
2000 – James Shea sells Sigtek, Inc. for $20 million to a British firm, while remaining listed as President of the company.
2007 – James Shea sets up Telic Networks, his newest SIGINT-focused company. This too is “based” at the Sheas’ shared home address.
2010 – Teresa Shea is promoted to Director of SIGINT. Nearly simultaneously, James Shea is named vice president of major SIGINT contractor DRS Signal Solutions.
Much of the Sheas’ shared success hinges on SIGINT — both the government’s expansion of dragnet surveillance and simultaneous growth of SIGINT-focused contractors. Maybe there’s nothing to this, but the silence from everyone involved seems to indicate there’s at least the “appearance of impropriety,” if not flat-out misconduct and abuse of power.
More will be known when (and always appended when dealing with the NSA, if ) Shea’s financial disclosure documents are released. At the very least, they’ll at least confirm the information Buzzfeed has dug up and prevent the NSA from boilerplating this whole situation into non-existence. The NSA is taking a second look at Keith Alexander’s post-NSA activities. If it’s willing to go that far, it’s willing to dig up dirt on lower-level officials. You can’t be too careful in the intelligence business these days, not with the eyes of legislators, activists and a whole bunch of pissed-off Americans watching your every move.
More Police Departments than Previously Thought Use Portable Surveillance Systems to Spy on almost Everyone
More U.S. police departments are employing electronic surveillance technology that can collect information from cell phones and laptop computers belonging not just to criminal suspects but also law abiding citizens.
The Charlotte Observer found the Charlotte-Mecklenburg police have for eight years used such equipment, which goes by many names: Stingray, Hailstorm, AmberJack and TriggerFish.
But the technology, which mimics cell towers, is also used by other law enforcement around the country. It’s just not clear which departments, the newspaper says, because the federal government has helped to shield police from disclosing their owning and operating the spy hardware. In fact, the Obama administration “has ordered cities not to disclose information about the equipment,” the Observer’s Fred Classen-Kelly reported.
However, members of the administration might also be among those spied upon. Through an open records request, VICE News has learned that Washington, D.C., is another city whose police department is using the technology. The Metropolitan Police Department (MPD) there purchased the Stingray system in 2003, purportedly to use for anti-terrorism efforts.
In 2008, however, the system was brought out of storage and is now used in regular criminal cases. But the system doesn’t discriminate between calls made by those suspected of wrongdoing and those of ordinary citizens, which means anyone’s whereabouts can be tracked.
Nathan Wessler, an attorney with the ACLU’s Speech, Privacy & Technology Project, told VICE News “If the MPD is driving around D.C. with Stingray devices, it is likely capturing information about the locations and movements of members of Congress, cabinet members, federal law enforcement agents, and Homeland Security personnel, consular staff, and foreign dignitaries, and all of the other people who congregate in the District…. If cell phone calls of congressional staff, White House aides, or even members of Congress are being disconnected, dropped, or blocked by MPD Stingrays, that’s a particularly sensitive and troublesome problem.”
Some in Charlotte have those concerns as well. “The thought of police or another agency collecting data on communications devices is troubling,” Charlotte City Councilman John Autry told the Observer. “I understand the balance between security and privacy, but I think we should honor the privacy protection in the Constitution. … What happens to the data? Who sees it? Who has access to it?”
The ACLU estimates that at least 46 local law enforcement agencies nationwide have cell phone tracking systems.
To Learn More:
Charlotte Police Investigators Secretly Track Cellphones (by Fred Classen-Kelly, Charlotte Observer)
Police in Washington, D.C. Are Using the Secretive ‘Stingray’ Cell Phone Tracking Tool (by Jason Leopold, VICE News)
After Months of Denial, Sacramento Sheriff Admits Using Stingray Cellphone Surveillance (by Ken Broder, AllGov California)
Local Police Departments Use Non-Disclosure Agreements to Hide Cellphone Tracking (by Noel Brinkerhoff, AllGov)
Around this time last year, parliamentary records show, the retired property developer and hugely generous Labour party donor, Sir David Garrard, had given a modest £60,000 towards the party’s election campaign for 2015. It came in addition to around half a million he had already given since 2003.
Fast forward to 16 June of this year, Garrard hosts a Labour Friends of Israel event, at which Labour leader Ed Miliband is the main speaker. The prime minister hopeful had, the year before, proclaimed that he was a Zionist. The lobbying group he addressed boasts dozens of Labour peers and MPs amongst its membership, including the Shadow Chancellor Ed Balls.
Despite the atrocities being committed as Miliband spoke – a few thousand miles away during “Operation Protective Edge” in Gaza, he made not one mention of the Palestinian casualties in his speech, though he did take time to note Israel’s own losses. By that point, 172 Palestinian lives had been taken, and over 1,200 were wounded. The newspapers were in outcry, but from Miliband – performing before his party donors – silence.
That same day, the silence was rewarded. Garrard transferred a whopping £630,000 to the Labour party accounts, over ten times his donation from the previous year.
It was a near identical episode to David Cameron speaking in 2009, back when he too was hoping to take office as prime minister.
At a well-attended Conservative Friends of Israel annual fundraising lunch held in London, he again made no mention of the Palestinian lives that had been lost, this time as part of “Operation Cast Lead”. Not one mention. In that war, 1,370 Palestinians had died. At the time, a leading British journalist wrote: “I found it impossible to reconcile the remarks made by the young Conservative leader with the numerous reports of human rights abuses in Gaza. Afterwards I said as much to some Tory MPs. They looked at me as if I was distressingly naive, drawing my attention to the very large number of Tory donors in the audience.”
No other foreign nation is as well represented in the campaign finances of British elections as Israel. In fact, no other nation comes close – and money linked to pro-Israel donors is a single interest influence akin to that of the trade unions (the largest democratic organisations in the country) or indeed the megabucks flowing in from City financiers.
And with that money, war crimes are being glossed over, rules bent, and our hard-won democracy warped by foreign interests.
The money is already pouring in.
In April, the Conservative Branch for Brigg & Goole, the constituency of Andrew Percy MP, received £6,000 from a notable pro-Israel supporter, Lord Stanley Fink. During the recent conflict, Percy attended an Israeli military briefing about the Iron Dome missile defence system – later glibly observing that “Israel acts as we would” in response to the mass civilian casualties being inflicted by the IDF.
Percy is, like 80 per cent of his colleagues, a member of Conservative Friends of Israel.
On the same day, £3,000 dropped into the bank account of the Conservative party in Harrow East. Their MP, Bob Blackman, also visited Israel during “Operation Protective Edge”. The money also came from Lord Fink.
And the pro-Israel peer pulled off a democracy-warping hat-trick that day – £3,000 for the Conservatives in Brighton & Kemptown, home to Conservative Friends of Israel linked Simon Kirby MP.
Over and above his backing of individual MPs, Lord Fink has also contributed over £60,000 to the Conservative Central Party accounts since July last year, and his total donations to the Conservatives over the years are now nearing £3 million.
Lord Fink is a staunch supporter of Israel – telling the Jewish Chronicle in 2009 that he shared similar views to Lord Michael Levy, Tony Blair’s aide who had close ties with Israeli political leaders. Levy’s son, Daniel, served as an assistant to the former Israeli Prime Minister Ehud Barak and to Knesset member Yossi Beilin.
Elsewhere, Lord Fink has been a “loyal donor” to Just Journalism, a now defunct group organised by the pro-Israeli Westminster think tank the Henry Jackson Society. Just Journalism claimed to be correcting “media bias” against Israel but instead acted as a pro-Israel “flak” group aggressively criticising any British publication who queried Israel’s human rights record, including the Guardian and the London Review of Books. The group folded in 2011.
Lord Fink is also a member of the Jewish Leadership Council (more on their influence later).
In March, the Conservative Branch in Poplar & Limehouse received £3,000 from another pro-Israel funder – Sir Michael Hintze. Hintze was ranked by Forbes in 2014 as the 1,016th richest person in the world, with a net worth of approximately $1.8 billion.
The constituency he has plugged money into is a swing seat; a six per cent change would depose incumbent Labour MP Jim Fitzpatrick (a member of both Labour Friends of Israel and Labour Friends of Palestine).
The Conservatives have their own reasons for targeting the seat, using the youthful ex-banker and Tower Hamlets councillor Tim Archer. The Respect party are running George Galloway, and he could split the Labour vote, opening the way for a Conservative win. George Galloway also happens to be the most outspoken critic of Israel in British politics.
British-Australian Hintze is not a man the Conservatives would want to annoy. Since July of last year, he has donated just over £1.5 million to the party (the figure is doubled if you look back to 2002).
Current Chancellor of the Exchequer George Osborne MP received nearly £40,000 in 2008 and 2009 directly from Hintze. Mayor of London Boris Johnson, Home Secretary Theresa May MP, David Davis MP and David Willets MP have also been subject to his financial largesse.
But the first politician Hintze backed in the Conservatives was Dr Liam Fox MP, with a £10,000 gift back in January 2007.
Fox then rose to become Secretary of State for Defence, before being disgraced when it was revealed he had allowed his close friend Adam Werrity access to the Ministry of Defence and to travel on official visits (despite not being a government employee).
Hintze was implicated because he had allowed Fox a desk in his London office as part of a £29,000 donation to Fox’s controversial charity – Atlantic Bridge – another pro-Israel lobbying organisation. Hintze served on its Executive Council.
Adam Werrity, who had been best man at Fox’s wedding in 2005, was later appointed UK Executive Director of Atlantic Bridge and played a key role in its operations.
In late 2011, “multiple sources” told the Independent on Sunday that Werrity had used contacts developed through Atlantic Bridge to arrange visits to Iran, meeting with opposition groups in both Washington and London, and had even been debriefed by MI6 about his travels.
The newspaper described the activities as “a freelance foreign policy” with Werrity seemingly “acting as a rogue operator”.
It was also revealed that Werrity was capable of arranging meetings “at the highest levels of the Israeli government”, and that Mossad had, bizarrely, believed Werrity to be Fox’s chief of staff.
The Guardian also raised the possibility that Werrity and Fox could have been operating a “shadow foreign policy,” using Atlantic Bridge as a cover organisation. The charity was investigated by the Charities Commission in 2011 and shut down.
Another patron of Atlantic Bridge, alongside Hintze, was Michael Lewis, ex-chairman of the Britain Israel Communications and Research Centre (BICOM).
That lobbying group describes itself as a “British organisation dedicated to creating a more supportive environment for Israel in Britain”. It was reported that Michael Lewis had paid for some of Werrity’s trips to Israel, charges he later denied.
Fox’s resignation was forced over the scandal – although true to Westminster form – no scandal is too much, in fact, he is already back, having politely refused a role as foreign secretary in July but now planning a new career as a backbencher.
Reviewing the Electoral Commission records for 2014, the pro-Israel donor Michael Lewis has popped up again. In March, he wrote another cheque for £10,000, to none other than Liam Fox.
In the past, Lewis has also backed William Hague – to the tune of £5,000. Hague later became foreign secretary.
According to Peter Oborne, now chief political commentator for the Telegraph, Michael Lewis’s baby BICOM is “Britain’s major pro-Israel lobby”.
In a searing expose for Channel 4 in 2009 and later a pamphlet calling for transparency from the Israel lobby, Oborne showed how BICOM was funded by a Finnish billionaire whose father made a fortune selling Israeli arms.
Chaim “Poju” Zabludowicz, who the Sunday Times ranked as the 57th richest individual in Britain with a net worth of over £1.5 billion, founded BICOM in 2001 and is its chairman.
Zabludowicz is also a member of the United Jewish Israel Appeal, a charity whose website claims it has three strands of work – “Supporting Israel”, “Connecting with Israel” and “Engaging with Israel”.
Since 2009, Zabludowicz has given approximately £125,000 to the Conservative party, either directly to party central, or to the party operating in Finchley and Golders Green, Harlow, Watford or Burton.
Zabludowicz is also a member of the Jewish Leadership Council – primarily concerned with philanthropic and educational matters within the British Jewish community, but who in June 2011 also met with the government to discuss the Middle East (BICOM attended the meeting too), and again in January 2012.
The Jewish Leadership Council, whose members also include pro-Israel Tory funders such as Lord Stanley Fink, and Tony Blair’s controversial man in Israel Lord Michael Levy, have taken it upon themselves to vigorously defend Israeli leaders from the principles of universal jurisdiction – which proves a great example of how influential the lobby is ,how intent the lobby is on insulating Israel from legal redress, and exactly why British voters should be wary of how much money the lobby is pumping into our elections.
In a celebratory post in 2011, on their own website, the Jewish Leadership Council (JLC) explained that two years ago, they had “commissioned a legal opinion from Lord Pannick QC which recommended a change in the law. We wanted to protect universal jurisdiction itself, a vital innovation that grew out of the Holocaust, while preventing it from being abused.” (“Preventing it from being abused” roughly translates to “being applied to Israel”).
Following an arrest warrant being issued for Israeli opposition leader Tzipi Livni, the group said: “We immediately sent our legal opinion to the government and opposition and worked with Conservative Friends of Israel, Labour Friends of Israel and Liberal Democratic Friends of Israel to begin generating support for this law change.”
“Within a few days, Gordon Brown had publicly promised to change the law as soon as possible,” the JLC bragged.
The Conservative party had already placed an advert in the Jewish Chronicle promising to change the law if they were elected. In 2011, the universal jurisdiction laws of the United Kingdom were changed, with arrest warrants now requiring the assent of the Attorney-General before they could be issued for alleged war criminals.
This was just as the pro-Israel lobby wanted. Rather than facing arrest when visiting the UK, Israeli politicians, generals and other war criminals can now feel assured that warrants would first have to pass through the Attorney-General, who is none other than Jeremy Wright MP, who is of course, another member of Conservative Friends of Israel.
The New York Supreme Court dismissed a lawsuit against the NYPD challenging its refusal to confirm or deny the existence of records related to its surveillance of a New York City mosque. The case appears to be the first time that a court has affirmed a “Glomar doctrine” below the federal level. Adam Marshall from the Reporters Committee for Freedom of the Press has more:
The case, Abdur-Rashid v. New York City Police Department, involved a request by Imam Talib Abdur-Rashid for records regarding NYPD surveillance of himself and his mosque in New York City. The city refused to disclose to Mr. Abdur-Rashid whether any such records existed, and told him that even if they did exist, such records would be exempt under the New York Freedom of Information Law (“FOIL”).
In its decision, the court somewhat perplexingly acknowledged that according to federal and state case law, “[i]t should follow that when a local agency such as the NYPD is replying to a FOIL request, the Glomar doctrine is similarly inapplicable.” However, it then went on to state that as this was a case of first impression, the NYPD’s use of a Glomar response “is in keeping with the spirit of similar appellate court cases.” The court determined that “disclosing the existence of responsive records would reveal information concerning operations, methodologies, and sources of information of the NYPD, the resulting harm of which would allow individuals or groups to take counter-measures to avoid detection of illegal activity, undermining current and future NYPD investigations.” Therefore, it granted the NYPD’s motion to dismiss the case.
Elizabeth Kimundi, a lawyer for the firm of Omar T. Mohameddi, which is representing Abdur-Rashid, said over the phone that her firm is drafting an appeal.
That appeal will be one to watch, because this is a “case of first impression,” meaning that, if the ruling is upheld, it will set precedent in the state of New York. And it would be a bad precedent.
The Glomar doctrine gives agencies the obvious power to hide the existence of records, but it also allows agencies to short-circuit the appeal process, since requestors can’t file an appeal for records they don’t know exist. The NYPD consistently flouts both the spirit and letter of New York’s Freedom of Information Law. There is no expectation that it would use Glomar powers in good faith. A Glomar doctrine would just become another tool in Police Plaza One’s aggressive strategy to block and discourage FOIL requestors.
- CIA says it didn’t know it had a copy of the Senate torture report.
- ACLU and EFF file appeal in suit for LAPD license plate reader tech
- Obama admin asks judge to dismiss civil lawsuit against United Against Nuclear Iran, attempting to invoke state secrets without public explanation. “After everything – the torture, the rendition, the eavesdropping…This is the case that stands for the proposition that privilege can be asserted in the dark?”
- In FOIA lawsuit, EPA says it may have lost text messages it was required to archive under federal record law.
- Judicial Watch sues DOJ for Operation Choke Point records.
- Pebble Project files lawsuit against EPA, alleging FOIA violation
Diosdado Cabello speaks at a press conference in this archive photo. (Photo: AVN )
The president of the Venezuelan National Assembly Diosdado Cabello called Friday on intelligence agencies to investigate Non-Governmental Organizations (NGOs) in the country that are funded by the United States Agency for International Development (USAID).
Cabello’s call comes on the heels of the arrival of a representative of the U.S. government in Venezuela to meet with representatives of NGOs at the U.S. Embassy in Caracas.
According to their website, USAID’s mission is “furthering America’s interests, while improving lives in the developing world.”
In practice, much of the work of USAID has been to support the activities of groups that are opposed to democratically elected governments. Cabello pointed to an NGO that has links to “Operation Liberty”, the group lead by Lorent Saleh, who is currently in custody on accusations of intent to commit terrorist acts in Venezuela.
Cabello has previously warned of the attempts by the U.S. and its allies to interfere in the internal affairs of Venezuela, saying, “This is one way for imperialism to finance conspiracy [against the government].”
Bolivia expelled USAID and its representatives from that country in 2013 due to their support of opposition groups opposed to the government of Evo Morales.
It’s part of the public record that the NSA has engaged in an industry-wide campaign to weaken cryptographic protocols and insert back doors into hi-tech products sold by U.S. companies. We also know that NSA officials have privately congratulated each other in successfully undermining privacy and security across the Internet. Hence it’s only logical to assume that the NSA’s numerous subversion programs extend into foreign “commercial entities”. Thanks to documents recently disclosed by the Intercept we have unambiguous confirmation.
Hi-tech subversion underscores the fact that the whole tired debate regarding cryptographic keys held in escrow for so-called lawful interception (what the Washington Post called “secret golden keys”) only serves to distract the public from programs aimed at wielding covert back doors. In other words, by reviving the zombie idea of an explicit back door the editorial board at the Washington Post is conveniently ignoring all of the clandestine techniques that already exist to sidestep encryption. In a nutshell: zero-day bugs and malware often trump strong crypto.
On an aside it’s interesting to observe the citadel of free thinkers at the Electronic Frontier Foundation continue to promote cryptographic tools as a privacy tonic with a faith that’s almost religious while conspicuously neglecting other important aspects of operational security. The EFF cheerfully provides a litany of alleged success stories. Never mind all of the instances in which the users of said cryptographic tools were compromised, even users who specialized in computer security.
Infiltrating the Media
The NSA’s campaign to undermine software and hardware is mirrored by parallel efforts in other domains. Specifically, the Church Committee and Pike Committee investigations of the 1970s unearthed secret programs like Operation Mockingbird which were conducted to infiltrate the media and develop an apparatus, a Mighty Wurlitzer of sorts, that allowed government spies to quietly influence public perception. The findings of congressional investigators have been substantiated by writers like Deborah Davis and Carl Bernstein.
Though much of the documented evidence is decades old the CIA continues to maintain its long-standing relationship with the press. For example in March of 2010 WikiLeaks published a classified CIA analysis which described a propaganda recipe for the “targeted manipulation of public opinion” in Germany and France to bolster support for NATO military action in Afghanistan. Also, here in the United States New York Times editor Bill Keller admitted to delaying the story on Bush-era warrantless wiretapping in direct service to the powers that be.
So don’t think for a minute that the CIA didn’t have a hand in the media’s assault on journalist Gary Webb after Webb exposed the CIA’s connections to the international drug trade. Gary caught U.S. intelligence with its pants down and spymasters had their operatives in the press destroy him.
More recently, the former editor of Frankfurter Allgemeine Zeitung revealed that he worked for the CIA. In a televised interview Udo Ulfkotte described Germany as an American client state, noting the role of the CIA in the origins of German intelligence. He warned that powerful interests in the United States were pushing for war with Russia and that American spies have widespread links to foreign news outlets:
“Is this only the case with German journalists? No, I think it is especially the case with British journalists, because they have a much closer relationship. It is especially the case with Israeli journalists. Of course with French journalists. … It is the case for Australians, [with] journalists from New Zealand, from Taiwan, well, there is many countries, … like Jordan for example. …”
A Question for Ed Snowden
While media subversion enables political manipulation through indirect means, U.S. intelligence has been known to employ more direct means to impose its agenda in places like Angola, Chile, Guatemala, Iran, Nicaragua, and Ukraine. In fact, stepping back to view the big picture, one might be tempted to posit that U.S. intelligence has established clandestine footholds globally in any institution seen as vital to the interests of the corporate factions that drive the American Deep State.
All of this subversion raises a question: are covert programs compatible with democracy? Can the public allow secrecy, propaganda, and infiltration to blossom while simultaneously expecting to be immune from their effects? Former CIA officers who went public, intrepid whistleblowers like Philip Agee and John Stockwell, answered this question with a resounding “no.” As would millions of people in third-world countries who suffered through the bloody proxy battles of the Cold War. For instance, Philip Agee stated in his book CIA Diary:
“When the Watergate trials end and the whole episode begins to fade, there will be a movement for national renewal, for reform of electoral practices, and perhaps even for reform of the FBI and the CIA. But the return to our cozy self-righteous traditions should lure no one into believing that the problem has been removed. Reforms attack symptoms rather than the disease”
Hence it’s unsettling to hear Edward Snowden, despite his commendable admonishments for an open debate on mass surveillance, maintain the underlying legitimacy of government subterfuge:
“We can have secret programs. You know, the American people don’t have to know the name of every individual that’s under investigation. We don’t need to know the technical details of absolutely every program in the intelligence community. But we do have to know the bare and broad outlines of the powers our government is claiming … and how they affect us and how they affect our relationships overseas.”
You’re witnessing the power of framing the narrative. Society has been encouraged to discuss the legitimacy of what spies do and how they do it. But the problem with this well-intentioned dialogue is that “we the people” are led away from the more fundamental question of whether society needs spies and their covert ops to begin with.
Author’s Note: In the past I’ve posed a question to Glenn Greenwald and was met with silence. Exceptional behavior for someone who is famous for responding vocally. Now we’ll see how Mr. Snowden replies.
Bill Blunden is an independent investigator whose current areas of inquiry include information security, anti-forensics, and institutional analysis. He is the author of several books, including The Rootkit Arsenal , and Behold a Pale Farce: Cyberwar, Threat Inflation, and the Malware-Industrial Complex. Bill is the lead investigator at Below Gotham Labs.
As we reported a few weeks ago, Australia has passed a dreadful “anti-terror” law that not only allows the authorities to monitor the entire Internet in that country with a single warrant, but also threatens 10 years of jail time for anyone who “recklessly” discloses information that relates to a “special intelligence operation.” But what exactly will that mean in practice? Elizabeth Oshea, writing in the Overland journal, has put together a great article fleshing things out. Here’s her introduction:
The parliament has passed legislation that permits the Attorney General to authorise certain activities of ASIO and affiliates as ‘special intelligence operations’. We can only assume that ASIO will seek such authorisation when its operatives plan to break the criminal or civil law — the whole point of authorising an operation as a special intelligence operation is that participants will be immune from the consequences of their unlawfulness. It will also be a criminal act to disclose information about these operations.
So the Australian government can designate activities of its spy services as “special intelligence operations,” which may well be illegal, and then it becomes a criminal act to disclose anything about those operations, however bad they are. Indeed, that even seems to include operations that result in death, as Oshea explains in one of her examples of what could happen under the new law:
A botched operation is conducted that results in the death of an innocent bystander (credit this suggestion to the former Independent National Security Legislation Monitor). Note that if a person with three children dies as a result of a failure to take reasonable care, her family will be unable to make a claim for the cost of raising her dependents. If she is maimed but not killed, she will be unable to make a claim for the cost of her medical care, lost earnings, pain and suffering, and the cost of raising her dependents.
That’s a hypothetical case, but Oshea also lists a number of incidents that have already occurred, but which are likely to be covered by the new law — and would thus become impossible to write about. Here are a couple of them, with links to the real-life cases:
Agents and officers raid a couple in their home and hold them captive at gunpoint for an hour, only leaving when they discovered they were at the wrong address. The couple will have no entitlement to compensation for any property or personal damage arising from imprisonment, trespass and assault.
Agents kidnap and falsely imprison a young medical student. They attempt to coerce answers from him, making threats that go beyond what is permitted by the relevant search warrant.
There’s more of the same, listing previously-reported incidents that would probably be censored in future. The post also explores legislative proposals that are equally disturbing:
The parliament is considering laws that will punish people with life imprisonment for a range of new offences associated with ‘subverting society’ (which is a component of the new definition of ‘engaging in hostile activities’). The law contains a defence of advocacy, protest, dissent or industrial action, but it is very unclear how these would be applied.
Here’s the kind of thing that might get you life imprisonment in Australia in the future:
Leaking materials taken from government information systems that demonstrate serious wrongdoing (as per Manning or Snowden).
Organising and engaging in denial of service attacks – the online equivalent of a sit in – against government websites, such as that of the President, Prime Minister, the Ministry of Industry, the Ministry of Foreign Affairs, and the Stock Exchange.
There’s also an explanation of what data retention might mean for the public. All in all, it’s a valuable guide to some of the seriously bad stuff that Australia is doing. Let’s just hope that other countries don’t take it as a blueprint.
The mainstream news media’s reaction to the new movie, “Kill the Messenger,” has been tepid, perhaps not surprising given that the MSM comes across as the film’s most unsympathetic villain as it crushes journalist Gary Webb for digging up the Contra-cocaine scandal in the mid-1990s after the major newspapers thought they had buried it in the 1980s.
Not that the movie is without other villains, including drug traffickers and “men in black” government agents. But the drug lords show some humanity and even honesty as they describe how they smuggled drugs and shared the proceeds with the Nicaraguan Contra rebels, President Ronald Reagan’s beloved “freedom fighters.”
By contrast, the news executives for the big newspapers, such as the Washington Post and the Los Angeles Times, come across as soulless careerists determined to maintain their cozy relations with the CIA’s press office and set on shielding their failure to take on this shocking scandal when it was playing out in the 1980s.
So, in the 1990s, they concentrated their fire on Webb for alleged imperfections in his investigative reporting rather than on U.S. government officials who condoned and protected the Contra drug trafficking as part of Reagan’s Cold War crusade.
Webb’s cowardly editors at the San Jose Mercury News also come across badly as frightened bureaucrats, cringing before the collective misjudgment of the MSM and crucifying their own journalist for the sin of challenging the media’s wrongheaded conventional wisdom.
That the MSM’s “group think” was upside-down should no longer be in doubt. In fact, the Contra-cocaine case was conclusively established as early as 1985 when Brian Barger and I wrote the first story on the scandal for the Associated Press. Our sourcing included some two dozen knowledgeable people including Contras, Contra supporters and U.S. government sources from the Drug Enforcement Administration and even Reagan’s National Security Council staff.
But the Reagan administration didn’t want to acknowledge this inconvenient truth, knowing it would sink the Contra war against Nicaragua’s leftist Sandinista government. So, after the AP story was published, President Reagan’s skillful propagandists mounted a counteroffensive that elicited help from editors and reporters at the New York Times, the Washington Post and other major news outlets.
Thus, in the 1980s, the MSM treated the Contra-cocaine scandal as a “conspiracy theory” when it actually was a very real conspiracy. The MSM’s smug and derisive attitude continued despite a courageous investigation headed by Sen. John Kerry which, in 1989, confirmed the AP reporting and took the story even further. For his efforts, Newsweek dubbed Kerry “a randy conspiracy buff.”
This dismissive treatment of the scandal even survived the narcotics trafficking trial of Panama’s Manuel Noriega in 1991 when the U.S. government called witnesses who implicated both Noriega and the Contras in the cocaine trade.
The Power of ‘Group Think’
What we were seeing was the emerging power of the MSM’s “group think,” driven by conformity and careerism and resistant to both facts and logic. Once all the “smart people” of Official Washington reached a conclusion – no matter how misguided – that judgment would be defended at nearly all costs, since none of these influential folks wanted to admit error.
That’s what Gary Webb ran into in 1996 when he revived the Contra-cocaine scandal by focusing on the devastation that one Contra drug pipeline caused by feeding into the production of crack cocaine. However, for the big newspapers to admit they had ducked such an important story – and indeed had aided in the government’s cover-up – would be devastating to their standing.
So, the obvious play was to nitpick Webb’s reporting and to destroy him personally, which is what the big newspapers did and what “Kill the Messenger” depicts. The question today is: how will the MSM react to this second revival of the Contra-cocaine scandal?
Of the movie reviews that I read, a few were respectful, including the one in the Los Angeles Times where Kenneth Turan wrote: “The story Webb related in a series of articles … told a still-controversial tale that many people did not want to hear: that elements in the CIA made common cause with Central American drug dealers and that money that resulted from cocaine sales in the U.S. was used to arm the anti-communist Contras in Nicaragua.
“Although the CIA itself confirmed, albeit years later, that this connection did in fact exist, journalists continue to argue about whether aspects of Webb’s stories overreached.”
A normal person might wonder why – if the CIA itself admitted (as it did) that it was collaborating with drug dealers – journalists would still be debating whether Webb may have “overreached” (although in reality he actually understated the problem). Talk about missing “the lede” or the forest for the trees.
What kind of “journalist” obsesses over dissecting the work of another journalist while the U.S. government gets away with aiding and abetting drug traffickers?
Turan went on to note the same strange pattern in 1996 after Webb’s series appeared: “what no one counted on was that the journalistic establishment — including elite newspapers such as the Los Angeles Times — would attempt to discredit Webb’s reporting. The other newspapers questioned the shakier parts of his story and proving the truth of what one of Webb’s sources tells him: ‘You get the most flak when you’re right above the target.’”
However, other reviews, including those in the New York Times and the Washington Post, continued the snarky tone that pervaded the sneering treatment of Webb that hounded him out of journalism in 1997 and ultimately drove him to suicide in 2004. For instance, the headline in the Post’s weekend section was “Sticking with Webb’s Story,” as in the phrase “That’s my story and I’m sticking to it.”
The review by Michael O’Sullivan stated: “Inspired by the true story of Gary Webb — the San Jose Mercury News reporter known for a controversial series of articles suggesting a link between the CIA, the California crack epidemic and the Nicaraguan Contras — this slightly overheated drama begins and ends with innuendo. In between is a generous schmear of insinuation.”
You get the point. The allegations, which have now been so well-established that even the CIA admits to them, are “controversial” and amount to “innuendo” and “insinuation.”
Similarly, the New York Times review by Manohla Dargis disparaged Webb’s “Dark Alliance” series as “much-contested,” which may be technically accurate but fails to recognize that the core allegations of Contra-cocaine trafficking and U.S. government complicity were true – something an earlier article by Times’ media writer David Carr at least had the decency to acknowledge. [See Consortiumnews.com’s “NYT’s Belated Admission on Contra-Cocaine.”]
In a different world, the major newspapers would have taken the opening created by “Kill the Messenger” to make amends for their egregious behavior in the 1980s – in discrediting the scandal when the criminality could have been stopped – and for their outrageous actions in the 1990s in destroying the life and career of Gary Webb. But it appears the big papers mostly plan to hunker down and pretend they did nothing wrong.
For those interested in the hard evidence proving the reality of the Contra-cocaine scandal, I posted a Special Report on Friday detailing much of what we know and how we know it. [See Consortiumnews.com’s “The Sordid Contra-Cocaine Saga.”]
As for “Kill the Messenger,” I had the pleasure of watching it on Friday night with my old Associated Press colleague Brian Barger – and we both were impressed by how effectively the movie-makers explained a fairly complicated tale about drugs and politics. The personal story was told with integrity, aided immensely by Jeremy Renner’s convincing portrayal of Webb.
There were, of course, some Hollywood fictional flourishes for dramatic purposes. And it was a little weird hearing my cautionary advice to Webb – delivered when we talked before his “Dark Alliance” series was published in 1996 – being put into the mouth of a fictional Kerry staffer.
But those are minor points. What was truly remarkable about this movie was that it was made at all. Over the past three decades, many directors and screenwriters have contemplated telling the sordid story of Contra-cocaine trafficking but all have failed to get the projects “green-lighted.”
The conventional wisdom in Hollywood has been that such a movie would be torn apart by the major media just as Webb’s series (and before that the AP articles and Kerry’s report) were. But so far the MSM has largely held its fire against “Kill the Messenger,” relying on a few snide asides and knowing smirks.
Perhaps the MSM simply assumes that the old conventional wisdom will hold and that the movie will soon be forgotten. Or maybe there’s been a paradigm shift – and the MSM realizes that its credibility is shot (especially after its catastrophic performance regarding Iraq’s WMD) and it is losing its power to dictate false narratives to the American people.
[To learn how you can hear a December 1996 joint appearance at which Robert Parry and Gary Webb discuss their reporting, click here.]
Investigative reporter Robert Parry broke many of the Iran-Contra stories for The Associated Press and Newsweek in the 1980s. You can buy his new book, America’s Stolen Narrative, either in print here or as an e-book (from Amazon and barnesandnoble.com).
A US watchdog is asking why 16 planes bought for the Afghan Air Force, costing almost $500 million, were turned into scrap metal valued at just $32,000. The government wants to know why hundreds of millions of taxpayers’ money were wasted on the project.
The military transport planes had been sitting at Kabul International Airport for years, before they were sent for scrap. John Sopko, the Special Inspector General for Afghanistan Reconstruction (SIGAR), wants to know why the money was wasted. According to Reuters, he had asked Air Force Secretary Deborah James to keep a record of all decisions concerning the destruction of the 16 C-27J planes.
Sopko also wants to know what will happen to another four transport planes currently stored at the US Air Force base in Ramstein, Germany.
“I am concerned that the officials responsible for planning and executing the scrapping of the planes may not have considered other possible alternatives in order to salvage taxpayer dollars,” Sopko said.
The 20 planes were bought from Alenia, which is part of the Italian aircraft company Fimmeccanica SpA. However, according to a SIGAR letter sent to US Defense Secretary Chuck Hagel, the program was ended in March 2013, “after sustained, serious performance, maintenance, and spare parts problems and the planes were grounded,” ABC reported.
By January 2013, according to Sopko’s office, the aircraft were not airworthy and had only flown a total of 234 of the 4,500 hours required in nine months from January through September 2012. Spoko’s office also said that a further $200 million was needed to buy spare parts.
The Defense Logistics Agency was responsible for destroying the planes and Sopko now wants to know if any of the parts of the planes were sold before they were sent for scrap metal.
Major Bradlee Avots, a Pentagon spokesman, said that the 16 aircraft at Kabul International Airport had been destroyed “to minimize impact on drawdown of US Forces in Afghanistan,” and added that more information would be released after a review. The US government is currently in the process of scaling down from its present military personnel in Afghanistan of around 26,000 to a force of just under 10,000, who will be staying in a mainly advisory role.
Avots also said that the US Department of Defense and the US Air Force were still deciding what to do with the four aircraft in Germany.
SIGAR has been investigating possible wasteful spending on warplanes since the end of 2013, following questions raised by military officials and non-profit organizations.
Sopko has said that he does not know if wasteful plane procurement was due to any criminal malice or was just mismanagement, but that the “scrap metal” incident in Afghanistan was not an isolated case.
In June, despite Afghanistan being a landlocked country, a US government watchdog found that the Pentagon spent more than $3 million obtaining eight patrol boats that were never used. Additionally, the cost of each boat turned out to be about $375,000 – far more than the $50,000 they usually sell for in the US.
During his investigation, Sopko said that records related to the purchase and cancelation of the patrol boats were severely lacking, and his questions to the military have not resulted in adequate answers.
“The military has been unable to provide records that would answer the most basic questions surrounding this $3 million purchase,” his office told the Washington Post in a statement in June.
Software giant Adobe has been accused of spying on individuals who use its Digital Editions e-book and PDF reader. The practice allegedly includes mining for data on users PCs, yet Adobe has denied acting beyond the user license agreement.
On Tuesday, the allegation that Digital Edition (DE) software logs and uploads user data to its servers was verified by Ars Technica and a competing software developer at Safari Books. This process is also notable because it’s done transparently over the internet, meaning individuals, internet corporations, and government departments like the National Security Agency can easily intercept the information.
Whether or not the company also monitors user hard drives in general has yet to be confirmed.
“It’s not clear how the data collected by Adobe is stored, but it is associated with a unique identifier for each Digital Editions installation that can be associated with an Internet Protocol address when logged,” Sean Gallagher wrote at Ars Technica. “And the fact that the data is broadcast in the clear by Digital Editions is directly in conflict with the privacy guidelines of many library systems, which closely guard readers’ book loan data.”
Originally, Adobe was flagged by the Digital Reader for tracking and uploading data related to various books opened in DE, such as how long a book has been activated or opened, or what pages have been read.
“Adobe is gathering data on the eBooks that have been opened, which pages were read, and in what order,” Nate Hoffelder wrote at the website. “All of this data, including the title, publisher, and other metadata for the book is being sent to Adobe’s server in clear text.”
“Adobe is not only logging what users are doing,” he continued, “they’re also sending those logs to their servers in such a way that anyone running one of the servers in between can listen in and know everything.”
If that wasn’t enough, Hoffelder claimed that Adobe’s tracking systems are exploring data even beyond the DE reader, scanning users’ computer hard drives and collecting and uploading metadata related to every e-book in the system – whether they were opened in DE or not.
As previously mentioned, this last accusation has not been verified.
“Adobe Digital Editions does not scan your entire computer looking for files that it knows how to open, it needs to be explicitly told about EPUB or PDF files that you would like it to know about,” an Adobe tech support employee wrote earlier this year in response to a question on the community forum.
Utilized by thousands of libraries in order to lend out books digitally, DE’s tracking of activation times would allow libraries to know when a particular lending period has run its course. However, DE is not just tracking borrowed books. It’s also keeping tabs on purchased titles as well.
“We are looking at this, and very concerned about this,” said Deorah Caldwell-Stone, the deputy director of the American Library Association’s Office for Intellectual Freedom, to Ars Technica. If the data being uploaded over the internet is related to library lending, “we would want this information encrypted and private,” she added.
Meanwhile, Adobe said that “all information collected from the user is collected solely for purposes such as license validation and to facilitate the implementation of different licensing models by publishers.”
“In terms of the unsecure transmission of the collected data, Adobe is in the process of working on an update to address this issue,” the spokesperson said in an email to Ars Technica. “We will notify you when a date for this update has been determined.”
Program in Atmospheres, Oceans and Climate. Massachusetts Institute of Technology (MIT) and Global Research 30 November 2009
For a variety of inter-related cultural, organizational, and political reasons, progress in climate science and the actual solution of scientific problems in this field have moved at a much slower rate than would normally be possible.
Not all these factors are unique to climate science, but the heavy influence of politics has served to amplify the role of the other factors. By cultural factors, I primarily refer to the change in the scientific paradigm from a dialectic opposition between theory and observation to an emphasis on simulation and observational programs. The latter serves to almost eliminate the dialectical focus of the former.
Whereas the former had the potential for convergence, the latter is much less effective. The institutional factor has many components. One is the inordinate growth of administration in universities and the consequent increase in importance of grant overhead. This leads to an emphasis on large programs that never end. Another is the hierarchical nature of formal scientific organizations whereby a small executive council can speak on behalf of thousands of scientists as well as govern the distribution of ‘carrots and sticks’ whereby reputations are made and broken. The above factors are all amplified by the need for government funding.
When an issue becomes a vital part of a political agenda, as is the case with climate, then the politically desired position becomes a goal rather than a consequence of scientific research. This paper will deal with the origin of the cultural changes and with specific examples of the operation and interaction of these factors. In particular, we will show how political bodies act to control scientific institutions, how scientists adjust both data and even theory to accommodate politically correct positions, and how opposition to these positions is disposed of.
Although the focus of this paper is on climate science, some of the problems pertain to science more generally. Science has traditionally been held to involve the creative opposition of theory and observation wherein each tests the other in such a manner as to converge on a better understanding of the natural world. Success was rewarded by recognition, though the degree of recognition was weighted according to both the practical consequences of the success and the philosophical and aesthetic power of the success. As science undertook more ambitious problems, and the cost and scale of operations increased, the need for funds undoubtedly shifted emphasis to practical relevance though numerous examples from the past assured a strong base level of confidence in the utility of science. Moreover, the many success stories established ‘science’ as a source of authority and integrity. Thus, almost all modern movements claimed scientific foundations for their aims. Early on, this fostered a profound misuse of science, since science is primarily a successful mode of inquiry rather than a source of authority.
Until the post World War II period, little in the way of structure existed for the formal support of science by government (at least in the US which is where my own observations are most relevant). In the aftermath of the Second World War, the major contributions of science to the war effort (radar, the A-bomb), to health (penicillin), etc. were evident. Vannevar Bush (in his report, Science: The Endless Frontier, 1945) noted the many practical roles that validated the importance of science to the nation, and argued that the government need only adequately support basic science in order for further benefits to emerge. The scientific community felt this paradigm to be an entirely appropriate response by a grateful nation. The next 20 years witnessed truly impressive scientific productivity which firmly established the United States as the creative center of the scientific world.
The Bush paradigm seemed amply justified. (This period and its follow-up are also discussed by Miller, 2007, with special but not total emphasis on the NIH (National Institutes of Health).) However, something changed in the late 60’s. In a variety of fields it has been suggested that the rate of new discoveries and achievements slowed appreciably (despite increasing publications) , and it is being suggested that either the Bush paradigm ceased to be valid or that it may never have been valid in the first place. I believe that the former is correct. What then happened in the 1960’s to produce this change?
It is my impression that by the end of the 60’s scientists, themselves, came to feel that the real basis for support was not gratitude (and the associated trust that support would bring further benefit) but fear: fear of the Soviet Union, fear of cancer, etc. Many will conclude that this was merely an awakening of a naive scientific community to reality, and they may well be right. However, between the perceptions of gratitude and fear as the basis for support lies a world of difference in incentive structure. If one thinks the basis is gratitude, then one obviously will respond by contributions that will elicit more gratitude. The perpetuation of fear, on the other hand, militates against solving problems. This change in perception proceeded largely without comment. However, the end of the cold war, by eliminating a large part of the fear-base forced a reassessment of the situation. Most thinking has been devoted to the emphasis of other sources of fear: competitiveness, health, resource depletion and the environment.
What may have caused this change in perception is unclear, because so many separate but potentially relevant things occurred almost simultaneously. The space race reinstituted the model of large scale focused efforts such as the moon landing program. For another, the 60’s saw the first major postwar funding cuts for science in the US. The budgetary pressures of the Vietnam War may have demanded savings someplace, but the fact that science was regarded as, to some extent, dispensable, came as a shock to many scientists. So did the massive increase in management structures and bureaucracy which took control of science out of the hands of working scientists. All of this may be related to the demographic pressures resulting from the baby boomers entering the workforce and the post -sputnik emphasis on science. Sorting this out goes well beyond my present aim which is merely to consider the consequences of fear as a perceived basis of support.
Fear has several advantages over gratitude. Gratitude is intrinsically limited, if only by the finite creative capacity of the scientific community. Moreover, as pointed out by a colleague at MIT, appealing to people’s gratitude and trust is usually less effective than pulling a gun. In other words, fear can motivate greater generosity. Sputnik provided a notable example in this regard; though it did not immediately alter the perceptions of most scientists, it did lead to a great increase in the number of scientists, which contributed to the previously mentioned demographic pressure. Science since the sixties has been characterized by the large programs that this generosity encourages.
Moreover, the fact that fear provides little incentive for scientists to do anything more than perpetuate problems, significantly reduces the dependence of the scientific enterprise on unique skills and talents. The combination of increased scale and diminished emphasis on unique talent is, from a certain point of view, a devastating combination which greatly increases the potential for the political direction of science, and the creation of dependent constituencies. With these new constituencies, such obvious controls as peer review and detailed accountability, begin to fail and even serve to perpetuate the defects of the system. Miller (2007) specifically addresses how the system especially favors dogmatism and conformity.
The creation of the government bureaucracy, and the increasing body of regulations accompanying government funding, called, in turn, for a massive increase in the administrative staff at universities and research centers. The support for this staff comes from the overhead on government grants, and, in turn, produces an active pressure for the solicitation of more and larger grants .
One result of the above appears to have been the deemphasis of theory because of its intrinsic difficulty and small scale, the encouragement of simulation instead (with its call for large capital investment in computation), and the encouragement of large programs unconstrained by specific goals . In brief, we have the new paradigm where simulation and programs have replaced theory and observation, where government largely determines the nature of scientific activity, and where the primary role of professional societies is the lobbying of the government for special advantage.
This new paradigm for science and its dependence on fear based support may not constitute corruption per se, but it does serve to make the system particularly vulnerable to corruption. Much of the remainder of this paper will illustrate the exploitation of this vulnerability in the area of climate research. The situation is particularly acute for a small weak field like climatology. As a field, it has traditionally been a subfield within such disciplines as meteorology, oceanography, geography, geochemistry, etc. These fields, themselves are small and immature. At the same time, these fields can be trivially associated with natural disasters. Finally, climate science has been targeted by a major political movement, environmentalism, as the focus of their efforts, wherein the natural disasters of the earth system, have come to be identified with man’s activities – engendering fear as well as an agenda for societal reform and control. The remainder of this paper will briefly describe how this has been playing out with respect to the climate issue.
2. Conscious Efforts to Politicize Climate Science
The above described changes in scientific culture were both the cause and effect of the growth of ‘big science,’ and the concomitant rise in importance of large organizations. However, all such organizations, whether professional societies, research laboratories, advisory bodies (such as the national academies), government departments and agencies (including NASA, NOAA, EPA, NSF, etc.), and even universities are hierarchical structures where positions and policies are determined by small executive councils or even single individuals. This greatly facilitates any conscious effort to politicize science via influence in such bodies where a handful of individuals (often not even scientists) speak on behalf of organizations that include thousands of scientists, and even enforce specific scientific positions and agendas. The temptation to politicize science is overwhelming and longstanding. Public trust in science has always been high, and political organizations have long sought to improve their own credibility by associating their goals with ‘science’ – even if this involves misrepresenting the science.
Professional societies represent a somewhat special case. Originally created to provide a means for communication within professions – organizing meetings and publishing journals – they also provided, in some instances, professional certification, and public outreach. The central offices of such societies were scattered throughout the US, and rarely located in Washington. Increasingly, however, such societies require impressive presences in Washington where they engage in interactions with the federal government. Of course, the nominal interaction involves lobbying for special advantage, but increasingly, the interaction consists in issuing policy and scientific statements on behalf of the society. Such statements, however, hardly represent independent representation of membership positions. For example, the primary spokesman for the American Meteorological Society in Washington is Anthony Socci who is neither an elected official of the AMS nor a contributor to climate science. Rather, he is a former staffer for Al Gore.
Returning to the matter of scientific organizations, we find a variety of patterns of influence. The most obvious to recognize (though frequently kept from public view), consists in prominent individuals within the environmental movement simultaneously holding and using influential positions within the scientific organization. Thus, John Firor long served as administrative director of the National Center for Atmospheric Research in Boulder, Colorado. This position was purely administrative, and Firor did not claim any scientific credentials in the atmospheric sciences at the time I was on the staff of NCAR. However, I noticed that beginning in the 1980′s, Firor was frequently speaking on the dangers of global warming as an expert from NCAR. When Firor died last November, his obituary noted that he had also been Board Chairman at Environmental Defense– a major environmental advocacy group – from 1975-1980 .
The UK Meteorological Office also has a board, and its chairman, Robert Napier, was previously the Chief Executive for World Wildlife Fund – UK. Bill Hare, a lawyer and Campaign Director for Greenpeace, frequently speaks as a ‘scientist’ representing the Potsdam Institute, Germany’s main global warming research center. John Holdren, who currently directs the Woods Hole Research Center (an environmental advocacy center not to be confused with the far better known Woods Hole Oceanographic Institution, a research center), is also a professor in Harvard’s Kennedy School of Government, and has served as president of the American Association for the Advancement of Science among numerous other positions including serving on the board of the MacArthur Foundation from 1991 until 2005. He was also a Clinton-Gore Administration spokesman on global warming.
The making of academic appointments to global warming alarmists is hardly a unique occurrence. The case of Michael Oppenheimer is noteworthy in this regard. With few contributions to climate science (his postdoctoral research was in astro-chemistry), and none to the physics of climate, Oppenheimer became the Barbara Streisand Scientist at Environmental Defense . He was subsequently appointed to a professorship at Princeton University, and is now, regularly, referred to as a prominent climate scientist by Oprah (a popular television hostess), NPR (National Public Radio), etc. To be sure, Oppenheimer did coauthor an early absurdly alarmist volume (Oppenheimer and Robert Boyle, 1990: Dead Heat, The Race Against the Greenhouse Effect), and he has served as a lead author with the IPCC (Intergovernmental Panel on Climate Change) .
One could go on at some length with such examples, but a more common form of infiltration consists in simply getting a couple of seats on the council of an organization (or on the advisory panels of government agencies). This is sufficient to veto any statements or decisions that they are opposed to. Eventually, this enables the production of statements supporting their position – if only as a quid pro quo for permitting other business to get done. Sometimes, as in the production of the 1993 report of the NAS, Policy Implications of Global Warming, the environmental activists, having largely gotten their way in the preparation of the report where they were strongly represented as ‘stake holders,’ decided, nonetheless, to issue a minority statement suggesting that the NAS report had not gone ‘far enough.’ The influence of the environmental movement has effectively made support for global warming, not only a core element of political correctness, but also a requirement for the numerous prizes and awards given to scientists. That said, when it comes to professional societies, there is often no need at all for overt infiltration since issues like global warming have become a part of both political correctness and (in the US) partisan politics, and there will usually be council members who are committed in this manner.
The situation with America’s National Academy of Science is somewhat more complicated. The Academy is divided into many disciplinary sections whose primary task is the nomination of candidates for membership in the Academy . Typically, support by more than 85% of the membership of any section is needed for nomination. However, once a candidate is elected, the candidate is free to affiliate with any section. The vetting procedure is generally rigorous, but for over 20 years, there was a Temporary Nominating Group for the Global Environment to provide a back door for the election of candidates who were environmental activists, bypassing the conventional vetting procedure. Members, so elected, proceeded to join existing sections where they hold a veto power over the election of any scientists unsympathetic to their position. Moreover, they are almost immediately appointed to positions on the executive council, and other influential bodies within the Academy. One of the members elected via the Temporary Nominating Group, Ralph Cicerone, is now president of the National Academy. Prior to that, he was on the nominating committee for the presidency. It should be added that there is generally only a single candidate for president. Others elected to the NAS via this route include Paul Ehrlich, James Hansen, Steven Schneider, John Holdren and Susan Solomon.
It is, of course, possible to corrupt science without specifically corrupting institutions. For example, the environmental movement often cloaks its propaganda in scientific garb without the aid of any existing scientific body. One technique is simply to give a name to an environmental advocacy group that will suggest to the public, that the group is a scientific rather than an environmental group. Two obvious examples are the Union of Concerned Scientists and the Woods Hole Research Center [9,10]. The former conducted an intensive advertising campaign about ten years ago in which they urged people to look to them for authoritative information on global warming.
This campaign did not get very far – if only because the Union of Concerned Scientists had little or no scientific expertise in climate. A possibly more effective attempt along these lines occurred in the wake of Michael Crichton’s best selling adventure, State of Fear, which pointed out the questionable nature of the global warming issue, as well as the dangers to society arising from the exploitation of this issue. Environmental Media Services (a project of Fenton Communications, a large public relations firm serving left wing and environmental causes; they are responsible for the alar scare as well as Cindy Sheehan’s anti-war campaign.) created a website, realclimate.org, as an ‘authoritative’ source for the ‘truth’ about climate. This time, real scientists who were also environmental activists, were recruited to organize this web site and ‘discredit’ any science or scientist that questioned catastrophic anthropogenic global warming.
The web site serves primarily as a support group for believers in catastrophe, constantly reassuring them that there is no reason to reduce their worrying. Of course, even the above represent potentially unnecessary complexity compared to the longstanding technique of simply publicly claiming that all scientists agree with whatever catastrophe is being promoted. Newsweek already made such a claim in 1988. Such a claim serves at least two purposes. First, the bulk of the educated public is unable to follow scientific arguments; ‘knowing’ that all scientists agree relieves them of any need to do so. Second, such a claim serves as a warning to scientists that the topic at issue is a bit of a minefield that they would do well to avoid.
The myth of scientific consensus is also perpetuated in the web’s Wikipedia where climate articles are vetted by William Connolley, who regularly runs for office in England as a Green Party candidate. No deviation from the politically correct line is permitted.
Perhaps the most impressive exploitation of climate science for political purposes has been the creation of the Intergovernmental Panel on Climate Change (IPCC) by two UN agencies, UNEP (United Nations Environmental Program) and WMO (World Meteorological Organization), and the agreement of all major countries at the 1992 Rio Conference to accept the IPCC as authoritative. Formally, the IPCC summarizes the peer reviewed literature on climate every five years. On the face of it, this is an innocent and straightforward task. One might reasonably wonder why it takes 100′s of scientists five years of constant travelling throughout the world in order to perform this task. The charge to the IPCC is not simply to summarize, but rather to provide the science with which to support the negotiating process whose aim is to control greenhouse gas levels. This is a political rather than a scientific charge. That said, the participating scientists have some leeway in which to reasonably describe matters, since the primary document that the public associates with the IPCC is not the extensive report prepared by the scientists, but rather the Summary for Policymakers which is written by an assemblage of representative from governments and NGO’s, with only a small scientific representation [11, 12].
3. Science in the service of politics
Given the above, it would not be surprising if working scientists would make special efforts to support the global warming hypothesis. There is ample evidence that this is happening on a large scale. A few examples will illustrate this situation. Data that challenges the hypothesis are simply changed. In some instances, data that was thought to support the hypothesis is found not to, and is then changed. The changes are sometimes quite blatant, but more often are somewhat more subtle. The crucial point is that geophysical data is almost always at least somewhat uncertain, and methodological errors are constantly being discovered. Bias can be introduced by simply considering only those errors that change answers in the desired direction. The desired direction in the case of climate is to bring the data into agreement with models, even though the models have displayed minimal skill in explaining or predicting climate. Model projections, it should be recalled, are the basis for our greenhouse concerns. That corrections to climate data should be called for, is not at all surprising, but that such corrections should always be in the ‘needed’ direction is exceedingly unlikely. Although the situation suggests overt dishonesty, it is entirely possible, in today’s scientific environment, that many scientists feel that it is the role of science to vindicate the greenhouse paradigm for climate change as well as the credibility of models. Comparisons of models with data are, for example, referred to as model validation studies rather than model tests.
The first two examples involve paleoclimate simulations and reconstructions. Here, the purpose has been to show that both the models and the greenhouse paradigm can explain past climate regimes, thus lending confidence to the use of both to anticipate future changes. In both cases (the Eocene about 50 million years ago, and the Last Glacial Maximum about 18 thousand years ago), the original data were in conflict with the greenhouse paradigm as implemented in current models, and in both cases, lengthy efforts were made to bring the data into agreement with the models.
In the first example, the original data analysis for the Eocene (Shackleton and Boersma, 1981) showed the polar regions to have been so much warmer than the present that a type of alligator existed on Spitzbergen as did florae and fauna in Minnesota that could not have survived frosts. At the same time, however, equatorial temperatures were found to be about 4K colder than at present. The first attempts to simulate the Eocene (Barron, 1987) assumed that the warming would be due to high levels of CO2, and using a climate GCM (General Circulation Model), he obtained relatively uniform warming at all latitudes, with the meridional gradients remaining much as they are today. This behavior continues to be the case with current GCMs (Huber, 2008). As a result, paleoclimatologists have devoted much effort to ‘correcting’ their data, but, until very recently, they were unable to bring temperatures at the equator higher than today’s (Schrag, 1999, Pearson et al, 2000). However, the latest paper (Huber, 2008) suggests that the equatorial data no longer constrains equatorial temperatures at all, and any values may have existed. All of this is quite remarkable since there is now evidence that current meridional distributions of temperature depend critically on the presence of ice, and that the model behavior results from improper tuning wherein present distributions remain even when ice is absent.
The second example begins with the results of a major attempt to observationally reconstruct the global climate of the last glacial maximum (CLIMAP, 1976). Here it was found that although extratropical temperatures were much colder, equatorial temperatures were little different from today’s. There were immediate attempts to simulate this climate with GCMs and reduced levels of CO2. Once again the result was lower temperatures at all latitudes (Bush and Philander, 1998a,b), and once again, numerous efforts were made to ‘correct’ the data. After much argument, the current position appears to be that tropical temperatures may have been a couple of degrees cooler than today’s. However, papers appeared claiming far lower temperatures (Crowley, 2000). We will deal further with this issue in the next section where we describe papers that show that the climate associated with ice ages is well described by the Milankovich Hypothesis that does not call for any role for CO2.
Both of the above examples probably involved legitimate corrections, but only corrections that sought to bring observations into agreement with models were initially considered, thus avoiding the creative conflict between theory and data that has characterized the past successes of science. To be sure, however, the case of the Last Glacial Maximum shows that climate science still retains a capacity for self-correction.
The next example has achieved a much higher degree of notoriety than the previous two. In the first IPCC assessment (IPCC, 1990), the traditional picture of the climate of the past 1100 years was presented. In this picture, there was a medieval warm period that was somewhat warmer than the present as well as the little ice age that was cooler. The presence of a period warmer than the present in the absence of any anthropogenic greenhouse gases was deemed an embarrassment for those holding that present warming could only be accounted for by the activities of man. Not surprisingly, efforts were made to get rid of the medieval warm period (According to Demming, 2005, Jonathan Overpeck, in an email, remarked that one had to get rid of the medieval warm period. Overpeck is one of organizers in Appendix 1.).
The most infamous effort was that due to Mann et al (1998, 1999 ) which used primarily a few handfuls of tree ring records to obtain a reconstruction of Northern Hemisphere temperature going back eventually a thousand years that no longer showed a medieval warm period. Indeed, it showed a slight cooling for almost a thousand years culminating in a sharp warming beginning in the nineteenth century. The curve came to be known as the hockey stick, and featured prominently in the next IPCC report, where it was then suggested that the present warming was unprecedented in the past 1000 years. The study immediately encountered severe questions concerning both the proxy data and its statistical analysis (interestingly, the most penetrating critiques came from outside the field: McIntyre and McKitrick, 2003, 2005a,b). This led to two independent assessments of the hockey stick (Wegman,2006, North, 2006), both of which found the statistics inadequate for the claims. The story is given in detail in Holland (2007).
Since the existence of a medieval warm period is amply documented in historical accounts for the North Atlantic region (Soon et al, 2003), Mann et al countered that the warming had to be regional but not characteristic of the whole northern hemisphere. Given that an underlying assumption of their analysis was that the geographic pattern of warming had to have remained constant, this would have invalidated the analysis ab initio without reference to the specifics of the statistics. Indeed, the 4th IPCC (IPCC, 2007) assessment no longer featured the hockey stick, but the claim that current warming is unprecedented remains, and Mann et al’s reconstruction is still shown in Chapter 6 of the 4th IPCC assessment, buried among other reconstructions. Here too, we will return to this matter briefly in the next section.
The fourth example is perhaps the strangest. For many years, the global mean temperature record showed cooling from about 1940 until the early 70′s. This, in fact, led to the concern for global cooling during the 1970′s. The IPCC regularly, through the 4th assessment, boasted of the ability of models to simulate this cooling (while failing to emphasize that each model required a different specification of completely undetermined aerosol cooling in order to achieve this simulation (Kiehl, 2007)). Improvements in our understanding of aerosols are increasingly making such arbitrary tuning somewhat embarrassing, and, no longer surprisingly, the data has been ‘corrected’ to get rid of the mid 20th century cooling (Thompson et al, 2008). This may, in fact, be a legitimate correction (http://www.climateaudit.org/?p=3114). The embarrassment may lie in the continuous claims of modelers to have simulated the allegedly incorrect data.
The fifth example deals with the fingerprint of warming. It has long been noted that greenhouse warming is primarily centered in the upper troposphere (Lindzen, 1999) and, indeed, model’s show that the maximum rate of warming is found in the upper tropical troposphere (Lee, et al, 2007). Lindzen (2007) noted that temperature data from both satellites and balloons failed to show such a maximum. This, in turn, permitted one to bound the greenhouse contribution to surface warming, and led to an estimate of climate sensitivity that was appreciably less than found in current models. Once the implications of the observations were clearly identified, it was only a matter of time before the data were ‘corrected.’ The first attempt came quickly (Vinnikov et al, 2006) wherein the satellite data was reworked to show large warming in the upper troposphere, but the methodology was too blatant for the paper to be commonly cited . There followed an attempt wherein the temperature data was rejected, and where temperature trends were inferred from wind data (Allen and Sherwood, 2008).
Over sufficiently long periods, there is a balance between vertical wind shear and meridional temperature gradients (the thermal wind balance), and, with various assumptions concerning boundary conditions, one can, indeed, infer temperature trends, but the process involves a more complex, indirect, and uncertain procedure than is involved in directly measuring temperature. Moreover, as Pielke et al (2008) have noted, the results display a variety of inconsistencies. They are nonetheless held to resolve the discrepancy with models. More recently, Solomon et al (2009) have claimed further “corrections” to the data
The sixth example takes us into astrophysics. Since the 1970′s, considerable attention has been given to something known as the Early Faint Sun Paradox. This paradox was first publicized by Sagan and Mullen (1972). They noted that the standard model for the sun robustly required that the sun brighten with time so that 2-3 billion years ago, it was about 30% dimmer than it is today (recall that a doubling of CO2 corresponds to only a 2% change in the radiative budget). One would have expected that the earth would have been frozen over, but the geological evidence suggested that the ocean was unfrozen. Attempts were made to account for this by an enhanced greenhouse effect. Sagan and Mullen (1972) suggested an ammonia rich atmosphere might work. Others suggested an atmosphere with as much as several bars of CO2 (recall that currently CO2 is about 380 parts per million of a 1 bar atmosphere).
Finally, Kasting and colleagues tried to resolve the paradox with large amounts of methane. For a variety of reasons, all these efforts were deemed inadequate  (Haqqmisra et al, 2008). There followed a remarkable attempt to get rid of the standard model of the sun (Sackman and Boothroyd, 2003). This is not exactly the same as altering the data, but the spirit is the same. The paper claimed to have gotten rid of the paradox. However, in fact, the altered model still calls for substantial brightening, and, moreover, does not seem to have gotten much acceptance among solar modelers.
My last specific example involves the social sciences. Given that it has been maintained since at least 1988 that all scientists agree about alarming global warming, it is embarrassing to have scientists objecting to the alarm. To ‘settle’ the matter, a certain Naomi Oreskes published a paper in Science (Oreskes, 2004) purporting to have surveyed the literature and not have found a single paper questioning the alarm (Al Gore offers this study as proof of his own correctness in “Inconvenient Truth.”). Both Benny Peiser (a British sociologist) and Dennis Bray (an historian of science) noted obvious methodological errors, but Science refused to publish these rebuttals with no regard for the technical merits of the criticisms presented . Moreover, Oreskes was a featured speaker at the celebration of Spencer Weart’s thirty years as head of the American Institute of Physics’ Center for History of Physics. Weart, himself, had written a history of the global warming issue (Weart, 2003) where he repeated, without checking, the slander taken from a screed by Ross Gelbspan (The Heat is On) in which I was accused of being a tool of the fossil fuel industry. Weart also writes with glowing approval of Gore’s Inconvenient Truth. As far as Oreskes’ claim goes, it is clearly absurd . A more carefully done study revealed a very different picture (Schulte, 2007)
The above examples do not include the most convenient means whereby nominal scientists can support global warming alarm: namely, the matter of impacts. Here, scientists who generally have no knowledge of climate physics at all, are supported to assume the worst projections of global warming and imaginatively suggest the implications of such warming for whatever field they happen to be working in. This has led to the bizarre claims that global warming will contribute to kidney stones, obesity, cockroaches, noxious weeds, sexual imbalance in fish, etc. The scientists who participate in such exercises quite naturally are supportive of the catastrophic global warming hypothesis despite their ignorance of the underlying science .
4. Pressures to inhibit inquiry and problem solving
It is often argued that in science the truth must eventually emerge. This may well be true, but, so far, attempts to deal with the science of climate change objectively have been largely forced to conceal such truths as may call into question global warming alarmism (even if only implicitly). The usual vehicle is peer review, and the changes imposed were often made in order to get a given paper published. Publication is, of course, essential for funding, promotion, etc. The following examples are but a few from cases that I am personally familiar with. These, almost certainly, barely scratch the surface. What is generally involved, is simply the inclusion of an irrelevant comment supporting global warming accepted wisdom. When the substance of the paper is described, it is generally claimed that the added comment represents the ‘true’ intent of the paper. In addition to the following examples, Appendix 2 offers excellent examples of ‘spin control.’.
As mentioned in the previous section, one of the reports assessing the Mann et al Hockey Stick was prepared by a committee of the US National Research Counsel (a branch of the National Academy) chaired by Gerald North (North, 2006). The report concluded that the analysis used was totally unreliable for periods longer ago than about 400 years. In point of fact, the only basis for the 400 year choice was that this brought one to the midst of the Little Ice Age, and there is essentially nothing surprising about a conclusion that we are now warmer. Still, without any basis at all, the report also concluded that despite the inadequacy of the Mann et al analysis, the conclusion might still be correct. It was this baseless conjecture that received most of the publicity surrounding the report.
In a recent paper, Roe (2006) showed that the orbital variations in high latitude summer insolation correlate excellently with changes in glaciation – once one relates the insolation properly to the rate of change of glaciation rather than to the glaciation itself. This provided excellent support for the Milankovich hypothesis. Nothing in the brief paper suggested the need for any other mechanism. Nonetheless, Roe apparently felt compelled to include an irrelevant caveat stating that the paper had no intention of ruling out a role for CO2.
Choi and Ho (2006, 2008) published interesting papers on the optical properties of high tropical cirrus that largely confirmed earlier results by Lindzen, Chou and Hou (2001) on an important negative feedback (the iris effect – something that we will describe later in this section) that would greatly reduce the sensitivity of climate to increasing greenhouse gases. A proper comparison required that the results be normalized by a measure of total convective activity, and, indeed, such a comparison was made in the original version of Choi and Ho’s paper. However, reviewers insisted that the normalization be removed from the final version of the paper which left the relationship to the earlier paper unclear.
Horvath and Soden (2008) found observational confirmation of many aspects of the iris effect, but accompanied these results with a repetition of criticisms of the iris effect that were irrelevant and even contradictory to their own paper. The point, apparently, was to suggest that despite their findings, there might be other reasons to discard the iris effect. Later in this section, I will return to these criticisms. However, the situation is far from unique. I have received preprints of papers wherein support for the iris was found, but where this was omitted in the published version of the papers
In another example, I had originally submitted a paper mentioned in the previous section (Lindzen, 2007) to American Scientist, the periodical of the scientific honorary society in the US, Sigma Xi, at the recommendation of a former officer of that society. There followed a year of discussions, with an editor, David Schneider, insisting that I find a coauthor who would illustrate why my paper was wrong. He argued that publishing something that contradicted the IPCC was equivalent to publishing a paper that claimed that ‘Einstein’s general theory of relativity is bunk.’ I suggested that it would be more appropriate for American Scientist to solicit a separate paper taking a view opposed to mine. This was unacceptable to Schneider, so I ended up publishing the paper elsewhere. Needless to add, Schneider has no background in climate physics. At the same time, a committee consisting almost entirely in environmental activists led by Peter Raven, the ubiquitous John Holdren, Richard Moss, Michael MacCracken, and Rosina Bierbaum were issuing a joint Sigma Xi – United Nations Foundation (the latter headed by former Senator and former Undersecretary of State Tim Wirth  and founded by Ted Turner) report endorsing global warming alarm, to a degree going far beyond the latest IPCC report. I should add that simple disagreement with conclusions of the IPCC has become a common basis for rejecting papers for publication in professional journals – as long as the disagreement suggests reduced alarm. An example will be presented later in this section.
Despite all the posturing about global warming, more and more people are becoming aware of the fact that global mean temperatures have not increased statistically significantly since 1995. One need only look at the temperature records posted on the web by the Hadley Centre. The way this is acknowledged in the literature forms a good example of the spin that is currently required to maintain global warming alarm. Recall that the major claim of the IPCC 4th Assessment was that there was a 90% certainty that most of the warming of the preceding 50 years was due to man (whatever that might mean). This required the assumption that what is known as natural internal variability (ie, the variability that exists without any external forcing and represents the fact that the climate system is never in equilibrium) is adequately handled by the existing climate models.
The absence of any net global warming over the last dozen years or so, suggests that this assumption may be wrong. Smith et al (2007) (Smith is with the Hadley Centre in the UK) acknowledged this by pointing out that initial conditions had to reflect the disequilibrium at some starting date, and when these conditions were ‘correctly’ chosen, it was possible to better replicate the period without warming. This acknowledgment of error was accompanied by the totally unjustified assertion that global warming would resume with a vengeance in 2009 . As 2009 approaches and the vengeful warming seems less likely to occur, a new paper came out (this time from the Max Planck Institute: Keenlyside et al, 2008) moving the date for anticipated resumption of warming to 2015. It is indeed a remarkable step backwards for science to consider models that have failed to predict the observed behavior of the climate to nonetheless have the same validity as the data .
Tim Palmer, a prominent atmospheric scientist at the European Centre for Medium Range Weather Forecasting, is quoted by Fred Pearce (Pearce, 2008) in the New Scientist as follows: “Politicians seem to think that the science is a done deal,” says Tim Palmer. “I don’t want to undermine the IPCC, but the forecasts, especially for regional climate change, are immensely uncertain.” Pearce, however, continues “Palmer .. does not doubt that the Intergovernmental Panel on Climate Change (IPCC) has done a good job alerting the world to the problem of global climate change. But he and his fellow climate scientists are acutely aware that the IPCC’s predictions of how the global change will affect local climates are little more than guesswork. They fear that if the IPCC’s predictions turn out to be wrong, it will provoke a crisis in confidence that undermines the whole climate change debate. On top of this, some climate scientists believe that even the IPCC’s global forecasts leave much to be desired. …” Normally, one would think that undermining the credibility of something that is wrong is appropriate.
Even in the present unhealthy state of science, papers that are overtly contradictory to the catastrophic warming scenario do get published (though not without generally being substantially watered down during the review process). They are then often subject to the remarkable process of ‘discreditation.’ This process consists in immediately soliciting attack papers that are published quickly as independent articles rather than comments. The importance of this procedure is as follows. Normally such criticisms are published as comments, and the original authors are able to respond immediately following the comment. Both the comment and reply are published together. By publishing the criticism as an article, the reply is published as a correspondence, which is usually delayed by several months, and the critics are permitted an immediate reply. As a rule, the reply of the original authors is ignored in subsequent references.
In 2001, I published a paper (Lindzen, Chou and Hou) that used geostationary satellite data to suggest the existence of a strong negative feedback that we referred to as the Iris Effect. The gist of the feedback is that upper level stratiform clouds in the tropics arise by detrainment from cumulonimbus towers, that the radiative impact of the stratiform clouds is primarily in the infrared where they serve as powerful greenhouse components, and that the extent of the detrainment decreases markedly with increased surface temperature. The negative feedback resulted from the fact that the greenhouse warming due to the stratiform clouds diminished as the surface temperature increased, and increased as the surface temperature decreased – thus resisting the changes in surface temperature. The impact of the observed effect was sufficient to greatly reduce the model sensitivities to increasing CO2, and it was, moreover, shown that models failed to display the observed cloud behavior. The paper received an unusually intense review from four reviewers.
Once the paper appeared, the peer review editor of the Bulletin of the American Meteorological Society, Irwin Abrams, was replaced by a new editor, Jeffrey Rosenfeld (holding the newly created position of Editor in Chief), and the new editor almost immediately accepted a paper criticizing our paper (Hartmann and Michelsen, 2002), publishing it as a separate paper rather than a response to our paper (which would have been the usual and appropriate procedure). In the usual procedure, the original authors are permitted to respond in the same issue. In the present case, the response was delayed by several months. Moreover, the new editor chose to label the criticism as follows: “Careful analysis of data reveals no shrinkage of tropical cloud anvil area with increasing SST.”
In fact, this criticism was easily dismissed. The claim of Hartmann and Michelsen was that the effect we observed was due to the intrusion of midlatitude non- convective clouds into the tropics. If this were true, then the effect should have diminished as one restricted observations more closely to the equator, but as we showed (Lindzen, Chou and Hou, 2002), exactly the opposite was found. There were also separately published papers (again violating normal protocols allowing for immediate response) by Lin et al, 2002 and Fu, Baker and Hartmann, 2002, that criticized our paper by claiming that since the instruments on the geostationary satellite could not see the thin stratiform clouds that formed the tails of the clouds we could see, that we were not entitled to assume that the tails existed. Without the tails, the radiative impact of the clouds would be primarily in the visible where the behavior we observed would lead to a positive feedback; with the tails the effect is a negative feedback. The tails had long been observed, and the notion that they abruptly disappeared when not observed by an insufficiently sensitive sensor was absurd on the face of it, and the use of better instruments by Choi and Ho (2006, 2008) confirmed the robustness of the tails and the strong dominance of the infrared impact. However, as we have already seen, virtually any mention of the iris effect tends to be accompanied with a reference to the criticisms, a claim that the theory is ‘discredited,’ and absolutely no mention of the responses. This is even required of papers that are actually supporting the iris effect.
Vincent Courtillot et al (2007) encountered a similar problem. (Courtillot, it should be noted, is the director of the Institute for the Study of the Globe at the University of Paris.) They found that time series for magnetic field variations appeared to correlate well with temperature measurements – suggesting a possible non-anthropogenic source of forcing. This was immediately criticized by Bard and Delaygue (2008), and Courtillot et al were given the conventional right to reply which they did in a reasonably convincing manner. What followed, however, was highly unusual. Raymond Pierrehumbert (a professor of meteorology at the University of Chicago and a fanatical environmentalist) posted a blog supporting Bard and Delaygue, accusing Courtillot et al of fraud, and worse. Alan Robock (a coauthor of Vinnikov et al mentioned in the preceding section) perpetuated the slander in a letter circulated to all officers of the American Geophysical Union. The matter was then taken up (in December of 2007) by major French newspapers (LeMonde, Liberation, and Le Figaro) that treated Pierrehumbert’s defamation as fact. As in the previous case, all references to the work of Courtillot et al refer to it as ‘discredited’ and no mention is made of their response. Moreover, a major argument against the position of Courtillot et al is that it contradicted the claim of the IPCC.
In 2005, I was invited by Erneso Zedillo to give a paper at a symposium he was organizing at his Center for Sustainability Studies at Yale. The stated topic of the symposium was Global Warming Policy After 2012, and the proceedings were to appear in a book to by entitled Global Warming: Looking Beyond Kyoto. Only two papers dealing with global warming science were presented: mine and one by Stefan Rahmstorf of the Potsdam Institute. The remaining papers all essentially assumed an alarming scenario and proceeded to discuss economics, impacts, and policy. Rahmstorf and I took opposing positions, but there was no exchange at the meeting, and Rahmstorf had to run off to another meeting. As agreed, I submitted the manuscript of my talk, but publication was interminably delayed, perhaps because of the presence of my paper. In any event, the Brookings Institute (a centrist Democratic Party think tank) agreed to publish the volume. When the volume finally appeared (Zedillo, 2008), I was somewhat shocked to see that Rahmstorf’s paper had been modified from what he presented, and had been turned into an attack not only on my paper but on me personally . I had received no warning of this; nor was I given any opportunity to reply. Inquiries to the editor and the publisher went unanswered. Moreover, the Rahmstorf paper was moved so that it immediately followed my paper. The reader is welcome to get a copy of the exchange, including my response, on my web site (Lindzen-Rahmstorf Exchange, 2008), and judge the exchange for himself.
One of the more bizarre tools of global warming revisionism is the posthumous alteration of skeptical positions.
Thus, the recent deaths of two active and professionally prominent skeptics, Robert Jastrow (the founding director of NASA’s Goddard Institute for Space Studies, now headed by James Hansen), and Reid Bryson (a well known climatologist at the University of Wisconsin) were accompanied by obituaries suggesting deathbed conversions to global warming alarm.
The death of another active and prominent skeptic, William Nierenberg (former director of the Scripps Oceanographic Institute), led to the creation of a Nierenberg Prize that is annually awarded to an environmental activist. The most recent recipient was James Hansen who Nierenberg detested.
Perhaps the most extraordinary example of this phenomenon involves a paper by Singer, Starr, and Revelle (1991). In this paper, it was concluded that we knew too little about climate to implement any drastic measures. Revelle, it may be recalled, was the professor that Gore credits with introducing him to the horrors of CO2 induced warming. There followed an intense effort led by a research associate at Harvard, Justin Lancaster, in coordination with Gore staffers, to have Revelle’s name posthumously removed from the published paper. It was claimed that Singer had pressured an old and incompetent man to allow his name to be used. To be sure, everyone who knew Revelle, felt that he had been alert until his death. There followed a law suit by Singer, where the court found in Singer’s favor. The matter is described in detail in Singer (2003).
Occasionally, prominent individual scientists do publicly express skepticism. The means for silencing them are fairly straightforward.
Will Happer, director of research at the Department of Energy (and a professor of physics at Princeton University) was simply fired from his government position after expressing doubts about environmental issues in general. His case is described in Happer (2003).
Michael Griffin, NASA’s administrator, publicly expressed reservations concerning global warming alarm in 2007. This was followed by a barrage of ad hominem attacks from individuals including James Hansen and Michael Oppenheimer. Griffin has since stopped making any public statements on this matter.
Freeman Dyson, an acknowledged great in theoretical physics, managed to publish a piece in New York Review of Books (Dyson, 2008), where in the course of reviewing books by Nordhaus and Zedillo (the latter having been referred to earlier), he expressed cautious support for the existence of substantial doubt concerning global warming. This was followed by a series of angry letters as well as condemnation on the realclimate.org web site including ad hominem attacks. Given that Dyson is retired, however, there seems little more that global warming enthusiasts can do. However, we may hear of a deathbed conversion in the future.
5. Dangers for science and society
This paper has attempted to show how changes in the structure of scientific activity over the past half century have led to extreme vulnerability to political manipulation. In the case of climate change, these vulnerabilities have been exploited to a remarkable extent. The dangers that the above situation poses for both science and society are too numerous to be discussed in any sort of adequate way in this paper. It should be stressed that the climate change issue, itself, constitutes a major example of the dangers intrinsic to the structural changes in science.
As concerns the specific dangers pertaining to the climate change issue, we are already seeing that the tentative policy moves associated with ‘climate mitigation’ are contributing to deforestation, food riots, potential trade wars, inflation, energy speculation and overt corruption as in the case of ENRON (one of the leading lobbyists for Kyoto prior to its collapse). There is little question that global warming has been exploited by many governments and corporations (and not just by ENRON; Lehman Brothers, for example, was also heavily promoting global warming alarm, and relying on the advice of James Hansen, etc.) for their own purposes, but it is unclear to what extent such exploitation has played an initiating role in the issue. The developing world has come to realize that the proposed measures endanger their legitimate hopes to escape poverty, and, in the case of India, they have, encouragingly, led to an assessment of climate issues independent of the ‘official’ wisdom (Government of India, 2008 ).
For purposes of this paper, however, I simply want to briefly note the specific implications for science and its interaction with society. Although society is undoubtedly aware of the imperfections of science, it has rarely encountered a situation such as the current global warming hysteria where institutional science has so thoroughly committed itself to policies which call for massive sacrifices in well being world wide. Past scientific errors did not lead the public to discard the view that science on the whole was a valuable effort. However, the extraordinarily shallow basis for the commitment to climate catastrophe, and the widespread tendency of scientists to use unscientific means to arouse the public’s concerns, is becoming increasingly evident, and the result could be a reversal of the trust that arose from the triumphs of science and technology during the World War II period.
Further, the reliance by the scientific community on fear as a basis for support, may, indeed, have severely degraded the ability of science to usefully address problems that need addressing. It should also be noted that not all the lessons of the World War II period have been positive. Massive crash programs such as the Manhattan Project are not appropriate to all scientific problems. In particular, such programs are unlikely to be effective in fields where the basic science is not yet in place. Rather, they are best suited to problems where the needs are primarily in the realm of engineering.
Although the change in scientific culture has played an important role in making science more vulnerable to exploitation by politics, the resolution of specific issues may be possible without explicitly addressing the structural problems in science. In the US, where global warming has become enmeshed in partisan politics, there is a natural opposition to exploitation which is not specifically based on science itself. However, the restoration of the traditional scientific paradigm will call for more serious efforts. Such changes are unlikely to come from any fiat. Nor is it likely to be implemented by the large science bureaucracies that have helped create the problem in the first place. A potentially effective approach would be to change the incentive structure of science. The current support mechanisms for science is one where the solution of a scientific problem is rewarded by ending support.
This hardly encourages the solution of problems or the search for actual answers. Nor does it encourage meaningfully testing hypotheses. The alternative calls for a measure of societal trust, patience, and commitment to elitism that hardly seems consonant with the contemporary attitudes. It may, however, be possible to make a significant beginning by carefully reducing the funding for science. Many scientists would be willing to accept a lower level of funding in return for greater freedom and stability. Other scientists may find the trade-off unacceptable and drop out of the enterprise. The result, over a period of time, could be a gradual restoration of a better incentive structure.
One ought not underestimate the institutional resistance to such changes, but the alternatives are proving to be much worse. Some years ago, I described some of what I have discussed here at a meeting in Erice (Lindzen, 2005). Richard Garwin (who some regard as the inventor of the H-bomb) rose indignantly to state that he did not want to hear such things. Quite frankly, I also don’t want to hear such things. However, I fear that ignoring such things will hardly constitute a solution, and a solution may be necessary for the sake of the scientific enterprise.
Acknowledgments. The author wishes to thank Dennis Ambler, Willie Soon, Lubos Motl and Nigel Lawson for useful comments and assistance.
1. This paper was prepared for a meeting sponsored by Euresis (Associazone per la promozione e la diffusione della cultura e del lavoro scientifico) and the Templeton Foundation on Creativity and Creative Inspiration in Mathematics, Science, and Engineering: Developing a Vision for the Future. The meeting was held in San Marino from 29-31 August 2008. Its Proceedings are expected to be published in 2009.
2. At some level, this is obvious. Theoretical physics is still dealing with the standard model though there is an active search for something better. Molecular biology is still working off of the discovery of DNA. Many of the basic laws of physics resulted from individual efforts in the 17th-19th Centuries. The profound advances in technology should not disguise the fact that the bulk of the underlying science is more than 40 years old. This is certainly the case in the atmospheric and oceanic sciences. That said, it should not be forgotten that sometimes progress slows because the problem is difficult. Sometimes, it slows because the existing results are simply correct as is the case with DNA. Structural problems are not always the only factor involved.
3. It is sometimes thought that government involvement automatically implies large bureaucracies, and lengthy regulations. This was not exactly the case in the 20 years following the second world war. Much of the support in the physical sciences came from the armed forces for which science support remained a relatively negligible portion of their budgets. For example, meteorology at MIT was supported by the Air Force. Group grants were made for five year periods and renewed on the basis of a site visit. When the National Science Foundation was created, it functioned with a small permanent staff supplemented by ‘rotators’ who served on leave from universities for a few years. Unfortunately, during the Vietnam War, the US Senate banned the military from supporting non-military research (Mansfield Amendment). This shifted support to agencies whose sole function was to support science. That said, today all agencies supporting science have large ‘supporting’ bureaucracies.
4. In fairness, such programs should be distinguished from team efforts which are sometimes appropriate and successful: classification of groups in mathematics, human genome
5. A personal memoir from Al Grable sent to Sherwood Idso in 1993 is interesting in this regard. Grable served as a Department of Agriculture observer to the National Research Council’s National Climate Board. Such observers are generally posted by agencies to boards that they are funding. In any event, Grable describes a motion presented at a Board meeting in 1980 by Walter Orr Roberts, the director of the National Center for Atmospheric Research, and by Joseph Smagorinsky, director of NOAA’s Geophysical Fluid Dynamics Laboratory at Princeton, to censure Sherwood Idso for criticizing climate models with high sensitivities due to water vapor feedbacks (in the models), because of their inadequate handling of cooling due to surface evaporation. A member of that board, Sylvan Wittwer, noted that it was not the role of such boards to censure specific scientific positions since the appropriate procedure would be to let science decide in the fullness of time, and the matter was dropped. In point of fact, there is evidence that models do significantly understate the increase of evaporative cooling with temperature (Held and Soden, 2006). Moreover, this memoir makes clear that the water vapor feedback was considered central to the whole global warming issue from the very beginning.
6. It should be acknowledged that Oppenheimer has quite a few papers with climate in the title – especially in the last two years. However, these are largely papers concerned with policy and advocacy, assuming significant warming. Such articles probably constitute the bulk of articles on climate. It is probably also fair to say that such articles contribute little if anything to understanding the phenomenon.
7. Certain names and organizations come up repeatedly in this paper. This is hardly an accident. In 1989, following the public debut of the issue in the US in Tim Wirth’s and Al Gore’s famous Senate hearing featuring Jim Hansen associating the warm summer of 1988 with global warming, the Climate Action Network was created. This organization of over 280 ENGO’s has been at the center of the climate debates since then. The Climate Action Network, is an umbrella NGO that coordinates the advocacy efforts of its members, particularly in relation to the UN negotiations. Organized around seven regional nodes in North and Latin America, Western and Eastern Europe, South and Southeast Asia, and Africa, CAN represents the majority of environmental groups advocating on climate change, and it has embodied the voice of the environmental community in the climate negotiations since it was established.
The founding of the Climate Action Network can be traced back to the early involvement of scientists from the research ENGO community. These individuals, including Michael Oppenheimer from Environmental Defense, Gordon Goodman of the Stockholm Environmental Institute (formerly the Beijer Institute), and George Woodwell of the Woods Hole Research
Center were instrumental in organizing the scientific workshops in Villach and Bellagio on Developing Policy Responses to Climate Change’ in 1987 as well as the Toronto Conference on the Changing Atmosphere in June 1988. It should be noted that the current director of the Woods Hole Research Center is John Holdren. In 1989, several months after the Toronto Conference, the emerging group of climate scientists and activists from the US, Europe, and developing countries were brought together at a meeting in Germany, with funding from Environmental Defense and the German Marshall Fund. The German Marshall Fund is still funding NGO activity in Europe: http://www.gmfus.org/event/detail.cfm?id=453&parent_type=E (Pulver, 2004).
8. The reports attributed to the National Academy are not, to any major extent, the work of Academy Members. Rather, they are the product of the National Research Council, which consists in a staff of over 1000 who are paid largely by the organizations soliciting the reports. The committees that prepare the reports are mostly scientists who are not Academy Members, and who serve without pay.
9. One might reasonably add the Pew Charitable Trust to this list. Although they advertise themselves as a neutral body, they have merged with the National Environmental Trust, whose director, Philip Clapp, became deputy managing director of the combined body. Clapp (the head of the legislative practice of a large Washington law firm, and a consultant on mergers and acquisitions to investment banking firms), according to his recent obituary, was ‘an early and vocal advocate on climate change issues and a promoter of the international agreement concluded in 1997 in Kyoto, Japan. Mr. Clapp continued to attend subsequent global warming talks even after the US Congress did not ratify the Kyoto accord.’
10. John Holdren has defended the use of the phrase ‘Research Center’ since research is carried out with sponsorship by National Science Foundation, the National Oceanographic Administration, and NASA. However, it is hardly uncommon to find sponsorship of the activities of environmental NGO’s by federal funding agencies
11. Appendix 1 is the invitation to the planning session for the 5th assessment. It clearly emphasizes strengthening rather than checking the IPCC position. Appendix 2 reproduces a commentary by Stephen McIntyre on the recent OfCom findings concerning a British TV program opposing global warming alarmism. The response of the IPCC officials makes it eminently clear that the IPCC is fundamentally a political body. If further evidence were needed, one simply has to observe the fact that the IPCC Summary for Policymakers will selectively cite results to emphasize negative consequences. Thus the summary for Working Group II observes that global warming will result in “Hundreds of millions of people exposed to increased water stress.” This, however, is based on work (Arnell, 2004) which actually shows that by the 2080s the net global population at risk declines by up to 2.1 billion people (depending on which scenario one wants to emphasize)! The IPCC further ignores the capacity to build reservoirs to alleviate those areas they project as subject to drought (I am indebted to Indur Goklany for noting this example.)
12. Appendix 3 is a recent op-ed from the Boston Globe, written by the aforementioned John Holdren. What is interesting about this piece is that what little science it invokes is overtly incorrect. Rather, it points to the success of the above process of taking over scientific institutions as evidence of the correctness of global warming alarmism. The 3 atmospheric scientists who are explicitly mentioned are chemists with no particular expertise in climate, itself. While, Holdren makes much of the importance of expertise, he fails to note that he, himself, is hardly a contributor to the science of climate. Holdren and Paul Ehrlich (of Population Bomb fame; in that work he predicted famine and food riots for the US in the 1980′s) are responsible for the I=PAT formula. Holdren, somewhat disingenuously claims that this is merely a mathematical identity where I is environmental impact, P is population, A is GDP/P and T is I/GDP. However, in popular usage, A has become affluence and T has become technology (viz Schneider, 1997; see also Wikipedia).
13. The 1998 paper actually only goes back to 1400 CE, and acknowledges that there is no useful resolution of spatial patterns of variability going further back. It is the 1999 paper that then goes back 1000 years.
14. Of course, Vinnikov et al did mention it. When I gave a lecture at Rutgers University in October 2007, Alan Robock, a professor at Rutgers and a coauthor of Vinnikov et al declared that the ‘latest data’ resolved the discrepancy wherein the model fingerprint could not be found in the data.
15. Haqqmisra, a graduate student at the Pennsylvania State University, is apparently still seeking greenhouse solutions to the paradox.
16. The refusal was not altogether surprising. The editor of Science, at the time, was Donald Kennedy, a biologist (and colleague of Paul Ehrlich and Stephen Schneider, both also members of Stanford’s biology department), who had served as president of Stanford University. His term, as president, ended with his involvement in fiscal irregularities such as charging to research overhead such expenses as the maintenance of the presidential yacht and the provision of flowers for his daughter’s wedding – offering peculiar evidence for the importance of grant overhead to administrators. Kennedy had editorially declared that the debate concerning global warming was over and that skeptical articles would not be considered. More recently, he has published a relatively pure example of Orwellian double-speak (Kennedy, 2008) wherein he called for better media coverage of global warming, where by ‘better’ he meant more carefully ignoring any questions about global warming alarm. As one might expect, Kennedy made extensive use of Oreskes’ paper. He also made the remarkably dishonest claim that the IPCC Summary for Policymakers was much more conservative than the scientific text.
17. Oreskes, apart from overt errors, merely considered support to consist in agreement that there had been some warming, and that anthropogenic CO2 contributed part of the warming. Such innocent conclusions have essentially nothing to do with catastrophic projections. Moreover, most of the papers she looked at didn’t even address these issues; they simply didn’t question these conclusions.
18. Perhaps unsurprisingly, The Potsdam Institute, home of Greenpeace’s Bill Hare, now has a Potsdam Institute for Climate Impact Research.
19. Tim Wirth chaired the hearing where Jim Hansen rolled out the alleged global warming relation to the hot summer of 1988 (viz Section 2). He is noted for having arranged for the hearing room to have open windows to let in the heat so that Hansen would be seen to be sweating for the television cameras. Wirth is also frequently quoted as having said “We’ve got to ride the global warming issue. Even if the theory of global warming is wrong, we will be doing the right thing — in terms of economic policy and environmental policy.”
20. When I referred to the Smith et al paper at a hearing of the European Parliament, Professor Schellnhuber of the Potsdam Institute (which I mentioned in the previous section with respect to its connection to Greenpeace) loudly protested that I was being ‘dishonest’ by not emphasizing what he referred to as the main point in Smith et al: namely that global warming would return with a vengeance.
21. The matter of ‘spin control’ warrants a paper by itself. In connection with the absence of warming over the past 13 years, the common response is that 7 of the last 10 warmest years in the record occurred during the past decade. This is actually to be expected, given that we are in a warm period, and the temperature is always fluctuating. However, it has nothing to do with trends.
22. The strange identification of the CO2 caused global warming paradigm with general relativity theory, mentioned earlier in this section, is repeated by Rahmstorf. This repetition of odd claims may be a consequence of the networking described in footnote 7.
23. A curious aspect of the profoundly unalarming Indian report is the prominent involvement in the preparation of the report by Dr. Rajendra Pachauri (an economist and long term UN bureaucrat) who heads the IPCC. Dr. Pachauri has recently been urging westerners to reduce meat consumption in order to save the earth from destruction by global warming.
Allen, R.J. and S.C. Sherwood (2008) Warming maximum in the tropical upper troposphere deduced from thermal winds, Nature 25 May 2008; doi:10.1038/ngeo208 1-5
Arnell, N.W. (2004) Climate change and global water resources: SRES emissions and socio-economic scenarios, Global Environmental Change, 14, 31-52.
Bard, E.and G. Delaygue (2008) Comment on “Are there connections between the Earth’s magnetic field and climate?” Earth and Planetary Science Letters 265 302–307
Barron, E.J. (1987) Eocene Equator-to- Pole Surface Ocean Temperatures: A Significant Climate Problem? PALEOCEANOGRAPHY, 2, 729–739
Bush, A.B.G. and S.G.H. Philander (1998a) The late Cretaceous: simulation with a coupled atmosphere-ocean general circulation model. Paleoceanography 12 495-516
Bush, A.B.G. and S.G.H. Philander (1998b) The role of ocean -atmosphere interactions in tropical cooling during the last glacial maximum. Science 279 1341-1344
Bush, V. (1945) Science: the Endless Frontier. http://www.nsf.gov/about/history/vbush1945.htm
Choi, Y.-S., and C.-H. Ho (2006), Radiative effect of cirrus with different optical properties over the tropics in MODIS and CERES observations, Geophysical Research Letters, 33, L21811, doi:10.1029/2006GL027403
Choi, Y.-S., and C.-H. Ho (2008), Validation of the cloud property retrievals from the MTSAT-1R imagery using MODIS observations, International Journal of Remote Sensing, accepted.
Chou, M.-D., R.S. Lindzen, and A.Y. Hou (2002b) Comments on “The Iris hypothesis: A negative or positive cloud feedback?” J. Climate, 15, 2713-2715.
CLIMAP Project (1976) The surface of the ice-age Earth. Science 191:1131-1136
Courtillot, V., Y. Gallet, J.-L. Le Mouël, F. Fluteau, and A. Genevey (2007) Are there connections between the Earth’s magnetic field and climate? Earth and Planetary Science Letters 253 328–339
Crichton, M. (2004) State of Fear, Harper Collins, 624 pp.
Crowley, T. J. (2000) CLIMAP SSTs re-revisited. Climate Dynamics 16:241-255
Demming, D. (2005) Global warming, the politicization of science, and Michael Crichton’s State of Fear, Journal of Scientific Exploration, 19, 247-256.
Dyson, F. (2008) The Question of Global Warming, New York Review of Books, 55, No. 10, June 12, 2008.
Fu, Q., Baker, M., and Hartman, D. L.(2002) Tropical cirrus and water vapor: an effective Earth infrared iris feedback? Atmos. Chem. Phys., 2, 31–37
Gelbspan, R. (1998) The Heat is On, Basic Books, 288 pp.
Government of India (2008) National Action Plan on Climate Change, 56pp.
Happer, W. (2003) Harmful Politicization of Science in Politicizing Science: The Alchemy of Policymaking edited by Michael Gough, Hoover Institution 313 pp (pp 27-48).
Haqq-Misra, J.D., S.D. Domagal-Goldman, P. J. Kasting, and J.F. Kasting (2008) A Revised, hazy methane greenhouse for the Archean Earth. Astrobiology in press
Hartmann, D. L., and M. L. Michelsen (2002) No evidence for iris. Bull. Amer. Meteor. Soc., 83, 249–254.
Held, I.M. and B.J. Soden (2006) Robust responses of the hydrological cycle to global warming,
Journal of Climate., 19, 5686-5699.
Holland, D. (2007) Bias And Concealment in the IPCC Process: The “Hockey-Stick” Affair and its Implications, Energy & Environment, 18, 951-983.
Horvath, A., and B. Soden, ( 2008) Lagrangian Diagnostics of Tropical Deep Convection and Its Effect upon Upper-Tropospheric Humidity, Journal of Climate, 21(5), 1013–1028
Huber, M. (2008) A Hotter Greenhouse? Science 321 353-354
IPCC, 1990: Climate Change: The IPCC Scientific Assessment [Houghton, J. T et al., (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, 362 pp.
IPCC, 1996: Climate Change 1995: The Science of Climate Change. Contribution of Working Group I to the Second Assessment Report of the Intergovernmental Panel on Climate Change
[Houghton et al. (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, 572 pp
IPCC, 2001: Climate Change 2001: The Scientific Basis. Contribution of Working Group I to the Third Assessment Report of the Intergovernmental Panel on Climate Change [Houghton, J.T., et al. (eds.)]. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA, 881 pp.
IPCC, 2007:Solomon et al., (eds.) 2007: ‘Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change’. Cambridge University Press, Cambridge, United Kingdom and New York, NY, USA. (Available at http://www.ipcc.ch/)
Keenlyside, N. S., M. Latif, J. Jungclaus, L. Kornblueh and E. Roeckner (2008) Advancing decadal-scale climate prediction in the North Atlantic sector. Nature 453 84-88
Kennedy, D., 2008: Science, Policy, and the Media, Bulletin of the American Academy of Arts & Sciences, 61, 18-22.
Kiehl, J.T. (2007) Twentieth century climate model response and climate sensitivity. Geophys. Res. Lttrs., 34, L22710, doi:10.1029/2007GL031383
Lee, M.I., M.J. Suarez, I.S. Kang, I. M. Held, and D. Kim (2008) A Moist Benchmark Calculation for the Atmospheric General Circulation Models, J.Clim., in press.
Lin, B., B. Wielicki, L. Chambers, Y. Hu, and K.-M. Xu, (2002) The iris hypothesis: A negative or positive cloud feedback? J. Climate, 15, 3–7.
Lindzen, R.S. (1999) The Greenhouse Effect and its problems. Chapter 8 in Climate Policy After Kyoto (T.R. Gerholm, editor), Multi-Science Publishing Co., Brentwood, UK, 170pp.
Lindzen, R.S. (2005) Understanding common climate claims. in Proceedings of the 34th International Seminar on Nuclear War and Planetary Emergencies, R. Raigaini, editor, World Scientific Publishing Co., Singapore, 472pp. (pp. 189-210)
Lindzen, R.S. (2007) Taking greenhouse warming seriously. Energy & Environment, 18, 937-950.
Lindzen, R.S., M.-D. Chou, and A.Y. Hou (2001) Does the Earth have an adaptive infrared iris?
Bull. Amer. Met. Soc. 82, 417-432.
Lindzen, R.S., M.-D. Chou, and A.Y. Hou (2002) Comments on “No evidence for iris.” Bull. Amer. Met. Soc., 83, 1345–1348
Lindzen-Rahmstorf Exchange (2008) http://www-eaps.mit.edu/faculty/lindzen/L_R-Exchange.pdf
Mann, M.E., R.E. Bradley, and M.K. Hughes (1998) Global-scale temperature patterns and climate forcing over the past six centuries,” Nature, 392, 779-787.
Mann, M.E., Bradley, R.S. and Hughes, M.K. (1999) Northern Hemisphere Temperatures During the Past Millennium: Inferences, Uncertainties, and Limitations, Geophysical Research Letters,
McIntyre, S. and R. McKitrick (2003) Corrections to the Mann et al. (1998) proxy data base and Northern hemispheric average temperature series,” Energy and Environment, 14, 751-771.
McIntyre, S. and R. McKitrick (2005a) The M&M critique of MBH98
Northern hemisphere climate index: Update and implications, Energy and Environment, 16, 69-100.
McIntyre, S. and R. McKitrick (2005b) Hockey sticks, principal components, and spurious significance,” Geophysical Research Letters, 32, L03710, doi:10.1029/2004GL021750
Miller, D.W. (2007) The Government Grant System Inhibitor of Truth and Innovation? J. of Information Ethics, 16, 59-69
National Academy of Sciences (1992) Policy Implications of Greenhouse Warming:Mitigation, Adaptation, and the Science Base, National Academy Press, 944 pp.
North, G.R., chair (2006) NRC, 2006: Committee on Surface Temperature Reconstructions for the Last 2,000 Years, National Research Council, National Academies Press
Oppenheimer, M. and R.Boyle (1990) Dead Heat, The Race Against the Greenhouse Effect, Basic Books, 288 pp.
Oreskes, N.(2004) The scientific consensus on climate change. Science, 306, 1686.
Pearce, F. (2008) Poor forecasting undermines climate debate. New Scientist, 01 May 2008, 8-9
Pearson, P.N., P.W. Ditchfeld, J. Singano, K.G. Harcourt-Brown, C.J. Nicholas, R.K. Olsson, N.J. Shackleton & M.A. Hall (2000) Warm tropical sea surface temperatures in the Late Cretaceous and Eocene epochs Nature 413 481-487
Pielke Sr., R.A., T.N. Chase, J.R. Christy and B. Herman (2008) Assessment of temperature trends in the troposphere deduced from thermal winds. Nature (submitted)
Pulver, Simone (2004). Power in the Public Sphere: The battles between Oil Companies and Environmental Groups in the UN Climate Change Negotiations, 1991-2003. Doctoral dissertation, Department of Sociology, University of California, Berkeley
Roe, G. (2006) In defense of Milankovitch. Geophys. Res. Ltrs., 33, L24703, doi:10.1029/2006GL027817
Schneider, S.H., (1997) Laboratory Earth, Basic Books, 174pp.
Sackmann, J. and A.I. Boothroyd (2003) Our sun. V. A bright young sun consistent with helioseismology and warm temperatures on ancient earth and mars. The Astrophysical Journal, 583:1024-1039
Sagan, C. and G. Mullen. (1972) Earth and Mars: evolution of atmospheres and surface temperatures. Science, 177, 52-56.
Schrag, D.P. (1999) Effects of diagenesis on isotopic record of late Paleogene equatorial sea surface temperatures. Chem. Geol., 161, 215-224
Schulte, K.-M. (2008) Scientific consensus on climate? Energy and Environment, 19 281-286
Shackleton, N., and A. Boersma, (1981) The climate of the Eocene ocean, J. Geol. Soc., London, 138, 153-157.
Singer, S.F. (2003) The Revelle-Gore Story Attempted Political Suppression of Science in
Politicizing Science: The Alchemy of Policymaking edited by Michael Gough, Hoover Institution 313 pp (pp 283-297).
Singer, S.F., C. Starr, and R. Revelle (1991), “What To Do About Greenhouse Warming: Look Before You Leap,” Cosmos 1 28–33.
Smith, D.M., S. Cusack, A.W. Colman, C.K. Folland, G.R. Harris, J.M. Murphy (2007) Improved Surface Temperature Prediction for the Coming Decade from a Global Climate Model
Science, 317, 796-799
Soon, W., S. Baliunas, C. Idso, S. Idso, and D. Legates (2003) Reconstructing climatic and environmental changes of the past 1000 years: a reappraisal. Energy and Environment, 14, 233-296
Thompson, D.W.J., J. J. Kennedy, J. M. Wallace and P.D. Jones (2008) A large discontinuity in the mid-twentieth century in observed global-mean surface temperature Nature 453 646-649
Vinnikov, K.Y. N.C. Grody, A. Robock, RJ. Stouffer, P.D. Jones, and M.D. Goldberg (2006) Temperature trends at the surface and in the troposphere. J. Geophys. Res.,111, D03106, doi:10.1029/2005JD006392
Weart, S. (2003) The Discovery of Global Warming, Harvard University Press, 228 pp.
Wegman, E.J. et al., (2006): Ad Hoc Committee report on the “Hockey Stick” global climate reconstruction, commissioned by the US Congress House Committee on Energy and Commerce, http://republicans.energycommerce.house.gov/108/home/07142006_Wegman_Report.pdf
Zedillo, E., editor (2007) Global Warming: Looking Beyond Kyoto. Brookings Institution Press, 237 pp.