Ryze - Business Networking Buy Ethereum and Bitcoin
Get started with Cryptocurrency investing
Home Invite Friends Networks Friends classifieds
Home

Apply for Membership

About Ryze


Innovation Network [This Network is not currently active and cannot accept new posts] | | Topics
35 Inconvenient TruthsViews: 113
Nov 25, 2009 11:00 pm re: 35 Inconvenient Truths > Gore Lawsuits + Climategate Uncovered

Ron Sam
Al Gore sued by over 30.000 Scientists for Global Warming fraud / John Coleman

Al Gore's Lies Exposed By Congress

The video I want to see is when Al Gore get's his Nobel Prize taken away.

ron

Nov 25, 2009
CRU’s Source Code: Climategate Uncovered

By Marc Sheppard, The American Thinker

As the evidence of climate fraud at the University of East Anglia[s prestigious Climactic Research Unit (CRU) continues to mount, those who’ve been caught green-handed continue to parry their due opprobrium and comeuppance, thanks primarily to a dead-silent mainstream media.  But should the hubris and duplicity evident in the emails of those whose millennial temperature charts literally fuel the warming alarmism movement somehow fail to convince the world of the scam that’s been perpetrated upon it, certainly these revelations of the fraud cooked into the computer programs that create such charts will.

First—Let’s briefly review a few pertinent details. 

We reported on Saturday that among the most revealing of the “hacked” emails released last week was one dated November 1999, in which CRU chief PD Jones wrote these words to Hockey-Stick-Team leaders Michael Mann, Raymond Bradley and Malcolm Hughes: “I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (ie from 1981 onwards) and (sic) from 1961 for Keith’s to hide the decline.”

Predictably, the suggestion of a climate-related data-adjusting “trick” being employed by such alarmist bellwethers 10 years ago instantly raised more than a few eyebrows.  And with similar alacrity, the Big Green Scare Machine shifted into CYA gear. 

Almost immediately after the news hit on Friday, Jones told Investigative Magazine’s TGIF Edition [PDF] that he “had no idea” what he might have meant by the words “hide the decline” a decade prior: “They’re talking about the instrumental data which is unaltered - but they’re talking about proxy data going further back in time, a thousand years, and it’s just about how you add on the last few years, because when you get proxy data you sample things like tree rings and ice cores, and they don’t always have the last few years. So one way is to add on the instrumental data for the last few years.”

Baloney.

Mere hours later, Jones’s warmist soul mates at RealClimate offered an entirely different explanation: “The paper in question is the Mann, Bradley and Hughes (1998) Nature paper on the original multiproxy temperature reconstruction, and the ‘trick’ is just to plot the instrumental records along with reconstruction so that the context of the recent warming is clear. Scientists often use the term “trick” to refer to “a good way to deal with a problem”, rather than something that is “secret”, and so there is nothing problematic in this at all. As for the ‘decline’, it is well known that Keith Briffa’s maximum latewood tree ring density proxy diverges from the temperature records after 1960 (this is more commonly known as the “divergence problem” - see e.g. the recent discussion in this paper) and has been discussed in the literature since Briffa et al in Nature in 1998 (Nature, 391, 678-682). Those authors have always recommend not using the post 1960 part of their reconstruction, and so while ‘hiding’ is probably a poor choice of words (since it is ‘hidden’ in plain sight), not using the data in the plot is completely appropriate, as is further research to understand why this happens.”

And later that day, Jean S at Climate Audit explained the reality of the quandary. In order to smooth a timed series, it’s necessary to pad it beyond the end-time. But it seems however hard they tried, when MBH plotted instrumental data against their tree ring reconstructions, no smoothing method would ever undo the fact that after 1960, the tree ring series pointed downward while the instrumental series pointed upward - hence the divergence: “So Mann’s solution [Mike’s Nature Trick] was to use the instrumental record for padding [both], which changes the smoothed series to point upwards.”

So the author of the email claimed the “trick” was adding instrumental measurements for years beyond available proxy data, his co-conspirators at Real Climate admitted it was actually a replacement of proxy data due to a known yet inexplicable post-1960 “divergence” anomaly, and CA called it what it was - a cheat.

The next day, the UEA spoke out for the first time on the subject when its first related press-release was posted to its homepage.  And Jones demonstrated to the world the benefits a good night’s sleep imparts to one’s memory, though not one’s integrity: “The word ‘trick’ was used here colloquially as in a clever thing to do. It is ludicrous to suggest that it refers to anything untoward.”

Tick Tock.

Of course, RealClimate also avowed there was “no evidence of the falsifying of data” in the emails.  But as Jones chose not to walk back his statement that the “tricks” were rarely exercised, and even assured us that he was “refer[ring] to one diagram - not a scientific paper,” his explanation remained at-odds with that of his virtual-confederates at RC.

And as Jones must have known at the time—such would prove to be the very least of CRU’s problems.

Getting with the Green Program(s)

One can only imagine the angst suffered daily by the co-conspirators, who knew full well that the “Documents” sub-folder of the CRU FOI2009 file contained more than enough probative program source code to unmask CRU’s phantom methodology. 

In fact, there are hundreds of IDL and FORTRAN source files buried in dozens of subordinate sub-folders.  And many do properly analyze and chart maximum latewood density (MXD), the growth parameter commonly utilized by CRU scientists as a temperature proxy, from raw or legitimately normalized data.  Ah, but many do so much more. 

Skimming through the often spaghetti-like code, the number of programs which subject the data to a mixed-bag of transformative and filtering routines is simply staggering.  Granted, many of these “alterations” run from benign smoothing algorithms (e.g. omitting rogue outliers) to moderate infilling mechanisms (e.g. estimating missing station data from that of those closely surrounding).  But many others fall into the precarious range between highly questionable (removing MXD data which demonstrate poor correlations with local temperature) to downright fraudulent (replacing MXD data entirely with measured data to reverse a disorderly trend-line).

In fact, workarounds for the post-1960 “divergence problem”, as described by both RealClimate and Climate Audit, can be found throughout the source code.  So much so that perhaps the most ubiquitous programmer’s comment (REM) I ran across warns that the particular module “Uses ‘corrected’ MXD - but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.”

What exactly is meant by “corrected” MXD, you ask?  Outstanding question—and the answer appears amorphous from program to program.  Indeed, while some employ one or two of the aforementioned “corrections,” others throw everything but the kitchen sink at the raw data prior to output.

For instance, in subfolder ”” there’s a program that calibrates the MXD data against available local instrumental summer (growing season) temperatures between 1911-1990, then merges that data into a new file.  That file is then digested and further modified by another program which creates calibration statistics for the MXD against the stored temperature and “estimates” (infills) figures where such temperature readings were not available.  The file created by that program is modified once again, by a program which “corrects it” - as described by the author—by “identifying” and “artificially” removing “the decline.”

But oddly enough - the series doesn’t begin its “decline adjustment” in 1960—the supposed year of the enigmatic “divergence.” In fact, all data between 1930 and 1994 are subject to “correction.” And such games are by no means unique to the folder attributed to Michael Mann.

A Clear and Present Rearranger

In 2 other programs, briffa_Sep98_d.pro and briffa_Sep98_e.pro, the “correction” is bolder by far. The programmer (Keith Briffa?) entitled the “adjustment” routine “Apply a VERY ARTIFICAL correction for decline!!” And he/she wasn’t kidding. Now, IDL is not a native language of mine, but its syntax is similar enough to others I’m familiar with, so please bear with me while I get a tad techie on you.

Here’s the “fudge factor” (notice the brash SOB actually called it that in his REM statement): yrloc = [1400,findgen(19)*5.+1904]
valadj=[0., 0., 0., 0., 0., -0.1, -0.25, -0.3, 0., -0.1, 0.3, 0.8, 1.2, 1.7, 2.5, 2.6, 2.6, 2.6, 2.6, 2.6] *0.75; fudge factor

These 2 lines of code establish a 20 element array (yrloc) comprised of the year 1400 (base year but not sure why needed here) and 19 years between 1904 and 1994 in half-decade increments.  Then the corresponding “fudge factor” (from the valadj matrix) is applied to each interval.  As you can see, not only are temperatures biased to the upside later in the century (though certainly prior to 1964) but a few mid-century intervals are being biased slightly lower.  That, coupled with the post-1930 restatement we encountered earlier, would imply that in addition to an embarrassing false decline experienced with their MXD after 1960 (or earlier), CRU’s “divergence problem” also includes a minor false incline after 1930.

And the former apparently wasn’t a particularly well-guarded secret, although the actual adjustment period remained buried beneath the surface.

Plotting programs such as data4alps.pro print this reminder to the user prior to rendering the chart: “IMPORTANT NOTE: The data after 1960 should not be used.  The tree-ring density records tend to show a decline after 1960 relative to the summer temperature in many high-latitude locations.  In this data set this ‘decline’ has been artificially removed in an ad-hoc way, and this means that data after 1960 no longer represent tree-ring density variations, but have been modified to look more like the observed temperatures.”

Others, such as mxdgrid2ascii.pro, issue this warning: “NOTE: recent decline in tree-ring density has been ARTIFICIALLY REMOVED to facilitate calibration.  THEREFORE, post-1960 values will be much closer to observed temperatures then (sic) they should be which will incorrectly imply the reconstruction is more skilful than it actually is.  See Osborn et al. (2004).”

Care to offer another explanation, Dr. Jones?

Gotcha

Clamoring alarmists can and will spin this until they’re dizzy.  The ever-clueless mainstream media can and will ignore this until it’s forced upon them as front-page news, and then most will join the alarmists on the denial merry-go-round.

But here’s what’s undeniable:  If a divergence exists between measured temperatures and those derived from dendrochronological data after (circa) 1960 then discarding only the post-1960 figures is disingenuous to say the least. The very existence of a divergence betrays a potential serious flaw in the process by which temperatures are reconstructed from tree-ring density.  If it’s bogus beyond a set threshold, then any honest men of science would instinctively question its integrity prior to that boundary.  And only the lowliest would apply a hack in order to produce a desired result.

And to do so without declaring as such in a footnote on every chart in every report in every study in every book in every classroom on every website that such a corrupt process is relied upon is not just a crime against science, it’s a crime against mankind.

Indeed, miners of the CRU folder have unearthed dozens of email threads and supporting documents revealing much to loathe about this cadre of hucksters and their vile intentions.  This veritable goldmine has given us tales ranging from evidence destruction to spitting on the Freedom of Information Act on both sides of the Atlantic. But the now irrefutable evidence that alarmists have indeed been cooking the data for at least a decade may just be the most important strike in human history.

Advocates of the global governance or financial redistribution sought by the United Nations at Copenhagen in two weeks and the expanded domestic governance and financial redistribution sought by Liberal politicians both substantiate their drastic proposals with the pending climate emergency predicted in the reports of the Intergovernmental Panel on Climate Change (IPCC).  Kyoto, Waxman-Markey, Kerry-Boxer, EPA regulation of the very substances of life - all bad policy concepts enabled solely by IPCC reports.  And the IPCC, in turn, bases those reports largely on the data and charts provided by the research scientists as CRU - largely from tree ring data—who just happen to be editors and lead authors of that same U.N. panel.

Bottom line:  CRU’s evidence is now irrevocably tainted.  As such—all assumptions based on that evidence must now be reevaluated and readjudicated. And all policy based on those counterfeit assumptions must also be re-examined.

Gotcha.  We’ve known they’ve been lying all along, and now we can prove it.  It’s time to bring sanity back to this debate.  It’s time for the First IPCC Reassessment Report.  See post here.


Private Reply to Ron Sam (new win)





Ryze Admin - Support   |   About Ryze



© Ryze Limited. Ryze is a trademark of Ryze Limited.  Terms of Service, including the Privacy Policy