Skip to main content

I’ve moaned about global warming, aka, anthropogenic climate change, aka man-made global warming off and on over the past couple years so this week let’s get down to business.

I’m all in favor of cost effective, as in benefits/cost ratio, reasonable environmental policy, resource preservation, and the minimization of irreversible damage.  I wouldn’t be passionate about what I do or spend Saturday mornings doing this rant every week if I weren’t.  I am also a professional engineer, which carries ethics of protecting the public health and well being.  Well being would include financial soundness, which is where the cost effectiveness thing comes.

The Berkeley Earth Surface Temperature (BEST – cute) team recently released its analysis of 40,000 land based thermometer readings over two centuries – 1.6 billion data points.  This analysis/report has captured much hype as of late.  They cleaned the data for things like readings of 5000 degrees C, Fahrenheit/Celsius mixups, and this sort of thing.  These are easy but then it gets much more difficult.

If you have ever surveyed for anything with widely varying results (not yes/no or who will you vote for questions), you know there will be all kinds of gaps that need filling.  And there are bad eggs you need to discard, like say a sensor that is on a parking lot as shown in the article.  There are heat island effects in urban areas where sensors are more densely packed.  This needs to be accounted for.  Good luck with that over 200 years of societal development.  From horse buggy, dirt streets, and no air conditioning to what we have today – various roofing materials, road and building materials, air conditioning, and you name it.  How for the love of Pete does one account for this?

To digress for a moment, the proponents tend to view one metric in isolation at a time and not look at the whole picture and make points that are likely true but not relevant.  For example, Google the BEST stuff for reviews by various Johnny-come-Latelies and you will see that the BEST team has proven that urban heat island effects have negligible effect on climate temperature, and therefore, that isn’t the cause of the warming planet (it’s CO2, stupid).  This misses the point.  The point is, the data set includes an enormous volume of data from these urban areas and as I mention above, how in the heck do you account for that over decades and centuries of human development?  Ride around any city with an old guy who has lived there his entire life.  “This used to be the edge of town” (it’s now home to a shopping mall built in 1970).  This used to be all cornfield (now it’s a watering hole – gasoline stations, fast food restaurants and strip malls – for the interstate that was built in 1965, and it is filled with homes for as far as the eye can see).  The temperature readings in a corn field will be much different in precisely the same location when that location becomes a shopping cart coral on a parking lot.  What do you do with that?  Even if the thermometer was and is downtown for the past 100 years, the city’s boundary growth can have a very significant impact on that.

Anyway, you can talk till you’re blue in the face about calibrating the statistical model for progressive urbanization, and I’m here to tell you, it will be wrong.  It’s a guess.  Period.  One could waste his time modeling these effects for every single affected temperature station but it would be full of assumptions until the model results look right.  More on this below.

The data do not include temperatures over the oceans that cover 70% of the earth’s surface.  Further discussion is not required.  It speaks for itself.

A dirty secret: people are prejudice to their beliefs and desires and will tweak data until they get what they think is right or what they want.  For example, entities like utilities for SarbOx compliance (sometimes) and government agencies (almost always) that need to show rigorous and fair evaluation of bidder proposals develop scoring breakdowns for various selection criteria.  E.g., experience – worth 20% of the total score, technical approach 30%, price 30%, and so on.  BTW, if it’s government project you are bidding for, it doesn’t matter what they say, price is 100% of the score.

The proposal review team reads the proposals and may interview the finalists.  In the end, I’m here to tell you, they select the bidder they are most comfortable with, or the team that will tell them what they want to hear, or the team with the lowest price even though per the math it makes negligible difference, or the team they used last time (you won’t see “risk of switching horses” as a criterion).  When it’s time to document the scores, magically the scores work out to support the selection.  There really isn’t anything wrong with this.  This is combination of subjective, gut-feel, emotional, and rational factors used by humans to decide something.  Do guys use score cards to decide whether to marry their girlfriends?  Don’t answer that.  Some probably do.  Case closed.  Anyway, the law says they need to develop scoring criteria but the scoring is totally subjective – even the pricing part is.  Why bother?

Back to the BEST study.  The results indicate the surface temperature increased 0.9C over the past century.  Congratulations.  This is entirely believable but it’s also far down in the grass known as uncertainty.  This is my educated guess, but in order to arrive at a result this believable with the above unavoidable unknowns and uncertainties would require a lot of data scrubbing, tweaking, and model massaging until “reasonable” results pop out.  Ask any engineer who models anything.  They have an idea of what the outcomes should look like.  If they don’t, they are fired.  Assumptions, factors, inputs, boundary conditions, and equations are massaged.  Victory is declared once the results look “reasonable”.  Well jeez man, if you have a massive gob of data with 5000C and god knows what other unfounded anomalies buried in there – and mathematical/statistical models are used to bridge gaps – where does one declare victory?  0.9C sounds good to me!  Really, I can believe that.

Look, I’m not in any way whatsoever belittling or doubting for a second the expertise of the team that did this analysis.  I have no reason to believe these guys aren’t geniuses and used the best methods and their work is technically error free even.

What about temperature-taking technology, accuracy and precision over the past 200 years?  Change a little?

But what about the oceans?

I’ve already used up my space for the week so this will have to be a series of rants because I’m not done.

The good news in my opinion is, from objective commentary and assessments of this study, there appear to be serious and objective analysis happening, still.  They’re not all packing Kool Aid for the Algore train to Guyana.

I will continue discussing anthropogenic climate change next week, unless I change my mind.  Forget the anecdotal rebuttals.  I’ll address them later.

Tidbits

Last week I beat up the Occupy Wall Street movement for not focusing on right problem – crony capitalism, essentially rich donors to politicians who then return the “investment” 10 fold.  It’s money laundering for campaign money.  It’s actually far worse.  George Kaiser bundled $50-$100k for the President’s campaign prior to the half billion dollar loan to build the Solyndra plant.  Picking a number near the middle of the reported donation, that’s 7,000 fold return on investment – and consider he was a bundler, which means he’s essentially panhandling from all his friends so it isn’t even all his money.  If I’m reading it correctly, Kaiser provided his own venture capital to Solyndra but put his money in front of taxpayer money to pull it out first if the Hindenburg went down, and it did.

Jeff Ihnen

Author Jeff Ihnen

More posts by Jeff Ihnen

Leave a Reply