Search
  • Bruce Williams

Variable Parameter Overload



On April 1, 2014 Scientific American had an article by Michael E. Mann entitled “Why Global Warning Will Cross a Dangerous Threshold in 2036”. Using that article I would like to point out the biggest problem with the science of Global Warning.

My first blog entry was about Fermi’s Elephant and the danger of having too many adjustable parameters in your work. That is not to say that you can’t have more than three parameters to ensure that you are sure of your response it just means that in your work you can only have up to three of them that can be adjusted in order to get a valid result. The unfortunate thing about Global Warming research is some if not most of the research suffers from the problem of too many adjustable variables.

Below I have quoted the majority of the articles text. The part of the text which has no bearing on the research method and materials was not quoted simply to limit the amount of text in this blog entry. You can read the entire article at Scientific American.

All text from the article are in italics and all my writing is standard font.

I have marked many of the statements below with yellow to show the words indicating an adjusted parameter and the words for the parameters that are adjusted in orange. I will start with the statement that points out that the science is not “settled”.

“If the world continues to burn fossil fuels at the current rate, global warming will rise 2 degrees Celsius by 2036, crossing a threshold that many scientists think will hurt all aspects of human civilization: food, water, health, energy, economy and national security. The . . .”


The biggest reason this shows that the science is not settled is because Mann points out that it is many scientists involved here and he did not say or imply that it was even most scientists. And as he stated the scientists only think this will have a bad effect. You will notice that he in no way states there is a consensus that there will be a bad effect.

“We employed a simple zero-dimensional Energy Balance Model

(“EBM”—see references 1 through 5 below) of the form C dT/dt = S(1-a)/4 + FGHG -A-B T + w(t) to model the forced response of the climate to estimate natural and anthropogenic radiative forcing.


An estimation is a guess at a value and as such are somewhat inaccurate values with unknown error’s but assumed by the author to be “close enough”. Other authors may or may not agree with these estimates since they are not well established values.

“T is the temperature of Earth’s surface (approximated as the surface of a 70-meter-depth, mixed- layer ocean covering 70 percent of Earth’s surface area). C = 2.08 x 108 J K-1m-2 and is an effective heat capacity that accounts for the thermal inertia of the mixed-layer ocean, but does not allow for heat exchange with the deep ocean as in more elaborate “upwelling-diffusion models” (ref. 6). S ≈ 1370 Wm-2 is the solar constant, and a ≈ 0.3 is the effective surface albedo. FGHG is the radiative forcing by greenhouse gases, and w(t) represents random weather effects, which was set to zero to analyze the pure radiative forced response.”

Effective heat capacity indicates that their variable C is set to a constant assuming some arbitrary thermal inertia of the mixed layer ocean regardless of the actual oceans characteristics over the last 1162 years. A very simplistic model that is used apparently throughout this work even though the earth was in the medieval warm period for 300 years.

Does not allow for heat exchange with the deep ocean is a simple statement which warns that it does not perform some function and it is good to inform the reader of a function that they would expect. Unfortunately it indicates that the results will not be as accurate as it could be. Another introduced error.

“The linear “gray body” approximation (ref. 3) LW = A+B T was used to model outgoing longwave radiation in a way that accounts for the greenhouse effect. The choice A = 221.3 Wm-2 and B = 1.25 WK-1m-2 yields a realistic preindustrial global mean temperature T = 14.8 oC and an equilibrium climate sensitivity (ECS) of DT2xCO2 = 3.0oC, consistent with midrange estimates by the International Panel on Climate Change (ref. 7). B can be varied to change the ECS of the EBM. For example, the higher value B = 1.5 WK-1m-2 yields a more conservative ECS of DT2xCO2 = 2.5oC.”

Approximation and estimates are essentially the same in effect. They are somewhat inaccurate values with unknown error’s but assumed by the author to be “close enough”. Other authors may or may not agree with these estimates and approximations since they are not well established values.

“Energy Balance Model Simulations Historical Simulations. The model was driven with The linear “gray body” approximation estimated annual natural and anthropogenic forcing over the years A.D. 850 to 2012. Greenhouse radiative forcing was calculated using the approximation (ref. 8) FGHG = 5.35log(CO2e/280), where 280 parts per million (ppm) is the preindustrial CO2 level and CO2e is the “equivalent” anthropogenic CO2. We used the CO2 data from ref. 9, scaled to give CO2e values 20 percent larger than CO2 alone (for example, in 2009 CO2 was 380 ppm whereas CO2e was estimated at 455 ppm). Northern Hemisphere anthropogenic tropospheric aerosol forcing was not available for ref. 9 so was taken instead from ref. 2, with an increase in amplitude by 5 percent to accommodate a slightly larger indirect effect than in ref. 2, and a linear extrapolation of the original series (which ends in 1999) to extend though 2012.”


Approximation and estimated are essentially the same in effect. They are somewhat inaccurate values with unknown error’s but assumed by the author to be “close enough”. Other authors may or may not agree with these estimates and approximations since they are not well established values.


Scaled to give CO2e values 20 percent larger than CO2 alone indicates that the raw data was changed to an arbitrary value that the author assumed would be proper even though no justification is given for that adjustment.


Extrapolation of the original series (which ends in 1999) to extend though 2012 indicates that the data set he is working with did not cover the entire time period of the article and therefore they had to literally guess what the missing values were. Unfortunately when extrapolating you have to assume the trend in the data will continue as it did in the past which is a circular reference saying my results are true because I chose the specific data to use and my research shows that my choice of data was proper.

“Estimated past changes in solar irradiance were prescribed as a change in the solar constant S whereas forcing by volcanic aerosols was prescribed as a change in the surface albedo a. Solar and volcanic forcing were taken from the General Circulation Model (GCM) simulation of ref. 3 described in the section above, with the following modifications: (1) solar forcing was rescaled under the assumption of a 0.1 percent change from Maunder Minimum to present, more consistent with recent estimates (ref. 9); (2) volcanic forcing was applied as the mean of the latitudinally varying volcanic forcing of ref. 9; (3) values for both series were updated through 2012.”


Approximation and estimated are essentially the same in effect. They are somewhat or possibly completely inaccurate values with unknown errors but assumed by the author to be “close enough”. Other authors may or may not agree with these estimates and approximations since they are not well established values.


Simulation in this case implies that one must accept that the GCM’s has put forward by the IPCC can actually predict accurately in and of themselves which for a gross accuracy case they do predict something but they do not predict anything that is within what I would consider an accurate estimate. And herein lyes the problem, what is accurate enough? After all I can calculate a value with an error range that is 2 to 3 times the value calculated which essentially makes it useless. But then I can have a calculated value with an error of .1% of the calculated value and it would be acceptable. The IPCC’s GCM’s are in the first category not the accurate .1% range.

“Future Projections: For the purpose of the “business as usual” future projections, we have linearly extrapolated the CO2 radiative forcing forward to 2100, based on the trend over the past decade (which is roughly equivalent, from a radiative forcing standpoint, to a forward projection of the exponential historical trajectory of CO2 emissions). We assume constant solar output, and assume no climatically significant future volcanic eruptions.


Linearly extrapolated the CO2 radiative forcing forward to 2100 - The only problem is that linearly extrapolated the CO2 forcing when they are fully aware that CO2 forcing is a logarithmic function not a linear function is going to give you an inaccurate result. In this range the difference is relatively small but then if you have enough small errors you wind up with a totally inaccurate result.


Roughly equivalent is self-explanatory as far as accuracy.


Assume in these two cases is just an indication that they are going to ignore any possible changes in these two parameters. Of course reality is quite different from assuming that something is not going to change. Quite honestly even the smoothed solar output varies considerably during each sunspot cycle, and volcanic eruptions are unpredictable so it is strictly a guess as to if they will or will not influence this article.

“We have assumed that tropospheric aerosols decrease exponentially from their current values with a time constant of 60 years. This gives a net anthropogenic forcing change from 2000 to 2100 of 3.5 Wm-2, roughly equivalent to the International Panel on Climate Change’s 5th assessment report “RCP6” scenario, a future emissions scenario that assumes only modest efforts at mitigation.


Assumed in this case appears to be a valid assumption but then it is just an assumption which is one more variable that in reality can mess with their final product value.

“Stabilization Scenarios: For the stabilization scenarios, we relax (with a 20-year time constant) the CO2 concentration to a maximum specified value at 2100. We considered cases both where the anthropogenic tropospheric aerosol burden is assumed to (a) decrease exponentially from their current values with a time constant of 60 years, as in the future projections discussed in the previous section, and alternatively (b) remain constant at its current value.


Assume, assume In these two cases is an indication that they yet again are going to ignore any possible changes in these two parameters other than the two that they have chosen. Of course reality is quite different from assuming that something is going to be exactly what you choose.

“Additional Details. Sensitivity analyses of the historical simulations were performed by Mann et al 2012 (ref. 4) with respect to (i) the equilibrium climate sensitivity assumed (varied from DT2xCO2 =2-4 oC); the (ii) solar scaling (0.25 percent in place of 0.1 percent Maunder Minimum to present change); (iii) the volcanic aerosol loading estimates used; and (iv) the scaling of volcanic radiative forcing with respect to aerosol loading to account for possible size distribution effects. All of the alternative choices described above were found to yield qualitatively similar results.”


During their sensitivity analysis does “qualitatively similar” equate to an error range of 1% or an error range of 1000%? This will depend on the judgment of the people who performed the analysis as to if it is qualitatively similar and there is no limit on what they can accept. And since they state no specific value or values there is no reason to believe that it was a result similar enough to qualify for what the majority of scientists may consider a good result.


“Why Global Warning Will Cross a Dangerous Threshold in 2036” is a completely improper title for this. It implies that the results are within a high degree of scientific certainty which it most assuredly does not. A proper title would have been something on the order of “Why Global Warming May Cross a Possibly Dangerous Threshold in 2036” with this title the reader and the media would be properly informed that the results of this work is very tentative and is not to be taken as some kind of verified truth as is indicated by the word “Will” in the original article title.


From a personal perspective I have some doubts as to the ability of Mr. Mann to take a neutral stance in anything. I base this on two specific items, his trouble with the “hockey-stick” incident and his disregard for the truth when dealing with the courts.


When Mr. Mann produced his famous “hockey stick” graph he received a lot of pushback from various sources including critics. The majority of the complaints arose from the fact that in the article he published he used “adjusted” temperature data and eliminated the medieval warm period. His reactions to the criticisms was not to challenge his detractors and let the scientific community determine the credibility of his work but instead he decided to sue people in a civil court. Whether those criticisms

were true or not this creates the appearance of using money and the courts to decide the science.


Considering the fact that the latest IPCC reports have reinstated the temperatures in the medieval warm period which as it turns out was a global phenomenon consisting of varying degrees of warming in different places but the overall temperature of the earth was actually higher than it is today. As it turned out Mr. Mann had made errors in his “assumptions” and choice of data. So now the scientific community is stuck with the fact that one of its earliest prominent supporters of global warming decided to use courts to settle a scientific question that turned out to be an error by Mr. Mann in his adjustments. This is a black eye for both the scientific community for not challenging the results and Mr. Mann for egregious data modification and usage and then trying to use the courts to settle the science.


Even beyond this Mr. Mann showed that he either has no idea how the Nobel Prize works, or he knows what the award is and tried to fool the court, or he could be so nave as to believe that when the Nobel prize is awarded to an organization all the people in the organization or associated with the organization also received the Nobel Prize which is ridiculous.


Mr. Mann, in his lawsuit, submitted that “2. Dr. Mann is a climate scientists whose research has focused on global warming. Along with other researches, he was one of the first to document the steady rise in surface temperature during the 20th century and the steep increase in measured temperatures since the 1950s. As a result of this research, Dr. Mann and his colleagues were awarded the Nobel Peace Prize.”


The truth of the matter is the wording from the Nobel committee for the 2007 Nobel Peace Prize is: Intergovernmental Panel on Climate Change (IPCC) . . . "for their efforts to build up and disseminate greater knowledge about man-made climate change, and to lay the foundations for the measures that are needed to counteract such change."


And the issued Diploma picture from the Nobel prize website is:

But what Mr. Mann submitted to the court was:

Which is not a Nobel Peace Prize. It is a recognition from the IPCC to Mr. Mann for contributing to the work done which allowed the IPCC to receive the Nobel Peace Prize, similar to any thank you from any large corporation. The thank you from the IPCC to Mr. Mann had the Nobel Peace Prize Diploma copied on the top in the same manner any corporation would include a copy of any award the company may have gotten. And as in most large corporation accomplishments there were thousands of people that contributed to the IPCC’s effort in getting the Nobel Peace Prize. And no one of them individually received the Nobel Peace Prize.


It must be noted that the 2007 Nobel Peace Prize award is listed as having been given to 2 recipients, the IPCC and Al Gore, both of whom received separate Diplomas. If Mr. Mann had actually received the Nobel Peace Prize he would have received a separate Diploma from the Nobel committee with his name on it, which he did not.


It is quite disturbing that a magazine like Scientific American, realizing how little regard Mr. Mann has for the truth seeking of courts, would continue to support the efforts of Mr. Mann. But then Scientific American is a for-profit enterprise and like all for-profit enterprises must makes decisions to either make more money through an articles headline or disregard the article based on the trustworthiness of the author. And there in lies the problem, pure capitalism does not require truth, and most papers are published through profit making companies and presented to the public by profit making media.

#Global Warming, #Climate Change, #Environmental Issues, #CO2, #Carbon Footprint, #Reusable Energy Sources, #Weather

2 views0 comments

Recent Posts

See All

THE Q