Dog

Climate news

Recommended Posts

23 hours ago, duncan (the other one) said:

Then the answer to your question is yes, and you will not find many arguing against that.

A more important question for useful idiots such as yourself to answer is by how much?

I can see how much, it is being measured very accurately.

2016-12_p14.png

  • Like 1
  • Downvote 1

Share this post


Link to post
Share on other sites
13 hours ago, random said:

Been sailing.  Your points are weasel words. 

$5bn subsidy a year to fossil fuels in Australia, what don't you get about that?

How many subsidies to brewers, growers, builders?

Share this post


Link to post
Share on other sites
2 hours ago, warbird said:

How many subsidies to brewers, growers, builders?

You don't know?

  • Downvote 1

Share this post


Link to post
Share on other sites
16 hours ago, random said:

I can see how much, it is being measured very accurately.

2016-12_p14.png

I do not see a +/- (X *10 to the 21 ) .  What is the accuracy of the measuring devices involved?

Share this post


Link to post
Share on other sites
1 hour ago, warbird said:

I do not see a +/- (X *10 to the 21 ) .  What is the accuracy of the measuring devices involved?

They were the accuracy required by the specialists responsible.  Please contact them directly to ask. 

More to the point, over time, less accurate devices would still detect the trend shown.  

You don't know jack about this stuff WarpedBird 

  • Downvote 1

Share this post


Link to post
Share on other sites
3 hours ago, warbird said:

I do not see a +/- (X *10 to the 21 ) .  What is the accuracy of the measuring devices involved?

Over what range?  When you know the difference between a thermometer and a bolometer,  come back and we can talk.

Share this post


Link to post
Share on other sites
1 hour ago, Laker said:

Over what range?  When you know the difference between a thermometer and a bolometer,  come back and we can talk.

I do, NOAA does not have bolometers in the sea surface fleet. The sea surface fleet is the bulk of global surface temp sensors. Those sensors have an accuracy of +/-1 degreeC (2degree swing)

Share this post


Link to post
Share on other sites
16 minutes ago, Laker said:

They don't?  I don't want to blow my cover, but I can assure you that triple-point referenced bolometers are used.

Good, what percent of the fleet? NOAA specs page states +/-1 C.

Share this post


Link to post
Share on other sites

This would not be "fleet based" as you say, but part of the instrumentation packages they deploy.  Every ship at some point has deployed and retrieved instruments.  One of the accurate and reliable instruments in general use is based on the same principle as the depth meter, usually in the same package, that can measure ripples on the surface of the water from hundreds of feet down.  Quartz is a very stable element, one part in 10*14.  We can measure frequency very accurately with digital counters.  A column of quartz is set to resonate at a set frequency, usually in the megahertz.  As the column is loaded, this frequency changes.  In depth meters, this is done by exposing one end of the column to ambient pressure.  In the thermometer, a stable material with a high coefficient of thermal expansion is used.  Usually amber.  The high thermal expansion material strains the low expansion material and causes a change in frequency. The result is a thermometer that is accurate to within at least one part in 10*6 and over the -20 to +40 C range that is being observed, 10*-5 degrees is entirely possible as an accuracy.  Of course you run into problems with the definition of temperature at such accuracy and of course there is the correction for pressure.  The thought of your $1M AUV running around with a thermometer accurate to only +/-1 C is disturbing at best.

  • Like 1

Share this post


Link to post
Share on other sites
58 minutes ago, Laker said:

This would not be "fleet based" as you say, but part of the instrumentation packages they deploy.  Every ship at some point has deployed and retrieved instruments.  One of the accurate and reliable instruments in general use is based on the same principle as the depth meter, usually in the same package, that can measure ripples on the surface of the water from hundreds of feet down.  Quartz is a very stable element, one part in 10*14.  We can measure frequency very accurately with digital counters.  A column of quartz is set to resonate at a set frequency, usually in the megahertz.  As the column is loaded, this frequency changes.  In depth meters, this is done by exposing one end of the column to ambient pressure.  In the thermometer, a stable material with a high coefficient of thermal expansion is used.  Usually amber.  The high thermal expansion material strains the low expansion material and causes a change in frequency. The result is a thermometer that is accurate to within at least one part in 10*6 and over the -20 to +40 C range that is being observed, 10*-5 degrees is entirely possible as an accuracy.  Of course you run into problems with the definition of temperature at such accuracy and of course there is the correction for pressure.  The thought of your $1M AUV running around with a thermometer accurate to only +/-1 C is disturbing at best.

Glad you are here Laker.  WarpedBird has been banging on about this shot for years, showing his ignorance.

Carry on I can't wait for him converse with you.  Gentleman to Gentleman.

giphy.gif

  • Downvote 1

Share this post


Link to post
Share on other sites

This is where Warbird, as the true scientist, reconsiders.   He withdraws his objections to the radical new theory since exciting new research methodoligy has convinced him his previous few years were an academic dead end.   In fact, he abruptly withdraws his poster exploring the affect of water evaporation from slime mold on mercury thermometer readings and devotes the rest of his career to better understanding the consequences of his lifestyle on his planet.  Exciting times are afoot, I eagerly await his next paper.   

  • Like 1

Share this post


Link to post
Share on other sites
14 hours ago, random said:

Glad you are here Laker.  WarpedBird has been banging on about this shot for years, showing his ignorance.

Carry on I can't wait for him converse with you.  Gentleman to Gentleman.

giphy.gif

Still a dickhead I see! Didn't look at my NDBC link when I posted it years ago, probably still won't.

http://www.ndbc.noaa.gov/rsa.shtml

Share this post


Link to post
Share on other sites
9 minutes ago, Lark said:

This is where Warbird, as the true scientist, reconsiders.   He withdraws his objections to the radical new theory since exciting new research methodoligy has convinced him his previous few years were an academic dead end.   In fact, he abruptly withdraws his poster exploring the affect of water evaporation from slime mold on mercury thermometer readings and devotes the rest of his career to better understanding the consequences of his lifestyle on his planet.  Exciting times are afoot, I eagerly await his next paper.   

http://www.ndbc.noaa.gov/rsa.shtml

 

Share this post


Link to post
Share on other sites
15 hours ago, Laker said:

This would not be "fleet based" as you say, but part of the instrumentation packages they deploy.  Every ship at some point has deployed and retrieved instruments.  One of the accurate and reliable instruments in general use is based on the same principle as the depth meter, usually in the same package, that can measure ripples on the surface of the water from hundreds of feet down.  Quartz is a very stable element, one part in 10*14.  We can measure frequency very accurately with digital counters.  A column of quartz is set to resonate at a set frequency, usually in the megahertz.  As the column is loaded, this frequency changes.  In depth meters, this is done by exposing one end of the column to ambient pressure.  In the thermometer, a stable material with a high coefficient of thermal expansion is used.  Usually amber.  The high thermal expansion material strains the low expansion material and causes a change in frequency. The result is a thermometer that is accurate to within at least one part in 10*6 and over the -20 to +40 C range that is being observed, 10*-5 degrees is entirely possible as an accuracy.  Of course you run into problems with the definition of temperature at such accuracy and of course there is the correction for pressure.  The thought of your $1M AUV running around with a thermometer accurate to only +/-1 C is disturbing at best.

http://www.ndbc.noaa.gov/rsa.shtml

 

Share this post


Link to post
Share on other sites
1 minute ago, warbird said:

Wave period is sensitive to one second plus or minus one second.   How can they possibly figure out Great Lakes waves if they are off by that much? I wouldn’t even bother checking the buoy data.   Except as I pointed out a while back and Raz’r explained in some detail, statistics are like magic (my words).   Uncertainty  from a field of data points is smaller than measurment precision.   See examples on pp 21and 22.  Even I can follow these, and statistics seems like magic to me.   As we tried to explain a while back, your argument is wrong even without Lakers professed knowledge of what sensors are being used.   

Share this post


Link to post
Share on other sites
25 minutes ago, Lark said:

Wave .   Except as I pointed out a while back and Raz’r explained in some detail, statistics are like magic (my words).   Uncertainty  from a field of data points is smaller than measurment precision.   See examples on pp 21and 22.  Even I can follow these, and statistics seems like magic to me.   .   

That is only valid if the measurement inaccuracies are random

Share this post


Link to post
Share on other sites
3 minutes ago, warbird said:

That is only valid if the measurement inaccuracies are random

Are they not?   

Share this post


Link to post
Share on other sites
1 hour ago, warbird said:

So.....you don't know. Thus you cannot presume.

WarpBird has been thoroughly reamed by science.  Funni as shit to watch.

giphy.gif

  • Downvote 1

Share this post


Link to post
Share on other sites

Warble should call NOAA and let them know their data is all wrong. Because I'm sure none of the thousands of people (scientist, engineers, technicians) working in the climate field over the past hundred years have ever thought about measurement accuracy. Why would they. It's a huge oversight and thank god WB was here to catch it. 

Thanks WB, you may have just saved us from saving the world.

dial-meat-thermometer-23-401-2.jpg

Share this post


Link to post
Share on other sites

Interesting that the surface temperature accuracies quoted are the same for different buoys.  You can go to web pages like Sea-Bird Scientific for the specs to the sort of temperature probes that would be used and they would state an accuracy of 25 micro Kelvin.  The values for the ADCP are reasonable, if on the conservative side.  The accuracy numbers for the waverider buoys may be OK because of the issues with mixing of air and water temperatures at the interface.  Perhaps they take the same approach with all the floating buoys.  They are measuring and stating they are measuring surface temperature, which is not really valid in the situations we are talking about.   

 

Share this post


Link to post
Share on other sites
On 12/22/2017 at 7:00 PM, Laker said:

Interesting that the surface temperature accuracies quoted are the same for different buoys.  You can go to web pages like Sea-Bird Scientific for the specs to the sort of temperature probes that would be used and they would state an accuracy of 25 micro Kelvin.  The values for the ADCP are reasonable, if on the conservative side.  The accuracy numbers for the waverider buoys may be OK because of the issues with mixing of air and water temperatures at the interface.  Perhaps they take the same approach with all the floating buoys.  They are measuring and stating they are measuring surface temperature, which is not really valid in the situations we are talking about.   

 

...yet, NOAA data is based on those platforms I listed....

Share this post


Link to post
Share on other sites
On December 20, 2017 at 6:09 AM, slug zitski said:

Extractive industries...minerals...oil.....will always need special treatment.

 

a company may prospect and develope a  potential resource at great expense , then find out that the resource is not economically extractable or that market prices have collapsed and the investment wasted. 

i have no problem with policies that help these industries. 

I'm not sure I follow. I own an oil and gas E&P company, and we do fine without subsidies. Our biggest problem is regulatory overreach and enforcement by bureaucrats who usually have no industry experience and usually very little industry specific knowledge.

our tax flow has always gone one way, me to the government. 

Your second paragraph is a pretty good description of dry hole risk. Yes, if we find an uneconomic resource, and the host government decides, for whatever reason, that it is in the national interest to develop the resource anyways, then there will have to be some sort of support.  I happen not to like those deals. I, and I think most oil guys, like my profit clean and simple. Find for X, develop for Y, produce for Z, sell for (X+Y+Z)x2. We've been pretty lucky at that. Of course, I only operate in the US.

As an aside, I don't post here often, but occasionally read these threads. You are quite a breath of fresh air in what is normally a grim, self-congratulatory groupthink. Thank you.

Share this post


Link to post
Share on other sites

I know little of the industry.  I do read that in my region there is a valuable offshore gas field .  The big oil companies are ready to work .  The issue is transporting the gas into the present gas distribution,   pipline network . Evidently the oil companies will not proceed unless they recieve government help..taypayers money , political support ....for onshore infrastructure .

i would imagine that you are the same...ports, transport links...

long term investments  that need government guarantees ,, political approval  .

as a taxpayer i view this as in the public interest 

Share this post


Link to post
Share on other sites
7 hours ago, warbird said:

...yet, NOAA data is based on those platforms I listed....

Interesting, isn't it.  You would think those darn scientists didn't know what they were doing.

Share this post


Link to post
Share on other sites
3 hours ago, Laker said:

Interesting, isn't it.  You would think those darn scientists know who is payng thier way...

 

FIFY

Share this post


Link to post
Share on other sites
10 minutes ago, warbird said:

FIFY

Those rich universities drowning out the poor fossil fuel companies with fake science?   

https://splinternews.com/how-fossil-fuel-money-made-climate-denial-the-word-of-g-1797466298

Conservative groups, funded by fossil fuel magnates, spend approximately one billion dollars every year interfering with public understanding of what is actually happening to our world. Most of that money—most of the fraction of it that can be tracked, anyway—goes to think tanks that produce policy papers and legislative proposals favorable to donors’ interests, super PACs that support politicians friendly to industry or oppose those who are not, or mercenary lobbyists and consultants, in some instances employing the same people who fought to suppress the science on smoking. In terms of impact, however, few investments can rival the return that the conservative donor class has gotten from the small cohort of evangelical theologians and scholars whose work has provided scriptural justifications for apocalyptic geopolitics and economic rapaciousness.

Share this post


Link to post
Share on other sites
18 minutes ago, warbird said:

FIFY

Sarcasm is such a hard thing to do with an email.  So much so that I must remember to stop trying to do it.  The Canadians, after their revolution with the Harper government, at least offer complete access to the raw data from such things as the Ocean Observatory.

Share this post


Link to post
Share on other sites
12 hours ago, Lark said:

Those rich universities drowning out the poor fossil fuel companies with fake science?   

https://splinternews.com/how-fossil-fuel-money-made-climate-denial-the-word-of-g-1797466298

Conservative groups, funded by fossil fuel magnates, spend approximately one billion dollars every year interfering with public understanding of what is actually happening to our world. Most of that money—most of the fraction of it that can be tracked, anyway—goes to think tanks that produce policy papers and legislative proposals favorable to donors’ interests, super PACs that support politicians friendly to industry or oppose those who are not, or mercenary lobbyists and consultants, in some instances employing the same people who fought to suppress the science on smoking. In terms of impact, however, few investments can rival the return that the conservative donor class has gotten from the small cohort of evangelical theologians and scholars whose work has provided scriptural justifications for apocalyptic geopolitics and economic rapaciousness.

Tin foil hat?

  • Downvote 1

Share this post


Link to post
Share on other sites
On 12/22/2017 at 9:45 AM, warbird said:

I do not see a +/- (X *10 to the 21 ) .  What is the accuracy of the measuring devices involved?

You don't know?

  • Downvote 1

Share this post


Link to post
Share on other sites
5 hours ago, warbird said:

 

OK, let's assume the temperatures stay the same from now on. How much air will you have to put in your tires to offset the extra weight from the load being cold?

Share this post


Link to post
Share on other sites
On 12/20/2017 at 4:12 AM, random said:

You have led a sheltered life Dunc.  You have no fucking idea do you?

In the US

"(MISI) estimated the total historical federal subsidies for various energy sources over the years 1950–2010. The study found that oil, natural gas, and coal received $369 billion, $121 billion, and $104 billion (2010 dollars), respectively, or 70% of total energy subsidies over that period."

in Australia,

Australian coal, oil and gas companies receive $4b in subsidies: report

A new report finds exploration by coal and energy companies is subsidised by Australian taxpayers by as much as $US3.5 billion ($4 billion) every year in the form of direct spending and tax breaks.

Hmmm, they get 70% of the subsidies and produce 90% of the energy.

Sounds like they are getting screwed

energy_consumption_by_source_large.jpg

Share this post


Link to post
Share on other sites

Despite cold ending to 2017, it was the warmest year on record in Austin. And, don't be fooled by this brief cold spell--global warming is alive and well. Austin's 5 hottest years have ALL occurred since 2006. Austin's records date to 1890s.

3D4A7DC4-4D8A-4D2F-98D2-818E04B9D618.jpeg

Share this post


Link to post
Share on other sites
9 hours ago, Gouvernail said:

Despite cold ending to 2017, it was the warmest year on record in Austin. And, don't be fooled by this brief cold spell--global warming is alive and well. Austin's 5 hottest years have ALL occurred since 2006. Austin's records date to 1890s.

3D4A7DC4-4D8A-4D2F-98D2-818E04B9D618.jpeg

Have you considered plotting the growth in the population in the area to the temperatures?  It looks like your 72.0 to 72.1 is about a 0.13% increase.  In that same time frame, Austin added over 100K in population.  The suburban area expanded as well.

All those new folks have arrived and want air conditioned houses and pump the warm air inside out to where the thermometers are.

Move out to the less built up area of Elgin and the annual average is in the 60s.

Share this post


Link to post
Share on other sites

So @Saorsa is claiming, at least in the Austin area, warming is entirely driven by human activity. 

Perhaps it is true similar warming influence by every  city in the world could possibly be a factor in the general warming of the planet?? !!!

The globe is mighty large but there are billions of these humans each of whom is contributing a little.

What if all the carbon based fuels being burned by those billions are causing changes to the atmosphere?

What if fumes from that burning change how the  atmospheric blanket around the globe absorbs or radiates energy?? 

What if the equilibrium temperature is being altered? 

What if the globe is so big, the forces of change can be operating for hundreds of years before the changes become detectable?

What  if those forces of change are irreversible or take many centuries to “turn off.”?????

 

WOW!! Those thoughts are certainly upsetting. Let’s not think about it!!! 

 

 

Share this post


Link to post
Share on other sites
7 minutes ago, Gouvernail said:

So @Saorsa is claiming, at least in the Austin area, warming is entirely driven by human activity. 

Perhaps it is true similar warming influence by every  city in the world could possibly be a factor in the general warming of the planet?? !!!

The globe is mighty large but there are billions of these humans each of whom is contributing a little.

What if all the carbon based fuels being burned by those billions are causing changes to the atmosphere?

What if fumes from that burning change how the  atmospheric blanket around the globe absorbs or radiates energy?? 

What if the equilibrium temperature is being altered? 

What if the globe is so big, the forces of change can be operating for hundreds of years before the changes become detectable?

What  if those forces of change are irreversible or take many centuries to “turn off.”?????

 

WOW!! Those thoughts are certainly upsetting. Let’s not think about it!!! 

 

 

Now, shitwit, where did I say entirely?

I've never said the world isn't getting warmer, just that none of the idiotic proposals to date will make one bit of difference.

After Greenies denied the idea for years even the EPA now recognizes urban heat islands.

As long as you keep making the island bigger, the temperature will increase.

You're the dunce that picked a rapidly growing urban center for your example.  There's a clue in that old saying

THINK GLOBAL ACT LOCAL.

 

Share this post


Link to post
Share on other sites
18 minutes ago, Gouvernail said:

So @Saorsa is claiming, at least in the Austin area, warming is entirely driven by human activity. 

Perhaps it is true similar warming influence by every  city in the world could possibly be a factor in the general warming of the planet?? !!!

The globe is mighty large but there are billions of these humans each of whom is contributing a little.

What if all the carbon based fuels being burned by those billions are causing changes to the atmosphere?

What if fumes from that burning change how the  atmospheric blanket around the globe absorbs or radiates energy?? 

What if the equilibrium temperature is being altered? 

What if the globe is so big, the forces of change can be operating for hundreds of years before the changes become detectable?

What  if those forces of change are irreversible or take many centuries to “turn off.”?????

 

WOW!! Those thoughts are certainly upsetting. Let’s not think about it!!! 

 

 

You could use that case to stop all economic activity whatsoever.

What if the sun god is displeased because we haven't sacrificed a virgin in a while?  Is it really worth the risk?

  • Like 1

Share this post


Link to post
Share on other sites
Just now, jzk said:

You could use that case to stop all economic activity whatsoever.

What if the sun god is displeased because we haven't sacrificed a virgin in a while?  Is it really worth the risk?

The supply of virgins is being depleted by Islamic martyrs.

 

Share this post


Link to post
Share on other sites
10 hours ago, By the lee said:

 

10 hours ago, By the lee said:

Two good examples.  The first shows that if you stop polluting the earth will wipe its ass. 

The second the relationship between ocean O2 levels and nitrogen fertilizers which make feeding the world cheaper. 

Conservation is still the key

Share this post


Link to post
Share on other sites
31 minutes ago, warbird said:

The front is in the picture......

The sun rises in the west on your world?

Rotating_globe.gif

Share this post


Link to post
Share on other sites
14 hours ago, Ishmael said:

For one horrible moment I thought those were pictures of my eyeballs.

Colored floaters!

Share this post


Link to post
Share on other sites
On January 5, 2018 at 10:07 AM, Saorsa said:

The supply of virgins is being depleted by Islamic martyrs.

 

Now hold on just a second, Wilbur!  (If I may call you Wilbur.  Just pretend I'm Mr Ed, just for a leetle bit....)   The virgins that the Martyrs do whatever they do with are in heaven, right?  And that means everyone in heaven has expired? (How old are are these virgins?)  Or are they just on loan to heaven, but when they get back to earth they are they officially undeflowered?  Or is the deflowering retroactively withdrawn?  (eewww....)  Or maybe all the females who go to heaven become virgins?   Perhaps ozone is caused as a byproduct of any type of virgin churning?  I call this the Pence Multiphasic No Man Made Global Warming Virgin Churning Theory.  Or maybe the Martyrs, never having tasted alcohol, get really shit faced for the very first time,  and just think the 'virgins' are virgins?

 

 

Share this post


Link to post
Share on other sites
2 hours ago, Amati said:

Now hold on just a second, Wilbur!  (If I may call you Wilbur.  Just pretend I'm Mr Ed, just for a leetle bit....)   The virgins that the Martyrs do whatever they do with are in heaven, right?  And that means everyone in heaven has expired? (How old are are these virgins?)  Or are they just on loan to heaven, but when they get back to earth they are they officially undeflowered?  Or is the deflowering retroactively withdrawn?  (eewww....)  Or maybe all the females who go to heaven become virgins?   Perhaps ozone is caused as a byproduct of any type of virgin churning?  I call this the Pence Multiphasic No Man Made Global Warming Virgin Churning Theory.  Or maybe the Martyrs, never having tasted alcohol, get really shit faced for the very first time,  and just think the 'virgins' are virgins?

 

 

Just draw pretty faces on the Martyr's backs and you'll have an undulating line of happy drunks.

  • Like 1

Share this post


Link to post
Share on other sites

Globally, 2017 likely to be the second- or third-warmest year on record

The estimated global mean temperature for 2017 (January–November) was 0.77 ± 0.09 °C above the 1961–1990 average, and it is likely 2017 will be the second- or third-warmest year on record since 1850. The warmest two years currently are 2016 (+0.87 °C) and 2015 (+0.76 °C), records that were assisted by a strong El Niño. Conversely, the exceptional warmth of 2017 has occurred in the absence of El Niño.

Global temperatures have increased by just over one degree since the pre-industrial period, and all of the ten warmest years on record have occurred between 1998 and the present. No year since 1985 has observed a global mean temperature below the 1961–1990 average.

image.png.219edc90f8e5a40f3a489c2d2af82ec0.png

  • Downvote 1

Share this post


Link to post
Share on other sites
3 hours ago, random said:

Globally, 2017 likely to be the second- or third-warmest year on record

The estimated global mean temperature for 2017 (January–November) was 0.77 ± 0.09 °C above the 1961–1990 average, and it is likely 2017 will be the second- or third-warmest year on record since 1850. The warmest two years currently are 2016 (+0.87 °C) and 2015 (+0.76 °C), records that were assisted by a strong El Niño. Conversely, the exceptional warmth of 2017 has occurred in the absence of El Niño.

Global temperatures have increased by just over one degree since the pre-industrial period, and all of the ten warmest years on record have occurred between 1998 and the present. No year since 1985 has observed a global mean temperature below the 1961–1990 average.

image.png.219edc90f8e5a40f3a489c2d2af82ec0.png

Adjustments to the raw data.

2cwn0ww.jpg

Share this post


Link to post
Share on other sites
On 1/5/2018 at 1:03 PM, Saorsa said:

Now, shitwit, where did I say entirely?

I've never said the world isn't getting warmer, just that none of the idiotic proposals to date will make one bit of difference.

After Greenies denied the idea for years even the EPA now recognizes urban heat islands.

As long as you keep making the island bigger, the temperature will increase.

You're the dunce that picked a rapidly growing urban center for your example.  There's a clue in that old saying

THINK GLOBAL ACT LOCAL.

 

Goodness I would like to see evidence of this denial. We certainly talked about urban heat islands as an undergrad geography student in the late 1960s. I do agree that saying that warmer temps in one city matters in this debate. In fact, the number of warmest years in Austin seems less than for the whole world.

Share this post


Link to post
Share on other sites
1 hour ago, Bristol-Cruiser said:

Goodness I would like to see evidence of this denial. We certainly talked about urban heat islands as an undergrad geography student in the late 1960s. I do agree that saying that warmer temps in one city matters in this debate. In fact, the number of warmest years in Austin seems less than for the whole world.

Well, google' urban heat island myth' and look at the results from the early 21st century.

Here is one reference from Greenpeace.

 

Share this post


Link to post
Share on other sites
7 hours ago, Dog said:

Adjustments to the raw data.

2cwn0ww.jpg

I got my graph from BOM.

You got yours from people who lie for a living.

  • Downvote 1

Share this post


Link to post
Share on other sites

New developments in settled science...

"Princeton University researchers have found that the climate models scientists use to project future conditions on our planet underestimate the cooling effect that clouds have on a daily — and even hourly — basis, particularly over land".

https://environment.princeton.edu/news/spotty-coverage-climate-models-underestimate-cooling-effect-daily-cloud-cycle

Share this post


Link to post
Share on other sites

New developments in settled science continued...

It's often claimed that the current rate of warming is unprecedented and that it must therefore be anthropogenic. Further that widespread extinctions will result from this rate of warming. However a new study published in Nature using noble gas in ice cores as a proxy finds oceanic heat uptake about 12000 years ago exceeded today's rate of warming . No word yet on whether the authors will face criminal charges for publishing their findings.

https://www.nature.com/articles/nature25152

Share this post


Link to post
Share on other sites
On 1/15/2018 at 1:03 AM, Dog said:

"Princeton University researchers have found that the climate models scientists use to project future conditions on our planet underestimate the cooling effect that clouds have on a daily — and even hourly — basis, particularly over land".

So please tell us how they 'found' something that has not happened yet?

Did the use a 'model'?

  • Downvote 1

Share this post


Link to post
Share on other sites
3 hours ago, random said:

So please tell us how they 'found' something that has not happened yet?

Did the use a 'model'?

What are you mean, not happened yet? The cooling effect of clouds discussed in the article has been happening all along. It just hasn't been factored into the climate models which we know to be running hot. Think there might be a connection? 

Share this post


Link to post
Share on other sites

Just to chime in with one of my personal issues.

http://www.theenergycollective.com/edfenergyex/2419654/nasa-study-underscores-urgency-solving-global-methane-problem

"A growing number of leading global oil and gas companies including BP, Exxon and Shell have embraced methane reductions as a priority, while others have pledged to a near zero methane emissions future."

"Oil and gas methane is a significant global problem, but it is also a problem with a relatively simple solution. The International Energy Agency singled out methane as a central business issue for oil and gas companies, concluding that the industry can reduce its worldwide emissions by 75 percent – and that up to two thirds of those reductions can be realized at zero net cost. Further, IEA says that just the no net cost reductions would have the same climate impact in 2100 as immediately closing all the coal plants in China.

 

Share this post


Link to post
Share on other sites
On 12/22/2017 at 7:00 PM, Laker said:

Interesting that the surface temperature accuracies quoted are the same for different buoys.  You can go to web pages like Sea-Bird Scientific for the specs to the sort of temperature probes that would be used and they would state an accuracy of 25 micro Kelvin.  The values for the ADCP are reasonable, if on the conservative side.  The accuracy numbers for the waverider buoys may be OK because of the issues with mixing of air and water temperatures at the interface.  Perhaps they take the same approach with all the floating buoys.  They are measuring and stating they are measuring surface temperature, which is not really valid in the situations we are talking about.   

 

Not looking for a shit fight.  Honest question:

If the instruments are truly that good, (I have no reason to doubt the accuracy,) then why do the scientists "adjust" their findings?  I honestly do not have the time to get into it right now, but I always hear about actual data vs revised data.  I figure either you or Lark are the best here to answer that question.   Also, I like the idea of using thermal expansion as a sensor.  

I am a skeptic, not a "denier."  To me the science is not settled.  I draw that conclusion based on the repeated failed predictions of the same group people over and over again.  I guess I'm getting to old to buy any bucket of shit pushed in my face, from either side.  And yes there are sides.  Political solutions looking for a convenient problem. Get rid of the political money and I'd bet we would see a better more believable set of both data and predictions.  Unfortunately that is not going to happen.  At least not anytime soon.

BTW: Anyone who doubts the seriousness of at least some energy companies to improve efficiencies and power consumption does not have a freaking clue.  I have been working in a Southern Company owned lab for the past two months on a project that they are hoping to prove viable.  The project is a new type of HVAC system that is water sourced and geo-assisted. It will significantly reduce the size of the ground loop for geothermal heat pump systems.  The facility is great. 

I will check back, but for the next two weeks I will be very busy with the above project.  More programming tomorrow and then another battery of tests.  I don't manipulate the results of the tests.  Good data is good data.  Learn from it, adapt and build a better machine.

Share this post


Link to post
Share on other sites
41 minutes ago, plchacker said:

Not looking for a shit fight.  Honest question:

If the instruments are truly that good, (I have no reason to doubt the accuracy,) then why do the scientists "adjust" their findings?  I honestly do not have the time to get into it right now, but I always hear about actual data vs revised data.  I figure either you or Lark are the best here to answer that question.   Also, I like the idea of using thermal expansion as a sensor.  

I am a skeptic, not a "denier."  To me the science is not settled.  I draw that conclusion based on the repeated failed predictions of the same group people over and over again.  I guess I'm getting to old to buy any bucket of shit pushed in my face, from either side.  And yes there are sides.  Political solutions looking for a convenient problem. Get rid of the political money and I'd bet we would see a better more believable set of both data and predictions.  Unfortunately that is not going to happen.  At least not anytime soon.

BTW: Anyone who doubts the seriousness of at least some energy companies to improve efficiencies and power consumption does not have a freaking clue.  I have been working in a Southern Company owned lab for the past two months on a project that they are hoping to prove viable.  The project is a new type of HVAC system that is water sourced and geo-assisted. It will significantly reduce the size of the ground loop for geothermal heat pump systems.  The facility is great. 

I will check back, but for the next two weeks I will be very busy with the above project.  More programming tomorrow and then another battery of tests.  I don't manipulate the results of the tests.  Good data is good data.  Learn from it, adapt and build a better machine.

 

There's a lot of reasons actually.  The first is obvious - calibration.  Sensors are almost always designed to measure in regards to something else and truly steady reference conditions are always challenging, particularly when something is exposed to the environment over time (yea, sailors know that one!).  The smaller the change, the bigger issues become with reference conditions.  Imagine your typical bathroom scale.  It measures your weight by measuring the displacement in a spring, measured by either a strain gauge (measures deflection essentially) or an LVDT (linear displacement device).  But there's a lot of underlying assumptions - what's the spring constant of the material?  Are the gauges properly attached?  Is the LVDT secure?  Is your bathroom freaking cold or stupid hot or has someone put something super heavy on the scale and bent something?  Now imagine that instead of measuring your weight, I'm going to measure the weight of say, a fruit fly.  Everything has to be made MUCH more sensitive and now, even things like air currents are going to impact the measurement.  There's a whole host of analytical techniques used to verify data over time and that's what most of the 'corrections' are there to assess and compensate.

Here's an example from a totally unrelated field - long range rifle shooting (https://www.youtube.com/watch?v=jX7dcl_ERNs).  It details how the Coriolis effects - yes, the rotation of the earth - creates a measurable change in bullet accuracy at 1000 yards depending if you're shooting due east or due west.  If you want to measure accurately, you may have to account for such things, depending on what you're trying to measure.

So 'adjusted' in the positive context means that the researchers have collected the data, analyzed it against references, checked it for internal consistency, and presented their best understanding of what the data represents.  The better researchers will detail the raw data, the corrections made, and the final results.  In fact, a lot of Dept. of Energy contracts now REQUIRE the raw data be preserved for analysis later.

The biggest single problem with climate science is a lack of experimental replication and reflection as a means of removing interference.  Here's a few basic concepts from Design of Experiments.  Say you have three variables, A, B, and C, and you want to know what their values are.  To completely define them, you actually need to run 8 experiments - called a 'full factorial' test.  You need A, B, C, and then the interactions AB, AC, BC, and then ABC, plus at least one for comparison to give some sense of the spread of the data.  AB, AC, BC, and ABC are cases of where your coefficients are interrelated, which you may or many not know in advance.  Now imagine that you have 5 variables.  You can learn something about those 5 variables by running 8 tests also - called a partial factorial or sometimes a "Plackett-Burman analysis'  - but you give up some information.  For example, you may have an interaction between A and C but wouldn't be able to tell because experimentally it would be superimposed by some other interaction.  If the interactions are within the noise range of the data you might not care but what if it's not?  The way you gain such knowledge is through 'replication' and 'reflection'.  Replication just means you run the test over and this tells you if the values you are calculating are statistically significant or if they are within the normal spread of the data.  Reflection basically means you run the opposite set of conditions.  This essentially makes interactions cancel out and tells you if you have interactions or not.  By the way, this all assumes linear functions.  If you don't have linear functions, then you need to run intermediate points to determine curvature.

There are some advanced mathematics to deconvolve data sets which are also sometimes employed but that's a science in and of itself.

Now you're a climate scientist.  How many variables do you want to look at per experiment?  How many conditions?  And can you actually set up a 'negative' test to determine interactions?  There's only one earth.  And you're measuring very small changes, often over large areas.  That's why freak events - like the grounding of all planes during 9/11, actually are statistically really important to climate researchers.  They actually get a chance to test a condition that is outside the normal range of experience.

Hope that helps explain what they're doing.  These techniques have been around for a while and frankly, virtually everything you interact with day to day was developed using these concepts.  They work.  It's not the math.  The challenge is sensitivity required and the lack of control over experimental conditions.

If you wanna talk about the modeling side of this sometime, let me know :)

 

Share this post


Link to post
Share on other sites

I'll actually give you a real life example that sort of points on the problem with any of this kind of analysis.

About 27 years ago, I was looking at a trade journal for 'starting salaries' for BS, MS, and Ph.D. engineers.  It turned out that "BS' chemical engineers were actually getting paid, on average, more than "MS' chemical engineers.  In fact, BS chemical engineers were getting paid more than MOST MS engineers and every BS engineering field by way more that particular year than they had ever gotten before.

Being an engineering journal, they listed the reason why.. the CEO of Exxon got his BS Chem E. that year and his compensation was north of 33 million dollars.

So what do you do?  It's data.  Real and accurately measured.  On average, BS Chem E's did great.  But in particular, they all did about what they had been doing the year before.

Do you change your data representation because of ONE data point that you CLEARLY know is an outlier?   That's biasing data!  Do you do a bunch of analysis that tells you the obvious - it's an outlier - and use that to justify your omission?  Do you accept the data and present it with a description as to why it's weird (what they did) and risk people just ignoring it and reporting the table without the clarification?  Do you change your representation to a median instead of a mean to account for such outliers thereby losing integrity with all of your earlier reported data sets?

One hopes that people doing such analysis are being honest and fair and trying to get to truth.  There are examples where people have cooked the books.  But are there enough book cookers to actually change the fundamental conclusions or is it just a CEO from Exxon who's impact will go away as his single result is diluted over time?

Good luck on the HVAC system.  There's analysis out there that shows that leaking ACs from cheep window units that  will be installed in india and SE Asia over the next 20 years will release more damaging GHG than all the CO2 released up to this point.  I truly believe that efficient, effective climate control - non- HCF based - is the single most important technical achievement of the next 50 years if we're going to reduce the impact of climate change.

Share this post


Link to post
Share on other sites
35 minutes ago, cmilliken said:

 

There's a lot of reasons actually.  The first is obvious - calibration.  Sensors are almost always designed to measure in regards to something else and truly steady reference conditions are always challenging, particularly when something is exposed to the environment over time (yea, sailors know that one!).

This. I had some involvement in data analysis for Australia's ocean monitoring program. Instruments come with a calibration sheet and often an algorithm.

You take the raw values, plug the calibration coefficients into the algorithm and out pops the magic number in SI units - hopefully.

Now take an instrument that has been in the water for 12 months and you send it off to be re-calibrated. The numbers are different. Which set do you use, those for when it went into the water or those for after it came out? And where do you document this & other choices?

When I designed one of our databases I kept *all* the metadata. We stored the raw data file off of the instrument, the calibration coefficients and the Java code used to calculate the engineering unit numbers - this last being what got made publicly available. This way if anyone challenged the data we could tell them the whole story and they could re-check it all for themselves. There was no 'secret squirrel' business by design. We had nothing to hide and if a mistake was found, we could re-calculate from original sources.

FKT

Share this post


Link to post
Share on other sites

Certainly an even more difficult  data set to quantify is the total adjustment to the public perception of the gathered data accomplished by the investments of those who have paid skilled professionals to adjust that perception. 

It is my hypothesis a person who wishes to contribute to the welfare of the world’s population can and will have much more useful skills if he /she studies public relations, political science, statistics, and learns to excel in  thesbianism. 

Share this post


Link to post
Share on other sites
12 hours ago, cmilliken said:

 

There's a lot of reasons actually.  The first is obvious - calibration.  Sensors are almost always designed to measure in regards to something else and truly steady reference conditions are always challenging, particularly when something is exposed to the environment over time (yea, sailors know that one!).  The smaller the change, the bigger issues become with reference conditions.  Imagine your typical bathroom scale.  It measures your weight by measuring the displacement in a spring, measured by either a strain gauge (measures deflection essentially) or an LVDT (linear displacement device).  But there's a lot of underlying assumptions - what's the spring constant of the material?  Are the gauges properly attached?  Is the LVDT secure?  Is your bathroom freaking cold or stupid hot or has someone put something super heavy on the scale and bent something?  Now imagine that instead of measuring your weight, I'm going to measure the weight of say, a fruit fly.  Everything has to be made MUCH more sensitive and now, even things like air currents are going to impact the measurement.  There's a whole host of analytical techniques used to verify data over time and that's what most of the 'corrections' are there to assess and compensate.

Here's an example from a totally unrelated field - long range rifle shooting (https://www.youtube.com/watch?v=jX7dcl_ERNs).  It details how the Coriolis effects - yes, the rotation of the earth - creates a measurable change in bullet accuracy at 1000 yards depending if you're shooting due east or due west.  If you want to measure accurately, you may have to account for such things, depending on what you're trying to measure.

So 'adjusted' in the positive context means that the researchers have collected the data, analyzed it against references, checked it for internal consistency, and presented their best understanding of what the data represents.  The better researchers will detail the raw data, the corrections made, and the final results.  In fact, a lot of Dept. of Energy contracts now REQUIRE the raw data be preserved for analysis later.

The biggest single problem with climate science is a lack of experimental replication and reflection as a means of removing interference.  Here's a few basic concepts from Design of Experiments.  Say you have three variables, A, B, and C, and you want to know what their values are.  To completely define them, you actually need to run 8 experiments - called a 'full factorial' test.  You need A, B, C, and then the interactions AB, AC, BC, and then ABC, plus at least one for comparison to give some sense of the spread of the data.  AB, AC, BC, and ABC are cases of where your coefficients are interrelated, which you may or many not know in advance.  Now imagine that you have 5 variables.  You can learn something about those 5 variables by running 8 tests also - called a partial factorial or sometimes a "Plackett-Burman analysis'  - but you give up some information.  For example, you may have an interaction between A and C but wouldn't be able to tell because experimentally it would be superimposed by some other interaction.  If the interactions are within the noise range of the data you might not care but what if it's not?  The way you gain such knowledge is through 'replication' and 'reflection'.  Replication just means you run the test over and this tells you if the values you are calculating are statistically significant or if they are within the normal spread of the data.  Reflection basically means you run the opposite set of conditions.  This essentially makes interactions cancel out and tells you if you have interactions or not.  By the way, this all assumes linear functions.  If you don't have linear functions, then you need to run intermediate points to determine curvature.

There are some advanced mathematics to deconvolve data sets which are also sometimes employed but that's a science in and of itself.

Now you're a climate scientist.  How many variables do you want to look at per experiment?  How many conditions?  And can you actually set up a 'negative' test to determine interactions?  There's only one earth.  And you're measuring very small changes, often over large areas.  That's why freak events - like the grounding of all planes during 9/11, actually are statistically really important to climate researchers.  They actually get a chance to test a condition that is outside the normal range of experience.

Hope that helps explain what they're doing.  These techniques have been around for a while and frankly, virtually everything you interact with day to day was developed using these concepts.  They work.  It's not the math.  The challenge is sensitivity required and the lack of control over experimental conditions.

If you wanna talk about the modeling side of this sometime, let me know :)

 

Thanks, that nicely explains the scientific need for the "adjustments" but I don't think it explains what they are doing.

Adjustment based on the considerations you describe would be dispersed either side of the raw data and that pattern of adjustment would remain constant over time.  In the case of global temperature 1940 and earlier data were overwhelmingly adjusted to the down side. From 1940 to about 1975 the adjustments to the raw were, as we should expect, equally dispersed either side of the raw data. From 1975 to the present the adjustments were virtually exclusively to the up side. The effect of course is a steeper warming trend line in the adjusted data than in the raw data.

I doubt you or anyone here can provide a scientific explanation for this pattern of adjustment. The explanation is political.

Share this post


Link to post
Share on other sites
17 minutes ago, Dog said:

Thanks, that nicely explains the scientific need for the "adjustments" but I don't think it explains what they are doing.

Adjustment based on the considerations you describe would be dispersed either side of the raw data and that pattern of adjustment would remain constant over time.  In the case of global temperature 1940 and earlier data were overwhelmingly adjusted to the down side. From 1940 to about 1975 the adjustments to the raw were, as we should expect, equally dispersed either side of the raw data. From 1975 to the present the adjustments were virtually exclusively to the up side. The effect of course is a steeper warming trend line in the adjusted data than in the raw data.

I doubt you or anyone here can provide a scientific explanation for this pattern of adjustment. The explanation is political.

 

In the vernacular, if the data follows a normal distribution (i.e., the bell curve), then corrections would nominally fall on either side of the mean.  In fact, the general rule of statistical process control is that any data set that includes 8 or more values on one side of the mean is considered to be statistically significant and suggests that something has changed since the mean was first calculated.

It doesn't mean it's 'wrong' - in means the data isn't following a normal distribution.

In truth, I haven't looked at the data set from the global temperature data but assuming your observation is true, then something has changed in the way the data is analyzed from how the time the original methodologies were constructed.  An example from my 'Chem E. Salary' post above might be shifting from a mean value to a median value.  In a large data set, that kind of change usually doesn't matter but if you're looking at small changes where there are serious outliers, then it can create a shift in the apparent result.

I had this discussion yesterday about the price of natural gas.  For 320 or so days a year, the price of natural gas on any given day is below the mean price for the year which makes for some funny data.  There can be 100 days in a row where all the prices fall below the mean line which would be 'statistically significant' - except that the mean is calculated including the 30 days a year where the spot price of natural gas can skyrocket.  This kind of skewing of data can happen wherever you butt up against a boundary for some reason.  Test scores where the class average is like 95 out of a 100 often show a severe skew.  

I've said this earlier in this thread and I believe it to be true today.  The problem with climate science isn't the science - the problem is about reparations.

That's it.  All this talk about science, models, methods, etc. are a smokescreen to avoid the hard conversation.  On one side, you have plaintiffs that want a down payment from the US of somewhere around 4 trillion dollars for the 'damage we have done' to the world. They want money.  Hard cash.  Moolah.   On the other side, you have the defendants that claim the overall benefit of modern technology vastly outweighs the damage done and the world should be happy with vaccines, TV, cell phones, AI, and all the other advancements created by science which wouldn't have been possible without a stable, reliable, energy infrastructure.

https://motherboard.vice.com/en_us/article/bmj97q/the-us-owes-the-world-4-trillion-for-trashing-the-climate

CO2 isn't going to cook the world.  CH4 leaking might.  And a massive release of HFCs absolutely will.

 

Share this post


Link to post
Share on other sites

Cape Town is down to 90 days of water (though there will be guarded collection points where residents can get their 25 liters and tanker trucks will be available for the wealthy still).   https://apple.news/AEYtRQKGMRPCc_oTIrB1KeQ

 

Share this post


Link to post
Share on other sites
1 hour ago, cmilliken said:

 

In the vernacular, if the data follows a normal distribution (i.e., the bell curve), then corrections would nominally fall on either side of the mean.  In fact, the general rule of statistical process control is that any data set that includes 8 or more values on one side of the mean is considered to be statistically significant and suggests that something has changed since the mean was first calculated.

It doesn't mean it's 'wrong' - in means the data isn't following a normal distribution.

In truth, I haven't looked at the data set from the global temperature data but assuming your observation is true, then something has changed in the way the data is analyzed from how the time the original methodologies were constructed.  An example from my 'Chem E. Salary' post above might be shifting from a mean value to a median value.  In a large data set, that kind of change usually doesn't matter but if you're looking at small changes where there are serious outliers, then it can create a shift in the apparent result.

I had this discussion yesterday about the price of natural gas.  For 320 or so days a year, the price of natural gas on any given day is below the mean price for the year which makes for some funny data.  There can be 100 days in a row where all the prices fall below the mean line which would be 'statistically significant' - except that the mean is calculated including the 30 days a year where the spot price of natural gas can skyrocket.  This kind of skewing of data can happen wherever you butt up against a boundary for some reason.  Test scores where the class average is like 95 out of a 100 often show a severe skew.  

I've said this earlier in this thread and I believe it to be true today.  The problem with climate science isn't the science - the problem is about reparations.

That's it.  All this talk about science, models, methods, etc. are a smokescreen to avoid the hard conversation.  On one side, you have plaintiffs that want a down payment from the US of somewhere around 4 trillion dollars for the 'damage we have done' to the world. They want money.  Hard cash.  Moolah.   On the other side, you have the defendants that claim the overall benefit of modern technology vastly outweighs the damage done and the world should be happy with vaccines, TV, cell phones, AI, and all the other advancements created by science which wouldn't have been possible without a stable, reliable, energy infrastructure.

https://motherboard.vice.com/en_us/article/bmj97q/the-us-owes-the-world-4-trillion-for-trashing-the-climate

CO2 isn't going to cook the world.  CH4 leaking might.  And a massive release of HFCs absolutely will.

 

And the reason the distribution isn't normal is because there is an interfering variable.  Dog is pointing out that politics may be the interfering variable.  No?

Share this post


Link to post
Share on other sites
13 minutes ago, Mckarma said:

And the reason the distribution isn't normal is because there is an interfering variable.  Dog is pointing out that politics may be the interfering variable.  No?

It's possible.  I haven't looked into the particular data sets to which he is referring so I can't comment on the veracity of his claim.

As I said, I think most of the 'dig into the science' questions are really just a smokescreen to avoid the hard questions.

Society for the last half of the 20th century was based upon leveraged energy and an infinite growth model.  Our leveraged energy model produced pollution (not just the CO2 kind - in fact, as I've said, I think that's one of the lesser issues in truth) that has been ignored to the point where it's now piling up.  Furthermore and more problematic, we've run out of runway on the infinite growth model.  What do we owe to those folks who were told to 'wait their turn' now that the ride has basically slowed down?  Nothing?  Something? 

 

Share this post


Link to post
Share on other sites

Isn’t a worsening condition it’s own interfering variable?   If the boat is settling faster than predicted you don’t disbelieve the theory of the hole or the dip stick.  You worry why every time you put the flashlight on the dip stick the water was higher than it felt on your fingers.

 If the dipstick proves to be wicking water you celebrate and keep pumping.  That’s safer then taking a nap since the data is confusing.  You can stick your hand in the bilge (Houston’s having a couple millennia of flooding in five years) and realize it’s not your holding tank.  

The evangelical solution of praying for Coast Guard Jesus goes against the conservative self reliance creed.   The other Republican solution of ignoring the water and having all hands trim sails so we don’t risk race (economic) performance assumes we don’t need the boat. 

Share this post


Link to post
Share on other sites
13 minutes ago, Lark said:

Isn’t a worsening condition it’s own interfering variable?   If the boat is settling faster than predicted you don’t disbelieve the theory of the hole or the dip stick.  You worry why every time you put the flashlight on the dip stick the water was higher than it felt on your fingers.

 If the dipstick proves to be wicking water you celebrate and keep pumping.  That’s safer then taking a nap since the data is confusing.  You can stick your hand in the bilge (Houston’s having a couple millennia of flooding in five years) and realize it’s not your holding tank.  

The evangelical solution of praying for Coast Guard Jesus goes against the conservative self reliance creed.   The other Republican solution of ignoring the water and having all hands trim sails so we don’t risk race (economic) performance assumes we don’t need the boat. 

What is the specific action that you recommend?

Share this post


Link to post
Share on other sites
12 minutes ago, jzk said:

What is the specific action that you recommend?

  • Quit deleting carbon info from us gov websites
  • gradually increase gas tax to stop the increase in vehicle size
    • that will help protect the economy from the next gas bump,  I’ve lived through a couple and the pattern is obvious.
  • stop all subsidies of fossil fuels, including protecting tankers in the gulf with taxpayer funded navies.
  • spend tax dollars protecting population centers from flooding, instead of fema bailouts
    • planned development, not more Houstons
  • stop taxpayer subsidizing flood insurance for those that live in harms waY
    • this distorts the market and relies on government
  • capture methane from landfills and compost to control methane
  • As a cultural shift, stop denying an inconvenient reality in the interest of short term profits.
    • this requires eliminating $peech by the fossil fuel industry,   Of course solar should face a similar gag.
  • stop subsidizing soy diesel,   This helps German industry anyway.   
  • review and curtail corn ethanol mandates, except as smog control.  
  • Like 1

Share this post


Link to post
Share on other sites
2 minutes ago, Lark said:
  • Quit deleting carbon info from us gov websites
  • gradually increase gas tax to stop the increase in vehicle size
    • that will help protect the economy from the next gas bump,  I’ve lived through a couple and the pattern is obvious.
  • stop all subsidies of fossil fuels, including protecting tankers in the gulf with taxpayer funded navies.
  • spend tax dollars protecting population centers from flooding, instead of fema bailouts
    • planned development, not more Houstons
  • stop taxpayer subsidizing flood insurance for those that live in harms way
  • capture methane from landfills and compost to control methane
  • As a cultural shift, stop denying an inconvenient reality in the interest of short term profits.
    • this requires eliminating $peech by the fossil fuel industry,   Of course solar should face a similar gag.
  • stop subsidizing soy diesel,   This helps German industry anyway.   
  • review and curtail corn ethanol mandates, except as smog control.  

If we do that today, what will be the difference in our climate in 50 years?

Share this post


Link to post
Share on other sites
Just now, jzk said:

If we do that today, what will be the difference in our climate in 50 years?

Hotter, but maybe less extreme.    Less disruption for future generations as climate fluctuations continue until FEMA is too expensive and we write off a few coastal cities.  

 The current philosophy of extract all oil immediately clearly shows disdain for our progeny even if climate change was a giant hoax by university scientists, detected only by the diligence of Exxon executives.

  • Like 1

Share this post


Link to post
Share on other sites
1 minute ago, Lark said:

Hotter, but maybe less extreme.    Less disruption for future generations as climate fluctuations continue until FEMA is too expensive and we write off a few coastal cities.  

 The current philosophy of extract all oil immediately clearly shows disdain for our progeny even if climate change was a giant hoax by university scientists, detecte