|
Post by marchesarosa on Feb 11, 2011 10:00:42 GMT 1
The number of perceived extreme weather events is directly proportional to the development of the news media compounded by the presence of video cameras and mobile phones everywhere. There is instant footage – vital for today’s ‘news’ – of every incident in every corner of the globe. Images of these extremes are now piped into to virtually every home on a twentyfour hour basis.
The idea prevalent amongst climate alarmists of ever more frequent extreme weather events is therefore an artifact of media coverage exacerbated by the credulity of the gullible of the western world.
The impact of extreme weather is a function of population growth and growing real estate values.
Discuss.
|
|
|
Post by carnyx on Feb 11, 2011 10:22:31 GMT 1
Would it make a nice formula?
, and then it could be used to produce a time-history graph between say 1500 and 2100, with the projection from 2011 to 2100 as an exrapolation.
I wonder what shape it would be ...
Quick! I feel a grant coming on!
|
|
|
Post by marchesarosa on Feb 11, 2011 10:44:35 GMT 1
|
|
|
Post by marchesarosa on Feb 11, 2011 13:49:35 GMT 1
Al Gored at WUWT agrees. He says: February 10, 2011 at 7:10 pm
"I also believe that the recent shift to covering the daily weather porn is no accident. It has, after all, worked to manufacture the consent that "gee, the weather sure is weird" – when it is not."
--------
"weather porn" - that's a good 'un!
|
|
|
Post by helen on Feb 11, 2011 14:13:03 GMT 1
|
|
|
Post by marchesarosa on Feb 11, 2011 14:35:24 GMT 1
Quarterly Journal of the Royal Meteorological Society © 2011 Royal Meteorological Society
January 2011 Part A
Volume 137, Issue 654 Pages 1–27
The Twentieth Century Reanalysis Project G. P. Compo et al (NB the "et al" includes Phil Jones amongst the also rans)
Abstract
The Twentieth Century Reanalysis (20CR) project is an international effort to produce a comprehensive global atmospheric circulation dataset spanning the twentieth century, assimilating only surface pressure reports and using observed monthly sea-surface temperature and sea-ice distributions as boundary conditions. It is chiefly motivated by a need to provide an observational dataset with quantified uncertainties for validations of climate model simulations of the twentieth century on all time-scales, with emphasis on the statistics of daily weather. It uses an Ensemble Kalman Filter data assimilation method with background ‘first guess’ fields supplied by an ensemble of forecasts from a global numerical weather prediction model. This directly yields a global analysis every 6 hours as the most likely state of the atmosphere, and also an uncertainty estimate of that analysis.
The 20CR dataset provides the first estimates of global tropospheric variability, and of the dataset's time-varying quality, from 1871 to the present at 6-hourly temporal and 2° spatial resolutions. Intercomparisons with independent radiosonde data indicate that the reanalyses are generally of high quality. The quality in the extratropical Northern Hemisphere throughout the century is similar to that of current three-day operational NWP forecasts. Intercomparisons over the second half-century of these surface-based reanalyses with other reanalyses that also make use of upper-air and satellite data are equally encouraging.
It is anticipated that the 20CR dataset will be a valuable resource to the climate research community for both model validations and diagnostic studies. Some surprising results are already evident. For instance, the long-term trends of indices representing the North Atlantic Oscillation, the tropical Pacific Walker Circulation, and the Pacific–North American pattern are weak or non-existent over the full period of record. The long-term trends of zonally averaged precipitation minus evaporation also differ in character from those in climate model simulations of the twentieth century. Copyright © 2011 Royal Meteorological Society and Crown Copyright.
|
|
|
Post by marchesarosa on Feb 11, 2011 14:39:36 GMT 1
The bit in red means (as reported in the WSJ) that there is no evidence in the main indices of any increasing variability in the determinants of global weather over the century, so no increase in extreme weather events either.
Looks like louise will have to fall back on Munich Re's insurance claims proxy for "bad weather" since the real indicators are somewhat noticeable by their absence!
|
|
|
Post by marchesarosa on Feb 11, 2011 15:01:32 GMT 1
Just in case you have missed the implications for the putative increase in extreme weather events claimed by alarmists -
“In the climate models, the extremes get more extreme as we move into a doubled CO2 world in 100 years,” atmospheric scientist Gilbert Compo, one of the researchers on the project, tells me from his office at the University of Colorado, Boulder. “So we were surprised that none of the three major indices of climate variability that we used show a trend of increased circulation going back to 1871.”
|
|
|
Post by helen on Feb 11, 2011 17:40:56 GMT 1
Putative? Well it's real here in Shropshire, I've lived fifty odd years to note and record it, in Eastern Australia, in Pakistan and Brazil and Greenland and Hudson Bay. Just Because Greenland stays frozen doesn't mean the climate hasn't changed. -20°C is a lot warmer than -40°! Do you know nothing of science marchesarosa other than the names of scientists? No, we know you don't so pack it in with the science and stick to the politics.
|
|
|
Post by marchesarosa on Feb 11, 2011 18:00:26 GMT 1
Take it up with the Royal Meteorological Society, helen, which sponsored and published the findings. Consult Phil Jones, too, if you wish.
I guess it must have been peer reviewed but with that huge list of co-authors, would there have been anyone left to do it?
|
|
|
Post by marchesarosa on Feb 13, 2011 2:20:49 GMT 1
Roger Pielke Sr comments on The Twentieth Century Reananalysis Project:
"The temporal inhomogeneity of the input fields that Compo et al used [the surface pressure sea surface temperatures and sea ice] raise questions on the robustness of their results. Nevertheless, this is an interesting study and the results so far are quite provocative, as they raise further questions on the skill of the IPCC models to replicate (i.e. hindcast) the evolution of the climate system in the last century."
|
|
|
Post by marchesarosa on Feb 20, 2011 21:59:32 GMT 1
|
|
|
Post by marchesarosa on Feb 21, 2011 12:32:36 GMT 1
Willis Eschenbach has done an excellent critique of a piece of research purporting to show that extreme rainfall events are increasing due to CO2. Even folk like me can more or less follow the stages of explanation! Nature Unleashes a Flood … of Bad Science.wattsupwiththat.com/2011/02/20/nature-unleashes-a-flood-of-bad-science/#more-34439Incidentally, more pal-review stuff from "Nature". Willis does the "audit" of data and method that you can bet the actual reviewers did not undertake.
|
|
|
Post by marchesarosa on Feb 27, 2011 10:48:47 GMT 1
Unscientific hype about the flooding risks from climate change will cost us all dear The warmists have sound financial grounds for hyping the dangers of flooding posed by climate change, writes Christopher Booker. 26 Feb 2011 As the great global warming scare continues to crumble, attention focuses on all those groups that have a huge interest in keeping it alive. Governments look on it as an excuse to raise billions of pounds in taxes. Wind farm developers make fortunes from the hidden subsidies we pay through our electricity bills. A vast academic industry receives more billions for concocting the bogus science that underpins the scare. Carbon traders hope to make billions from corrupt schemes based on buying and selling the right to emit CO2. But no financial interest stands to make more from exaggerating the risks of climate change than the re-insurance industry, which charges retail insurers for “catastrophe cover”, paid for by all of us through our premiums. An insight into this was given by a paper published by Nature on February 17, which claimed to show for the first time how man-made climate change greatly increases the risk of flood damage. Among the eight authors of the paper are two of the most influential scientists at the heart of the UN’s Intergovernmental Panel on Climate Change, Prof Peter Stott of the UK Met Office’s Hadley Centre and Dr Myles Allen, head of Oxford’s Climate Dynamics Group. Two of their co-authors are from Risk Management Solutions (RMS), a California-based firm which is the world leader in advising the insurance industry on climate change. The study, based entirely on computer models, focused on the exceptional flooding that took place in England and Wales in the autumn of 2000. Its conclusion – that climate change could increase the chance of flooding by up to 90 per cent – was widely publicised, without questioning, by all the usual media cheerleaders for global warming, led by the BBC’s Richard Black (“Climate change increases flood risk, researchers say”). When less partisan observers examined the paper, however, they were astonished. Although Nature has long been a leading propagandist for man-made climate change, this example seemed truly bizarre. Why had this strangely opaque study been based solely on the results of a series of computer models – mainly provided by the Hadley Centre and RMS – and not on any historical data about rainfall and river flows?The Met Office’s own records show no upward trend in UK rainfall between 1961 and 2004. Certainly autumn 2000 showed an unusual rainfall maximum, but it was exceeded in 1930. The graph between then and 2010 shows no significant upward trend. While 2000 may have seen a lot of rain, 1768 and 1872 were even wetter. In the real world, the data show no evidence of an increase in UK rainfall at all. Any idea that there is one seemed to be entirely an artefact of the computer models. On Friday came the fullest and most expert dissection of the Nature paper so far, wattsupwiththat.com/2011/02/25/an-open-letter-to-bruce-alberts-of-science-magazine/ published on WUWT by Willis Eschenbach, a very experienced computer modeller. His findings are devastating. After detailed analysis of the study’s multiple flaws, he sums up by accusing Nature of “trying to pass off the end-result of a long daisy-chain of specifically selected, untested, unverified, un-investigated computer models as valid, falsifiable, peer-reviewed science”.His conclusion is worth quoting at some length: “When your results represent the output of four computer models, fed into a fifth computer model, whose output goes to a sixth computer model, which is calibrated against a seventh computer model, and then your results are compared to a series of different results from the fifth computer model, but run with different parameters, in order to show that flood risks have increased from greenhouse gases…” you cannot pretend that this is “a valid representation of reality”, let alone “a sufficiently accurate representation of reality to guide our future actions”.This is precisely why the Nature study is of such significance – because it will undoubtedly be used to guide future actions, which will in one way or another impact on all our lives. For a start, consider the players in this drama. Prof Stott and Dr Allen have long been among the most influential scientists in the world in stoking up climate alarmism. A famous analysis by John McClean showed that they played a key part in compiling the single most important chapter in the IPCC’s last report, in 2007. The chapter, entitled “Understanding and attributing climate change”, cited many more papers by them than anyone else. They have now been appointed as lead authors of the relevant chapter in the NEXT IPCC report, “Detection and attribution of climate change”, which will guide the actions of governments all over the world. As for their two colleagues from Risk Management Solutions, this is not the first time that this leading adviser to the world’s re-insurance industry has been involved in a controversial bid to heighten alarm over the consequences of climate change. In October 2005, in the wake of the Hurricane Katrina disaster, RMS held a meeting in Bermuda with four hurricane specialists, all of the alarmist persuasion, to quiz them as to how they thought hurricane activity was likely to be affected between 2006 and 2010, thanks to climate change, and how this would impact on the southern United States, notably Florida. On the basis of this meeting, RMS advised the re-insurers that the risk of hurricane damage over the next four years was hugely increased. The companies found that their reserves were $82 billion short of what they might be expected to pay. Premiums, particularly in Florida, accordingly rocketed upwards. Under the heading “The $82 billion prediction”, the details of this episode are chronicled on his blog by Dr Roger Pielke Jr, who in 2008 advised RMS that the methodology on which it relied was so biased that “a group of monkeys would have arrived at the exact same results”. Dr Pielke, an expert in environmental impacts, recently published a chart showing how, although the RMS prediction for hurricane damage between 2006 and 2010 was a third higher than the historical average, the actual cost proved to be well under half the average figure. But, thanks to RMS, the insurance industry had made billions from higher premiums. In 2008, following the disastrous floods of summer 2007, that vociferous climate alarmist Bob Ward, now at the Grantham Institute but then a director of RMS, called for the British government to work more closely with the insurance industry “to devise mutually beneficial strategies for dealing with flood risks”. We understand how working with RMS might be beneficial to the insurance industry. But whether, in light of the Nature study, the Government would find it beneficial is another matter – never mind the rest of us, as we are asked to pay ever higher insurance premiums, based not least on the findings of those RMS computer models. www.telegraph.co.uk/comment/columnists/christopherbooker/8349545/Unscientific-hype-about-the-flooding-risks-from-climate-change-will-cost-us-all-dear.html
|
|
|
Post by marchesarosa on Mar 2, 2011 12:35:14 GMT 1
La Nina expected to cause more extreme US tornado effects Joe D'Aleo CCM, AMS Fellow, says: "Tornado season kicks off in February most years, and yesterday’s storm had tornadoes, and other severe weather and with heavy rains after a snowy winter, major flooding. The tornado seasons tend to be more severe in La Ninas with larger outbreaks and stronger tornadoes. Tornadoes tend to focus in the central and southern plains and the Gulf Coast during El Nino years, with a shift to the Midwest, the Ohio and Tennessee Valleys, and the mid-Atlantic region during La Nina years. Bove (1999) showed El Ninos tend to produce more tornadoes in the southern plains, while La Nina seasons are more active in the Ohio Valley and the south. The strengths of these cycles also seem to be a factor in this data as well. Tornadoes during a La Nina are stronger and remain on the ground longer than those observed during an El Nino. That means an increased danger of large destructive and deadly tornadoes during the cold phase. There is also an increased risk of “tornado swarms” or outbreaks of 40 or more twisters from a single weather system in a La Nina season. We believe a recent climate shift favoring a cooler Pacific and more frequent La Nina events suggests we have entered a period of increasing severe storms that could last a decade or more. We saw a burst of activity in the La Nina year of 2008 with 1692 tornadoes. In 2010, an El Nino year, by comparison, 1277 tornadoes occurred. Bove found also most large outbreaks and major tornadoes occur in cold (La Nina) or neutral (La Nada) years. He refers to the analyses by Grazulis in 1991." wattsupwiththat.com/2011/03/01/big-time-la-nina-tornado-and-spring-flood-season-possible/#more-35082
|
|