The subscription at physicsweb.org is free. Sidney Redner from Boston University and Mark Petersen from Los Alamos studied the question how frequently you expect record-breaking weather events to occur:
- Redner & Petersen: article (2005, Philadelphia only, Phys Rev E 74 - 2006)
However, if you have a long-term trend, a specific kind of autocorrelation, the decrease of the record weather events implied by the inverse proportionality law will stop. Instead, the rate of creating record events will approach a constant. If you have a wiggly curve that can however be approximated by an increasing linear function in the long term, it is clear that the numbers from the far past will have a low probability to be records. Only a smaller and fixed number of recent years will have a significant chance to compete as potential record-makers which will lead to an asymptotic time-independent probability of a new record.
On the other hand, the probability of a new record low temperature will go to zero in the presence of a warmind trend - faster than 1/T.
Redner and Petersen have looked at the data from Philadelphia - 126 years of data - and they found no statistically significant deviations from the inverse probability law i.e. no evidence of approaching a constant rate. As far as the frequency of measured record weather events in Philadelphia goes, there exists no climate change.
More recently, their best additional idea was to look at a city with a longer history and tradition than those of Philadelphia, namely Prague that offers records from the last 231 years. And they did find a signal in this case. Their (so far unpublished) analysis led them to the conclusion that it takes about 130 years for the rate of new records to approach a constant. The precise value of 130 years sounds a bit vague to me because the crossover from the inverse proportionality law to the constant regime is surely not sharp and discontinuous.
Nevertheless, you should believe that assuming the existing size of the trends, one needs more than one century of data to be able to recognize noise (weather) from a trend (climate change) in the frequency of record weather events. Equivalently, if someone cares about the "climate change" because it could increase the frequency of extreme events, the analysis shows that it can only do so measurably after more than a century. If someone predicts a huge amount of extreme weather events in 13 years, he seems to exceed the crackpot threshold by an order of magnitude.
Moreover, in 130 years, we may be able to measure a difference; a significant impact on our lives is a completely different level.
Their conclusion also allows us to disqualify all comparisons of record temperatures based on a 100-year timescale or shorter as noise or hysteria. Whoever uses these weather events to prove a long-term trend is a charlatan. The same conclusion, however, also implies that the same warming trend in Prague has existed before 1900 because the constant regime wouldn't be reached if the trend didn't exist before 1900: you need more than 130 years of the trend.
So it is unlikely that the bulk of the warming observed in the Prague data can be explained by the influence of the global industry that was negligible 100 years ago in comparison with its current size. Instead, the 19th century data from the Czech lands played an important role for them to reveal the underlying trend which means that according to their analysis, the bulk of the post-Little-Ice-Age warming should be of natural origin.