TY - JOUR
T1 - Spare change
T2 - Evaluating revised forecasts
AU - Fowler, Tressa L.
AU - Brown, Barbara G.
AU - Gotway, John Halley
AU - Kucera, Paul
N1 - Publisher Copyright:
© 2015, India Meteorological Department. All rights reserved.
PY - 2015
Y1 - 2015
N2 - The question of consistency of revised forecasts through time often comes up in the context of weather events such as hurricanes or high wind days. For a single event, forecasts are made, and then revised as the time of the event nears. Hopefully, the revision reflects new and better information that will yield a better forecast. Nonetheless, if forecasts change frequently or by large amounts, a user may believe they are poor or uncertain. Over time, a user may lose trust in forecasts that are not consistent. This is particularly an issue for decision makers who create plans based on early forecasts (e.g., emergency managers), then must change their plans repeatedly as new forecasts arrive. Thus, for forecasts that are revised, the consistency in the revisions is an important aspect of forecast quality. Unfortunately, though everyone knows forecast consistency when they see it, the use of objective measures to evaluate consistency in forecast verification is very limited. A similar problem exists in economic forecasting, where some simple tests are applied to a single time series to measure the consistency. However, these measures do not easily extend to weather forecasts that may be multi-dimensional or a collection of many time series. For example, there is no simple way to measure the back and forth (or 'windshield wiper') effect of changing hurricane track forecasts. In this paper, some consistency measures of forecast revision time series are discussed. Extensions of these measures to more complex forecasts are examined using some preliminary examples from hurricane forecast tracks and accumulations of multiple time series. Particular attention is paid to comparisons of consistency measures between competing forecasts.
AB - The question of consistency of revised forecasts through time often comes up in the context of weather events such as hurricanes or high wind days. For a single event, forecasts are made, and then revised as the time of the event nears. Hopefully, the revision reflects new and better information that will yield a better forecast. Nonetheless, if forecasts change frequently or by large amounts, a user may believe they are poor or uncertain. Over time, a user may lose trust in forecasts that are not consistent. This is particularly an issue for decision makers who create plans based on early forecasts (e.g., emergency managers), then must change their plans repeatedly as new forecasts arrive. Thus, for forecasts that are revised, the consistency in the revisions is an important aspect of forecast quality. Unfortunately, though everyone knows forecast consistency when they see it, the use of objective measures to evaluate consistency in forecast verification is very limited. A similar problem exists in economic forecasting, where some simple tests are applied to a single time series to measure the consistency. However, these measures do not easily extend to weather forecasts that may be multi-dimensional or a collection of many time series. For example, there is no simple way to measure the back and forth (or 'windshield wiper') effect of changing hurricane track forecasts. In this paper, some consistency measures of forecast revision time series are discussed. Extensions of these measures to more complex forecasts are examined using some preliminary examples from hurricane forecast tracks and accumulations of multiple time series. Particular attention is paid to comparisons of consistency measures between competing forecasts.
KW - Revised forecasts
KW - Surface station forecast
KW - TC forecast
KW - Wald-wolfowitz test
UR - https://www.scopus.com/pages/publications/84951009543
U2 - 10.54302/mausam.v66i3.572
DO - 10.54302/mausam.v66i3.572
M3 - Article
AN - SCOPUS:84951009543
SN - 0252-9416
VL - 66
SP - 635
EP - 644
JO - Mausam
JF - Mausam
IS - 3
ER -