User-Responsive Diagnostic Forecast Evaluation Approaches: Application to Tropical Cyclone Predictions

Barbara G. Brown, Louisa B. Nance, Christopher L. Williams, Kathryn M. Newman, James L. Franklin, Edward N. Rappaport, Paul A. Kucera, Robert L. Gall

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

The Hurricane Forecast Improvement Project (HFIP; renamed the “Hurricane Forecast Improvement Program” in 2017) was established by the U.S. National Oceanic and Atmospheric Administration (NOAA) in 2007 with a goal of improving tropical cyclone (TC) track and intensity predictions. A major focus of HFIP has been to increase the quality of guidance products for these parameters that are available to forecasters at the National Weather Service National Hurricane Center (NWS/NHC). One HFIP effort involved the demonstration of an operational decision process, named Stream 1.5, in which promising experimental versions of numerical weather prediction models were selected for TC forecast guidance. The selection occurred every year from 2010 to 2014 in the period preceding the hurricane season (de-fined as August–October), and was based on an extensive verification exercise of retrospective TC forecasts from candidate experimental models run over previous hurricane seasons. As part of this process, user-responsive verification questions were identified via discussions between NHC staff and forecast verification experts, with additional questions considered each year. A suite of statistically meaningful verification approaches consisting of traditional and innovative methods was developed to respond to these questions. Two examples of the application of the Stream 1.5 evaluations are presented, and the benefits of this approach are discussed. These benefits include the ability to provide information to forecasters and others that is relevant for their decision-making processes, via the selection of models that meet forecast quality standards and are meaningful for demonstration to forecasters in the subsequent hurricane season; clarification of user-responsive strengths and weaknesses of the selected models; and identification of paths to model improvement.

Original languageEnglish
Pages (from-to)2321-2342
Number of pages22
JournalWeather and Forecasting
Volume38
Issue number11
DOIs
StatePublished - Nov 2023

Keywords

  • Decision support
  • Forecast verification/skill
  • Model evaluation/performance
  • Numerical weather prediction/forecasting
  • Operational forecasting

Fingerprint

Dive into the research topics of 'User-Responsive Diagnostic Forecast Evaluation Approaches: Application to Tropical Cyclone Predictions'. Together they form a unique fingerprint.

Cite this