Numerical Weather Prediction at Air Force Weather Agency


Download 494 b.
Sana06.06.2018
Hajmi494 b.


Numerical Weather Prediction at Air Force Weather Agency

  • Mark T. Surmeier, Deputy Director

  • Air and Space Science

  • Air Force Weather Agency

  • Offutt AFB, NE


Overview





  • First Operational Use of NWP by USAF

    • History, 4th Weather Group, Jan –Dec 1954
    • Introduced the Numerical Predictions Project
    • Conceived at Air Force Cambridge Research Center (R&D)
    • Programmed for operational establishment (USAF Weather Central) on 1 July 1954 at Andrews AF Base as a joint WBAN project
    • Regional baroclinic model, 300km grid, US domain, 36hr
    • IBM 701 (2048 36-bit words)


  • 1955: USAF Weather Central (NWP focus) Moved to Suitland, MD

  • 1957: USAF Weather Central Moved to Offutt AFB, NE to Combine with the Global Weather Central

  • 1958: First Automated “contrail” Forecasts

  • 1960: Global Weather Central Purchased its First Computer: an IBM 7090

  • 1961: Began Computer Wind Factor Forecasts; Added IBM 1401



  • 1962: Computerized Stratospheric Analyses and Numerical Cloud Forecasts; Added IT&T Automatic Data Exchange 6400

  • 1963: First Automated Facsimile Charts; First Receipt of METSAT Data; Upgraded the IBM 7090 to an IBM 7094

  • 1964: Implemented Quasi-geostrophic Prediction Model (SIXLVL)

    • 381km grid; NH; 72hr forecast length


  • 1965: DoD Established the Automated Weather Network—Worldwide High-Speed Data Collection; Added Terminal to Process METSAT Data from DOC Satellites

  • 1966: Transmitted NWP-Based Products for Asian and European Theaters; Added a second IBM 7094 Computer

  • 1967: Installed Four UNIVAC 1108 Computers

  • 1970: Computer Flight Plans; First Operational PBL Model (185km grid; 7 layers; regional; 24hr forecast—7LVL)



  • 1972: AWS’s Medium Range Forecast Mission Moved from Suitland, MD to AFGWC; Added Fifth Computer—UNIVAC 1110

  • 1974: AWS Primitive Equation (PE) Model (381km res., 7 layers, NH, 72hr forecast)

  • 1976: Upgraded a UNIVAC 1108 to 1110; Added a 1110

  • 1979: Replaced 3 UNIVAC 1108s w 2 UNIVAC 1100/81s

  • 1982: Replaced 2 UNIVAC 1100/81s with 2 Sperry 1100/82s



  • 1986: Advanced Weather Analysis and Prediction System (AWAPS)--Global Spectral Model (GSM) 493km, 14lyrs, global; 96hr fcst; High-Resolution Analysis System (HIRAS); Cray X-MP Supercomputer

  • 1987: Sperry 1100/82s Upgraded to UNISYS 1100/91s



  • 1992: Quote from one of our Tech Notes:

  • “This [the computational cost] makes increasing the grid resolution an ineffective way to increase forecaster skill.”

  • 1995: AFWA Stopped Using the GSM Global Model as Primary (part of the NAVAF Agreement--Used Navy NOGAPS Data; GSM Used as Back-up into 1997)

  • 1996: RWM Run Operationally; First Visualizations; AWAPS-U (IBM SP1 Replaced Cray XMP)

  • 1998: Discontinued RWM



  • 1995: Advanced Concept Technology Demo Using Penn State Univ./NCAR Mesoscale Model 5 (MM5) & 14-Node IBM SP2 System

    • Bosnia window run once per day (NOGAPS initialization)
    • 25  Levels
    • 27 km Grid Resolution
    • 24 Hour Forecast


  • 1997:

    • Added Southwest Asia & CONUS Windows


  • 1999:

    • Ran 18 theaters per day; 80GB/day of model output data; over 90,000 GIF images
    • Images are made available to meteorologists in the field through AFWIN (Air Force Weather Information Network) over the Internet


  • 1999:

    • Parallelized MM5 and supporting applications-- significant decreases in application run times achieved
    • Added IBM Silver node production system (110 nodes, 4 CPUs per node, 332MHz)
      • Provided 2.5 Gigaflops per node
      • GTWAPS evolved from a single system two-frame SP into a six-system eighteen-frame SP
      • 41  levels; 72 Hour Forecast; 36, 12, and 4 km Grid Spacing




  • 1999:



  • 2000:

    • Optimized windows
      • Less overlap-- faster processing; more coverage
      • Focused on current requirements (theaters, coverage, etc.)
      • Projected processing timeline changes based on the cumulative effects of the following:
      • OWS: Incremental receipt of MMLITE GriB files
      • OWS: Incremental post-processing (all parameters)
      • OWS: Incremental visualizations
      • AFWA: Domain, Forecast Length, and Output Frequency changes based on new window plan
      • Overall impact: products available at the OWSs roughly 1½ hours earlier
      • 300,000 products per day


  • 2000:

    • Visualization Boom
      • MIKE: Interactive GrADS application (generated customizable 2-D B&W charts
      • Added more post-processed data types
      • Combined all visualization packages into single GUI
      • Created 2-panel and 4-panel chart options
      • IMaST: Increase Skew-T vertical resolution


  • 2001

    • Implemented Mesoscale Data Assimilation System (MDAS) and Multi-Variate Optimal Interpolation (MVOI)
    • Tropical Storm Bogusing capability for Tropical Theaters (improved track and intensity forecasts for TCs worldwide)
    • AFWA MM5 Window Configurations:
      • Eighteen 45-km theaters (14 regular + 4 Tropical) (Also, T1 - T6 4x daily)
      • Eleven 15-km nests
      • Two 5-km nests
      • Total of 37 windows every 12 hours
      • 80% earth coverage; 98% land coverage


  • 2001

    • MM5 Model Runs Produced on Two Production Platforms, Prod 2 and Prod 3
      • 94 IBM Silver nodes and 41 WH II nodes, respectively
      • 625 Gflops


  • 2001

    • Added Capability to Initialize with NCEP’s AVN/MRF
    • Parameterizations in use for MM5
      • Cumulus: Grell for 45 and 15km; explicit for 5km
      • Planetary boundary layer (PBL): MRF
      • Explicit moisture: Mixed phase (a.k.a. Reisner I)
      • Radiation: Cloud radiation
      • Ground temperature: Five layer soil
    • Post-processing
      • Raw MM5 output used to derive over 100 forecast parameters
      • Algorithms developed by AFWA and external labs
      • AFWIN products (over 400,000 per day; 170,000 GrADS and 231,660 Vis5D)
      • TrimGriB capability (tailored gridded data sets)


  • 2001

    • Objective and Subjective Verification of MM5
    • Integration of WRF Early Release into AFWA Processing Environment
    • MM5 Integrated with Other AFWA Models
      • Land Surface Model (near-global land-surface analysis model)
      • Real-Time Cloud Analysis (RTNEPH/CDFS-II; global cloud analysis models; forecaster enhanced)
      • Advect Cloud (ADVCLD), High-Res Cloud Prog (HRCP), & C-MNS fine resolution cloud forecasts (forecaster enhanced global cloud trajectory forecast models)
      • Snow Analysis (global snow & ice areas; forecaster enhanced)
      • Surface Temperature (global temperature analysis model)


  • 2002

    • Improved Objective and Subjective Verification of MM5
    • Redesigned JAAWIN
    • Operational Implementation of MM5-V3-R5.2 (improved boundary layer physics)


  • 2002

    • Added 2 and 10 meter output of T, RH, u- and v- winds
    • Reduced model biases of forecasts T, Td, wind speed, wind direction, ceiling and visibility
    • MM5-V3-R5.3 Operational Implementation (added options for “unified-LSM” and improved polar physics)


  • 2003

    • Linked to Defense Research and Engineering Network (DREN)—High bandwidth to use HPC centers
    • Established Classified Modeling Capability (Operational in 2004)
    • Established Visiting Scientist Position (assimilation) at JCSDA
    • Completed Common High-performance S/W Support Initiative ($1.5M over 3 yrs—WRF and WRF 3DVAR development)




  • 2003



  • 2004

    • Halted MM5 Technology Enhancement Efforts for WRF
      • WRF is a Community Model Built in the Same Spirit as MM5, but is Designed for Greater Expansion (numerics, physics, and initialization)
      • WRF is Designed for Cloud Scale Phenomena (1-10 km horiz. res. grids) that are not Explicitly Calculated and are not Currently Forecasted Well
      • WRF Addresses Key Warfighter and National Security Effectiveness Issues Caused by Weather
    • Signed WRF National Concept of Operations Framework with NCEP and FNMOC


  • 2004

    • Running WRF V1.2 /1.3 Retrospective Tests at NAVO MSRC and AFWA (executing community WRF Tests)
    • DoD HPCMO funded $3M Navy/AF Distributed Center (fields two platforms to conduct WRF operational tests)
      • Test multiple system configurations
      • Determine configurations that best meet DoD and service unique mesoscale NWP requirements
      • Test operationally capable mesoscale ensemble runs
      • Prototype and test Grid Computing concepts


  • 2004



WRF Visualizations--Isabel





Summary

  • The Air Force has a solid position in NWP history and is well poised for the future. Since the inception of operational use, the Air Force has put and continues to put its emphasis on providing highest quality tailored weather products and decision aids for warfighting operations.






Do'stlaringiz bilan baham:


Ma'lumotlar bazasi mualliflik huquqi bilan himoyalangan ©fayllar.org 2017
ma'muriyatiga murojaat qiling