CAPT John Kusters Commanding Officer Mike Clancy Technical Director Fleet Numerical Meteorology & Oceanography Center Command Overview – Presented to Committee.

Slides:



Advertisements
Similar presentations
Numerical Weather Prediction Readiness for NPP And JPSS Data Assimilation Experiments for CrIS and ATMS Kevin Garrett 1, Sid Boukabara 2, James Jung 3,
Advertisements

WRF Modeling System V2.0 Overview
Critical Skills Shortages Related to Meteorology Marine Meteorology Mr. Rich Jeffries Naval Meteorology and Oceanography Command. OFCM Mini-Workshop, Apr.
IBM 1350 Cluster Expansion Doug Johnson Senior Systems Developer.
Fleet Numerical Meteorology and Oceanography Center (FNMOC) COPC Site Update November 14, 2007 Captain John G. Kusters, USN Fleet Numerical Meteorology.
N84 UNCLASSIFIED Rear Admiral Dave Titley, Ph.D. Oceanographer of the Navy / Director Task Force Climate Change October 15, 2009 This Presentation is Unclassified.
February High Impact Weather Workshop 1 JCSDA-HFIP and -ECMWF Workshop Recommendations Lars Peter Riishojgaard and Sid Boukabara Joint Center for.
Naval Oceanography Naval Oceanography: Excellence in (Tropical) Meteorology Rear Admiral Dave Titley Commander Naval Meteorology and Oceanography Command.
Navy Meteorology and Oceanography Support for Homeland Security Thomas J. Cuff Deputy Technical Director Oceanographer of the Navy 28 November 2001.
National Hurricane Conference Navy Priorities and Initiatives for Tropical Cyclone Research Robert S. Winokur Technical Director Oceanographer of the Navy.
1 WRF Development Test Center A NOAA Perspective WRF ExOB Meeting U.S. Naval Observatory, Washington, D.C. 28 April 2006 Fred Toepfer NOAA Environmental.
18 Aug 2014 Martha Anderson Director, Meteorology and Oceanography Providing Meteorological Services to the Canadian Armed Forces through Network-based.
Introduction to Systems Analysis and Design
Aim High…Fly, Fight, Win NWP Transition from AIX to Linux Lessons Learned Dan Sedlacek AFWA Chief Engineer AFWA A5/8 14 MAR 2011.
ECMWF Slide 1CAS2K3, Annecy, 7-10 September 2003 Report from ECMWF Walter Zwieflhofer European Centre for Medium-Range Weather Forecasting.
NCEP Central Operations Capabilities “Where America’s Climate, Weather, Ocean, and Space Prediction Services Begin” Brent Gordon NCEP/NCO/Systems Integration.
1 NOAA’s Environmental Modeling Plan Stephen Lord Ants Leetmaa November 2004.
U.S. Navy & Marine Corps Program
National Weather Service National Weather Service Central Computer System Backup System Brig. Gen. David L. Johnson, USAF (Ret.) National Oceanic and Atmospheric.
Fleet Numerical Update CSABC Meeting 15 April 2009 James A. Vermeulen Data Ingest Team Supervisor FNMOC 7 Grace Hopper Ave., Stop 1 Monterey, CA
Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Remote HPC Computing Mr. Robert Burke 1.
Future Airborne Capability Environment (FACE)
UNCLASSIFIED Navy Applications of GOES-R Richard Crout, PhD Naval Meteorology and Oceanography Command Satellite Programs Presented to 3rd GOES-R Conference.
NCEP Vision: First Choice – First Alert – Preferred Partner 1 Ocean Prediction Center ( Ming Ji, Director “where NOAA’s ocean obs.,
Maj Richard “Krash” Krasner Directorate of Requirements Headquarters Air Force Space Command Air Force Space Command's Environmental Monitoring Requirements.
1 Addressing Critical Skills Shortages at the NWS Environmental Modeling Center S. Lord and EMC Staff OFCM Workshop 23 April 2009.
NAVAL RESEARCH LABORATORY MARINE METEOROLOGY DIVISION, Monterey CA Operational Application of NAVDAS 3DVAR Analysis for COAMPS Keith Sashegyi Pat Pauley.
10° COSMO GENERAL MEETING Plans and State of Art at USAM/CNMCA Massimo Ferri.
Polar Communications and Weather Mission Canadian Context and Benefits.
Angèle Simard Canadian Meteorological Center Meteorological Service of Canada MSC Computing Update.
Page 1 Pacific THORPEX Predictability, 6-7 June 2005© Crown copyright 2005 The THORPEX Interactive Grand Global Ensemble David Richardson Met Office, Exeter.
1 NUOPC National Unified Operational Prediction Capability 1 Review Committee for Operational Processing Centers National Unified Operational Prediction.
Presented by Leadership Computing Facility (LCF) Roadmap Buddy Bland Center for Computational Sciences Leadership Computing Facility Project.
Cray Innovation Barry Bolding, Ph.D. Director of Product Marketing, Cray September 2008.
Presentation to the Information Services Board March 6, 2008 Bill Kehoe, Chief Information Officer Bill Kehoe, Chief Information Officer.
EPA Geospatial Segment United States Environmental Protection Agency Office of Environmental Information Enterprise Architecture Program Segment Architecture.
Catawba County Board of Commissioners Retreat June 11, 2007 It is a great time to be an innovator 2007 Technology Strategic Plan *
Naval Oceanographic Office Homeland Security CAPT Pete Furze Nov 01 Homeland Security CAPT Pete Furze Nov 01 Committee for Operational Processing.
1 11/25/2015 Developmental Testbed Center (DTC) Bob Gall June 2004.
Last Updated 1/17/02 1 Business Drivers Guiding Portal Evolution Portals Integrate web-based systems to increase productivity and reduce.
RECAPITALIZING THE NATION’S WEATHER PREDICTION CAPABILITY National Unified Operational Prediction Capability (NUOPC)
Thirty Years of Operational Ocean Wave Forecasting at Fleet Numerical Meteorology and Oceanography Center Paul Wittmann and Mike Clancy Fleet Numerical.
Naval Oceanography Naval Oceanography: Enabling Decisions through Excellence in Tropical Cyclone Forecasting CAPT Michael Angove Commanding Officer, NMFC/JTWC.
Cray Environmental Industry Solutions Per Nyberg Earth Sciences Business Manager Annecy CAS2K3 Sept 2003.
Fleet Numerical… Atmospheric & Oceanographic Prediction Enabling Fleet Safety and Decision Superiority… Fleet Numerical Meteorology & Oceanography Center.
1 Proposal for a Climate-Weather Hydromet Test Bed “Where America’s Climate and Weather Services Begin” Louis W. Uccellini Director, NCEP NAME Forecaster.
Applied Sciences Perspective Lawrence Friedl, Program Director NASA Earth Science Applied Sciences Program LANCE User Working Group Meeting  September.
Tropical Cyclone Operations & Research Mary M. Glackin Deputy Under Secretary for Oceans & Atmosphere | NOAA 62 nd Interdepartmental Hurricane Conference.
Action Item NCEP Central Operations 4-5 November 2015 COPC – Fall 2015.
National Centers for Environmental Prediction: “Where America’s Climate, Weather and Ocean Services Begin” An Overview.
RIME A possible experiment for Advancing Antarctic Weather Prediction David H. Bromwich 1, John J. Cassano 1, Thomas R. Parish 2, Keith M. Hines 1 1 -
Evolving the NCEP Production Suite Dr. William M. Lapenta Director, National Centers for Environmental Prediction NOAA/National Weather Service NCEP Production.
CAPT James Pettigrew Commanding Officer Mike Clancy Technical Director Fleet Numerical Meteorology & Oceanography Center Command Overview – Presented to.
Impact of Blended MW-IR SST Analyses on NAVY Numerical Weather Prediction and Atmospheric Data Assimilation James Cummings, James Goerss, Nancy Baker Naval.
Protecting Against Cyber Challenges Pacific Operational Science & Technology Conference 15 March 2011 Rob Wolborsky Chief Technology Officer Space and.
WG-CSAB Update Fall 2008 CDR Mark Moran, NOAA WG-CSAB Acting Chair.
1 National Centers for Environmental Prediction: Where America’s Climate and Weather Services Begin Louis W. Uccellini Director, NCEP January 28, 2004.
U N I T E D S T A T E S D E P A R T M E N T O F C O M M E R C E N A T I O N A L O C E A N I C A N D A T M O S P H E R I C A D M I N I S T R A T I O N.
Central Operations Ben Kyger Acting Director / NCEP CIO.
Fleet Numerical… …Atmospheric & Oceanographic Prediction Enabling Fleet Safety and Decision Superiority… Fleet Numerical Meteorology and Oceanography Center.
© 2010 IBM Corporation STRATEGIC ADVISORY COUNCIL MARCH 2011 Enterprise Architecture - Advisory Discussion – Greg Dietzel Vice President, Client Unit Director,
Roger A. Stocker 1 Jason E. Nachamkin 2 An overview of operational FNMOC mesoscale cloud forecast support 1 FNMOC: Fleet Numerical Meteorology & Oceanography.
Ocean Prediction Center
Fleet Numerical Meteorology & Oceanography Center Command Overview – Presented to Committee Operational Processing Centers 4 May 2010 CAPT James Pettigrew.
Not Approved for Public Release
Naval Research Laboratory
Department of Licensing HP 3000 Replatforming Project Closeout Report
DISN Evolution Mr. Charles Osborn
Presentation transcript:

CAPT John Kusters Commanding Officer Mike Clancy Technical Director Fleet Numerical Meteorology & Oceanography Center Command Overview – Presented to Committee Operational Processing Centers 13 May, 2009 Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority…

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Introduction Overview of Operations Overview of POPS and A2 Cross Domain Solutions NOGAPS Way Ahead MILCON Project Summary Outline

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… 3 3 Operational Operational Excellence The United States Navy is committed to excellence in operational meteorology and oceanography (METOC) – Fleet Safety –Decision Superiority –Information Assurance We are stakeholders and partners in National capability –Key part of the Nation’s ocean & weather infrastructure –Expect to be a major player in NUOPC Near-term goals (CNMOC) – Tighter link between forecasts and decisions – Predictive Oceanography We are the Navy’s Operational Science Community

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… 4 We produce and deliver weather, ocean, and climate information for Fleet Safety, Warfighting Effectiveness, and National Defense. –Numerical Weather Prediction (NWP) is the core of our business. Global and Regional Operational Models Assimilate meteorological and oceanographic data worldwide –~6 million observations daily Scheduled and on-demand products –0 to 240 hour forecasts, updated at 6-hour interval –Specific to Fleet and Joint operations –24x7 Operational Reachback Center Supporting National, Navy, DoD/Joint, and Coalition missions Direct Support for Global Submarine Weather Data fusion for planning and operations FNMOC Mission

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Focus on Strategy Mission We provide and deliver weather, ocean and climate information for Fleet Safety, Warfighting Effectiveness, and National Defense. Vision To be the First Choice for environmental prediction and assessment in direct support of National Defense Values People  Teamwork  Communications Innovation  Technical Excellence Basic Strategy Best-Cost Provider

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Recent Accomplishments Commenced FNMOC models Ops Run on Linux Cluster (operational since October 2008), and retired legacy SGI supercomputers Commenced operational support for the North American Ensemble Forecast System (NAEFS), with NOGAPS Ensemble fields delivered daily to NCEP Upgraded the NOGAPS Ensemble with Ensemble Transform (ET) initialization Implemented the coupled air-sea version of the GFDN tropical cyclone model Commenced Submarine Enroute Weather Forecast (SUBWEAX) mission at SECRET and TS/SCI levels Hosted and upgraded the Naval Oceanography Portal (NOP), the single unclassified and classified operational Web presence for all CNMOC activities Delivered the first spiral of the Automated Optimum Track Ship Routing System (AOTSR) Completed A76 source selection process for IT Services resulting in selection of the government’s Most Efficient Organization (MEO) bid as the winning proposal; implemented IT Services Department (ITSD) on February 1, 2009

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… 7 CNMOC Oceanographic Operations Watch (COOW) –Global and 24x7 –Administrative Reporting –Coordinates and provides support for all time sensitive METOC RFIs Fleet Ops Reachback –Oceanography Portal –Fleet Products: AREPS, TAWS, Staff Briefs, Exercise Support –Tailored Support: STRATCOM, NOPF Whidbey, NSW MSC –Supporting all forward deployed METOC assets ISR Reachback –TAM –JFCC ISR Submarine Weather Direct Support Operational Watch Presence

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Models Fleet Numerical operates a highly integrated and cohesive suite of global, regional and local state-of-the-art weather and ocean models: –Navy Operational Global Atmospheric Prediction System (NOGAPS) –Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) –Navy Atmospheric Variational Data Assimilation System (NAVDAS) –Navy Aerosol Analysis and Prediction System (NAAPS) –GFDN Tropical Cyclone Model –WaveWatch 3 (WW3) Ocean Wave Model –Navy Coupled Ocean Data Assimilation (NCODA) System –Ensemble Forecast System (EFS) 15 km 0 km 10 km 5 km 60 m s m s m s m s - 1 Sierra Owens Valley Mounta in Wave ~30 km Obs ~28 km Cross section of temperature and wind speeds from COAMPS showing mountain waves over the Sierras Surface Pressure and Clouds Predicted by NOGAPS

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Satellite Products SATFOCUS SSMI and SSMI/S Scatterometer Tropical Cyclone Web Page Target Area METOC (TAM) Tactically Enhanced Satellite Imagery (TESI) Example SATFOCUS Products SATFOCUS Dust Enhancement Product SSM/I Water Vapor SSM/I Wind Speed

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… 10 NWP (NOGAPS, COAMPS) Aerosol Modeling Ensemble Models Tropical Cyclone Forecasts Optimum Track Ship Routing Automated High Seas / Wind Warnings Ballistic Wind Computations Electro-Optical Forecasts Aircraft Routing Ocean Acoustic Forecasting Long-Range Planning Ice Forecasts Target Weapon Systems Visibility/Dust Forecasts WRIP CEEMS Search and Rescue Models and Applications

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Overview of POPS and A2

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Program of Record (ACAT IAD Program in sustainment) Provides HPC platforms to run models and applications at the UNCLAS, SECRET and TS/SCI Levels Has been traditionally composed of two Subsystems: –Analysis and Modeling Subsystem (AMS) Models (NOGAPS, COAMPS, WW3, etc.) Data Assimilation (NAVDAS, NAVDAS-AR) –Applications Transactions and Observations Subsystem (ATOS) Model pre- and post-processing (e.g., observational data prep/QC, model output formatting, model visualization and verification, etc.) Full range of satellite processing (e.g., SATFOCUS products, TCWeb Page, DMSP, TAM, etc.) Full range of METOC applications and services (e.g., OPARS, WebSAR, CAGIPS, AREPS, TAWS, APS, ATCF, AOTSR, etc.) Hosting of NOP Portal (single Web Presence for Naval Oceanography) Primary Oceanographic Prediction System (POPS)

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… New POPS Architecture Strategy Drivers: –Knowledge Centric CONOPS, requiring highly efficient reach-back operations, including low-latency on-demand modeling –Growing HPC dominance of Linux clusters based on commodity processors and commodity interconnects –Resource ($) Efficiencies Solution: –Combine AMS and ATOS functionality into a single system –Make the system vendor neutral –Use Linux-cluster technology based on commodity hardware Advantages: –Efficiency for reach-back operations and on-demand modeling –Capability to surge capacity back-and-forth among the AMS (BonD Tier 1) and ATOS (BonD Tiers 0, 2, 3) applications as needed –Cost effectiveness of Linux and commodity hardware vice proprietary operating systems and hardware –Cost savings by converging to a single operating system [Linux] We call the combined AMS and ATOS functionality A2 (AMS + ATOS = A2)

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… A2 Hardware Specifications A2-0 (Opal) Linux Cluster –232 nodes with 2 Intel Xeon 5100-Series “Woodcrest” dual- core processors per node (928 core processors total) –1.86 TB memory (8 GB per node) –10 TFLOPS peak speed –35 TB disk space with 2 GB per second throughput –Double Data Rate Infiniband interconnect at 20 Gb per second –40 Gb per second connection to external Cisco network –4 Clearspeed floating point accelerator cards A2-1 (Ruby) Linux Cluster –146 nodes with 2 Intel Xeon 5300-Series “Clovertown” quad- core processors per node (1168 core processors total) –1.17 TB memory (8 GB per node) –12 TFLOPS peak speed –40 TB disk space with 2 GB per second throughput –Double Data Rate Infiniband interconnect at 20 Gb per second –40 Gb per second connection to external Cisco network –2 GRU floating point accelerator cards

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… A2 Middleware Beyond the hardware components, A2 depends on a complicated infrastructure of supporting systems software (i.e., “middleware”): –SUSE Enterprise Server Linux Operating System, soon to transition to Red Hat Linux V5 –GPFS: the Global Parallel File System, commercial software that provides a file system across all of the nodes of the system –ISIS: Fleet Numerical’s in-house database designed to rapidly ingest and assimilate NWP model output files and external observations –PBSPro: job scheduling software –ClusterWorx: System management software –Various MPI debugging tools, libraries, performance profiling tools, and utilities –TotalView: FORTRAN/MPI debugging tool –PGI’s Cluster Development kit, which includes parallel FORTRAN, C, C++ compilers –Intel FORTRAN Compiler –VMware ESX server

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… 16 Systems are linked directly to ~230 TB of disk space and ~160 TB of tape archive space. FNMOC HPC Systems NAMETYPECPUsMEMORY (GB) PEAK (TFLOPS) OS FS1SGI ORIGIN TRIX FS2SGI ORIGIN TRIX DC3IBM p AIX ATOS2IBM 1350s/x440s/x345s Linux CAAPSIBM e1350s Linux A2-0 (Opal)Linux Cluster Linux A2-1 (Ruby)Linux Cluster Linux A2-2 (Topaz)Linux Cluster Linux TOTAL ~30 FS = File Server / Legacy Cross Domain Solution ATOS = Applications, Transactions, and Observations Subsystem CAAPS = Centralized Atmospheric Analysis and Prediction System A2 = Combined AMS / ATOS System

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Cross Domain Solutions In order to perform its mission, FNMOC fundamentally requires a robust, high-bandwidth Cross Domain Solution (CDS) system to allow two-way flow of data between the UNCLAS and SECRET enclaves within its HPC environment FNMOC’s existing CDS system is nearing the end of it’s lifecycle –SGI computer hardware (FS1, FS2) –Trusted IRIX (TRIX) multi-level secure operating system The DoD Unified Cross Domain Management Office (UCDMO) has recently indicated that all legacy CDS systems must be replaced by one meeting their approval. Existing UCDMO solutions do not meet FNMOC’s bandwidth requirements (currently ~1 GB/sec, increasing to ~64 GB/sec by 2013)

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… CDS Way Ahead FNMOC is currently partnered with the National Security Agency (NSA) to achieve an acceptable CDS that meets high-bandwidth requirements –Seeking interim solution based on adaptation of one of the existing 14 UCDMO approved solutions, allowing phased migration to long-term solution –Seeking long-term solution based on the NSA High Assurance Platform (HAP) project –Expect that resulting solution will also meet NAVO’s requirements Joint NSA/FNMOC presentation at the Oct 2008 DISN Security Accreditation Working Group (DSAWG) resulted in approval of proposed CDS way ahead described above

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… NOGAPS Way Ahead

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… UKMO Engagement CNMOC/FNMOC and NRL explored the possibility of a partnership agreement with UKMO to obtain the UKMO Unified Model (UM) as a new baseline for development of the Navy’s operational global NWP capability. Exploratory discussions took place between CNMOC (RDML Titley, Steve Payne), NRL (Simon Chang) and UKMO senior leadership in Exeter Sept 2009 NRL obtain Research License for the UM code and performed initial testing on FNMOC HPC platforms in late 2008/early Issues: –UM throughput performance is very poor on the FNMOC Linux Cluster architecture –Navy IA policies created significant impediments for working with the UM code –Business and Licensing issues are a concern NRL will test the UM on the IBM architecture at NAVO/MSRC Remote operations of a UM-based ensemble at NAVO/MSRC may be a possibility

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Current Thinking on Navy Global NWP Fully support NUOPC Continue operational transition of most-promising near- term NOGAPS upgrades: –4DVAR (NAVDAS-AR) –Semi-Lagrangian Continue exploring the possibility of running operational UKMO/UM based ensemble (NAVO/MSRC) Seek partnerships for development of a next-generation global NWP model for operational implementation at FNMOC ~ Link to growing Navy interest in: (1) the arctic, (2) climate change, and (3) energy conservation

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Status and Milestones for 4DVAR (NAVDAS-AR) NOGAPS/NAVDAS-AR top raised from 1.0 hPa to 0.04 hPa with NOGAPS T239L42 implementation Began assimilating AIRS, IASI and SSMI/S radiances Produced the following improvements relative to the operational NOGAPS/NAVDAS (3DVAR) implementation: –~6 hours improvement in 500 mb height anomaly correlation at TAU120 –~50 nm improvement in TC track performance at TAU120 (based on Aug/Sep 2008 cases) Final NAVDAS-AR Beta Testing began 16 Mar 2009 and will complete in Jul 2009 Expect operational implementation in Aug 2009

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… 23 Impact of NAVDAS-AR on NOGAPS TC Track Skill Number of Forecasts August 2008 – 17 September 2008 TC Track Error (nm)

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… MILCON Project

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… MILCON Project $9.3M sq ft expansion/renovation of FNMOC Building 700 (Computer Center Building) –Adds 2500 sq ft to Computer Center for NPOESS IDPS –Expands Ops Floor from 1550 to 3200 sq ft –Expands SCIF Ops Floor from 1947 to 3012 sq ft –Expands SCIF Computer Center from 875 to 2240 sq ft –Expands Auditorium from 47 seats to 100 seats Schedule –Design/Build Contract awarded Sep 2008 –Groundbreaking May 27, 2009 –Completion approximately Oct 2010

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… FNMOC MILCON Project FNMOC Building 700 Computer Center

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Summary Several of our recent accomplishments of COPC interest: –Achieved IOC for the A2 Linux Cluster, new host for the FNMOC models Ops Run –Commenced operational support for NAEFS, with NOGAPS Ensemble fields delivered daily to NCEP –Upgraded the NOGAPS Ensemble with ET initialization –Implemented the coupled air-sea version of the GFDN TC model We have embraced Linux-based HPC as we continue to build out our A2 cluster to accommodate the full range of our operational workload We are partnered with NSA to achieve a DoD approved Cross Domain Solution capable of meeting our operational throughput requirements Along with ONR, NRL, CNMOC, and OPNAV N84 we are exploring the way-ahead for Navy Numerical Weather Prediction We will break ground on $9.3M MILCON project in June to expand and renovate Building 700 Computer Center

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Questions?

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… FY09 Objectives

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Top Five FY09 Objectives Implement the A76 Most Efficient Organization (MEO) as the Information Technology Services Department (ITSD). Achieve Full Operational Capability (FOC) for the A2-0 (Opal), A2-1 (Ruby) and A2-2 (Topaz) Linux Cluster systems. Achieve and maintain required Information Assurance (IA) accreditations, and implement DoD-approved Cross Domain Solution (CDS) for transferring data between classification levels. Design and install satellite processing systems in preparation for launch of NPOESS and NPP. Increase skill of METOC model products through implementation of new models, upgrades to existing models, and the assimilation of additional observational data from new sources and sensors.

Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Additional FY09 Objectives Execute the Building 700 MILCON Project while maintaining at least 98.5% uptime for operations. Maintain excellence in Submarine Enroute Weather Forecast (SUBWEAX) support. Achieve IOC for the Automated Optimum Track Ship Routing (AOTSR) system. Field and support next generation version of the Naval Oceanography Portal (NOP). Meet all requirements for the Weather Reentry-Body Interaction Planner (WRIP) project. Meet all requirements for the North American Ensemble Forecast System (NAEFS) project in preparation for full engagement in the National Unified Operational Prediction Capability (NUOPC) initiative. Complete all Cost of War (COW) projects. Maintain excellence in climatology support. Implement Resource Protection Component of the new Weather Delivery Model. Develop a robust Continuity of Operations (COOP) plan for key capabilities. Complete all Primary Oceanographic Prediction System (POPS) Program Decision Board (PDB) actions. Achieve Project Management culture aligned to the POPS program.