Fleet Numerical Update CSABC Meeting 15 April 2009 James A. Vermeulen Data Ingest Team Supervisor FNMOC 7 Grace Hopper Ave., Stop 1 Monterey, CA 93943.

Slides:



Advertisements
Similar presentations
Trusted Computing in Government Networks May 16, 2007 Richard C. (Dick) Schaeffer, Jr. Information Assurance Director National Security Agency.
Advertisements

Cloud Computing: Theirs, Mine and Ours Belinda G. Watkins, VP EIS - Network Computing FedEx Services March 11, 2011.
System Center 2012 R2 Overview
Chapter 15: Packaged Software and Enterprise Resource Planning
Critical Skills Shortages Related to Meteorology Marine Meteorology Mr. Rich Jeffries Naval Meteorology and Oceanography Command. OFCM Mini-Workshop, Apr.
IBM 1350 Cluster Expansion Doug Johnson Senior Systems Developer.
Fleet Numerical Meteorology and Oceanography Center (FNMOC) COPC Site Update November 14, 2007 Captain John G. Kusters, USN Fleet Numerical Meteorology.
Naval Oceanography Naval Oceanography: Excellence in (Tropical) Meteorology Rear Admiral Dave Titley Commander Naval Meteorology and Oceanography Command.
Navy Meteorology and Oceanography Support for Homeland Security Thomas J. Cuff Deputy Technical Director Oceanographer of the Navy 28 November 2001.
National Hurricane Conference Navy Priorities and Initiatives for Tropical Cyclone Research Robert S. Winokur Technical Director Oceanographer of the Navy.
Copyright 2009 FUJITSU TECHNOLOGY SOLUTIONS PRIMERGY Servers and Windows Server® 2008 R2 Benefit from an efficient, high performance and flexible platform.
1 WRF Development Test Center A NOAA Perspective WRF ExOB Meeting U.S. Naval Observatory, Washington, D.C. 28 April 2006 Fred Toepfer NOAA Environmental.
MSIS 110: Introduction to Computers; Instructor: S. Mathiyalakan1 Systems Design, Implementation, Maintenance, and Review Chapter 13.
Introduction to Systems Analysis and Design
DISA’s Transformation to a Platform Service Provider A Combat Support Agency Defense Information Systems Agency DISA Computing Services August 2011.
Virtualization for Cloud Computing
Aim High…Fly, Fight, Win NWP Transition from AIX to Linux Lessons Learned Dan Sedlacek AFWA Chief Engineer AFWA A5/8 14 MAR 2011.
System Design/Implementation and Support for Build 2 PDS Management Council Face-to-Face Mountain View, CA Nov 30 - Dec 1, 2011 Sean Hardman.
© 2012 Cisco and/or its affiliates. All rights reserved. Cisco Confidential 1 Cisco CloudVerse for Government: Helping Agencies Reduce Costs and Respond.
Effectively Explaining the Cloud to Your Colleagues.
Grid for Coupled Ensemble Prediction (GCEP) Keith Haines, William Connolley, Rowan Sutton, Alan Iwi University of Reading, British Antarctic Survey, CCLRC.
CAPT John Kusters Commanding Officer Mike Clancy Technical Director Fleet Numerical Meteorology & Oceanography Center Command Overview – Presented to Committee.
MANIT WEB HOSTING SERVICES Presented by - Sandeep Jain & Devesh Lal CRISP, Bhopal.
Server Virtualization: Navy Network Operations Centers
Nurjana Technologies Company Presentation. Nurjana Technologies (NT) is a small business enterprise founded in 2012 and operating in Aerospace and Defence.
National Weather Service National Weather Service Central Computer System Backup System Brig. Gen. David L. Johnson, USAF (Ret.) National Oceanic and Atmospheric.
Putting a Face on Electronic Commerce Kathy Warden.
Copyright 2009 Fujitsu America, Inc. 0 Fujitsu PRIMERGY Servers “Next Generation HPC and Cloud Architecture” PRIMERGY CX1000 Tom Donnelly April
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
Fleet Numerical… Supercomputing Excellence for Fleet Safety and Warfighter Decision Superiority… Remote HPC Computing Mr. Robert Burke 1.
1 Addressing Critical Skills Shortages at the NWS Environmental Modeling Center S. Lord and EMC Staff OFCM Workshop 23 April 2009.
Principles of Information Systems, Sixth Edition Systems Design, Implementation, Maintenance, and Review Chapter 13.
NWS Linux Implementation Barry C. West Chief Information Officer.
10° COSMO GENERAL MEETING Plans and State of Art at USAM/CNMCA Massimo Ferri.
OOI CI LCA REVIEW August 2010 Ocean Observatories Initiative OOI Cyberinfrastructure Architecture Overview Michael Meisinger Life Cycle Architecture Review.
1 NUOPC National Unified Operational Prediction Capability 1 Review Committee for Operational Processing Centers National Unified Operational Prediction.
Middleware for FIs Apeego House 4B, Tardeo Rd. Mumbai Tel: Fax:
Cray Innovation Barry Bolding, Ph.D. Director of Product Marketing, Cray September 2008.
Catawba County Board of Commissioners Retreat June 11, 2007 It is a great time to be an innovator 2007 Technology Strategic Plan *
1 11/25/2015 Developmental Testbed Center (DTC) Bob Gall June 2004.
Last Updated 1/17/02 1 Business Drivers Guiding Portal Evolution Portals Integrate web-based systems to increase productivity and reduce.
Nanco: a large HPC cluster for RBNI (Russell Berrie Nanotechnology Institute) Anne Weill – Zrahia Technion,Computer Center October 2008.
RECAPITALIZING THE NATION’S WEATHER PREDICTION CAPABILITY National Unified Operational Prediction Capability (NUOPC)
Naval Oceanography Naval Oceanography: Enabling Decisions through Excellence in Tropical Cyclone Forecasting CAPT Michael Angove Commanding Officer, NMFC/JTWC.
Fleet Numerical… Atmospheric & Oceanographic Prediction Enabling Fleet Safety and Decision Superiority… Fleet Numerical Meteorology & Oceanography Center.
Principles of Information Systems, Sixth Edition 1 Systems Design, Implementation, Maintenance, and Review Chapter 13.
Tropical Cyclone Operations & Research Mary M. Glackin Deputy Under Secretary for Oceans & Atmosphere | NOAA 62 nd Interdepartmental Hurricane Conference.
Robert Mahowald August 26, 2015 VP, Cloud Software, IDC
National Centers for Environmental Prediction: “Where America’s Climate, Weather and Ocean Services Begin” An Overview.
CAPT James Pettigrew Commanding Officer Mike Clancy Technical Director Fleet Numerical Meteorology & Oceanography Center Command Overview – Presented to.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
WG-CSAB Update Fall 2008 CDR Mark Moran, NOAA WG-CSAB Acting Chair.
1 National Centers for Environmental Prediction: Where America’s Climate and Weather Services Begin Louis W. Uccellini Director, NCEP January 28, 2004.
Network Requirements Analysis CPIT 375 Data Network Designing and Evaluation.
Central Operations Ben Kyger Acting Director / NCEP CIO.
Overview of SAP Application Services By Accely. Introduction Developed organizations in any business industry will invest in SAP programs to offer progressive.
Architecture of a platform for innovation and research Erik Deumens – University of Florida SC15 – Austin – Nov 17, 2015.
Roger A. Stocker 1 Jason E. Nachamkin 2 An overview of operational FNMOC mesoscale cloud forecast support 1 FNMOC: Fleet Numerical Meteorology & Oceanography.
Research and Service Support Resources for EO data exploitation RSS Team, ESRIN, 23/01/2013 Requirements for a Federated Infrastructure.
READ ME FIRST Use this template to create your Partner datasheet for Azure Stack Foundation. The intent is that this document can be saved to PDF and provided.
Fleet Numerical Meteorology & Oceanography Center Command Overview – Presented to Committee Operational Processing Centers 4 May 2010 CAPT James Pettigrew.
BIL 424 NETWORK ARCHITECTURE AND SERVICE PROVIDING.
Not Approved for Public Release
CIS 515 STUDY Lessons in Excellence-- cis515study.com.
Naval Research Laboratory
Introduce yourself Presented by
OU BATTLECARD: WebLogic Server 12c
Presentation transcript:

Fleet Numerical Update CSABC Meeting 15 April 2009 James A. Vermeulen Data Ingest Team Supervisor FNMOC 7 Grace Hopper Ave., Stop 1 Monterey, CA (831) James A. Vermeulen Data Ingest Team Supervisor FNMOC 7 Grace Hopper Ave., Stop 1 Monterey, CA (831)

Introduction Overview of Operations Overview of POPS and A2 Cross Domain Solutions UKMO Engagement MILCON Project FY09 Objectives Summary Outline

Introduction

Mission To provide the highest quality, most relevant and timely Meteorological and Oceanographic (METOC) support to U.S. and coalition forces.

Recent Accomplishments Achieved Initial Operating Capability (IOC) for the A2 supercomputer system, new host for the FNMOC models Ops Run Commenced operational support for the North American Ensemble Forecast System (NAEFS), with NOGAPS Ensemble fields delivered daily to NCEP Upgraded the NOGAPS Ensemble with Ensemble Transform (ET) initialization Implemented the coupled air-sea version of the GFDN tropical cyclone model Took on Submarine Enroute Weather Forecast (SUBWEAX) mission at SECRET and TS/SCI levels Hosted and achieved IOC for the Naval Oceanography Portal (NOP), the single unclassified and classified operational Web presence for all CNMOC activities Delivered the first spiral of the Automated Optimum Track Ship Routing System (AOTSR) Completed A76 source selection process for IT Services resulting in selection of the government’s Most Efficient Organization (MEO) bid as the winning proposal

Overview of Operations

Operations Center –Manned 24x7 by a team of military and civilian watch standers –Focused on operational mission support, response to requests for special support and products, and customer liaison for DoD operations worldwide –Joint Task Force capable –The Navy’s Worldwide Meteorology/Oceanography Operations Watch –Operates at UNCLAS, CONFIDENTIAL and SECRET levels –SUBWEAX mission at SECRET level Sensitive Compartmented Information Facility (SCIF) –Extension of Ops Center –Operational communications (including secure video teleconferencing), tasking and processing elevated to the TS/SCI level if needed –Includes significant supercomputer capacity at TS/SCI level –SUBWEAX mission at TS/SCI level Ops Run –Scheduled and on-demand 24x7 production –6 million meteorological observations and 2 million products each day –15 million lines of code and ~16,000 job executions per day –Highly automated and very reliable Operations

Models Fleet Numerical operates a highly integrated and cohesive suite of global, regional and local state-of-the-art weather and ocean models: –Navy Operational Global Atmospheric Prediction System (NOGAPS) –Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) –Navy Atmospheric Variational Data Assimilation System (NAVDAS) –Navy Aerosol Analysis and Prediction System (NAAPS) –GFDN Tropical Cyclone Model –WaveWatch 3 (WW3) Ocean Wave Model –Navy Coupled Ocean Data Assimilation (NCODA) System –Ensemble Forecast System (EFS) 15 km 0 km 10 km 5 km 60 m s m s m s m s - 1 Sierra Owens Valley Mounta in Wave ~30 km Obs ~28 km Cross section of temperature and wind speeds from COAMPS showing mountain waves over the Sierras Surface Pressure and Clouds Predicted by NOGAPS

Satellite Products SATFOCUS SSMI and SSMI/S Scatterometer Tropical Cyclone Web Page Target Area METOC (TAM) Tactically Enhanced Satellite Imagery (TESI) Example SATFOCUS Products SATFOCUS Dust Enhancement Product SSM/I Water Vapor SSM/I Wind Speed

Overview of POPS and A2

Program of Record (ACAT IAD Program in sustainment) Provides HPC platforms to run models and applications at the UNCLAS, SECRET and TS/SCI Levels Has been traditionally composed of two Subsystems: –Analysis and Modeling Subsystem (AMS) Models (NOGAPS, COAMPS, WW3, etc.) Data Assimilation (NAVDAS, NAVDAS-AR) –Applications Transactions and Observations Subsystem (ATOS) Model pre- and post-processing (e.g., observational data prep/QC, model output formatting, model visualization and verification, etc.) Full range of satellite processing (e.g., SATFOCUS products, TCWeb Page, DMSP, TAM, etc.) Full range of METOC applications and services (e.g., OPARS, WebSAR, CAGIPS, AREPS, TAWS, APS, ATCF, AOTSR, etc.) Hosting of NOP Portal (single Web Presence for Naval Oceanography) Primary Oceanographic Prediction System (POPS)

Battlespace on Demand (BonD) and Current POPS Subsystems Forecast Battlespace Tier 3 – the Decision Layer Tier 2 – the Performance Layer Tier 1 – the Environment Layer Observational Data Satellites Fleet Data ATOS AMS

POPS Architecture Strategy Going Forward Drivers: –Knowledge Centric CONOPS, requiring highly efficient reach-back operations, including low-latency on-demand modeling –Growing HPC dominance of Linux clusters based on commodity processors and commodity interconnects –Resource ($) Efficiencies Solution: –Combine AMS and ATOS functionality into a single system –Make the system ‘vendor agnostic’ –Use Linux-cluster technology based on commodity hardware Advantages: –Efficiency for reach-back operations and on-demand modeling Shared file system and shared databases Lower latency for on-demand model response –Capability to surge capacity back-and-forth among the AMS (BonD Tier 1) and ATOS (BonD Tiers 0, 2, 3) applications as needed –Cost effectiveness of Linux and commodity hardware vice proprietary operating systems and hardware –Cost savings by converging to a single operating system [Linux] We call the combined AMS and ATOS functionality A2 (AMS + ATOS = A2)

Forecast Battlespace Tier 3 – the Decision Layer Tier 2 – the Performance Layer Tier 1 – the Environment Layer Observational Data Satellites Fleet Data BonD and Future POPS A2

A2 Hardware Specifications A2-0 (Opal) - UNCLAS Linux Cluster –232 nodes with 2 Intel Xeon 5100-Series “Woodcrest” dual-core processors per node (928 core processors total) –1.86 TB memory (8 GB per node) –10 TFLOPS peak speed –35 TB disk space with 2 GB per second throughput –Double Data Rate Infiniband interconnect at 20 Gb per second –40 Gb per second connection to external Cisco network –4 Clearspeed floating point accelerator cards A2-1 (Ruby) - SECRET Linux Cluster –146 nodes with 2 Intel Xeon 5300-Series “Clovertown” quad-core processors per node (1168 core processors total) –1.17 TB memory (8 GB per node) –12 TFLOPS peak speed –40 TB disk space with 2 GB per second throughput –Double Data Rate Infiniband interconnect at 20 Gb per second –40 Gb per second connection to external Cisco network –2 GRU floating point accelerator cards

A2 Hardware Specifications A2-S (Topaz) - TS/SCI Linux Cluster –Add Topaz h/w specs here

A2 Middleware Beyond the hardware components, A2 depends on a complicated infrastructure of supporting systems software (i.e., “middleware”): –Suse Enterprise Server Linux Operating System –GPFS: the Global Parallel File System, commercial software that provides a file system across all of the nodes of the system –ISIS: Fleet Numerical’s in-house database designed to rapidly ingest and assimilate NWP model output files and external observations –PBSPro: job scheduling software –ClusterWorx: System management software –Various MPI debugging tools, libraries, performance profiling tools, and utilities –TotalView: FORTRAN/MPI debugging tool –PGI’s Cluster Development kit, which includes parallel FORTRAN, C, C++ compilers –Intel Compiler –VMware ESX server

A2 High-Level Timeline A2-0 (Opal) – UNCLAS System –Procurement started Jun 06 –Full system on-site Jan 07 –Achieved IOC 15 Oct 08 A2-1 (Ruby) – SECRET System –Procurement started May 07 –System on site Sep 07 –Expect to achieve IOC Feb 09 A2-S (Topaz) – TS/SCI System –Procurement started ???? –System on site ???? –Expect to achieve IOC ????

1 Used for TS/SCI level modeling in the SCIF. To be replaced in FY09 by A2-S. 2 No longer used for running models. Continue to serve as a legacy Cross Domain Solution (CDS), providing transfer of data between classified and unclassified systems. POPS Computer Systems NAMETYPE#CPUsMEMORY (GB) PEAK SPEED (TFLOPS) OS AMS2 1 SGI ORIGIN TRIX AMS3 2 SGI ORIGIN TRIX AMS4 2 SGI ORIGIN TRIX DC3IBM p AIX ATOS2IBM 1350s/x440s/x345s Linux CAAPSIBM e1350s Linux A2-0 (Opal)Linux Cluster Linux A2-1 (Ruby)Linux Cluster Linux A2-S (Topaz)Linux Cluster???? Linux TOTAL???? ~??

Cross Domain Solutions

In order to perform its mission, FNMOC fundamentally requires a robust, high-bandwidth Cross Domain Solution (CDS) system to allow two-way flow of data between the UNCLAS and SECRET enclaves within its HPC environment FNMOC’s existing CDS system is nearing the end of it’s lifecycle –SGI computer hardware (AMS3, AMS4) –Trusted IRIX (TRIX) multi-level secure operating system The DoD Unified Cross Domain Management Office (UCDMO) has recently indicated that all legacy CDS systems must be replaced by one meeting their approval. Existing UCDMO solutions do not meet FNMOC’s bandwidth requirements (currently ~1 GB/sec, increasing to ~64 GB/sec by 2013)

CDS Way Ahead FNMOC is currently partnered with the National Security Agency (NSA) to achieve an acceptable CDS that meets high-bandwidth requirements –Seeking interim solution based on modification of one or more of the existing 14 UCDMO approved solutions, allowing phased migration to long-term solution –Seeking long-term solution based on the NSA High Assurance Platform (HAP) project –Expect that resulting solution will also meet NAVO’s requirements Joint NSA/FNMOC presentation at the Oct 2008 DISN Security Accreditation Working Group (DSAWG) resulted in approval of proposed CDS way ahead described above

UKMO Engagement

Re-Baselining NOGAPS CNMOC/FNMOC and NRL are exploring the possibility of a partnership agreement with UKMO to obtain the UKMO Unified Model (UM) as a new baseline for development of the Navy’s operational global NWP capability. Intended to foster joint NRL/UKMO model development of a new operational global model for FNMOC. NRL to retain is global NWP expertise to ensure that resulting new model retains the attributes of NOGAPS that are important to Navy, while building on the advanced dynamics, physics and data assimilation capabilities of the UKMO UM. Exploratory discussions between CNMOC (RDML Titley, Steve Payne), NRL (Simon Chang) and UKMO senior leadership took place in Exeter last month. NRL has obtained a research license for the UM and is beginning initial testing of the model on the FNMOC computer system. Moving toward a go/no-go decision on this course of action in late January. Navy remains firmly committed to NUOPC, and sees this as a way to improve the quality of the NUOPC Ensemble.

Current NWP Configuration Decoders QC Programs NAVDAS NAVDAS/AR NOGAPS Forecast Model Applications/ Users Observational Data

Possible Phase 1 Transition Decoders QC Programs NAVDAS NAVDAS/AR Applications/ Users Observational Data UM Forecast Model Freeze NOGAPS forecast model development and replace with UM. This will require new output programs to and from the forecast model. A rigorous statistical evaluation for multiple seasons will be performed.

Possible Phase 2 Transition NAVDAS NAVDAS/AR Applications/ Users Observational Data Decoders QC Programs UM Forecast Model Freeze QC development and replace with UM QC, and develop decoders for UM QC programs. A rigorous statistical evaluation for multiple seasons will be performed.

Possible Phase 3 Transition Decoders UM QC Programs UM 4D Variational Analysis UM Forecast Model Observational Data Applications/ Users Freeze NAVDAS/AR development and replace with UM 4-D analysis system. A rigorous statistical evaluation for multiple seasons will be performed. Conduct tests to determine impact on downstream products.

MILCON Project

$9.3M sq ft expansion/renovation of FNMOC Building 700 (Computer Center Building) –Adds 2500 sq ft to Computer Center for NPOESS IDPS –Expands Ops Floor from 1550 to 3200 sq ft –Expands SCIF Ops Floor from 1947 to 3012 sq ft –Expands SCIF Computer Center from 875 to 2240 sq ft –Expands Auditorium from 47 seats to 100 seats Schedule –Design/Build Contract awarded Sep 2008 –Groundbreaking Jan 2009 –Completion approximately Oct 2010

FNMOC MILCON Project FNMOC Building 700 Computer Center

FY09 Objectives

Top Five FY09 Objectives Implement the A76 Most Efficient Organization (MEO) as the Information Technology Services Department (ITSD). Achieve Full Operational Capability (FOC) for the A2-0 (Opal), A2-1 (Ruby) and A2-S (Topaz) computer systems. Achieve and maintain required Information Assurance (IA) accreditations, and implement DoD-approved Cross Domain Solution (CDS) for transferring data between classification levels. Design and install satellite processing systems in preparation for launch of NPOESS and NPP. Increase skill of METOC model products through implementation of new models, upgrades to existing models, and the assimilation of additional observational data from new sources and sensors.

Additional FY09 Objectives Execute the Building 700 MILCON Project while maintaining at least 98.5% uptime for operations. Maintain excellence in Submarine Enroute Weather Forecast (SUBWEAX) support. Achieve FOC for the Automated Optimum Track Ship Routing (AOTSR) system. Field and support next generation version of the Naval Oceanography Portal (NOP). Meet all requirements for the Weather Reentry-Body Interaction Planner (WRIP) project. Meet all requirements for the North American Ensemble Forecast System (NAEFS) and Joint Ensemble Forecast System (JEFS) projects in preparation for full engagement in the National Unified Operational Prediction Capability (NUOPC) initiative. Complete all Cost of War (COW) projects. Maintain excellence in climatology support through FNMOD Asheville. Implement Resource Protection Component of the new Weather Delivery Model. Develop a robust Continuity of Operations (COOP) plan for key capabilities. Complete all Primary Oceanographic Prediction System (POPS) Program Decision Board (PDB) actions. Achieve Project Management culture aligned to the POPS program.

Summary

Several of our recent accomplishments will be of particular interest to COPC: –Achieved IOC for the A2 supercomputer system, new host for the FNMOC models Ops Run –Commenced operational support for NAEFS, with NOGAPS Ensemble fields delivered daily to NCEP –Upgraded the NOGAPS Ensemble with ET initialization –Implemented the coupled air-sea version of the GFDN TC model We have embraced Linux-based HPC as we continue to build out our A2 cluster to accomodate the full range of our operational workload We are partnered with NSA to achieve a DoD approved Cross Domain Solution capable of meeting our operational throughput requirements Along with NRL and CNMOC, we are exploring the possibility of a partnership with UKMO to re-baseline the Navy’s global NWP model We will break ground on $9.3M MILCON project in Jan to expand and renovate Building 700 Computer Center We have a full plate of FY09 objectives, not the least of which is implementing the MEO resulting from our completed A76 Study