Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Community T errain-following O cean Modeling S ystem 2003 Terrain-Following Ocean Models Users Workshop PMEL, Seattle, WA, August 5, 2003.

Similar presentations


Presentation on theme: "A Community T errain-following O cean Modeling S ystem 2003 Terrain-Following Ocean Models Users Workshop PMEL, Seattle, WA, August 5, 2003."— Presentation transcript:

1 A Community T errain-following O cean Modeling S ystem 2003 Terrain-Following Ocean Models Users Workshop PMEL, Seattle, WA, August 5, 2003

2 Developers and Collaborators Hernan G. Arango Alexander F. Shchepetkin W. Paul Budgell Bruce D. Cornuelle Emanuele DiLorenzo Tal Ezer Mark Hadfield Kate Hedstrom Robert Hetland John Klinck Arthur J. Miller Andrew M. Moore Christopher Sherwood Rich Signell John C. Warner John Wilkin Rutgers University UCLA IMR, Norway SIO Princeton University NIWA, New Zealand University of Alaska, ARSC TAMU Old Dominion SIO University of Colorado USGS/WHOI SACLANT USGS/WHOI Rutgers University

3 Executive Committee Dale B. Haidvogel James C. McWilliams Robert Street Rutgers University UCLA Stanford University ONR Support Manuel Fiadeiro Terri Paluszkiewicz Charles Linwood Vincent

4 To design, develop and test an expert ocean modeling system for scientific and operational applications over a wide range of scales from coastal to global To provide a platform for coupling with operational atmospheric models, sediment models, and ecosystem models To support multiple levels of nesting and composed grids To provide tangent linear and adjoint models for variational data assimilation, ensemble forecasting, and stability analysis To provide a framework for massive parallel computations Objectives

5 Use state-of-the-art advances in numerical techniques, subgrid-scale parameterizations, data assimilation, nesting, computational performance and parallelization Modular design with ROMS as a prototype Test and evaluate the computational kernel and various algorithms and parameterizations Build a suite of test cases and application databases Provide a web-based support to the user community and a linkage to primary developers Approach

6 ROMS/TOMS 2.0 released to beta testers on January 16, 2003 and full user community on June 30, 2003. Accomplishments Built tangent linear and adjoint models and tested on realistic applications of the US West and East Coasts: eigenmodes and adjoint eigenmodes, singular vectors, pseudospectra, forcing singular vectors, stochastic optimals, and ensemble forecasting.

7 (Relief Image from NOAA Animation by Rutgers) The model is used in oceanographic studies in over 30 countries by: Universities Government Agencies Research Organizations

8 Ocean Modeling Web Site http://www.ocean-modeling.org/

9 Free-surface, hydrostatic, primitive equation model Generalized, terrain-following vertical coordinates Boundary-fitted, orthogonal curvilinear, horizontal coordinates on an Arakawa C-grid Non-homogeneous predictor/corrector time- stepping algorithm Accurate discretization of the baroclinic pressure gradient term High-order advection schemes Continuous, monotonic reconstruction of vertical gradients to maintain high-order accuracy Kernel Attributes

10 Vertical Terrain-following Coordinates Dubrovnik (Croatia) Vieste (Italy) Longitude Depth (m)

11 Curvilinear Transformation Cartesian Spherical Polar

12 Modular, efficient, and portable F90/F95 Fortran code with dynamical allocation of memory via de- referenced pointer structures. C-preprocessing managing Multiple levels of nesting and composed grids Lateral boundary conditions options for closed, periodic, and radiation Arbitrary number of tracers (active and passive) Input and output NetCDF data structure Support for parallel execution on both shared- and distributed -memory architectures Code Design

13 Model Grid Configuration Nested Composed

14 Coarse-grained parallelization Parallel Framework

15 } } Nx Ny Parallel Tile Partitions 8 x 8

16 Coarse-grained parallelization Parallel Framework Shared-memory, compiler depend directives MAIN (OpenMP 2.0 standard) Distributed-memory (MPI) Optimized for cache-bound computers ZIG-ZAG cycling sequence of tile partitions Few synchronization points Serial and Parallel I/O (via NetCDF) Efficiency 4-64 threads

17 (Ezer)

18 The cost of saving output and global averaging is much higher for the MPI code (for the shared-memory SGI machine) (Ezer)

19 Horizontal mixing of tracers along level, geopotential, isopycnic surfaces Transverse, isotropic stress tensor for momentum Local, Mellor-Yamada, level 2.5, closure scheme Non-local, K-profile, surface and bottom closure scheme Local, Mellor-Yamada, level 2.5, closure scheme General Length-Scale turbulence closure (GOTM) Subgrid-Scale Parameterizations

20 Air-Sea interaction boundary layer from COARE (Fairall et al., 1996) Oceanic surface boundary layer (KPP; Large et al., 1994) Oceanic bottom boundary layer (inverted KPP; Durski et al., 2001) Boundary Layers Wave / Current / Sediment bed boundary layer (Styles and Glenn, 2000; Blaas; Sherwood)

21 Lagrangian Drifters (Klinck, Hadfield, Capet) Tidal Forcing (Hetland, Signell) River Runoff (Hetland, Signell, Geyer) Sea-Ice (Budgell, Hedstrom) Biology Fasham-type Model (Moisan, Di Lorenzo, Shchepetkin, Frenzel, Fennel, Wilkin) EcoSim Bio-Optical Model (Bissett, Wilkin) Sediment erosion, transport and deposition (Warner, Sherwood, Blaas) Modules

22 Ongoing and Future Work One- and two-way nesting Wetting and drying capabilities Sediment model Bottom boundary layer models Ice model Parallelization of adjoint model Variational data assimilation Parallel IO Framework (ESMF) Web-based dynamic documentation Test cases WRF coupling

23 One-Way Nesting

24 North Atlantic Basin 1/10 degree resolution (1002x1026x30) Levitus Climatology NCEP daily winds: 1994-2000 COADS monthly heat fluxes Requirements: Memory: 11 Gb Input data disk space: 16 Gb Ouput data disk space: 280 Gb 32 Processors Origin 3800, 4x16 CPU: 46 hours per day of simulation Wall clock: 153 days for 7-year simulation

25 Bathymetry 1/10 degree (m) Resolution ETOPO5 r-factor = 3.2

26 Free-Surface (m)

27 Temperature at 100 m

28 US East Coast 30 km resolution (192x64x30) Initialized for North Atlantic Basin Simulation NCEP daily winds COADS monthly heat fluxes with imposed daily shortwave radiation cycle. One-way nesting Boundary conditions from 3-day averages Flather/Chapman OBC for 2D momentum Clamped OBC for 3D momentum and tracers Rivers Fasham-type biology model

29 Temperature One-Way at 100 m Coupling (Wilkin)

30 Temperature Potential at 50 m (Celsius) 30 km Resolution (Wilkin)

31 Surface Temperature Surface Chlorophyll

32 Publications Ezer, T., H.G. Arango and A.F. Shchepetkin, 2002: Developments in Terrain-Following Ocean Models: Intercomparisons of Numerical Aspects, Ocean Modelling, 4, 249-267. Haidvogel, D.B., H.G. Arango, K. Hedstrom, A. Beckmann, P. Malanotte-Rizzoli, and A.F. Shchepetkin, 2000: Model Evaluation Experiments in the North Atlantic: Simulations in Nonlinear Terrain-Following Coordinates, Dyn. Atmos. Oceans, 32, 239-281. MacCready, P. and W.R. Geyer, 2001: Estuarine Salt Flux through an Isoline Surface, J. Geoph. Res., 106, 11629-11639. Malanotte-Rizzoli, P., K. Hedstrom, H.G. Arango, and D.B. Haidvogel, 2000: Water Mass Pathways Between the Subtropical and Tropical Ocean in a Climatological Simulation of the North Atlantic Ocean Circulation, Dyn. Atmos. Oceans, 32, 331-371. Marchesiello, P., J.C. McWilliams and A.F. Shchepetkin, 2003: Equilibrium Structure and Dynamics of the California Current System, J. Phys. Oceanogr., 34, 1-37. Marchesiello, P., J.C. McWilliams, and A.F. Shchepetkin, 2001: Open Boundary Conditions for Long-Term Integration of Regional Ocean Models, Ocean Modelling, 3, 1-20. Moore, A.M., H.G. Arango, A.J. Miller, B.D. Cornuelle, E. Di Lorenzo, and D.J. Neilson, 2003: A Comprehensive Ocean Prediction and Analysis System Based on the Tangent Linear and Adjoint Components of a Regional Ocean Model, Ocean Modelling, Submitted. Peven, P., C. Roy, A. Colin de Verdiere and J. Largier, 2000: Simulation and Quantification of a Coastal Jet Retention Process Using a Barotropic Model, Oceanol. Acta, 23, 615-634. Peven, P., J.R.E. Lutjeharms, P. Marchesiello, C. Roy and S.J. Weeks, 2001: Generation of Cyclonic Eddies by the Agulhas Current in the Lee of the Agulhas Bank, Geophys. Res. Let., 27, 1055-1058. Shchepetkin, A.F. and J.C. McWilliams, 2003: The Regional Ocean Modeling System: A Split-Explicit, Free-Surface Topography-Following Coordinates Ocean Model, J. Comp. Phys., Submitted. Shchepetkin, A.F. and J.C. McWilliams, 2003: A Method for Computing Horizontal Pressure-Gradient Force in an Oceanic Model with a Non-Aligned Vertical Coordinate, J. Geophys. Res., 108, 1-34. She, J. and J.M. Klinck, 2000: Flow Near Submarine Canyons Driven by Constant Winds, J. Geophys. Res., 105, 28671-28694. Warner, J.C., H.G. Arango, C. Sherwood, B. Butman, and Richard P. Signell, 2003: Performance of four turbulence closure methods Implemented using a Generic Length Scale Method, Ocean Modelling, Revised and Resubmitted.

33

34 Modular Design

35

36

37

38

39

40

41

42

43

44

45

46

47 Code Design

48 #include "cppdefs.h“ MODULE mod_ocean USE mod_kinds implicit none TYPE T_OCEAN real(r8), pointer :: rubar(:,:,:) real(r8), pointer :: rvbar(:,:,:) real(r8), pointer :: rzeta(:,:,:) real(r8), pointer :: ubar(:,:,:) real(r8), pointer :: vbar(:,:,:) real(r8), pointer :: zeta(:,:,:) #ifdef SOLVE3D real(r8), pointer :: pden(:,:,:) real(r8), pointer :: rho(:,:,:) real(r8), pointer :: ru(:,:,:,:) real(r8), pointer :: rv(:,:,:,:) real(r8), pointer :: t(:,:,:,:,:) real(r8), pointer :: u(:,:,:,:) real(r8), pointer :: v(:,:,:,:) real(r8), pointer :: W(:,:,:) real(r8), pointer :: wvel(:,:,:) # ifdef SEDIMENT real(r8), pointer :: bed(:,:,:,:) real(r8), pointer :: bed_frac(:,:,:,:) real(r8), pointer :: bottom(:,:,:) # endif END TYPE T_OCEAN TYPE (T_OCEAN), allocatable :: ALL_OCEAN(:) END MODULE mod_ocean CONTAINS

49 SUBROUTINE allocate_ocean (ng, LBi, UBi, LBj, UBj) USE mod_param #ifdef SEDIMENT USE mod_sediment #endif integer, intent(in) :: ng, LBi, UBi, LBj, UBj IF (ng.eq.1) allocate ( OCEAN(Ngrids) ) allocate ( OCEAN(ng) % rubar(LBi:UBi,LBj:UBj,2) ) allocate ( OCEAN(ng) % rvbar(LBi:UBi,LBj:UBj,2) ) allocate ( OCEAN(ng) % rzeta(LBi:UBi,LBj:UBj,2) ) allocate ( OCEAN(ng) % ubar(LBi:UBi,LBj:UBj,3) ) allocate ( OCEAN(ng) % vbar(LBi:UBi,LBj:UBj,3) ) allocate ( OCEAN(ng) % zeta(LBi:UBi,LBj:UBj,3) ) #ifdef SOLVE3D allocate ( OCEAN(ng) % pden(LBi:UBi,LBj:UBj,N(ng)) ) allocate ( OCEAN(ng) % rho(LBi:UBi,LBj:UBj,N(ng)) ) allocate ( OCEAN(ng) % ru(LBi:UBi,LBj:UBj,0:N(ng),2) ) allocate ( OCEAN(ng) % rv(LBi:UBi,LBj:UBj,0:N(ng),2) ) allocate ( OCEAN(ng) % t(LBi:UBi,LBj:UBj,N(ng),3,NT(ng)) ) allocate ( OCEAN(ng) % u(LBi:UBi,LBj:UBj,N(ng),2) ) allocate ( OCEAN(ng) % v(LBi:UBi,LBj:UBj,N(ng),2) ) allocate ( OCEAN(ng) % W(LBi:UBi,LBj:UBj,0:N(ng)) ) # ifdef SEDIMENT allocate ( OCEAN(ng) % bed(LBi:UBi,LBj:UBj,Nbed,MBEDP) ) allocate ( OCEAN(ng) % bed_frac(LBi:UBi,LBj:UBj,Nbed,NST) ) allocate ( OCEAN(ng) % bottom(LBi:UBi,LBj:UBj,MBOTP) ) # endif RETURN END SUBROUTINE allocate_ocean

50 SUBROUTINE initialize_ocean (ng, tile) USE mod_param #ifdef SEDIMENT USE mod_sediment #endif integer, intent(in) :: ng, tile integer :: IstrR, IendR, JstrR, JendR, IstrU, JstrV real(r8), parameter :: IniVal = 0.0_r8 #include "tile.h" #ifdef DISTRIBUTE IstrR=LBi IendR=UBi JstrR=LBj JendR=UBj #else # include "set_bounds.h" #endif OCEAN(ng) % rubar(IstrR:IendR,JstrR:JendR,1:2) = IniVal OCEAN(ng) % rvbar(IstrR:IendR,JstrR:JendR,1:2) = IniVal OCEAN(ng) % rzeta(IstrR:IendR,JstrR:JendR,1:2) = IniVal OCEAN(ng) % ubar(IstrR:IendR,JstrR:JendR,1:3) = IniVal OCEAN(ng) % vbar(IstrR:IendR,JstrR:JendR,1:3) = IniVal OCEAN(ng) % zeta(IstrR:IendR,JstrR:JendR,1:3) = IniVal... RETURN END SUBROUTINE initialize_ocean END MODULE mod_ocean

51 #include "cppdefs.h" MODULE omega_mod implicit none PRIVATE PUBLIC omega CONTAINS SUBROUTINE omega (ng, tile) USE mod_param USE mod_grid USE mod_ocean integer, intent(in) :: ng, tile # include "tile.h“ # ifdef PROFILE CALL wclock_on (ng, 13) # endif CALL omega_tile (ng, Istr, Iend, Jstr, Jend, & & LBi, UBi, LBj, UBj, & & GRID(ng) % Huon, & & GRID(ng) % Hvom, & & GRID(ng) % z_w, & & OCEAN(ng) % W) # ifdef PROFILE CALL wclock_off (ng, 13) # endif RETURN END SUBROUTINE omega

52 SUBROUTINE omega_tile (ng, Istr, Iend, Jstr, Jend, & & LBi, UBi, LBj, UBj, & & Huon, Hvom, z_w, W) USE mod_param USE mod_scalars USE bc_3d_mod, ONLY : bc_w3d_tile integer, intent(in) :: ng, Iend, Istr, Jend, Jstr integer, intent(in) :: LBi, UBi, LBj, UBj real(r8), intent(in) :: Huon(LBi:,LBj:,:) real(r8), intent(in) :: Hvom(LBi:,LBj:,:) real(r8), intent(in) :: z_w(LBi:,LBj:,0:) real(r8), intent(out) :: W(LBi:,LBj:,0:) integer :: IstrR, IendR, JstrR, JendR, IstrU, JstrV integer :: i, j, k real(r8), dimension(PRIVATE_1D_SCRATCH_ARRAY) :: wrk # include "set_bounds.h" DO j=Jstr,Jend DO i=Istr,Iend W(i,j,0)=0.0_r8 END DO DO k=1,N(ng) DO i=Istr,Iend W(i,j,k)=W(i,j,k-1)- & & (Huon(i+1,j,k)-Huon(i,j,k)+ & & Hvom(i,j+1,k)-Hvom(i,j,k)) END DO... END DO RETURN END SUBROUTINE omega_tile


Download ppt "A Community T errain-following O cean Modeling S ystem 2003 Terrain-Following Ocean Models Users Workshop PMEL, Seattle, WA, August 5, 2003."

Similar presentations


Ads by Google