Www.see-grid-sci.eu SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.

Slides:



Advertisements
Similar presentations
ATMO5332 WRF-ARW Tutorial 0.01”.
Advertisements

SEE-GRID-SCI Hands-On Session: Workload Management System (WMS) Installation and Configuration Dusan Vudragovic Institute of Physics.
DataGrid Kimmo Soikkeli Ilkka Sormunen. What is DataGrid? DataGrid is a project that aims to enable access to geographically distributed computing power.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
This is the footer Running WRF on HECToR Ralph Burton, NCAS (Leeds) Alan Gadian, NCAS (Leeds) With thanks to Paul Connolly, Hector.
Reproducible Environment for Scientific Applications (Lab session) Tak-Lon (Stephen) Wu.
Understanding the Basics of Computational Informatics Summer School, Hungary, Szeged Methos L. Müller.
SEE-GRID-SCI Applications of the Meteorology VO in the frame of SEE-GRID-SCI project The SEE-GRID-SCI initiative is co-funded by the.
The ATLAS Production System. The Architecture ATLAS Production Database Eowyn Lexor Lexor-CondorG Oracle SQL queries Dulcinea NorduGrid Panda OSGLCG The.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) WMPROXY API Python & C++ Diego Scardaci
The gLite API – PART I Giuseppe LA ROCCA INFN Catania ACGRID-II School 2-14 November 2009 Kuala Lumpur - Malaysia.
Grid Technologies  Slide text. What is Grid?  The World Wide Web provides seamless access to information that is stored in many millions of different.
INFSO-RI Enabling Grids for E-sciencE Workload Management System Mike Mineter
Group 1 : Grid Computing Laboratory of Information Technology Supervisors: Alexander Ujhinsky Nikolay Kutovskiy.
Using the EMI testbed ARC middleware Marek Kočan University of P. J. Šafárik, Košice.
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
T3 analysis Facility V. Bucard, F.Furano, A.Maier, R.Santana, R. Santinelli T3 Analysis Facility The LHCb Computing Model divides collaboration affiliated.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Status report on Application porting at SZTAKI.
FRANEC and BaSTI grid integration Massimo Sponza INAF - Osservatorio Astronomico di Trieste.
SEE-GRID-SCI Overview of YAIM and SEE-GRID-SCI YAIM templates Dusan Vudragovic Institute of Physics Belgrade Serbia The.
INFSO-RI Enabling Grids for E-sciencE Αthanasia Asiki Computing Systems Laboratory, National Technical.
SEE-GRID-SCI WRF – ARW application: Overview The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research.
SEE-GRID-SCI Storage Element Installation and Configuration Branimir Ackovic Institute of Physics Serbia The SEE-GRID-SCI.
1 P-GRADE Portal tutorial at EGEE’09 Introduction to hands-on Gergely Sipos MTA SZTAKI EGEE.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Command Line Grid Programming Spiros Spirou Greek Application Support Team NCSR “Demokritos”
Installing and Running the WPS Michael Duda 2006 WRF-ARW Summer Tutorial.
SEE-GRID-SCI REFS application: NOA The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
Testing the HEPCAL use cases J.J. Blaising, F. Harris, Andrea Sciabà GAG Meeting April,
1 P-GRADE Portal hands-on Gergely Sipos MTA SZTAKI Hungarian Academy of Sciences.
SEE-GRID-SCI Meteo-VO: Overview The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures.
D.Spiga, L.Servoli, L.Faina INFN & University of Perugia CRAB WorkFlow : CRAB: CMS Remote Analysis Builder A CMS specific tool written in python and developed.
12th EELA Tutorial for Users and Managers E-infrastructure shared between Europe and Latin America LFC Server Installation and Configuration.
Active-HDL Server Farm Course 11. All materials updated on: September 30, 2004 Outline 1.Introduction 2.Advantages 3.Requirements 4.Installation 5.Architecture.
InSilicoLab – Grid Environment for Supporting Numerical Experiments in Chemistry Joanna Kocot, Daniel Harężlak, Klemens Noga, Mariusz Sterzel, Tomasz Szepieniec.
GRID commands lines Original presentation from David Bouvet CC/IN2P3/CNRS.
EDGI European Desktop Grid Initiative Have you ever submitted jobs to gLite in one run? If not, I will show.
Advanced Computing Facility Introduction
Accessing the VI-SEEM infrastructure
Grid2Win Porting of gLite middleware to Windows XP platform
DGAS A.Guarise April 19th, Athens
gLite Information System
OpenPBS – Distributed Workload Management System
Introduction to Metview
MCproduction on the grid
An introduction to MEDIN Data Guidelines September 2016
MyProxy Server Installation
Running a job on the grid is easier than you think!
Running a job on the grid is easier than you think!
Practical: The Information Systems
Introductions Using gLite Grid Miguel Angel Díaz Corchero
Porting MM5 and BOLAM codes to the GRID
Work report Xianghu Zhao Nov 11, 2014.
Corso di Calcolo Parallelo Grid Computing
lcg-infosites documentation (v2.1, LCG2.3.1) 10/03/05
National Scientific Library at Tbilisi State University
Grid Application Support Group Case study Schrodinger equations on the Grid Status report 16. January, Created by Akos Balasko
Introduction to Grid Technology
Ruslan Fomkin and Tore Risch Uppsala DataBase Laboratory
Grid2Win: Porting of gLite middleware to Windows XP platform
Workload Management System
Information System Virginia Martín-Rubio Pascual
The gLite API – Part II Giuseppe LA ROCCA ACGRID-II School
gLite Information System
Grid2Win: Porting of gLite middleware to Windows XP platform
gWRF Workflow and Input Data Requirements
EGEE Middleware: gLite Information Systems (IS)
gLite Job Management Christos Theodosiou
The EU DataGrid Fabric Management Services
Job Application Monitoring (JAM)
Information System (BDII)
Presentation transcript:

SEE-GRID-SCI WRF-ARW model: Grid usage The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no Davor Davidović, Centre for Informatics and Computing Ruđer Bošković Institute Croatia

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Program  Introduction  Storage schema  WRF-ARW package  Data upload  Submitting model  Retrieving model data

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Why WRF-ARW on grid? Accessibility of “free” resources The possibility of running model on larger number of CPUs – shorter execution time The possibility of storing large amount of data (grid storage elements) ‏ The possibility of producing more accurate forecast (better resolution, shorter time interval, etc...) ‏

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Storage schema (1) ‏ Unified system of storing and collecting meteorological data (initial and boundary conditions, model output results, etc...) Use LFC file catalogue for storing all data WRF-ARW execution within METEO VO  Every VO has his own unique root directory within LFC with the form: /grid/VO_NAME/

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Storage schema (2) ‏ Before start using LFC we have to set a proper LFC server: We can check our file structure using: $ lfc-ls –l /grid/meteo.see-grid-sci.eu $ export LFC_HOST=grid02.rcub.bg.ac.yu

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Storage schema – example of use Root folder for WRF model data in LFC: /grid/meteo.see-grid-sci.eu/WRF-ARW Consists of 3 main sub-folders:  /bin  /input_data  /output_data

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Storage schema - /bin Save all executable and auxiliary files Divided in two folders  WPS – executables for pre-processing  WRF – executables for WRF-ARW core Naming convention for tar-ed executables: WRFV$VERSION_$ARCH.tar.gz WPSV$VERSION_$ARCH.tar.gz WRF_STATIC.tar.gz i WPS_STATIC.tar.gz contain all auxiliary static files These data are rarely changed ‏ (only with new versions of model)

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Storage shema – input_data Store all input model data (geo and grib)  /terrain – static geographical data  /raw – raw data  /precompiled – data processed with geogrid.exe  /boundary - initial and boundary conditions  /REGION – region sub-folder (user-defined) – /raw – raw data from global model servers (NCEP, ECMWF, etc...) – /precompiled – data processed with ungrib.exe All boundary condition data must be saved within a REGION

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Storage schma – output_data Save model output data All data are stored in REGION sub-folder defined by the user Consists of:  /REGION – region of interest (user-defined)  /sci – results of scientific mode  /oper – results of operational mode In operational mode outputs are saved by the current date Naming convention for zipped model outputs $username_WRF_ARW_YYYYMMDD_$code.tar.gz For example full path to output data: /grid/meteo.see-grid-sci.eu/WRF-ARW/output/CROATIA/oper/ / / ddavid_WRF_ARW_ _ tar.gz

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x WRF-ARW grid workflow

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Before submitting model Before using scripts one must set environment variables:  WRF_HOME – full path to WRF_ARW folder  PATH – putting WRF_ARW/bin into PATH  For example : $ export WRF_HOME=/home/home_folder/WRF_ARW $ export PATH=$PATH:$WRF_HOME/bin

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x WRF-ARW uploading data (1) ‏ There are three ways how to upload data (initial and boundary conditions):  Using standard lcg-* tools  Using DM-Web web application  Using wrf-upload-data script Example of copying data from UI node to specified grid SE ‏ $ lcg-cr --vo $VO –d $SE $LOCAL_FILE –l lfn:/LFC_LOCATION/FILE_NAME

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x WRF-ARW uploading data (2) ‏ The second way by using DM-Web application  Demo example web-interfaceweb-interface By using scripts  wrf-upload-data (most appropriate for downloading data for operational forecast) ‏ $ wrf-upload-data –h $ wrf-upload-data –l limit –d destination For uploading data for scientific mode use lcg-* tools

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x WRF-ARW submit Submit model on the WMS (grid) by using this script: $ wrf-submit [options] Simply to use Automatically generates *.jdl script and submit job on the grid Complexity of gLite middleware is hidden from end user $ wrf-submit -h

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x WRF-ARW submit Options for starting model:  -m – type of the model run (options: o – operational, s – scientific) ‏  -r – name of the region  -d – SE for storing output model outputs  -t – set the Vtable (default values: GFS for operative, ECMWF for scientific) ‏  -p – number of processors  -i – name of the input file  -o – name of the output file (obligatory for scientific mode) ‏

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x WRF-ARW operational mode How to start in operational mode  Script automatically updates start and end dates in namelist files  Start model with executing command:  Generates file with grid job id (extension *.job) given by WMS  namelist.input and namelist.wps must be located in the folder from which we start wrf-submit command $ wrf-submit –m o –p $NUM_PROC $ wrf-submit –m o –p 8 –o $USERNAME

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x WRF-ARW scientific mode Starting scientific mode  namelist.input and namelist.wps must be located in folder from which we run wrf-submit command  The user must manually set parameters in namelist files (start, end dates, period of the simulation, etc. )  The simplies way to start a model:  example:  Also generates files with saved grid job id $ wrf-submit –m s –i $INPUT_FILE $ wrf-submit –m s –i test_sci –p 8 –o $USERNAME

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Checking job state Name format of generated jobid file: username_YYYY-MM-DD_run_run-id.job Jobid files in necessary for checking your job status Check job status by using gLite tools: Check job status by using script: $ glite-wms-job-status –i JOBID_FILE.job $ wrf-job-status –i JOBID_FILE.job

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x WRF-ARW retrieving outputs After the run is finish all output results are packed and stored on (user defined or default) grid SE within LFC catalogue: /WRF-ARW/output_data/REGIJA/sci/ ili /WRF-ARW/output_data/REGIJA/oper/YYYYMMDD/ Script for downloading output files: Downloading is possible on UI node or some other user-defined server $ wrf-get-data –i JOBID_FILE.job

SEE-GRID-SCI Kick-off meeting, Athens, May 22-23, /x Thank you for Your attention! Questions?