Presentation is loading. Please wait.

Presentation is loading. Please wait.

UMeteo-K - 1 - Principle Investigator : Jai-Ho Oh (Pukyong National Univ, KOREA) Co-researchers: In-Sik Kang (Seoul National Uuiv, KOREA) Byung-Lyol Lee.

Similar presentations


Presentation on theme: "UMeteo-K - 1 - Principle Investigator : Jai-Ho Oh (Pukyong National Univ, KOREA) Co-researchers: In-Sik Kang (Seoul National Uuiv, KOREA) Byung-Lyol Lee."— Presentation transcript:

1 uMeteo-K - 1 - Principle Investigator : Jai-Ho Oh (Pukyong National Univ, KOREA) Co-researchers: In-Sik Kang (Seoul National Uuiv, KOREA) Byung-Lyol Lee (NCAM/KMA, KOREA) 17 th APAN Meetings Research Collaborations Among Institutions Employing Grid Technology January 29, 2004 Hawaii Imin International Conference Center University of Hawaii, Honolulu, Hawaii, USA

2 uMeteo-K - 2 - Main Goals Establishment of uMeteo-K, ubiquitous Korea meteorological research cooperation system

3 uMeteo-K - 3 - About uMeteo-K The concept of virtual lab. for interdisciplinary meteorological researches - Cooperation meteorological research environment (Access Grid) - Parallelized numerical weather/climate prediction modeling (Computational Grid) - Virtual server for large meteorological data (Data Grid) Grid technology is essential to accomplishment

4 uMeteo-K - 4 - uMeteo-K Achievement (Apr., 2003 - present) Establishment of virtual cooperation research environment using Access Grid Pilot operation of numerical weather prediction system under Computational Grid environment Establishment of data grid, application of virtual scenarios for uMeteo-K

5 uMeteo-K - 5 - PKNU video-conferencing & AG room

6 uMeteo-K - 6 - Video device-1 Video device-2 Animation capture device Audio device Mother Board CPUs RAM ODD FDD SCSI-FDD SpecificationsRemarks Processor & Memory P-2.4GHz zeon dual CPU & 1GB RAM Windows XP Video x 2 Matrox G550 Dual AGP slot-type & ATI 7000 32M DDR PCI slot-type More than 1024 x 768 AudioSound blaster LiveFull Duplex Animation capture Osprey 210 PCI64bit support Speaker & Microphone 5.1ch. PC speaker, Mic. Camera Sony EVI-D30 (Polycom View Station) Logitech Quickcam Pro 4000 independent duplex video conferencing Monitor 1024 x 768 x 4 (four monitors installed) PKNU room node AG server spec.

7 uMeteo-K - 7 - PKNU QuickBrigde server installed bridge.pknu.ac.kr : uMeteo-K bridge server - Supporting to connect PIG without Multicasting network - Operating uMeteo-K homepage server - Registered uMeteo-K virtual venues service (ANL) - Operating the Quick bridge server (bridge.pknu.ac.kr :210.107.209.219)

8 uMeteo-K - 8 - System Process & memoryDual Intel Xeon 2.4 GHz /1GB DDR /40 GB SCSI HDD / Giga Bit Ethernet Video Card Matrox G550 Dual AGP ATI Radeon 9000 Dual PCI Audio Capture Card SoundBlaster Live De Video Capture Card Osprey 230 PCI Speaker-Microphone Speaker-Mic. Speaker AKG SR40 + HT40 (cordless mic. & receiver) Creative INSPiRE 5.1 5700 Digital Camera Analog Camera USB Camera Sony EVI-D30 Logitech Quickcam Pro4000 Display Device Project LCD Monitor Sharp LCD Projector XG-C40 Samsung SyncMaster 157X Etc S/W Rack Projector Screen Triangle support MS WindowsXP Professional MS OfficeXP Standard Rack Cabinet 8U uMeteo-K portable AG sever spec.

9 uMeteo-K - 9 - uMeteo-K AG instruments - Camera : EVI-D30 camera - Mic. : cordless & boundary mic. - Echo remove unit

10 uMeteo-K - 10 - uMeteo-K new AG configuration KMA PKNU ( ) KISTI Multicast Unicast Quick Bridge ANL Meteorological University PKNU KISTI KMA SNU KAIST KJIST KMA Unicast Meteorological Organization KMA Unicast Other Users APAG AG

11 uMeteo-K - 11 - PKNU Venue-server registered to ANL

12 uMeteo-K - 12 - Remote PPT - Sharing presentation data and connecting with AG - The data sharing through simple virtual host configuration

13 uMeteo-K - 13 - VNC Server & Viewer - VNC stands for Virtual Network Computing. - VNC allows you to view and interact with one computer (the "server") using a simple program (the "viewer") on another computer anywhere. - The two computers don't even have to be the same type OS.

14 uMeteo-K - 14 - Sample of VNC Server & Viewer

15 uMeteo-K - 15 - 12003. 4. 23.uMeteo-K AG test PKNU, SNU, NCAM 22003. 6. 3.uMeteo-K AG test,, ( ) 32003. 6. 20.uMeteo-K AG test VRVS,,, 42003. 7. 8.,,, KISTI 52003. 7. 13.SNU & Tokyo Univ. Seminar,,, 62003. 7. 31. RPPT test,, 72003. 8. 26. GFK 82003. 8. 27. 16 APAN,,,, 92003. 9. 3., 102003. 9. 25. CAgM/ETSIDN,, 112003. 10. 23. AG, 122003. 11. 9.uMeteo-K 8th Workshop,,, 132003. 11. 21. 2th,, 142003. 12. 1.Winter G.F.K. Workshop Grid KISTI, PKNU, SNU, NCAM, SKKU 152003. 12. 18.GLOBEC Workshop NFRDI, KSITI, PKNU, JAPAN 162003. 12. 23. PKNU, SNU, NCAM, KISTI 172004. 1. 6. PKNU, JAPAN Frontier Institution 182004. 1. 9.Workshop on Grid-tech. Seminar : international (4) + domestic (14) Co-work and general use : 60 Samples of uMeteo-K AG operation

16 uMeteo-K - 16 - uMeteo-K AG operation

17 uMeteo-K - 17 - Meteorology Session on 16 th APAN meeting Marriot Hotel, Busan, Korea

18 ICSYM/PKNU 2003 uMeteo-K - 18 - 2003 Grid Forum Korea Winter meetings at Chosun Hotel, Seoul

19 ICSYM/PKNU 2003 uMeteo-K - 19 - GLOBEC workshop at NFRDI, Korea

20 ICSYM/PKNU 2003 uMeteo-K - 20 - Tera Computing and Network Demonstration, KISTI

21 ICSYM/PKNU 2003 uMeteo-K - 21 - Workshop on Grid Technology for Environmental study at PKNU

22 uMeteo-K - 22 - uMeteo-K AG 2.0 tutorials uMeteo-K meeting & tutorials at PKNU, Busan, Korea

23 uMeteo-K - 23 - Costs & Benefits Analysis for uMeteo-K AG ( : )AG 12003. 4. 23.uMeteo-K AG test 2 2 = 400,000 +α AG - AG 4,000,000 ( ) - 4,000,000 - 8,000,000 : 15,000,000 22003. 6. 3.uMeteo-K AG test 2 2 = 400,000 +α 32003. 6. 20.uMeteo-K AG test 3 2 = 600,000 +α 42003. 7. 8. 8 2 = 1600,000 +α 52003. 7. 13. SNU & Tokyo Uni. 8 + 8 = 8,800,000 +α 62003. 7. 31. ( 1 + 2 ) 2 = 300,000 +α 72003. 8. 26. GFK ( 1 + 2 ) 2 = 300,000 +α 82003. 8. 27. 16th APAN ( 5 ) 6 + 5 = 8,000,000 +α 92003. 9. 3. KISTI ( 5 + 3 ) 2 = 800,000 +α 102003. 9. 25. CAgM/ETSIDN 5 + 3 = 7,800,000 +α 112003. 10. 23. 1 + 2 = 200,000 +α 122003. 11. 9.uMeteo-K 8th Workshop ( 6 + 4 ) 3 = 2,700,000 +α 132003. 11. 21. 2th ( 4 + 4 ) 2 = 1,200,000 +α 142003. 12. 1.Winter G.F.K workshop ( 1 + 3 ) 3 = 750,000 +α 152003. 12. 18.GLOBEC Workshop ( 1 + 2 ) 2 = 400,000 +α 162003. 12. 23. ( 1 + 2 ) 2 = 400,000 +α 172004. 1. 6. 1 =1,000,000 +α 182004. 1. 9.Workshop on Grid-tech. ( 6 + 6 ) 3 =2,700,0000 +α Benefits/CostsUSD 32k +αUSD 27k 1 =100,000, 1 =1,000,000, 1 =5

24 uMeteo-K - 24 - Differences between AG 1.2 and AG 2.0 1. Easy Venue Server configuration and usage 2. Instead of PIG, using individual Venue Server 3. Complexity about Certification Authority 4. Sharing Data, Sharing Application, Sharing Service enforced 5. GUI 6. Service total Management about Access Grid utility Data sharing Remote controll bridge direct access Manage total applications through one program Manage total applications through different programs AG 1.2AG 2.0 client server client

25 uMeteo-K - 25 - Operate AG 2.0 – (1) Video Conferencing

26 uMeteo-K - 26 - Operate AG 2.0 – (2) Sharing Data

27 uMeteo-K - 27 - Operate AG 2.0 – (3) Data Download

28 uMeteo-K - 28 - Operate AG 2.0 – (4) Shared Brower

29 uMeteo-K - 29 - Operate AG 2.0 – (5) Shared Presentation

30 uMeteo-K - 30 - Enable face-to-face meeting activities What can be done: –Sharing Data –Shared Applications –Text chat Applications: –Shared Presentation –Shared Web browser –Whiteboard –Voting Tool –Question & Answer Tool –Shared Desktop Tool Integrate legacy single-user apps Communicates media addresses to node service Client Services & Applications

31 uMeteo-K - 31 - uMeteo-K computational grid test-bed (Two clusters utilized and each cluster has 4 nodes) CPUPentium 4 1.8 Ghz RAM1G SDRAM HDDEIDE 40G VGANo Network Internal : 10/100 Fast Ethernet External : KOREN uMeteo-K CG Testbed

32 uMeteo-K - 32 - NAS storage sever 10/100 switch hub Monitoring system UPS 4 nodes ( single CPU ) cluster Electrometer KOREN 10/100 Ethernet 4 nodes ( single CPU ) cluster uMeteo-K CG Testbed Configuration

33 uMeteo-K - 33 - Independent simple CA has installed at each master node. Simulating the severe weather, typhoon Mae-mi A group of slave nodes is controlled by each master nodes PBS scheduler Globus linkage between testbed clusters CA-A Master A CA-B Master B slaves PBS

34 uMeteo-K - 34 - subject : /O=uMeteoK/OU=pknuGB_A01/OU=pknu.ac.kr/CN=globus/CN=proxy issuer : /O=uMeteoK/OU=pknuGB_A01/OU=pknu.ac.kr/CN=globus identity : /O=uMeteoK/OU=pknuGB_A01/OU=pknu.ac.kr/CN=globus type : full legacy globus proxy strength : 512 bits path : /tmp/x509up_u533 timeleft : 7:52:58 subject : /O=uMeteoK/OU=pknuGB_B01/OU=pknu.ac.kr/CN=globus05/CN=proxy issuer : /O=uMeteoK/OU=pknuGB_B01/OU=pknu.ac.kr/CN=globus05 identity : /O=uMeteoK/OU=pknuGB_B01/OU=pknu.ac.kr/CN=globus05 type : full legacy globus proxy strength : 512 bits path : /tmp/x509up_u535 timeleft : 7:52:43 - CA-A - CA-B CA information of each cluster

35 uMeteo-K - 35 - uMeteo-K CG testbed monitoring

36 uMeteo-K - 36 - Non-hydrostatic NWP model developed by PSU/NCAR KMAs operational model for short-term forecast Parallel NWP model for realtime short-term forecast MM5 (Mesoscale Model Version 5)

37 uMeteo-K - 37 - Sep. 12, 2003 09:00 LST~ Sep. 13, 2003 09:00 LST Precipitation, MSLP & Wind for 24 HRs PrecipitationMSLP and Wind

38 uMeteo-K - 38 - Parallel MM5 Benchmarks with GLOBUS Average job waiting time (including CA) : 25 sec The required time for 3600 sec (1 hour) model integration The required time for 86400 sec (1 day) model integration MPICH35 sec MPICH-G242 sec MPICH10 min 38 sec MPICH-G212 min 47 sec Single CPU67 min 12 sec

39 uMeteo-K - 39 - Connecting to KISTI testbed subject : /O=Grid/O=Globus/OU=pknu.ac.kr/CN=cgtest/CN=proxy/CN=proxy issuer : /O=Grid/O=Globus/OU=pknu.ac.kr/CN=cgtest/CN=proxy type : full strength : 512 bits path : /tmp/x509up_p2358.fileEaIRDc.1 timeleft : 7:28:17 - CA infomation gsissh -p 2222 venus.gridcenter.or.kr - Conneting to venus.gridcenter.or.kr using gsissh gsincftpput -p 2811 venus.gridcenter.or.kr./ ~/MM5.tar gsincftpput -p 2811 venus.gridcenter.or.kr./ ~/CReSS.tar - File transfer to venus.gridcenter.or.kr using gsincftpput and gsincftpget

40 uMeteo-K - 40 - It is successful to compile MM5 ( Mesoscale Model version 5 ) and CReSS ( Cloud resolving Storm Simulator) It is successful to run CReSS, but fail to run MM5 because testbed is not supported some library and routine. Running weather numerical model + ( &(resourceManagerContact="venus.gridcenter.or.kr") (count=8) (label="subjob 0") (environment=(GLOBUS_DUROC_SUBJOB_INDEX 0) (LD_LIBRARY_PATH /usr/local/gt2/lib/)) (directory="/home/ngrid/ngrid017/CReSS1.3m") (executable="/home/ngrid/ngrid017/CReSS1.3m/run.sh") ) ( &(resourceManagerContact="jupiter.gridcenter.or.kr") (count=8) (label="subjob 8") (environment=(GLOBUS_DUROC_SUBJOB_INDEX 1) (LD_LIBRARY_PATH /usr/local/gt2/lib/)) (directory="/home/ngrid/ngrid017/CReSS1.3m") (executable="/home/ngrid/ngrid017/CReSS1.3m/run.sh") ) - rsl file to run CReSS

41 uMeteo-K - 41 - CReSS (Cloud Resolving Storm simulator) Research & development in Nagoya Univ., Japan Non-hydrostatic compressible dynamic model Meso-to-cloud scale simulation system Inclusion of a detailed cloud. microphysical processes Inclusion of a detailed cloud. microphysical processes It takes many times for simulation. Meso-to-cloud scale simulation system Inclusion of a detailed cloud. microphysical processes Inclusion of a detailed cloud. microphysical processes It takes many times for simulation. Performance of parallel processing

42 uMeteo-K - 42 - Downburst Simulation for 21000 sec.

43 uMeteo-K - 43 - Parallel CReSS Benchmarks with GLOBUS The required time for 21000 sec (1 hour) model integration in uMeteoK Testbed The required time for 21000 sec (1 hour) model integration in KISTI Testbed Jupiter 8 nodes16 min 39 sec Venus 8 nodes15 min 35 sec Jupiter 4 nodes + Venus 4 nodes715min 13 sec Jupiter 8 nodes + Venus 8 nodes PKNU cluster 8 nodes (MPICH)15 min 49 sec PKNU A 4 nodes + PKNU B 4 nodes16 min 39 sec

44 uMeteo-K - 44 - GASS - Simple, Multi-protocol file transfer Tools, integrated with GRAM GridFTP - Provides high-performance, reliable data transfer for modern WANS Data Grid in Globus Tools Data Transport & Access Data Replica Catalog - Provides a catalog service for keeping track of replicated datasets Replica Management - Provides services for creating and managing replicated datasets Data Replication and Management

45 uMeteo-K - 45 - Data grid for climate prediction Data transportation SNU KISTI Supercom PKNU Observation data Forecast output NASA NCEP KMA Model output Data input Wu-Ftp Connecting KMA Supercom NASA Model output Atmospheric e-science Data Grid Forecast output Model output

46 uMeteo-K - 46 - Hardware structure of Data-GRID Linux server Intel Dual CPU Linux server Intel Dual CPU Linux server Intel Dual CPU cdldata cpsdata climate Disk Raid 2.3TB Disk Raid 1.8TB Disk Raid 500G Linux server Intel Dual CPU neosky15 Linux server Intel Dual CPU pknuGB01 KOREN Network KOREN Network KOREN Network Disk Raid 1.2TB Disk Raid 500G Seoul National University Korea Meteorology Administration Pukyong National University

47 uMeteo-K - 47 - Overview Data grid ldapadd Adding the uMeteoK-DataGrid to LDAP server Adding the Test-catalog to LDAP server Creating the Test-Collection in the test-catalog Register location for test-collection (snu-kma-pknu) Create & Register & list file-list for all location Copy file snu->kma or snu->pknuList all location where some files can be foundDelete the files & location Delete the Test-collection & Test-catalog ldapadd Globus-replica- management Globus-replica -management Globus-replica -Management& catalog Globus-replica -management Globus-replica -catalog Globus-replica -management Globus-replica -management

48 uMeteo-K - 48 - Grid FTP Using globus-url-copy Secured ftp Including Globus Tool kit Grid-ftp Using NCFTP Programs reliable and re-startable data transfer parallel and Striped Data transfer Automatic negotiation of TCP buffer/window sizes GSINCFTP Globus Tool kitGrid-Proxy-Init Data transfer & Access Data grid middleware User certification

49 uMeteo-K - 49 - Data Grid in Globus Tools Globus Toolkit install-process It appear that Test Result of Globus Software(globusrun) is successful Grid Software : Globus 2.4 Toolkit CA Software : SimpleCA Result of run /bin/data Grid-proxy-init Globus-personal-gatekeeper Globusrun

50 uMeteo-K - 50 - Data Grid in Globus Tools GSINCFTP Testing - Successful SNU-SNU - Cdldata climate SNU-KMA - Cdldata Neosky15 SNU-PKNU - Cdldata PknuGB01

51 uMeteo-K - 51 - Replica management - Layer on other Grid services GSI, transport, information service - Use LDAP as catalog form at and protocol, for consistency - Use as a building block for other tools Characteristic of Replica management - LDAP object classes for representing logical-to physical mapping in an LDAP catalog - Low-level replica catalog * globus_replica_catalog library * manipulate replica catalog -High-level replica catalog * globus_replica_manager library * Combines calls to file transfer operation calls to low-level API functions Replica catalog

52 uMeteo-K - 52 - Replica Catalog Structure Logical Collection Global & regional Observation data File: AMIP2 Jan 1997 … Replica catalog Logical Collection Global & regional Prediction data File: NCEP Jan 1997 … Logical Collection Global & regional Observation data Logical Collection Global & regional Observation data Logical File parent Logical File Jan 1999 Logical File Feb 1997 File: NCEP Jan 1997 … Protocol : gsi-ftp Address: gsi-ftp://cdldata.snu.ac.kr /cdldata1 File: NCEP Jan 1997 … Protocol : gsi-ftp Address: gsi-ftp://cpsata.snu.ac.kr /cdldata2 Logical File Jan 1997 Logical File Jan 1999

53 uMeteo-K - 53 - Data Replication Management Test Location A host name : climate.snu.ac.kr Slapd & Grid-FTP running Logical Name : climate prediction A Location : /home/globus/data List of file : climatePrediction 1 climatePrediction 2 climatePrediction 3 Location B host name : cdldata.snu.ac.kr Slapd & Grid-FTP running Logical Name : climate prediction B Location : /home/ymyang/data List of file : climatePrediction 4 climatePrediction 5 climatePrediction 6 CopyDelete CopyDelete

54 uMeteo-K - 54 - Data Grid in Globus Tools Scenario for globus replica service – between cdldata of SNU and pknuGB01 of PKNU Running Shell Script for globus replica services In the Copy command, there is some Problem that it is available to copy a file from SNU to PKNU, but it Occurred error when doing opposite. We suppose that it result in from Firewall policy of PKNU Computing Agency

55 uMeteo-K - 55 - In the future Multicasting network might be necessary for duplex Access Grid. => Currently, ICSYM/PKNU and NCAM/KMA within uMeteo-K used unicast quick bridge network (KISTI supported) Enhancement of display panel and improvement of sound quality might be necessary. =>Update to AG version 2.0 An urgent repair might be necessary for the failure on the common DNS identification among members of uMeteo-K Addition to uMeteo-K CG testbed nodes Between SNU and KMA, we is about to testing Globus replica service Application of realistic scenarios for uMeteo-K

56 uMeteo-K - 56 - Future plan of uMeteo-K Ocean Grid NDP Grid APCN Grid Environ Grid Energy Grid Hydrologic Grid N*Grid CAgM Grid uMeteo-K

57 uMeteo-K - 57 - Thank you for your attention!


Download ppt "UMeteo-K - 1 - Principle Investigator : Jai-Ho Oh (Pukyong National Univ, KOREA) Co-researchers: In-Sik Kang (Seoul National Uuiv, KOREA) Byung-Lyol Lee."

Similar presentations


Ads by Google