E-AIRS Reporting and Issues Resource Working Group, PRAGMA 15 Jongbae Moon, Byungsang Kim, Kum Won Cho Korea Institute of Science and Technology Information.

Slides:



Advertisements
Similar presentations
PRAGMA 17 (10/29/2009) Resources Group Pacific Rim Application and Grid Middleware Assembly Resources.
Advertisements

Reports from Resource Breakout PRAGMA 16 KISTI, Korea.
Resources WG Report Back. Account Creation Complaint –Too difficult to obtain user account on all resources Observations –Just ask Cindy and she will.
Resource WG Update PRAGMA 14 Mason Katz, Yoshio Tanaka, Cindy Zheng.
E-AIRS : e-Science Aerospace Integrated Research System Nam Gyu KIM e-Science Division KISTI SC08 PRAGMA 08 Update
Reports from Resource Breakout PRAGMA 15 USM, Malaysia.
PRAGMA 15 (10/24/2008) Resources Group Pacific Rim Application and Grid Middleware Assembly Resources.
Resource WG Update PRAGMA 14 Mason Katz, Yoshio Tanaka, Cindy Zheng.
Resource WG PRAGMA Mason Katz, Yoshio Tanaka, Cindy Zheng.
PRAGMA 14 Geosciences WG Activities Update G. S. Chang, W. F. Tsai NARL, Taiwan March 11, 2008.
Cindy Zheng, PRAGMA 8, Singapore, 5/3-4/2005 Status of PRAGMA Grid Testbed & Routine-basis Experiments Cindy Zheng Pacific Rim Application and Grid Middleware.
Demonstrations at PRAGMA demos are nominated by WG chairs Did not call for demos. We will select the best demo(s) Criteria is under discussion. Notes.
Resource WG Summary Mason Katz, Yoshio Tanaka. Next generation resources on PRAGMA Status – Next generation resource (VM-based) in PRAGMA by UCSD (proof.
Gfarm v2 and CSF4 Osamu Tatebe University of Tsukuba Xiaohui Wei Jilin University SC08 PRAGMA Presentation at NCHC booth Nov 19,
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks MyProxy and EGEE Ludek Matyska and Daniel.
Severs AIST Cluster (50 CPU) Titech Cluster (200 CPU) KISTI Cluster (25 CPU) Climate Simulation on ApGrid/TeraGrid at SC2003 Client (AIST) Ninf-G Severs.
P2P Distributed Computing Platform: C.Y. ‘Connor’ Park KISTI Supercomputing Center.
Computational Steering on the GRID Using a 3D model to Interact with a Large Scale Distributed Simulation in Real-Time Michael.
National Institute of Advanced Industrial Science and Technology Introduction to Grid Activities in the Asia Pacific Region jointly presented by Yoshio.
1 CFD Analysis Process. 2 1.Formulate the Flow Problem 2.Model the Geometry 3.Model the Flow (Computational) Domain 4.Generate the Grid 5.Specify the.
Enabling Grids for E-sciencE Medical image processing web portal : Requirements analysis. An almost end user point of view … H. Benoit-Cattin,
INFSO-RI Enabling Grids for E-sciencE FloodGrid application Ladislav Hluchy, Viet D. Tran Institute of Informatics, SAS Slovakia.
PRAGMA21 – PRAGMA 22 Collaborative Activities Resources Working Group.
PRAGMA20 – PRAGMA 21 Collaborative Activities Resources Working Group.
Grid ASP Portals and the Grid PSE Builder Satoshi Itoh GTRC, AIST 3rd Oct UK & Japan N+N Meeting Takeshi Nishikawa Naotaka Yamamoto Hiroshi Takemiya.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
Kento Aida, Tokyo Institute of Technology Grid Challenge - programming competition on the Grid - Kento Aida Tokyo Institute of Technology 22nd APAN Meeting.
PRAGMA: Cyberinfrastructure, Applications, People Yoshio Tanaka (AIST, Japan) Peter Arzberger (UCSD, USA)
Status of PRAGMA Activities at KISTI Jongbae Moon 1.
A wind tunnel for kids… Status Update September 15, 2011.
CFD Cyber Education Service using Cyberinfrastructure for e-Science PRAGMA 15 DEMO SESSION Jongbae Moon Byungsang Kim
National Institute of Advanced Industrial Science and Technology Introduction of PRAGMA routine-basis experiments Yoshio Tanaka
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
Supercomputing Center CFD Grid Research in N*Grid Project KISTI Supercomputing Center Chun-ho Sung.
SAMGrid as a Stakeholder of FermiGrid Valeria Bartsch Computing Division Fermilab.
PRAGMA 17 – PRAGMA 18 Resources Group. PRAGMA Grid 28 institutions in 17 countries/regions, 22 compute sites (+ 7 site in preparation) UZH Switzerland.
National Computational Science National Center for Supercomputing Applications National Computational Science NCSA-IPG Collaboration Projects Overview.
International Workshop on HEP Data Grid Nov 9, 2002, KNU Data Storage, Network, Handling, and Clustering in CDF Korea group Intae Yu*, Junghyun Kim, Ilsung.
Sejong STATUS Chang Yeong CHOI CERN, ALICE LHC Computing Grid Tier-2 Workshop in Asia, 1 th December 2006.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks S. Natarajan (CSU) C. Martín (UCM) J.L.
The Advanced Data Searching System The Advanced Data Searching System with 24 February APCTP 2010 J.H Kim & S. I Ahn & K. Cho on behalf of the Belle-II.
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
2D Airfoil Aerodynamics
Holding slide prior to starting show. A Portlet Interface for Computational Electromagnetics on the Grid Maria Lin and David Walker Cardiff University.
GO-ESSP Workshop, LLNL, Livermore, CA, Jun 19-21, 2006, Center for ATmosphere sciences and Earthquake Researches Construction of e-science Environment.
EGEE-II INFSO-RI Enabling Grids for E-sciencE The GILDA training infrastructure.
Running BLAST on the cluster system over the Pacific Rim.
A Proposal of Cyber Learning WG for Computational Science and Engineering Education on Cyberinfrastructure March 22, 2013 Kum Won Cho, Ruth Lee(KISTI),
SC2008 (11/19/2008) Resources Group Pacific Rim Application and Grid Middleware Assembly Reports.
Leveraging the InCommon Federation to access the NSF TeraGrid Jim Basney Senior Research Scientist National Center for Supercomputing Applications University.
KISTI & Belle experiment Eunil Won Korea University On behalf of the Belle Collaboration.
Portal Update Plan Ashok Adiga (512)
1 Flow Simulation with CFDnet TICE 2000, Colloque international Université de Technologie de Troyes, octobre 2000 Julio Militzer, F. E. Ham and Theo.
GSI: Security On Teragrid A Introduction To Security In Cyberinfrastructure By Dru Sepulveda.
National Institute of Advanced Industrial Science and Technology Developing Scientific Applications Using Standard Grid Middleware Hiroshi Takemiya Grid.
1 P-GRADE Portal hands-on Gergely Sipos MTA SZTAKI Hungarian Academy of Sciences.
PRAGMA19 – PRAGMA 20 Collaborative Activities Resources Working Group.
VOX Project Status T. Levshina. 5/7/2003LCG SEC meetings2 Goals, team and collaborators Purpose: To facilitate the remote participation of US based physicists.
Jiri Chudoba for the Pierre Auger Collaboration Institute of Physics of the CAS and CESNET.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
Pledged and delivered resources to ALICE Grid computing in Germany Kilian Schwarz GSI Darmstadt ALICE Offline Week.
: Korea Science Gateway Beob Kyun Kim, Soonwook Hwang {kyun, KISTI, Korea
Academic Technology Services The UC Grid Project OSG Consortium All-Hands Meeting Bill Labate & Joan Slottow Research Computing Technologies UCLA Academic.
Review of Last PRAGMA 24 Meeting
JSDL Parameter Sweep Extension with support for input file sweeping
Particle Physics at KISTI
Grid Canada Testbed using HEP applications
Patrick Dreher Research Scientist & Associate Director
ExaO: Software Defined Data Distribution for Exascale Sciences
Data Processing for CDF Computing
Presentation transcript:

e-AIRS Reporting and Issues Resource Working Group, PRAGMA 15 Jongbae Moon, Byungsang Kim, Kum Won Cho Korea Institute of Science and Technology Information (KISTI)

Resources Daejeon, KISTI Seoul, SNU KREONET Portal Server Cluster DB Server Storage Cluster Sakura,AIST, Japan Cluster SDS C, USA Cluster UPRM, Puerto Rico Cluster NCHC, Taiwan Cluster UZH, Switzerland Cluster NCHC, Taiwan

Statistics How many resources in PRAGMA are used in e-AIRS? –Ver. 1 AIST( Japan), Sakura(Japan) GT4 only –Ver. 2 Available 15 sites So far, 300 CPUs 9 sites including AIST (Japan)

Education Service Effectiveness of Education –Under and Graduate Student Introduce Wind Tunnel Experiment, CFD using e-AIRS –Break away traditional text-based CFD lecture  Higher interesting in using e- AIRS at the class –Since utilizing at two universities classes, –More universities and classes in CFD Lecture Fluid Computation Result Wind Tunnel Remote Lecture (Using AG) Student can see the process of the Wind Tunnel Experiment via AGToolkit. No need to program solvers Just click simulation button => Quick review of Fluid

Survey Results Topic 1. Increasing the understanding of the CFD simulation process 2. Convenience of using the portlet based web portal 3. Functionality and Convenience of the mesh generation 4. Functionality and Convenience of the CFD simulation 5. Functionality and Convenience of the Visualization ~24 CFD Cyber Education Service using Cyberinfrastructure5 TopicAverageAbove % % % % %

1. Cyber Education Service 2. Insect Flapping Simulation Service 3. WIG Craft Simulation Service 4. Black hole Simulation Service Current and Future Services

Case study Busan University conducted a term project using e-AIRS. 5 meshes of NACA 4-digit series Airfoil (NACA 4012, 4112, 4212, 4312, 4412) Parametric Study Parametric Study variable : Angle of Attack (AOA), Mach number Angle of Attack = 30 Freestream flow direction Angle of Attack = 0 Freestream flow direction AOA 0~30, increase by 2 step => 16 Case Angle of Attack = 30 Mach number 0.2~3.0 increase by 0.2 => 15 Case Angle of Attack = 30 Freestream flow Mach number=0.2 Freestream flow Mach number=3.0 One student submit a job of at least 5*16*15=1,200 cases X 50 students = 60,000 JOBs

Computing resources needed for classes Average Needs : 6 Univ. X 30 Students X (200~400Case) X 20 min./Case = 720,000~1,440,000min = 12,000 hours~24,000 hours 24 hour X 7 days = 168 Hours Under 24h 7days 100% full running condition, Needs CPU : 70 ~ 140 CPUs Max Needs (at the class) : 1 Univ. X 30 students X (100~200Case) X 20min/Case = 60,000 ~ 120,000min = 1,000hours ~ 2,000hours We need 167 ~ 330 CPUs Under 100% running during 6 Hour class, To solve 3 Dimensional Problem –More CPU time

Need More Resources There are many resources in PRAGMA (over 1,000 CPUs) However, there are some problems –Grid authentication problem –Gridmap file typos "/C=KR/O=KISTI/O=GRID/O=KISTI/CN= Jongbae Moon". eairs –No CA file update –Network problem –SSH access denied gsi-ssh may be a alternatives

Global Scheduler or Resource Broker e-AIRS resource broker –At the classes, students are submitting jobs at the same time –Causing bad situations to some sites –Static information SCMSweb + GridWay is a candidate. –“Sugree” can help me –Need to install Gridway to all sites of PRAGMA Developing a Global Scheduler –SCMSweb based?

Authentication Problem e-AIRS is being used by many students –It is very difficult to register them to GRID –VOMS is mandatory for e-AIRS –PRAGMA VOMS is a good solution Student has own portal account However, one account and Sharing a GridProxy Issue Short-term User Certificate by PRAGMA CA?

Thank you for your attentions!