Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,

Slides:



Advertisements
Similar presentations
© S.J. Coles 2006 Usability WS, NeSC Jan 06 Experiences in deploying a useable Grid-enabled service for the National Crystallography Service Simon J. Coles.
Advertisements

The Anatomy of the Grid: An Integrated View of Grid Architecture Carl Kesselman USC/Information Sciences Institute Ian Foster, Steve Tuecke Argonne National.
Using the Collaborative Tools in NEESgrid Charles Severance University of Michigan.
Earth System Curator Spanning the Gap Between Models and Datasets.
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
High Performance Computing Course Notes Grid Computing.
The DOE Science Grid Computing and Data Infrastructure for Large-Scale Science William Johnston, Lawrence Berkeley National Lab Ray Bair, Pacific Northwest.
Presented by Scalable Systems Software Project Al Geist Computer Science Research Group Computer Science and Mathematics Division Research supported by.
Where do we go from here? “Knowledge Environments to Support Distributed Science and Engineering” Symposium on Knowledge Environments for Science and Engineering.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
July 21, 2005Interdisciplinary Modeling of Acquatic Ecosystems, Lake Tahoe 1 Advanced Computing in Environmental Sciences ACES Vanda Grubišić Desert Research.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
W4: Grid Portals Building Web-enabled End-User Environments for accessing Grid Services By Michael Paul Russell Dept Computer Science.
GridSphere for GridLab A Grid Application Server Development Framework By Michael Paul Russell Dept Computer Science University.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
Introduction to Grid Computing Ann Chervenak Carl Kesselman And the members of the Globus Team.
Web-based Portal for Discovery, Retrieval and Visualization of Earth Science Datasets in Grid Environment Zhenping (Jane) Liu.
TPAC Digital Library Talk Overview Presenter:Glenn Hyland Tasmanian Partnership for Advanced Computing & Australian Antarctic Division Outline: TPAC Overview.
LHC Experiment Dashboard Main areas covered by the Experiment Dashboard: Data processing monitoring (job monitoring) Data transfer monitoring Site/service.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Grid Computing, B. Wilkinson, a.1 Grid Portals.
E-Science Workflow Support with Grid-Enabled Microsoft Project Gregor von Laszewski and Leor E. Dilmanian, Rochester Institute of Technology Abstract von.
Commodity Grid (CoG) Kits Keith Jackson, Lawrence Berkeley National Laboratory Gregor von Laszewski, Argonne National Laboratory.
NeSC Grid Apps Workshop Exposing Legacy Applications as OGSI Components using pyGlobus Keith R. Jackson Distributed Systems Department Lawrence Berkeley.
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
CoG Kit Overview Gregor von Laszewski Keith Jackson.
Progress towards a National Collaboratory Stu Loken Lawrence Berkeley Laboratory.
Through the development of advanced middleware, Grid computing has evolved to a mature technology in which scientists and researchers can leverage to gain.
Jarek Nabrzyski, Ariel Oleksiak Comparison of Grid Middleware in European Grid Projects Jarek Nabrzyski, Ariel Oleksiak Poznań Supercomputing and Networking.
The Anatomy of the Grid: An Integrated View of Grid Architecture Ian Foster, Steve Tuecke Argonne National Laboratory The University of Chicago Carl Kesselman.
Fisheries Oceanography Collaboration Software Donald Denbo NOAA/PMEL-UW/JISAO Presented by Nancy Soreide NOAA/PMEL AMS 2002/IIPS 10.3.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor : A Concept, A Tool and.
Center for Component Technology for Terascale Simulation Software CCA is about: Enhancing Programmer Productivity without sacrificing performance. Supporting.
Grid BIFI1 A brief introduction to The Grid seminar 8th June 2005 Guillermo Losilla Anadón.
Service - Oriented Middleware for Distributed Data Mining on the Grid ,劉妘鑏 Antonio C., Domenico T., and Paolo T. Journal of Parallel and Distributed.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
Grid Architecture William E. Johnston Lawrence Berkeley National Lab and NASA Ames Research Center (These slides are available at grid.lbl.gov/~wej/Grids)
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
National Collaboratories Program Overview Mary Anne ScottFebruary 7, rd DOE/NSF Meeting on LHC and Global Computing “Infostructure”
Ames Research CenterDivision 1 Information Power Grid (IPG) Overview Anthony Lisotta Computer Sciences Corporation NASA Ames May 2,
The GRelC Project: architecture, history and a use case in the environmental domain G. Aloisio - S. Fiore The Climate-G testbed is an interdisciplinary.
Networked Embedded and Control Systems WP ICT Call 2 Objective ICT ICT National Contact Points Mercè Griera i Fisa Brussels, 23 May 2007.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
National Computational Science University of Illinois at Urbana-Champaign Alliance Vision of the Future (9/2000) Commodity cluster hardware –Intel IA-32/64.
Authors: Ronnie Julio Cole David
Presented by Scientific Annotation Middleware Software infrastructure to support rich scientific records and the processes that produce them Jens Schwidder.
The Earth System Grid (ESG) Computer Science and Technologies DOE SciDAC ESG Project Review Argonne National Laboratory, Illinois May 8-9, 2003.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Presented by Jens Schwidder Tara D. Gibson James D. Myers Computing & Computational Sciences Directorate Oak Ridge National Laboratory Scientific Annotation.
CEOS Working Group on Information Systems and Services - 1 Data Services Task Team Discussions on GRID and GRIDftp Stuart Doescher, USGS WGISS-15 May 2003.
May 6, 2002Earth System Grid - Williams The Earth System Grid Presented by Dean N. Williams PI’s: Ian Foster (ANL); Don Middleton (NCAR); and Dean Williams.
Cole David Ronnie Julio. Introduction Globus is A community of users and developers who collaborate on the use and development of open source software,
Enabling e-Research in Combustion Research Community T.V Pham 1, P.M. Dew 1, L.M.S. Lau 1 and M.J. Pilling 2 1 School of Computing 2 School of Chemistry.
XML-Based Grid Data System for Bioinformatics Development Noppadon Khiripet, Ph.D Wasinee Rungsarityotin, MS Chularat Tanprasert, Ph.D Royol Chitradon.
Adapting the Electronic Laboratory Notebook for the Semantic Era Tara Talbott, Michael Peterson, Jens Schwidder, James D. Myers 2005 International Symposium.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Networking: Applications and Services Antonia Ghiselli, INFN Stu Loken, LBNL Chairs.
GRID ANATOMY Advanced Computing Concepts – Dr. Emmanuel Pilli.
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
O AK R IDGE N ATIONAL L ABORATORY U.S. D EPARTMENT OF E NERGY Data Requirements for Climate and Carbon Research John Drake, Climate Dynamics Group Computer.
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
NASA Earth Exchange (NEX) A collaborative supercomputing environment for global change science Earth Science Division/NASA Advanced Supercomputing (NAS)
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
ENEA GRID & JPNM WEB PORTAL to create a collaborative development environment Dr. Simonetta Pagnutti JPNM – SP4 Meeting Edinburgh – June 3rd, 2013 Italian.
Clouds , Grids and Clusters
Collaborations and Interactions with other Projects
Presentation transcript:

Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics, high energy physics, structural biology, chemistry, and tele-immersive engineering, require the coordinated use of numerous distributed and heterogeneous components, including advanced networks, computers, storage devices, display devices, and scientific instruments. Such a national collaborative Grid infrastructure is being developed and supported by DOE, NSF, and NASA. Developing advanced scientific applications for these emerging national-scale computational Grid infrastructures is, however, still a difficult task. Although elementary Grid services exist that enable scientific application developers to authenticate, access, manage, and discover sophisticated remote resources, these services are not compatible with the commodity technologies and frameworks used by application scientists today. Additionally, many computational scientists lack the necessary expertise to deal with such a complex infrastructure. To address this problem, we are developing Commodity Grid (CoG) Kits that will enable scientific application programmers and middleware developers to readily make use of Grid services from a higher-level framework. We focus this effort on the development of two CoG Kits: one for Java and the other for Python. Based on component models, these kits encourage collaborative code reuse and avoid duplication of effort among problem solving environments, science portals, Grid middleware, and collaboratory pilot developers. The CoG Kits provide compatibility to the Globus Toolkit through which terra Grid facilities can share resources among distributed domains. In Java we developed protocol compatible components interfacing with the Grid infrastructure, enabling us to develop sophisticated Web enabled clients and services. In Python we developed components that are making use of precompiled Globus Toolkit libraries. Both promote Grid programming within their frameworks making it today possible to develop interfaces and programs to the Grid in Java and Python. The new CoG Kit technology has been reused while building a number of scientific portals to the Grid. Indeed, it already has become the de facto community standard for developing Grid portal applications. One exciting example is the NSF portal for atmospheric simulations, which is being used to analyze the April 1996 tornado that struck Illinois and nearby states (see Figure 1). Other examples of successful applications of our technologies include portals for astrophysical black-hole simulations, portals for structural biology, the SciDAC Earth System Grid, and the SciDAC Particle Physics Data Grid. Furthermore, our technology is integral part in the development of the current generation of a DOE-wide Science Grid allowing the larger DOE science community to utilize an emerging DOE wide Grid infrastructure. CoG Kits have also been incorporated in the Access Grid (Figure 3), an ensemble of resources that enables group-to-group collaborations amongst interdisciplinary scientists. The CoG Kit provides an efficient means for authentication and secure communication between different rooms. Furthermore, CoG Kits provide an elementary building blocks for developing the next generation Grid infrastructure as part of upcoming Globus Toolkit releases. The SciDAC project has enabled us to work tightly as an interdisciplinary team between scientists and technology providers amongst a variety of SciDAC projects. Interdisciplinary interactions, access to supercomputing facilities, and new collaborative use of sophisticated resources are among the results of our group efforts. In the next years, additional steps are needed that will allow us to use our basic middleware as part of high-end scientific collaborative environments while reusing methodologies and frameworks that make the construction of increasingly sophisticated collaborative environments possible. We will continue to collaborate with scientists to create transparent use of terrascale facilities by the scientist. Summary of available Tools and Technologies Java –Globus Protocol compatible Toolkit –Example GUI –Defacto standard for portal developers –GSS based security for OGSA Third Party –CCAToolkit (XCAT) Python –Globus API compatible Toolkit –GSS based security for OGSA –Example GUIs –Access Grid 2.0 controller based on Python CoG Kit Tutorials for both Objectives –Allow application developers to make use of Grid services from higher-level frameworks such as Java and Python. – Easier development of advanced Grid services. – Easier and more rapid application development. – Encourage code reuse, and avoid duplication of effort. –Encourage the reuse of Web Services as part of the Grids (now part of a separate proposal). Collaborations and Interactions with other Projects SciDAC –Middleware technologies to support science portals –CMCS pilot –Middleware to support group-to-group collaboration –DOE Science Grid –Particle Physics Data Grid Collaborative Pilot –Earth Systems Grid II Others –Alliance Expeditions (Astro-physics, climatology, comp. biology, BioCore) –GridLab –Access Grid –OGSA There are more … Technology developed as part of the CoG Kit Project is used in most of the projects that access Globus Toolkit services using Java or Python. For further information on this subject contact: - Dr. Gregor von Laszewski, Principal Investigator Argonne National Laboratory Mathematics and Computer Science Division Phone: Keith R. Jackson, Principal Investigator Lawrence Berkeley National Laboratory Distributed Systems Department Phone : Mary-Anne Scott: Figure 2: Argonne Climate Group uses CoG Kits to access Grid resources Figure 1: Our technology is already being used by researchers outside of DOE, for example in this sophisticated Grid portal for atmospheric simulations simulations (NCSA WARF). Figure 3: The Access Grid uses the Python CoG Kit (pyGlobus) as a component to enable scientific collaborative sessions.