Foundations for an LHC Data Grid Stu Loken Berkeley Lab.

Slides:



Advertisements
Similar presentations
The Access Grid Ivan R. Judson 5/25/2004.
Advertisements

Common Instrument Middleware Architecture and Federation of Instrument Resources for X-ray Crystallography Rick McMullen Indiana University.
OptorSim: A Replica Optimisation Simulator for the EU DataGrid W. H. Bell, D. G. Cameron, R. Carvajal, A. P. Millar, C.Nicholson, K. Stockinger, F. Zini.
The Anatomy of the Grid: An Integrated View of Grid Architecture Carl Kesselman USC/Information Sciences Institute Ian Foster, Steve Tuecke Argonne National.
C3.ca in Atlantic Canada Virendra Bhavsar Director, Advanced Computational Research Laboratory (ACRL) Faculty of Computer Science University of New Brunswick.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
High Performance Computing Course Notes Grid Computing.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
The DOE Science Grid Computing and Data Infrastructure for Large-Scale Science William Johnston, Lawrence Berkeley National Lab Ray Bair, Pacific Northwest.
Computing and Data Infrastructure for Large-Scale Science Deploying Production Grids: NASA’s IPG and DOE’s Science Grid William E. Johnston
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
National Computational Science Alliance Boston University Access Grid Conference Facility at Boston University.
NGNS Program Managers Richard Carlson Thomas Ndousse ASCAC meeting 11/21/2014 Next Generation Networking for Science Program Update.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
The Globus Toolkit: Description and Applications Review Steve Tuecke & Ian Foster Argonne National Laboratory The University of Chicago Globus Co-PI: Carl.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Peer to Peer & Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The University.
DISTRIBUTED COMPUTING
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
CoG Kit Overview Gregor von Laszewski Keith Jackson.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
A Grid Computing Use case Datagrid Jean-Marc Pierson.
Chapter 4 Realtime Widely Distributed Instrumention System.
D EPT. OF I NFO. & C OMM., KJIST Access Grid with High Quality DV Video JongWon Kim, Ph.D. 17 th APAN Meeting /JointTech WS Jan. 29 th, 2004 Networked.
PPDG and ATLAS Particle Physics Data Grid Ed May - ANL ATLAS Software Week LBNL May 12, 2000.
The Grid System Design Liu Xiangrui Beijing Institute of Technology.
The Globus Project: A Status Report Ian Foster Carl Kesselman
Major Grid Computing Initatives Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
Service - Oriented Middleware for Distributed Data Mining on the Grid ,劉妘鑏 Antonio C., Domenico T., and Paolo T. Journal of Parallel and Distributed.
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
Grid Architecture William E. Johnston Lawrence Berkeley National Lab and NASA Ames Research Center (These slides are available at grid.lbl.gov/~wej/Grids)
National Collaboratories Program Overview Mary Anne ScottFebruary 7, rd DOE/NSF Meeting on LHC and Global Computing “Infostructure”
28 March 2001F Harris LHCb Software Week1 Overview of GGF1 (Global Grid Forum) and Datagrid meeting, NIKHEF, Mar 5-9 F Harris(Oxford)
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
Ames Research CenterDivision 1 Information Power Grid (IPG) Overview Anthony Lisotta Computer Sciences Corporation NASA Ames May 2,
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
HEP-CCC Meeting, November 1999Grid Computing for HEP L. E. Price, ANL Grid Computing for HEP L. E. Price Argonne National Laboratory HEP-CCC Meeting CERN,
Policy Based Data Management Data-Intensive Computing Distributed Collections Grid-Enabled Storage iRODS Reagan W. Moore 1.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
GRID ARCHITECTURE Chintan O.Patel. CS 551 Fall 2002 Workshop 1 Software Architectures 2 What is Grid ? "...a flexible, secure, coordinated resource- sharing.
Authors: Ronnie Julio Cole David
GVis: Grid-enabled Interactive Visualization State Key Laboratory. of CAD&CG Zhejiang University, Hangzhou
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
DoE NGI Program PI Meeting, October 1999Particle Physics Data Grid Richard P. Mount, SLAC Particle Physics Data Grid Richard P. Mount SLAC Grid Workshop.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
PPDGLHC Computing ReviewNovember 15, 2000 PPDG The Particle Physics Data Grid Making today’s Grid software work for HENP experiments, Driving GRID science.
Internet2 AdvCollab Apps 1 Access Grid Vision To create virtual spaces where distributed people can work together. Challenges:
MCS  FUTURESLABARGONNE  CHICAGO Rick Stevens, Terry Disz, Lisa Childers, Bob Olson Argonne National Laboratory
Globus and PlanetLab Resource Management Solutions Compared M. Ripeanu, M. Bowman, J. Chase, I. Foster, M. Milenkovic Presented by Dionysis Logothetis.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
Networking: Applications and Services Antonia Ghiselli, INFN Stu Loken, LBNL Chairs.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
2. WP9 – Earth Observation Applications ESA DataGrid Review Frascati, 10 June Welcome and introduction (15m) 2.WP9 – Earth Observation Applications.
GRID ANATOMY Advanced Computing Concepts – Dr. Emmanuel Pilli.
Biomedical Informatics Research Network The Storage Resource Broker & Integration with NMI Middleware Arcot Rajasekar, BIRN-CC SDSC October 9th 2002 BIRN.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
DOE/NSF Quarterly review January 1999 Particle Physics Data Grid Applications David Malon Argonne National Laboratory
PARALLEL AND DISTRIBUTED PROGRAMMING MODELS U. Jhashuva 1 Asst. Prof Dept. of CSE om.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
2003 NTHU IEEM 1 Enterprise Integration Collaborative Product Design – Using Access Grid Project as an Example Group No.11 : 林彥伯 (Gilbert)
ScotGRID is the Scottish prototype Tier 2 Centre for LHCb and ATLAS computing resources. It uses a novel distributed architecture and cutting-edge technology,
Presentation transcript:

Foundations for an LHC Data Grid Stu Loken Berkeley Lab

The Message Large-scale Distributed Computing (known as Grids) is a major thrust of the U.S. Computing community Annual investment in Grid R&D and infrastructure is ~$100M per year This investment can and should be leveraged to provide the Regional computing model for LHC

The Vision for the Grid Persistent, Universal and Ubiquitous Access to Networked Resources Common Tools and Infrastructure for Building 21 st Century Applications Integrating HPC, Data Intensive Computing, Remote Visualization and Advanced Collaborations Technologies

The Grid from a Services View Resource-specific implementations of basic services: E.g., Transport protocols, name servers, differentiated services, CPU schedulers, public key infrastructure, site accounting, directory service, OS bypass Resource-independent and application-independent services: E.g., authentication, authorization, resource location, resource allocation, events, accounting, remote data access, information, policy, fault detection Distributed Computing Applications Toolkit Grid Fabric (Resources) Grid Services (Middleware) Application Toolkits Data- Intensive Applications Toolkit Collaborative Applications Toolkit Remote Visualization Applications Toolkit Problem Solving Applications Toolkit Remote Instrumentation Applications Toolkit Applications Chemistry Biology Cosmology High Energy Physics Environment

Grid-based Computing Projects China Clipper Particle Physics Data Grid NASA Information Power Grid: Distributed Problem Solving Access Grid: The Future of Distributed Collaboration

Clipper Project ANL-SLAC-Berkeley Push the limits of very high- speed data transmission Builds on Globus Middleware and high-performance distributed storage Demonstrated data rates up to 50 Mbytes/sec.

China Clipper Tasks High-Speed Testbed –Computing and networking infrastructure Differentiated Network Services –Traffic shaping on ESnet Monitoring Architecture –Traffic analysis to support traffic shaping and CPU scheduling Data Architecture –Transparent management of data Application Demonstration –Standard Analysis Framework (STAF)

China Clipper Testbed

Clipper Architecture

Monitoring End-to-end monitoring of the assets in a computational grid is necessary both for resolving network throughput problems and for dynamically scheduling resources. China Clipper adds precision-timed event monitor agents to: –ATM switchs –DPSS servers –Testbed computational resources Produce trend analysis modules for monitor agents Make results available to applications

Monitoring

Particle Physics Data Grid HENP Labs and Universities (Caltech-SLAC lead) Extend GRID concept to large- scale distributed data analysis Uses NGI testbeds as well as production networks Funded by DOE-NGI program

NGI: “Particle Physics Data Grid” ANL(CS/HEP), BNL, Caltech, FNAL, JLAB, LBNL(CS/HEP), SDSC, SLAC, U.Wisconsin High-Speed Site-to-Site File Replication Service FIRST YEAR: SLAC-LBNL at least; Goal intentionally requires > OC12; Use existing hardware and networks (NTON); Explore “Diffserv”, instrumentation, reservation/allocation.

NGI: “Particle Physics Data Grid” Deployment of Multi-Site Cached File Access FIRST YEAR: Read access only; Optimized for 1-10 GB files; File-level interface to ODBMSs; Maximal use of Globus, MCAT, SAM, OOFS, Condor, Grand Challenge etc.; Focus on getting users.

Information Power Grid Distributed High-Performance Computing, Large-Scale Data Management, and Collaboration Environments for Science and Engineering Building Problem-Solving Environments William E. Johnston, Dennis Gannon, William Nitzberg

IPG Problem Environment

IPG Requirements Multiple datasets Complex workflow scenarios Data-streams from instrument systems Sub-component simulations coupled simultaneously Appropriate levels of abstraction Search, interpret and fuse multiple data archives Share all aspects of work processes Bursty resource availability and scheduling Sufficient available resources VR and immersive techniques Software agents to assist in routine/repetitive tasks All this will be supported by the Grid. PSEs are the primary scientific/engineering user interface to the Grid.

The Future of Distributed Collaboration Technology: The Access Grid Ian Foster, Rick Stevens Argonne National Laboratory

Beyond Teleconferencing: Physical spaces to support distributed groupwork Virtual collaborative venues Agenda driven scenarios and work sessions Integration with Integrated GRID services

Access Grid Project Goals Enable Group-to-Group Interactions at a Distance Provide a Sense of Presence Use Quality but Affordable Digital IP Based Audio/video (Open Source) Enable Complex Multi-site Visual and Collaborative Experiences Build on Integrated Grid Services Architecture

The Docking Concept for Access Grid Private Workspaces - Docked into the Group Workspace

Ambient mic (tabletop) Presenter mic Presenter camera Audience camera Ambient mic (tabletop) Presenter mic Presenter camera Audience camera Access Grid Nodes Access Grid Nodes Under Development –Library, Workshop –ActiveMural Room –Office –Auditorium

Conclusion A set of closely coordinated projects is laying the foundation for a high- performance distributed computing environment. There appear to be good prospects for a significant long-term investment to deploy the necessary infrastructure to support Particle Physics Data Analysis.