Presentation is loading. Please wait.

Presentation is loading. Please wait.

Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Similar presentations


Presentation on theme: "Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T."— Presentation transcript:

1 Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T. Bauerdick, Fermilab Project Manager U.S. CMS Software and Computing

2 Apr 18, 2003 Meeting with the DOE 2 Lothar A T Bauerdick Fermilab LHC Physics Discovery through Information Technology and Computing Infrastructure LHC Computing Unprecedented in Scale and Complexity (and Costs)  Advanced Coherent Global “Information-Infrastructure” Partnerships: International, Interdisciplinary, Inter-agency! +

3 Apr 18, 2003 Meeting with the DOE 3 Lothar A T Bauerdick Fermilab LHC Physics Discoveries by Researchers at U.S. Universities and Labs U.S. LHC is Committed to Empower the U.S. Scientists to do Research on LHC Physics Data This is why we are interested in the Grid as an Enabling Technology

4 Apr 18, 2003 Meeting with the DOE 4 Lothar A T Bauerdick Fermilab Distributed Grid Computing and Data Model Adapted as the LHC Baseline Model: Distributed Communities, Distributed Resources Coherent Global Data Access, Analysis, Management Major Successes and Advances of Grid Infrastructure in the U.S.: R&D and Testbeds: Prototyping and Hardening of Grid Infrastructure, Deploying Grid Tools, Developing and Integrating Grid Applications Example: US Atlas Grid Testbed

5 Apr 18, 2003 Meeting with the DOE 5 Lothar A T Bauerdick Fermilab LHC Experiment Online System CERN Computer Center > 20 TIPS USA Japan France UK Institute MBytes/s 2.5 Gbits/s 1 Gbits/s Gbits/s ~0.6 Gbits/s Tier 0 Tier 1 Tier 3 Tier 2 Physics cache PCs, other portals Institute Tier2 Center Tier 4 Tier-ed System of Regional Centers Developing further the hierarchically organized fabric of “Grid Nodes” …

6 Apr 18, 2003 Meeting with the DOE 6 Lothar A T Bauerdick Fermilab Transition to Production-Quality Grid Centers taking part in LHC Grid 2003 Production Service Around the World  Around the Clock!

7 Apr 18, 2003 Meeting with the DOE 7 Lothar A T Bauerdick Fermilab …towards Dynamic Workspaces for Scientists Communities of Scientists Working Locally within a Global Context Infrastructure for sharing, consistency of physics and calibration data, software New IT Needed!

8 Apr 18, 2003 Meeting with the DOE 8 Lothar A T Bauerdick Fermilab LHC Research Program Has Started Strongly! Sizable R&D efforts and major investments in Tier-1 centers started First Grid Infrastructure in place, in collaboration with the LHC Computing Grid Project at CERN and elsewhere U.S. LHC Scientists will profit in major ways èdevelop the strong U.S. LHC environment for Physics Analysis Address the core issues in U.S. LHC S&C: èdeveloping and implementing the distributed computing model central for success of U.S. Universities participation èFocus on end-to-end services, èFocus on distributed data access and management Work with Grid Projects like PPDG, NSF projects, DOE Science Grid etc; work with CERN and other centers around the world to setup a global information infrastructure (“info-structure”) to enable the U.S. for scientific discovery at the LHC

9 Apr 18, 2003 Meeting with the DOE 9 Lothar A T Bauerdick Fermilab The Goal: Provide capabilities to individual physicists and communities of scientists that allow them èTo participate as an equal in the LHC research èTo be fully represented in the Global Experiment Enterprise èTo receive on-demand whatever resources and information they need to explore their science interest while respecting the collaboration-wide priorities and needs Provide massive computing, storage, networking resources èIncluding “opportunistic” use of resources that are not LHC owned! Provide full access to dauntingly complex “meta-data” èThat need to be kept consistent to make sense of the event data Collaborative Environment and Info-systems

10 Apr 18, 2003 Meeting with the DOE 10 Lothar A T Bauerdick Fermilab These Goals Require Substantial R&D vGlobal Access and Global Management of Massive and Complex Data vLocation Transparency of Complex Processing Environments and of Vast Data Collections vVirtual Data, Workflow, Knowledge Management Technologies vMonitoring, Simulation, Scheduling and Optimization on a Heterogeneous Grid of Computing Facilities and Networks vEnd-to-End Networking Performance, Application Integration vManagement of Virtual Organizations across the Grid, Technologies and Services for Security, Privacy, Accounting vScientific Collaboration over the distance vEtc … Grids are the Enabling Technology LHC Needs are Pushing the Limits Technology and Architecture still evolving New Research and Development in IT is required

11 Apr 18, 2003 Meeting with the DOE 11 Lothar A T Bauerdick Fermilab U.S. LHC Grid Technology Cycles “Rolling Prototypes”: evolution of the facility and data systems Prototyping and early roll out to production quality services Participation in Computing and Physics Data Challenges Emphasis on Quality, Documentation, Dissemination, Tracking of external “practices”

12 Apr 18, 2003 Meeting with the DOE 12 Lothar A T Bauerdick Fermilab MIT Rice Minnesota Iowa Princeton Expression of interest: Grid Testbeds And Production Grids Brazil South Korea Grid Testbeds: Research, Development and Dissemination! èLHC Grid Testbeds first real-life large Grid installations, becoming production quality èStrong Partnership between Labs, Universities, with Grid (iVDGL, GriPhyN, PPDG) and Middleware Projects (Condor, Globus) èStrong dissemination component, together with Grid Projects E.g. U.S. CMS Testbed: èCaltech, UCSD, U.Florida, UW Madison, Fermilab, CERN

13 Apr 18, 2003 Meeting with the DOE 13 Lothar A T Bauerdick Fermilab Example: Grid Monitoring and Information Services èMonALISA Monitoring System (Caltech) deployed in U.S. & CERN Grid Testbed èDynamic information services and Grid resource discovery mechanisms using “intelligent agents” u Use and further develop novel Grid Technologies and Grid Interfaces è“Grid Control Room” For LHC Grid èTechnology driver for other projects

14 Apr 18, 2003 Meeting with the DOE 14 Lothar A T Bauerdick Fermilab Distributed Collaborative Engineering “Projectization” essential for Software and Computing Effort of this complexity èRequires expert manpower and engineering èPhysics and Detector Groups at Universities are the first to profit from this Example: Atlas Detector Geometry Description Databases èIdea and Concept u Geometry Modeller based on CDF [U.Pittsburg] èMassive Development Effort u NOVA MySQL Database [BNL] –Repository of persistent configuration information u NOVA Service [ANL] –Retrieval of transient C++ objects from NOVA Database u Conditions Database Service [ANL/Lisbon] –Access to time-varying information based on type, time, version and key –Used in conjunction with other persistency services (e.g. NOVA Service) u Interval Of Validity Service [LBNL] –Registration of clients; retrieval of updated information when validity expires; caching policy management èRelease as scheduled to Detector and Physics Groups u Prototype at Silicon alignment workshop in December 2002

15 Apr 18, 2003 Meeting with the DOE 15 Lothar A T Bauerdick Fermilab Example: Detector Description Detail from TRT Detail from Barrel Liquid Argon (parameterized - 40kB in memory) Geometry Modeller, Database, Visualization, Optimization

16 Apr 18, 2003 Meeting with the DOE 16 Lothar A T Bauerdick Fermilab “Work Packages” for LHC Computing Facilities and Fabric Infrastructure èU.S. Tier-1 and Tier-2 centers, U.S. University infrastructure Distributed Computing Infrastructure èNetworks, throughput, servers, catalogs Grid Services èMiddleware, “Virtual Organizations” support, end-to-end and higher level services, trouble shooting and fault tolerance, distributed science environment Experiment Specific Software èCore software, frameworks, architectures, applications physics and detector support Collaboratory Tools and Support èCommunication, conferencing, sharing, Virtual Control Room Support Services èTraining, info services, help desk

17 Apr 18, 2003 Meeting with the DOE 17 Lothar A T Bauerdick Fermilab “Map” of Grid Projects Directly Related to LHC e.g. the U.S. Particle Physics Data Grid (PPDG), funded by SciDAC Grid Projects are a Large International Effort

18 Apr 18, 2003 Meeting with the DOE 18 Lothar A T Bauerdick Fermilab Partnership for Global Infostructure Physics + Computer Science/Information Technology Funding Agencies I.Gaines, 4-Agency meeting at CERN March 21st, 2003

19 Apr 18, 2003 Meeting with the DOE 19 Lothar A T Bauerdick Fermilab DOE Labs have great impact on U.S. LHC Science Inter-agency partnership between the DOE-funded Tier-1 and NSF-funded Tier-2 efforts Tier-1s at Fermilab and BNL address the major Grid Infrastructure expertise and 24x7 Support Information Technology and Computing Infostructure for LHC Physics Discovery requires Research, Development, Deployment, Dissemination —and Sustained Reliable Running of Facilities and Support Services! —and Sustained Reliable Running of Facilities and Support Services! +

20 Apr 18, 2003 Meeting with the DOE 20 Lothar A T Bauerdick Fermilab The U.S. LHC Mission is Physics Discovery at the Energy Frontier! This partnership takes advantage of the significant strengths of U.S. labs and universities in the area of CS and IT and exploit synergy between U.S. Universities and National Labs, Software Professionals and Physicists, Computer Scientists and High Energy Physicists LHC is amongst the first to put a truly distributed “Info-Structure” in place, spearheading important innovations in how we do science


Download ppt "Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T."

Similar presentations


Ads by Google