High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.

Slides:



Advertisements
Similar presentations
International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California.
Advertisements

Grids e HEP. Concorde (15 Km) Balloon (30 Km) CD stack with 1 year LHC data! (~ 20 Km) Mt. Blanc (4.8 Km) Bytes 10 3 Terabytes 1 Petabyte.
Oliver Gutsche - CMS / Fermilab Analyzing Millions of Gigabyte of LHC Data for CMS - Discover the Higgs on OSG.
Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
LHC Computing Review (Jan. 14, 2003)Paul Avery1 University of Florida GriPhyN, iVDGL and LHC Computing.
US-CMS Meeting (May 19, 2001)Paul Avery1 US-CMS Meeting (UC Riverside) May 19, 2001 Grids for US-CMS and CMS Paul Avery University of Florida
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
Experience with ATLAS Data Challenge Production on the U.S. Grid Testbed Kaushik De University of Texas at Arlington CHEP03 March 27, 2003.
University of Michigan (May 8, 2003)Paul Avery1 University of Florida Grids for 21 st Century Data Intensive.
SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN GriPhyN: Grid Physics Network and iVDGL: International Virtual Data Grid Laboratory.
IHG (Innsbrucker Hochenergiephysikgruppe) The High Energy – Particle Physics group (IHG) at the Institute for Experimental Physics Innsbruck is involved.
Beauty 2003 (October 14, 2003)Paul Avery1 University of Florida Grid Computing in High Energy Physics Enabling Data Intensive Global.
EU funding for DataGrid under contract IST is gratefully acknowledged GridPP Tier-1A Centre CCLRC provides the GRIDPP collaboration (funded.
The Grid as Infrastructure and Application Enabler Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Korean HEP Grid Workshop (Nov. 8, 2002)Paul Avery1 University of Florida U.S. Physics Data Grid Projects.
CANS Meeting (December 1, 2004)Paul Avery1 University of Florida UltraLight U.S. Grid Projects and Open Science Grid Chinese American.
HEP and Data Grids (Aug. 4-5, 2001)Paul Avery1 High Energy Physics and Data Grids Paul Avery University of Florida
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
José M. Hernández CIEMAT Grid Computing in the Experiment at LHC Jornada de usuarios de Infraestructuras Grid January 2012, CIEMAT, Madrid.
University of Mississippi (Nov. 11, 2003)Paul Avery1 University of Florida Data Grids Enabling Data Intensive Global Science Physics.
TechFair ‘05 University of Arlington November 16, 2005.
WG Goals and Workplan We have a charter, we have a group of interested people…what are our plans? goalsOur goals should reflect what we have listed in.
CHEP 2000 (Feb. 7-11)Paul Avery (Data Grids in the LHC Era)1 The Promise of Computational Grids in the LHC Era Paul Avery University of Florida Gainesville,
K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington.
A Design for KCAF for CDF Experiment Kihyeon Cho (CHEP, Kyungpook National University) and Jysoo Lee (KISTI, Supercomputing Center) The International Workshop.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
LIGO- G Z Planning Meeting (Dec 2002)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
Manuela Campanelli The University of Texas at Brownsville EOT-PACI Alliance All-Hands Meeting 30 April 2003 Urbana, Illinois GriPhyN.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
José D. Zamora, Sean R. Morriss and Manuela Campanelli.
GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida Opening and Overview GriPhyN External.
Brussels Grid Meeting (Mar. 23, 2001)Paul Avery1 University of Florida Extending the Grid Reach in Europe.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
D0RACE: Testbed Session Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
DOE/NSF Review (Nov. 15, 2000)Paul Avery (LHC Data Grid)1 LHC Data Grid The GriPhyN Perspective DOE/NSF Baseline Review of US-CMS Software and Computing.
GriPhyN Project Overview Paul Avery University of Florida GriPhyN NSF Project Review January 2003 Chicago.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Internet 2 Workshop (Nov. 1, 2000)Paul Avery (The GriPhyN Project)1 The GriPhyN Project (Grid Physics Network) Paul Avery University of Florida
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Grid Effort at UF Presented by Craig Prescott.
High Energy Physics & Computing Grids TechFair Univ. of Arlington November 10, 2004.
US CMS Centers & Grids – Taiwan GDB Meeting1 Introduction l US CMS is positioning itself to be able to learn, prototype and develop while providing.
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
Manuela Campanelli The University of Texas at Brownsville GriPhyN NSF Project Review January 2003 Chicago Education & Outreach.
High Energy FermiLab Two physics detectors (5 stories tall each) to understand smallest scale of matter Each experiment has ~500 people doing.
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
6 march Building the INFN Grid Proposal outline a.ghiselli,l.luminari,m.sgaravatto,c.vistoli INFN Grid meeting, milano.
Tier 1 at Brookhaven (US / ATLAS) Bruce G. Gibbard LCG Workshop CERN March 2004.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
Latest Improvements in the PROOF system Bleeding Edge Physics with Bleeding Edge Computing Fons Rademakers, Gerri Ganis, Jan Iwaszkiewicz CERN.
Hans Wenzel CDF CAF meeting October 18 th -19 th CMS Computing at FNAL Hans Wenzel Fermilab  Introduction  CMS: What's on the floor, How we got.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Hall D Computing Facilities Ian Bird 16 March 2001.
] Open Science Grid Ben Clifford University of Chicago
Southwest Tier 2.
US CMS Testbed.
Presentation transcript:

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics and Grid Projects at U of Florida Dell Visit to University of Florida Dec. 13, 2002

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery2 High Energy Physics  HEP data  Collected in large facilities: Fermilab, Brookhaven, CERN  Record collisions of opposite moving beams  Each collision stored & analyzed independently  100M – 1000M collisions/year collected

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery3 Today: High Energy Physics at Fermilab  CDF experiment  International experiment, 600 physicists, several countries

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery4 Tomorrow: High Energy Physics at LHC “Compact” Muon Solenoid at the LHC (CERN) Smithsonian standard man

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery5 CMS Data Complexity  “Events” resulting from beam-beam collisions:  Signal event is obscured by 20 overlapping uninteresting collisions in same crossing (1 MB per event stored)  CPU time to analyze each event rises dramatically

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery events/sec, selectivity: 1 in CMS Analysis: Higgs Decay to 4 muons

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery Physicists 150 Institutes 32 Countries LHC Computing Challenges  Complexity:Millions of detector channels, complex events  Scale:PetaOps (CPU), Petabytes (Data)  Distribution:Global distribution of people & resources

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery8 Experiment (e.g., CMS) Global LHC Data Grid Online System CERN Computer Center > 20 TIPS USA Korea Russia UK Institute MBytes/s 2.5 Gbits/s Gbits/s 2.5 Gbits/s ~0.6 Gbits/s Tier 0 Tier 1 Tier 3 Tier 4 Tier0/(  Tier1)/(  Tier2) ~ 1:1:1 Tier 2 Physics cache PCs, other portals Institute Tier2 Center Florida

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery9 Florida Tier2 Center (2003) Router GEth/FEth Switch GEth Switch Data Server >1 RAID WAN “Hierarchical” switching topology Switch GEth/FEth Florida in 2003  300 CPUs  >2.5 GHz P4  7 TBytes RAID

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery10 “Trillium”: US Data Grid Projects  GriPhyN  Grid research, toolkits  $12M, 15 institutions  iVDGL  Deploy Global Grid lab  $14M, 17 institutions  PPDG  Data Grid for HEP experiments  $9.5M, 12 institutions  Data intensive experiments  Physicists + computer scientists  Infrastructure development & deployment = Florida leads

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery11 Goal: PetaScale Virtual-Data Grids Virtual Data Tools Request Planning & Scheduling Tools Request Execution & Management Tools Transforms Distributed resources (code, storage, CPUs, networks) è Resource è Management è Services Resource Management Services è Security and è Policy è Services Security and Policy Services è Other Grid è Services Other Grid Services Interactive User Tools Production Team Individual Investigator Workgroups Raw data source ~1 Petaflop ~100 Petabytes

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery12 US-iVDGL Data Grid (Spring 2003) UF Wisconsin Fermilab BNL Indiana Boston U SKC Brownsville Hampton PSU J. Hopkins Caltech Tier1 Tier2 Tier3 FIU FSU Arlington Michigan LBL Oklahoma Argonne Vanderbilt UCSD/SDSC NCSA

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery13 Florida-Led US-CMS Testbed

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery14 CMS Production Simulations Remote Site 2 Master Site Remote Site 1 IMPALA mop_submitter DAGMan Condor-G GridFTP Batch Queue GridFTP Batch Queue GridFTP Remote Site N Batch Queue GridFTP Several productions in 2002 Sites in US & Europe Uncovered many Grid problems 1M events almost complete

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery15 WorldGrid  Joint US - Europe effort  Resources from both sides (15 sites)  Use several visualization tools (Nagios, MapCenter, Ganglia)  Use several monitoring tools (Ganglia, MDS, NetSaint, …)  Applications  CMS:CMKIN, CMSIM  ATLAS:ATLSIM  Submit jobs from US or EU  Jobs can run on any cluster  Demonstrated at IST2002 (Copenhagen)  Demonstrated at SC2002 (Baltimore)  Brochures available describing Grid projects

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery16 Collaborative Possibilities with Dell  Raw computing power for production simulations  100s of CPUs, Terabytes of RAID  High performance I/O  Need to move ~ 5 Gbytes/sec between remote sites  Bottlenecks: network protocols, clusters, components, software,…  Managed clusters  Goal is high efficiency use  Cluster management tools, automatic operation, fewer people, …  Campus Grid operations  Large scale operation permits many interesting “stress” tests of cluster, network and Grid software components  International Grid operations  Tools for monitoring dozens of sites  Automatic operation, fewer people, high throughput, …