Presentation is loading. Please wait.

Presentation is loading. Please wait.

High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.

Similar presentations


Presentation on theme: "High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics."— Presentation transcript:

1 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida http://www.phys.ufl.edu/~avery/ avery@phys.ufl.edu High Energy Physics and Grid Projects at U of Florida Dell Visit to University of Florida Dec. 13, 2002

2 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery2 High Energy Physics  HEP data  Collected in large facilities: Fermilab, Brookhaven, CERN  Record collisions of opposite moving beams  Each collision stored & analyzed independently  100M – 1000M collisions/year collected

3 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery3 Today: High Energy Physics at Fermilab  CDF experiment  International experiment, 600 physicists, several countries

4 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery4 Tomorrow: High Energy Physics at LHC “Compact” Muon Solenoid at the LHC (CERN) Smithsonian standard man

5 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery5 CMS Data Complexity  “Events” resulting from beam-beam collisions:  Signal event is obscured by 20 overlapping uninteresting collisions in same crossing (1 MB per event stored)  CPU time to analyze each event rises dramatically 20002007

6 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery6 10 9 events/sec, selectivity: 1 in 10 13 CMS Analysis: Higgs Decay to 4 muons

7 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery7 1800 Physicists 150 Institutes 32 Countries LHC Computing Challenges  Complexity:Millions of detector channels, complex events  Scale:PetaOps (CPU), Petabytes (Data)  Distribution:Global distribution of people & resources

8 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery8 Experiment (e.g., CMS) Global LHC Data Grid Online System CERN Computer Center > 20 TIPS USA Korea Russia UK Institute 100-200 MBytes/s 2.5 Gbits/s 0.1 - 1 Gbits/s 2.5 Gbits/s ~0.6 Gbits/s Tier 0 Tier 1 Tier 3 Tier 4 Tier0/(  Tier1)/(  Tier2) ~ 1:1:1 Tier 2 Physics cache PCs, other portals Institute Tier2 Center Florida

9 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery9 Florida Tier2 Center (2003) Router GEth/FEth Switch GEth Switch Data Server >1 RAID WAN “Hierarchical” switching topology Switch GEth/FEth Florida in 2003  300 CPUs  >2.5 GHz P4  7 TBytes RAID

10 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery10 “Trillium”: US Data Grid Projects  GriPhyN  Grid research, toolkits  $12M, 15 institutions  iVDGL  Deploy Global Grid lab  $14M, 17 institutions  PPDG  Data Grid for HEP experiments  $9.5M, 12 institutions  Data intensive experiments  Physicists + computer scientists  Infrastructure development & deployment = Florida leads

11 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery11 Goal: PetaScale Virtual-Data Grids Virtual Data Tools Request Planning & Scheduling Tools Request Execution & Management Tools Transforms Distributed resources (code, storage, CPUs, networks) è Resource è Management è Services Resource Management Services è Security and è Policy è Services Security and Policy Services è Other Grid è Services Other Grid Services Interactive User Tools Production Team Individual Investigator Workgroups Raw data source ~1 Petaflop ~100 Petabytes

12 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery12 US-iVDGL Data Grid (Spring 2003) UF Wisconsin Fermilab BNL Indiana Boston U SKC Brownsville Hampton PSU J. Hopkins Caltech Tier1 Tier2 Tier3 FIU FSU Arlington Michigan LBL Oklahoma Argonne Vanderbilt UCSD/SDSC NCSA

13 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery13 Florida-Led US-CMS Testbed

14 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery14 CMS Production Simulations Remote Site 2 Master Site Remote Site 1 IMPALA mop_submitter DAGMan Condor-G GridFTP Batch Queue GridFTP Batch Queue GridFTP Remote Site N Batch Queue GridFTP Several productions in 2002 Sites in US & Europe Uncovered many Grid problems 1M events almost complete

15 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery15 WorldGrid  Joint US - Europe effort  Resources from both sides (15 sites)  Use several visualization tools (Nagios, MapCenter, Ganglia)  Use several monitoring tools (Ganglia, MDS, NetSaint, …)  Applications  CMS:CMKIN, CMSIM  ATLAS:ATLSIM  Submit jobs from US or EU  Jobs can run on any cluster  Demonstrated at IST2002 (Copenhagen)  Demonstrated at SC2002 (Baltimore)  Brochures available describing Grid projects

16 High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery16 Collaborative Possibilities with Dell  Raw computing power for production simulations  100s of CPUs, Terabytes of RAID  High performance I/O  Need to move ~ 5 Gbytes/sec between remote sites  Bottlenecks: network protocols, clusters, components, software,…  Managed clusters  Goal is high efficiency use  Cluster management tools, automatic operation, fewer people, …  Campus Grid operations  Large scale operation permits many interesting “stress” tests of cluster, network and Grid software components  International Grid operations  Tools for monitoring dozens of sites  Automatic operation, fewer people, high throughput, …


Download ppt "High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics."

Similar presentations


Ads by Google