Presentation is loading. Please wait.

Presentation is loading. Please wait.

Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004.

Similar presentations


Presentation on theme: "Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004."— Presentation transcript:

1 Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004

2 Jeff Templon – WAR, NIKHEF, 2004.05.07 - 2 Goals 1. Realize an LHC physics computing infrastructure optimized for use by NIKHEF physicists 2. Where possible, combine expertise associated with goal 1 for other projects with NIKHEF participation 3. Capitalize on expertise & available funds by participating in closely-related EU & NL projects 4. Use NIKHEF Grid-computing expertise and capacity as currency

3 Jeff Templon – WAR, NIKHEF, 2004.05.07 - 3 NIKHEF-optimal LHC Computing Infrastructure u Operation of LCG core site n Build experience with site operation and discover “external” site issues traditionally ignored by CERN n Leverage front-runner position earned by our EDG effort u Strong participation in LHC/LCG/HEP Grid framework projects n Meten is weten – preoptimization is the root of all evil (Knuth) n Leverage front-runner position earned by EDG effort u Leading role in Architecture/Design arm of EGEE n AA model is fulcrum of balance between “CERN-centric” and “really distributed” models n Make use of accumulated expertise in “security” to gain position in middleware design u Preparation for Tier-1 n Avoids having others determine NIKHEF computing priorities

4 Jeff Templon – WAR, NIKHEF, 2004.05.07 - 4 LHC/LCG/HEP Projects u Strong Coupling to NIKHEF LHC experiment analysis n One grad student per experiment, working with ARDA project; early influence, experience, and expertise with LHC analysis frameworks n Room for more participation in medium term (postdocs, staff) u Continuing work with D0 reprocessing n D0 metadata model is far advanced compared to LHC model n Influence via our (LHC) task-distribution expertise on US computing u Investigations on ATLAS distributed Level-3 trigger n Precursor for LOFAR/Km3NeT activities

5 Jeff Templon – WAR, NIKHEF, 2004.05.07 - 5 Preparation for Tier-1 u Tier-1 for LHC n Archive ~ 1/7 of raw data, all ESDs produced on site, all MC produced on site, full copies of AOD and tags n Contribute ~ 1/7 of twice-yearly reprocessing power u End result: major computing facility in Watergraafsmeer n 1 petabyte each of disk cache & tape store per year start 2008 n ~ 2000 CPUs in 2008 n ~ 1.5 Gbit/s network to CERN n These numbers are per experiment u NIKHEF contributes research, SARA eventually takes lion’s share of operation u NCF must underwrite this effort (MoU with CERN)

6 Jeff Templon – WAR, NIKHEF, 2004.05.07 - 6 Overlap with other NIKHEF projects u Other HEP experiments n D0 work Q4 2003, continuing n Babar project together with Miron Livny (Wisconsin) u Astroparticle physics n LOFAR SOC much like LHC Tier-1 n Km3NeT on-demand repointing much like ATLAS Level-3 trigger

7 Jeff Templon – WAR, NIKHEF, 2004.05.07 - 7 EU & NL Projects u EGEE (EU FP6 project, 2+2 years, 30M) n Funding for site operation (together with SARA) n Funding for Grid Technology projects (together with UvA) n Funding for “generic applications” (read non-LHC) u BSIK/VL-E n Funding for Data-Intensive Science (everything we do) n Funding for Scaling and Validation (large-scale site operation) u Cooperation with other disciplines n Leverage multi-disciplinary use of our infrastructure into large NCF-funded facility (Tier-1)

8 Jeff Templon – WAR, NIKHEF, 2004.05.07 - 8 Currency u Advantages of Grid Computing for external funding u Grid computing (cycles & expertise) in exchange for membership fees

9 Jeff Templon – WAR, NIKHEF, 2004.05.07 - 9 People u LHC applications n Templon, Bos, “postdoc”, 3 grad students u Non-LHC applications n Van Leeuwen (CT), Grijpink (CT), Bos, Templon, Groep u Grid Technology n Groep, Koeroo (CT), Venekamp (CT), Steenbakkers (UvA), Templon u Site Operations n Salomoni (CT), Groep, Templon, other CT support

10 Jeff Templon – WAR, NIKHEF, 2004.05.07 - 10 People / Funding u EGEE n 1 FTE Generic Apps, 1 FTE Site Operations, 1 FTE AA u BSIK/VL-E n 1 FTE Scaling & Validation, 1 FTE Data-Intensive Sciences u Both projects require local 1-1 matching (50% cost model) u Can overlap +- 15% u Possible additional money from bio-range project u Possible to replace some manpower with equivalent equipment

11 Jeff Templon – WAR, NIKHEF, 2004.05.07 - 11 Possible “Funding Model”


Download ppt "Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004."

Similar presentations


Ads by Google