Your university or experiment logo here User Board or User Bored? Glenn Patrick GridPP19, 29 August 2007.

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

0 - 0.
D. Britton GridPP Status - ProjectMap 8/Feb/07. D. Britton08/Feb/2007GridPP Status GridPP2 ProjectMap.
London Tier2 Status O.van der Aa. Slide 2 LT 2 21/03/2007 London Tier2 Status Current Resource Status 7 GOC Sites using sge, pbs, pbspro –UCL: Central,
GridPP22 – Service Resilience and Disaster Planning David Britton, 1/Apr/09.
GridPP Status David Britton, 3/Sep/ /03/2014 Switching on the LHC The LHC was fully cold by mid August. This is being followed by continued powering.
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
Deployment metrics and planning (aka Potentially the most boring talk this week) GridPP16 Jeremy Coles 27 th June 2006.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 16 th Collaboration Meeting QMUL June 2006.
Storage Review David Britton,21/Nov/ /03/2014 One Year Ago Time Line Apr-09 Jan-09 Oct-08 Jul-08 Apr-08 Jan-08 Oct-07 OC Data? Oversight.
Project Status David Britton,15/Dec/ Outline Programmatic Review Outcome CCRC08 LHC Schedule Changes Service Resilience CASTOR Current Status Project.
Your university or experiment logo here What is it? What is it for? The Grid.
RAL Tier1: 2001 to 2011 James Thorne GridPP th August 2007.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
User Board - Supporting Other Experiments Stephen Burke, RAL pp Glenn Patrick.
Tony Doyle GridPP2 Specification Process Grid Steering Committee Meeting, MRC, London, 18 February 2004.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
Northgrid Status Alessandra Forti Gridpp24 RHUL 15 April 2010.
1Oxford eSc – 1 st July03 GridPP2: Application Requirement & Developments Nick Brook University of Bristol ALICE Hardware Projections Applications Programme.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Applications Area Issues RWL Jones GridPP16 QMUL 28 th June 2006.
GridPP: Executive Summary Tony Doyle. Tony Doyle - University of Glasgow Oversight Committee 11 October 2007 Exec 2 Summary Grid Status: Geographical.
UK Agency for the support of: High Energy Physics - the nature of matter and mass Particle Astrophysics - laws from natural phenomena Astronomy - the.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP24 Collaboration Meeting.
Tony Doyle - University of Glasgow 30 November 2005ScotGrid Phase 2 Procurement ScotGrid Procurement … a future news item 25 June 2006: ScotGrid's 4th.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
The National Grid Service Mike Mineter.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
The LHC experiments AuthZ Interoperation requirements GGF16, Athens 16 February 2006 David Kelsey CCLRC/RAL, UK
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft LCG-POB, , Reinhard Maschuw1 Grid Computing Centre Karlsruhe - GridKa Regional/Tier.
Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
EU Market Situation for Eggs and Poultry Management Committee 21 June 2012.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
D. Britton Preliminary Project Plan for GridPP3 David Britton 15/May/06.
Addition 1’s to 20.
Week 1.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Stefano Belforte INFN Trieste 1 CMS SC4 etc. July 5, 2006 CMS Service Challenge 4 and beyond.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
CERN IT Department CH-1211 Genève 23 Switzerland t Tier0 Status - 1 Tier0 Status Tony Cass LCG-LHCC Referees Meeting 18 th November 2008.
Project Management Sarah Pearce 3 September GridPP21.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
User Board Input Tier Storage Review 21 November 2008 Glenn Patrick Rutherford Appleton Laboratory.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
UK Tier 1 Centre Glenn Patrick LHCb Software Week, 28 April 2006.
Tier-1 Andrew Sansum Deployment Board 12 July 2007.
DJ: WLCG CB – 25 January WLCG Overview Board Activities in the first year Full details (reports/overheads/minutes) are at:
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Your university or experiment logo here User Board Glenn Patrick GridPP20, 11 March 2008.
SL5 Site Status GDB, September 2009 John Gordon. LCG SL5 Site Status ASGC T1 - will be finished before mid September. Actually the OS migration process.
SRM v2.2 Production Deployment SRM v2.2 production deployment at CERN now underway. – One ‘endpoint’ per LHC experiment, plus a public one (as for CASTOR2).
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
UK Status and Plans Catalin Condurache – STFC RAL ALICE Tier-1/Tier-2 Workshop University of Torino, February 2015.
Availability of ALICE Grid resources in Germany Kilian Schwarz GSI Darmstadt ALICE Offline Week.
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
Bernd Panzer-Steindel CERN/IT
LHC Data Analysis using a worldwide computing grid
CC and LQCD dimanche 13 janvier 2019dimanche 13 janvier 2019
Presentation transcript:

Your university or experiment logo here User Board or User Bored? Glenn Patrick GridPP19, 29 August 2007

2 Grids need users GridPP WLCG EGEE OSG NGS USER

3 Users need a Board? Not quite – more of a User Forum

4 Board needs a Chair LHCb BaBarOther Experiments CMS

5 Experiments are really the users

6 Tier 1 Fairshares – last 12 months ATLAS BaBar CMS LHCb ATLAS BaBar CMS LHCb...Requested Reality…

7 Tier 1 CPU – 2007 Profile UK Tier 1 Requests CPU(KSI2K) May 07 Used May 07 Alloc. JanFebMarAprilMayJuneJulyAugSeptOctNovDec ALICE ATLAS CMS LHCb BaBar Plan B CDF0.010 D H MICE MINOS SNO ZEUS8.020 Pheno1.620 UKQCD Dteam/Ops Other TOTAL LCG Total Fabric CAPACITY HEADROOM Also an underlying profile for LHC! Napes Needle SuperLHC

at the Tier 1 CPU. Largely met demand through the first half of the year. Resources start to become over-allocated in September when ATLAS and CMS activities are expected to peak. An increasing shortfall of CPU across Q4 is predicted, largely due to the LHC experiments ramping up in preparation for DISK. After the difficulties of 2006, the disk situation through the first 6 months of 2007 has been good with sufficient headroom to provide experiments with the requested capacity, as well as extra resources to assist with the testing of Castor and migration from dCache. There is still headroom through Q3, but Q4 will be challenging and experiments will probably have to wait for the new disk deployment in January TAPE. As always with tape (and the vagaries of Castor repacking, etc), it is difficult to be certain of the physical headroom. Estimates of the unwanted tape storage of the disk1tape0 storage class have had to be included for Castor (hopefully solved when v2.1.4 is deployed). Some allowance has also been made for migration from dCache to Castor.

9 LHC Schedule – End/Start in Sight! Interconnection of the continuous cryostat Leak tests of the last sub-sectors Inner Triplets repairs & interconnections Global pressure test &Consolidation Flushing Cool-down Warm up Powering Tests Global pressure test &Consolidation Cool-down Powering Tests General schedule Baseline rev. 4.0 May 2008 We are here! 8 months

Planning Underway Tier Hardware (GridPP3) CPU(KSI2K)Disk (TB)Tape (TB) Now2008Now2008Now2008 ALICE ATLAS CMS LHCb BaBar Other TOTAL x 2.7x 3.3x 2.8

11 dCache – Castor2 Migration Castor Data The migration to Castor continues to be a challenge! At the UB meeting on 20 June it was agreed that 6 month notice be given for dCache termination. Experiments have to fund storage costs past March 2008 for ADS/vtp tape service.

12 Castor Progress at RAL Separate Instances for LHC Experiments ATLAS Instance-Version in production. CMS Instance-Version in production. LHCb Instance-Version in testing. Issues Strategy meetings and weekly experiment technical meetings helped a lot with progress. Current issues: Tape migration rates. Monitoring at RAL. SRM development (v2.2 timescale). disk1tape0 capability. Repack. But upcoming Data Challenges of CMS (CSA07) and ATLAS (M4, FDR) + LHCb, ALICE will be the real test of Castor.

13 physics group regional group CERN Tier2 Lab a Uni a Lab c Uni n Lab m Lab b Uni b Uni y Uni x Tier3 physics department Desktop Germany Tier 1 USA UK France Italy ………. CERN Tier 1 ………. The LHC Computing Centre LHC Computing Model 2001

14 Grid Only Tier 1 After several discussions, non-Grid access to the Tier 1 is scheduled to finish at the end of 2007 except for a few exceptions. Use cases for the exceptions are being identified: Debugging production jobs. Maintaining experiment environment. Start up of new experiments. etc Limited User Interface Service. Important to retain functionality and flexibility. Implications for RAL AFS service (cell). This is bad news for BaBar who rely on AFS for software distribution.

15 SL4 and 64 Bit Migration Migration to SL4 discussed several times. A new SL4 CE with 20% of the batch capacity commissioned at Tier 1 during the first week of August. Generally, not an issue with experiments, but some teething problems (e.g. LHCb and CMS). Experiment attitudes towards true 64 bit applications (as opposed to 32 bit applications running in compatibility mode) surveyed: ATLAS - not important at the moment. LHCb – Can test, but what about middleware? BaBar/MINOS – No immediate interest. MICE plan to move to 64 bit computing as soon as underlying computing resources become available.

16 Real LHC Data now on Horizon Already here in the case of ATLAS cosmics ! cm -2 s CMS Need to be prepared for increasing luminosity and surprises (good and bad). Important to handle issues as they arise (and if possible anticipate). More direct communication needed in addition to quarterly UB meetings. Variable Backgrounds Changing Beam Energies

17 User Board Future UB and experiment interaction needs to progress as experiments evolve. Suggestions on this are welcome.

18 The End (and The Start)