Presentation is loading. Please wait.

Presentation is loading. Please wait.

LCG-France, 22 July 2004, CERN1 LHCb Data Challenge 2004 A.Tsaregorodtsev, CPPM, Marseille LCG-France Meeting, 22 July 2004, CERN.

Similar presentations


Presentation on theme: "LCG-France, 22 July 2004, CERN1 LHCb Data Challenge 2004 A.Tsaregorodtsev, CPPM, Marseille LCG-France Meeting, 22 July 2004, CERN."— Presentation transcript:

1 LCG-France, 22 July 2004, CERN1 LHCb Data Challenge 2004 A.Tsaregorodtsev, CPPM, Marseille LCG-France Meeting, 22 July 2004, CERN

2 LCG-France, 22 July 2004, CERN2 Goals of DC’04  Main goal: gather information to be used for writing the LHCb computing TDR/TP  Robustness test of the LHCb software and production system Using software as realistic as possible in terms of performance  Test of the LHCb distributed computing model Including distributed analyses realistic test of analysis environment, need realistic analyses  Incorporation of the LCG application area software into the LHCb production environment  Use of LCG resources as a substantial fraction of the production capacity

3 LCG-France, 22 July 2004, CERN3 DC 2004 phases  Phase 1 – MC data production  180M events of different signals, bg, mbias  Simulation+reconstruction  DST’s are copied to Tier1 centres  Phase 2 – Data reprocessing  Selection of various physics streams from DST’s  Copy selections to all Tier1 centers  Phase 3 – User analysis  User analysis jobs on DST data distributed in all the Tier1 centers

4 LCG-France, 22 July 2004, CERN4 Phase 1 MC production

5 LCG-France, 22 July 2004, CERN5 DIRAC Services and Resources DIRAC Job Management Service DIRAC Job Management Service DIRAC CE LCG Resource Broker Resource Broker CE 1 DIRAC Sites Agent CE 2 CE 3 Production manager Production manager GANGA UI User CLI JobMonitorSvc JobAccountingSvc AccountingDB Job monitor InfomarionSvc FileCatalogSvc MonitoringSvc BookkeepingSvc BK query webpage BK query webpage FileCatalog browser FileCatalog browser User interfaces DIRAC services DIRAC resources DIRAC Storage DiskFile gridftp bbftp rfio

6 LCG-France, 22 July 2004, CERN6 Software to be installed  Before an LHCb application can run on a Worker Node the following software components should be installed:  Application software itself;  Software packages on which the application depends;  Necessary databases (file based)  DIRAC software  Single untar command to install in place  All the necessary libraries are included – no assumption made about the availability of whatever software on the destination site (except recent python interpreter):  External libraries;  Compiler libraries;  ld-linux.so  Same binary distribution running on RH 7.1-9.0

7 LCG-France, 22 July 2004, CERN7 Software installation  Software repository:  Web server (http protocol)  LCG Storage Element  Installation in place DIRAC way:  By Agent upon reception of a job with particular software requirements; OR  By a running job itself.  Installation in place LCG2 way:  Special kind of a job running standard DIRAC software installation utility

8 LCG-France, 22 July 2004, CERN8 Software installation in the job  A job may need extra SW packages not in place on CE  Special version of geometry;  User analysis algorithms.  Any number of packages can be installed in the job itself (up to all of them)  Packages are installed in the job user space  Imitate the structure of the LHCb standard SW directory tree with symbolic links

9 LCG-France, 22 July 2004, CERN9 3’d party components  Originally DIRAC aimed at producing the following components:  Production database;  Metadata and job provenance database;  Workload management.  Expected 3’d party components:  Data management (FileCatalogue, replica management)  Security services  Information and Monitoring Services  Expectations for early delivery of the ARDA prototype components failed

10 LCG-France, 22 July 2004, CERN10 File catalog service  The LHCb Bookkeeping was not meant to be used as a File (Replica) Catalog  Main use as Metadata and Job Provenance database  Replica catalog based on specially built views  AliEn File Catalog was chosen to get a (full) set of the necessary functionality:  Hierarchical structure: Logical organization of data – optimized queries; ACL by directory; Metadata by directory; File system paradigm;  Robust, proven implementation  Easy to wrap as an independent service: Inspired by the ARDA RTAG work

11 LCG-France, 22 July 2004, CERN11 AliEn FileCatalog in DIRAC  AliEn FC SOAP interface was not ready in the beginning of 2004  Had to provide our own XML-RPC wrapper Compatible with XML-RPC BK File Catalog  Using AliEn command line “alien –exec”  Ugly, but works  Building service on top of AliEn which is run by the lhcbprod AliEn user  Not really using the AliEn security mechanisms  Using AliEn version 1.32  So far in DC2004:  >100’000 files with >250’000 replicas  Very stable performance

12 LCG-France, 22 July 2004, CERN12 File catalogs MySQL AliEn FC AliEn UI XML-RPC server XML-RPC server AliEn FC Client AliEn FC Client ORACLE LHCb BK DB LHCb BK DB XML-RPC server XML-RPC server BK FC Client BK FC Client FC Client DIRAC Application, Service DIRAC Application, Service AliEn FileCatalog Service BK FileCatalog Service FileCatalog Client

13 LCG-France, 22 July 2004, CERN13 Data Production – 2004  Currently distributed data sets  CERN: Complete DST (copied directly from production centres)  Tier1: Master copy of DST produced at associated sites  DIRAC sites: Bologna, Karlsruhe, Spain (PIC), Lyon, UK sites (RAL), all otherwise CERN  LCG sites: Currently only 3 Grid (MSS) SE sites - CASTOR Bologna, PIC, CERN  Bologna:ru,pl,hu,cz,gr,it  PIC: us,ca,es,pt,tw  CERN: elsewhere

14 LCG-France, 22 July 2004, CERN14 DIRAC DataManagement tools  DIRAC Storage Element:  IS description + server (bbftpd, sftpd, httpd, gridftpd, xmlrpcd, file, rfio, etc)  Need no special service installation on the site  Description in the Information Service:  Host, Protocol, Local path  ReplicaManager API for common operations:  copy(), copyDir(), get(), exists(), size(), mkdir(), etc  Examples of usage:  dirac-rm-copyAndRegister dirac-rm-copy dc2004.dst CERN_Castor_BBFTP  Tier0SE and Tier1SE’s are defined in the central IS

15 LCG-France, 22 July 2004, CERN15 Reliable Data Transfer  Any data transfer should be accomplished despite temporary failures of various services or networks:  Multiple retries of failed transfers with any necessary delay: Until services are up and running; Not applicable for LCG jobs.  Multiple retries of registration in the Catalog.  Transfer Agent:  Maintains a database of Transfer requests;  Transfers datasets or whole directories with log files;  Retries transfers until success

16 LCG-France, 22 July 2004, CERN16 DIRAC DataManagement tools Transfer Agent Transfer DB Job Data Manager Data Optimizer SE 1 SE 2 cache Transfer requests

17 LCG-France, 22 July 2004, CERN17 DIRAC DC2004 performance  In May-July:  Simulation+Reconstruction  >80000 jobs  ~75M events  ~25TB of data Stored at CERN,PIC,Lyon,CNAF,RAL Tier1 centres  >150’000 files in the catalogs  ~2000 jobs running continuously Up to 3000 in a peak

18 LCG-France, 22 July 2004, CERN18 DC2004 at CC/IN2P3  The main DIRAC development site  The CC/IN2P3 contribution is very weak  Production runs stable continuously;  Resources are very limited  HPSS performance is stable

19 LCG-France, 22 July 2004, CERN19 Note on the BBFTP  Nice product  Stable, performant, complete, grid enabled  Light weight  Easy deployment of the statically linked executable  Good peformance  Would be nice to have a parallelized load balancing server  Functionality not complete with respect to GRIDFTP:  Remote storage management (ls(), size(), remove() )  Transfers between remote servers

20 LCG-France, 22 July 2004, CERN20 LCG experience

21 LCG-France, 22 July 2004, CERN21 Production jobs  Long jobs – 23 hours on average 2GHz PIV  Simulation+Digitization+Reconstruction steps  5 to 10 steps in one job  No event input data  Output data – 1-2 output files of ~200MB  Stored to Tier1 and Tier0 SE  Log files copied to an SE at CERN  AliEn and Bookkeeping Catalogues are updated

22 LCG-France, 22 July 2004, CERN22 Using LCG resources  Different ways of scheduling jobs to LCG  Standard: jobs got via RB;  Direct: jobs go directly to CE;  Resource reservation  Using Reservation mode for the DC2004 production:  Deploying agents to WN as LCG jobs  DIRAC jobs are fetched by the agents in case the environment is OK  Agent steers the job execution including data transfers, update of the catalogs and bookkeeping.

23 LCG-France, 22 July 2004, CERN23 Using LCG resources (2)  Using DIRAC DataManagement tools:  DIRAC SE + gridftp + sftp  Starting to populate RLS from DIRAC catalogues:  For evaluation  For use with ReplicaManager of LCG

24 LCG-France, 22 July 2004, CERN24 Resource Broker I  No trivial use of tools for large number of jobs i.e. production  Command re-authenticated for every job  Produce errors with list of jobs (e.g. retrieve non-terminated jobs)  Slow to response when few 100 jobs in RB  e.g. 15 seconds for job scheduling  Ranking mechanism to provide even distribution of jobs  Number of CPUs published is for site & not for user/VO - request for free CPU in JDL doesn’t help

25 LCG-France, 22 July 2004, CERN25 Resource Broker II  LCG, in general, does not advertise normalised time units  Solution: request CPU resources for the slowest CPU (500Hz)  Problem: only v. few site have long enough queues  Solution: DIRAC agent scales CPU for particular WN before request to DIRAC  Problem: some sites have normalised their units!  Jobs with ∞loops  3 day job in week queue - killed by proxy expiry rather than CPU reqt  Jobs aborted by “proxy expired”  RB was re-using old proxies !!!!

26 LCG-France, 22 July 2004, CERN26 Resource Broker III  Job cancelled by RB but with message “cancelled by user”  Due to loss of communication between RB & CE - job rescheduled & killed on original CE  Some job are not killed until they fail due to inability to transfer data  DIRAC also re-schedules!  RB lost control of the status of all jobs  RB “stuck” - not responding to any request - solved without loss of jobs

27 LCG-France, 22 July 2004, CERN27 Disk Storage  Job runs in directory without enough space  Jobs running need ~2GB - problem where site has jobs sharing same disk server rather than local WN space

28 LCG-France, 22 July 2004, CERN28 Reliable Data Transfer  In case of data transfer failure the data on LCG is lost. There is no retry mechanism if the destination SE is temporarily not available  Problems with GRIDFTP server at CERN:  Certificates not understood  Refused connections

29 LCG-France, 22 July 2004, CERN29 Odds & Sods  LDAP of globus-mds server stops  OK - no jobs can be submitted to site  BUT also problems with authentication of GridFTP transfer  Empty output sandbox  Tricky to debug !  Jobs cancelled by retry count  Occurs on sites with many jobs running  DIRAC just submits more agents

30 LCG-France, 22 July 2004, CERN30 Conclusions

31 LCG-France, 22 July 2004, CERN31 Demand 2004  CPU:  14 M UI hours (1.4 M UI hours consumed so far)  Storage  HPSS 20 TB  Disk 2 TB Accessible from the LCG grid

32 LCG-France, 22 July 2004, CERN32 Demand 2005  CPU:  ~15 M UI hours  Storage  HPSS 30 TB ( ~15 TB recycled)  Disk 2 TB

33 LCG-France, 22 July 2004, CERN33 Tier2 centers  Feasible  Good network connectivity is essential  Limited functionality:  Number crunches (production simulation type tasks)  Standard technical solution  Hardware (CPU+storage)  Cluster software  Central consultancy support  Housing space  Adequate rooms in the labs (cooling, electric power, etc)

34 LCG-France, 22 July 2004, CERN34 Tier2 centers (2)  Local support  Stuff to be found (remote central watch tower ?)  24/24, 7/7 or best effort support  Serving the community  Regional Possible financing source Extra clients (security, resource sharing policies issues)  National French grid (segment) ?

35 LCG-France, 22 July 2004, CERN35 LHCb DC'04 Accounting

36 LCG-France, 22 July 2004, CERN36 Next Phases Reprocessing and Analysis

37 LCG-France, 22 July 2004, CERN37 Data reprocessing and analysis  Preparing data reprocessing phase:  Stripping – selecting events on DST files into several output streams by groups of physics  Scheduling jobs to sites where the needed data are Tier1’s (CERN, Lyon, PIC, CNAF, RAL, Karlsruhe)  The workload management is capable of automatic job scheduling to a site having data  Tools are being prepared to formulate reprocessing tasks.

38 LCG-France, 22 July 2004, CERN38 Data reprocessing and analysis (2)  User analysis:  Interfacing GANGA to submit jobs to DIRAC  Submitting user jobs to DIRAC sites: Security concerns – job are executed by the agent account on behalf of user  Submitting user jobs to LCG sites: Through DIRAC to have a common job Monitoring and Accounting Using user certificates to submit to LCG No agent submission:  Expecting high failure rate


Download ppt "LCG-France, 22 July 2004, CERN1 LHCb Data Challenge 2004 A.Tsaregorodtsev, CPPM, Marseille LCG-France Meeting, 22 July 2004, CERN."

Similar presentations


Ads by Google