Presentation is loading. Please wait.

Presentation is loading. Please wait.

Workfest Goals Develop the Tools for CDR Simulations HDFast HDGEANT Geometry Definitions Remote Access Education of the rest of the collaboration Needs.

Similar presentations


Presentation on theme: "Workfest Goals Develop the Tools for CDR Simulations HDFast HDGEANT Geometry Definitions Remote Access Education of the rest of the collaboration Needs."— Presentation transcript:

1 Workfest Goals Develop the Tools for CDR Simulations HDFast HDGEANT Geometry Definitions Remote Access Education of the rest of the collaboration Needs for CDR Data Model Other Items Software Web Page Distributed Meetings Long Term Simulation Effort – Goals & Design

2 Workfest Minutes Speaker – Richard Jones Subject - Status of GEANT 10 Months until the CDR is due Software  Design  Create  Publish  Monitor Development

3 HDFAST MCFAST  fairly mature (monitoring development). Reasonably stable at this point. GEANT 3 Designed and in prototype How to compare with MCFAST GEANT 4 Forseen, probably have a compatible geometry definition (GEANT4 might change). Event Generators  cwrap, genR8, weight Facilities (JLab, Regina, IU, FSU and UConn.) Expected Simulation Projects – PWA, Detector Optimizations, Background Studies

4 HDFAST – I/O Summary Genr8  ascii  (ascii2stdhep)  stdhep Command & Geometry  HDFast  Root(RDT) or stdhep  ascii Using Root (Root Data Tree) Root available at http://root.cern.chhttp://root.cern.ch

5 UConn Cluster Pentium 450/800 – 36 processors Rackmount (2U cases) Dual CPU – 512 MByte Switch Dual CPU’s Disk pvfs Disk raid 80 Mb/s 35 Mb/s

6 I n t e r n e t 2 U Conn Computing Cluster

7 Mantrid – Indiana University Processing/Storage ModelD 32 processors in 16 nodes 32 45 GB disks (1.44 TB total) /home/HallD Node 00 /data0/HallD /data1/HallD Has slides Has a prototype web based access system for event generation (cwrap). http://anthrax.physics.indiana.e du/~teige/tests/cwrap_request. html http://anthrax.physics.indiana.e du/~teige/tests/cwrap_request. html

8 U Regina – 50 Alpha Cluster 50 Nodes PBS 500 MHz Alphas 9 Gbyte disk per node Note a 500 MHz alphas is roughly comparable to a 1 GHz Pentium III. Has slides See: www.phys.uregina.ca/~ brash/openspace.ps www.phys.uregina.ca/~ brash/openspace.ps www.phys.uregina.ca/~ brash/connectivity.ps www.phys.uregina.ca/~ brash/connectivity.ps

9 Second Day: To Do Running HDFAST and HDGEANT. Paul gave a demo. Decide on data model  how information is moved from package to package. (Data Model Working Group) Web Site (Working Group) Features required to integrate clusters

10 Hall D Computing Web Site Goals: Everyone can contribute without excessive site management. XML based description of documents. Automatic searching and organization tools. Still need overview documents.

11 Contents: Hall D Website Everyone maintains their own site Everyone has a summary page and link to Hall D computing resources and searching tools Links and searches will be managed automatically Everyone contributes documents to the Hall D computing archive describing their computing activities Each document has an XML metadata description of what it contains

12 How To: Hall D Website Within your website create a single XML metadata document describing all of your documents. Let me know where it is. Publish DTD so local sites can be validated (http://comphy.fsu.edu/~dennisl/halld/ dtds/website.dtd).

13 How To: Hall D Website Hall D Computing Design Page Design Larry Dennis May 21, 2001 grid computing acquisition analysis simulations This is final word on Hall D computing.

14 Geant I/O Package Binary Stream Events InEvents Out Control In stderr stdoutLog metadata

15 GEANT 3 – Richard’s Plan Produce a standard geometry See http://zeus.phys.uconn.edu/halld/geometry http://zeus.phys.uconn.edu/halld/geometry Use the geometry for Monte Carlo, event display, logical geometry model for use in analysis.

16 Monte Carlo Data Model - Input <event <interaction <vertex … <particle … … Conceptual Model Logical Model Physical Model  Open Start with an I/O API Some others exist

17 Monte Carlo Data Model - Output … <detector <BarrelDC <ring <sector <strawhit <eloss <time … Conceptual Model Logical Model Physical Model  Open Start with an I/O API Monte Carlo Event Generator  Interactions Simulation  Real data DAC  Digitized Data Calibration Translator  Hits:

18 DOE/NSF Initiatives & Resources Groups Working on Software CMU U Conn U Regina FSU IU FIU JLab (Watson, Bird, Heyes, Hall D) RPI ODU Glasgow Raw events hits Tracks/clusters Particles How much of this can be automated?

19 Larry -- Things To Do Give everyone information about ITR and SciDAC Get Web Site Started Design (Elliott, Scott) Prototype Grid Nodes (Ian, Elliott)

20 Richard -- Things To Do Input Interface to GEANT from event generators, XML input. Finish Geometry Prototype Output Interface for GEANT Prototype Document and Publish the above

21 Scott -- Things To Do Web access for Mantrid Interfaces from generators to/from XML

22 Paul -- Things To Do Maintain HDFast Teach people how to use HDFAST

23 Greg -- Things To Do DTD for event structure DTD for cwrap input

24 Ed -- Things To Do Full OSF support for genr8, HDFAST, GEANT3, translators Web Interface for UR Farm Barrel Calorimeter Studies with GEANT3

25 Elliott -- Things To Do Explore CODA Event Format Assist Greg with Event DTD Explore GEANT4 Hall D Computing Design Prototype for Grid Nodes Remote Collaboration Tools

26 Design Focus Get the job done Minimize the effort required to perform computing Fewer physicists Lower development costs Lower hardware costs Keep it simple Provide for ubiquitous access and participation – improve participation in computing

27 Goals for the Computing Environment 1. The experiment must be easy to conduct (coded software people  two person rule). 2. Everyone can participate in solving experimental problems – no matter where they are located. 3. Offline analysis can more than keep up with the online acquisition. 4. Simulations can more than keep up with the online acquisition. 5. Production of tracks/clusters from raw data and simulations can be planned, conducted, monitored, validated and used by a group. 6. Production of tracks/clusters from raw data and simulations can be conducted automatically with group monitoring. 7. Subsequent analysis can be done automatically if individuals so choose.

28 Goal #1: Easy to Operate 100 MB/s raw data. Need an estimate of designed good event rate to set online trigger performance Automated system monitoring Automated slow controls Automated data acquisition Automated online farm Collaborative environment for access to experts Integrated problem solving database links current to past problems and solutions Well defined procedures Good training procedures

29 Goal #2: Ubiquitous expert participation Online system information available from the web. Collaborative environment for working with online team. Experts can control systems from elsewhere when data acquisition team allows or DAQ inactive.

30 Goal #3: Concurrent Offline Production Offline Production (raw events  tracks/clusters) can be completed in the same length of time as is required for data taking (including detector and accelerator down time). This includes: Calibration overhead. Multiple passes through the data (average of 2). Evaluation of results. Dissemination of results

31 Goal #4: Concurrent Simulations Simulations can be completed in the same length of time as is required for data taking (including detector and accelerator down time). This includes: Simulation planning. Systematic studies ( up to 5-10 times as much data as is required for experimental measurements). Production processing of simulation results. Dissemination of results.

32 Goal #5: Collaborative computing Production processing and simulations can be planned by a group. Multiple people can conduct, validate, monitor, evaluate and use produced data and simulations without unnecessary duplication. A single individual or a large group can manage appropriate scale tasks effectively.

33 Goal #6: Automated computing Production processing and simulations can conducted automatically without intervention. Progress is reported automatically. Quality checking can be performed automatically. Errors in automatic processing are automatically flagged.

34 Goal #7: Extensibility Subsequent analysis steps can be done automatically if individuals so choose. The computational management system can be extended to include any Hall D computing tasks.


Download ppt "Workfest Goals Develop the Tools for CDR Simulations HDFast HDGEANT Geometry Definitions Remote Access Education of the rest of the collaboration Needs."

Similar presentations


Ads by Google