DAQ Software Gordon Watts UW, Seattle December 8, 1999 Director’s Review Introduction to the System Goals for Installation & Commissioning Software Tasks.

Slides:



Advertisements
Similar presentations
Network II.5 simulator ..
Advertisements

System Integration and Performance
Slow Control LHCf Catania Meeting - July 04-06, 2009 Lorenzo Bonechi.
DoE Review June 6-7, 2000 Current Detector Configuration June 6, 2000  Solenoid, Central Preshower installed in central bore  A-layer MDTs installed.
June 19, 2002 A Software Skeleton for the Full Front-End Crate Test at BNL Goal: to provide a working data acquisition (DAQ) system for the coming full.
April 2006Jean-Sébastien GraulichSlide 1 DAQ Status o Goal for BTF o Status o First results o To be done.
1 VLPC system and Cosmic Ray test results M. Ellis Daresbury Tracker Meeting 30 th August 2005.
All DØ Meeting December 1, 2000 Uncle DØ Needs you!!! Commissioning Status & Plans Jae Yu All DØ Meeting Dec. 1, 2000 What have we been doing? What is.
Chapter 1 and 2 Computer System and Operating System Overview
GLAST LAT ProjectNovember 18, 2004 I&T Two Tower IRR 1 GLAST Large Area Telescope: Integration and Test One and Two Tower Integration Readiness Review.
Control of large scale distributed DAQ/trigger systems in the networked PC era Toby Burnett Kareem Kazkaz Gordon Watts DAQ2000 Workshop Nuclear Science.
Maintaining and Updating Windows Server 2008
Input/Output. Input/Output Problems Wide variety of peripherals —Delivering different amounts of data —At different speeds —In different formats All slower.
CLEO’s User Centric Data Access System Christopher D. Jones Cornell University.
Use of ROOT in the D0 Online Event Monitoring System Joel Snow, D0 Collaboration, February 2000.
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
Chapter 10: Input / Output Devices Dr Mohamed Menacer Taibah University
DAQ System at the 2002 ATLAS Muon Test Beam G. Avolio – Univ. della Calabria E. Pasqualucci - INFN Roma.
Emulator System for OTMB Firmware Development for Post-LS1 and Beyond Aysen Tatarinov Texas A&M University US CMS Endcap Muon Collaboration Meeting October.
Designing a HEP Experiment Control System, Lessons to be Learned From 10 Years Evolution and Operation of the DELPHI Experiment. André Augustinus 8 February.
© Janice Regan, CMPT 300, May CMPT 300 Introduction to Operating Systems Principles of I/0 hardware.
D0 Farms 1 D0 Run II Farms M. Diesburg, B.Alcorn, J.Bakken, T.Dawson, D.Fagan, J.Fromm, K.Genser, L.Giacchetti, D.Holmgren, T.Jones, T.Levshina, L.Lueking,
MICE CM25 Nov 2009Jean-Sebastien GraulichSlide 1 Detector DAQ Issues o Achievements Since CM24 o Trigger o Event Building o Online Software o Front End.
Online Calibration of the D0 Vertex Detector Initialization Procedure and Database Usage Harald Fox D0 Experiment Northwestern University.
CHEP 2000, February 7 - February 11, 2000, Padova (Italy), Abstract 377. Data Handling and Filter Framework for the D0 L3/Trigger System A. Boehnlein G.
1 Online Calibration of Calorimeter Mrinmoy Bhattacharjee SUNY, Stony Brook Thanks to: D. Schamberger, L. Groer, U. Bassler, B. Olivier, M. Thioye Institutions:
Dec 7, 1999D0 Dir Review Inst/Com DØ Silicon Hookup and Commissioning p Goals and timeline p Overview of Silicon Read out p “Stand alone” commissioning.
L3 DAQ the past, the present, and your future Doug Chapin for the L3DAQ group DAQ Shifters Meeting 26 Mar 2002.
1 Control Software (CAT) Introduction USB Interface implementation Calorimeter Electronics Upgrade Meeting Frédéric Machefert Wednesday 5 th May, 2010.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
Computing Division Requests The following is a list of tasks about to be officially submitted to the Computing Division for requested support. D0 personnel.
Online Software 8-July-98 Commissioning Working Group DØ Workshop S. Fuess Objective: Define for you, the customers of the Online system, the products.
The Main Injector Beam Position Monitor Front-End Software Luciano Piccoli, Stephen Foulkes, Margaret Votava and Charles Briegel Fermi National Accelerator.
DEPARTEMENT DE PHYSIQUE NUCLEAIRE ET CORPUSCULAIRE JRA1 Parallel - DAQ Status, Emlyn Corrin, 8 Oct 2007 EUDET Annual Meeting, Palaiseau, Paris DAQ Status.
L3 DAQ Doug Chapin for the L3DAQ group DAQShifters Meeting 10 Sep 2002 Overview of L3 DAQ uMon l3xqt l3xmon.
DØ Online16-April-1999S. Fuess Online Computing Status DØ Collaboration Meeting 16-April-1999 Stu Fuess.
D0 PMG 6/15/00 PMG Agenda June 15, 2000  Overview (Tuts) u Detector status u Reportable milestones u Financial status u Summary  Response to DOE review.
Configuration Mapper Sonja Vrcic Socorro,
June 17th, 2002Gustaaf Brooijmans - All Experimenter's Meeting 1 DØ DAQ Status June 17th, 2002 S. Snyder (BNL), D. Chapin, M. Clements, D. Cutts, S. Mattingly.
Online Monitoring for the CDF Run II Experiment T.Arisawa, D.Hirschbuehl, K.Ikado, K.Maeshima, H.Stadie, G.Veramendi, W.Wagner, H.Wenzel, M.Worcester MAR.
DØ Online Workshop3-June-1999S. Fuess Online Computing Overview DØ Online Workshop 3-June-1999 Stu Fuess.
DoE Review January 1998 Online System WBS 1.5  One-page review  Accomplishments  System description  Progress  Status  Goals Outline Stu Fuess.
Input/Output Problems Wide variety of peripherals —Delivering different amounts of data —At different speeds —In different formats All slower than CPU.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
DØ Algorithms Meeting April 6, Leslie Groer, Columbia Univ Ursula Bassler, LPNHE, ParisCalorimeter Online Software Status 1  Examines  Crate Unpacking.
DAQ Status & Plans GlueX Collaboration Meeting – Feb 21-23, 2013 Jefferson Lab Bryan Moffit/David Abbott.
D0 PMG February 15, 2001 PMG Agenda February 15, 2001  Overview (Weerts) u Detector status u Reportable milestones u Summary  Operations Organization.
1 Tracker Software Status M. Ellis MICE Collaboration Meeting 27 th June 2005.
Level 3 Review NT Support The Level 3 Group Michael Clements Doug Chapin Dave Cutts Andy Haas Sean Mattingly Gordon Watts Short Term Long Term.
1 Device Controller I/O units typically consist of A mechanical component: the device itself An electronic component: the device controller or adapter.
Director’s Review Dec 1999 DO Timeline Old Version J F M A M J J A S O N D J F M D Install silicon front-end ICD FPS Lum (L0) CFT install/hookup.
Collaboration Mtg April 14, 2000 DO Timeline J F M A M J J A S O N D J F M D Install silicon front-end ICD FPS Lum (L0) CFT install/hookup 1/2.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
1 DAQ.IHEP Beijing, CAS.CHINA mail to: The Readout In BESIII DAQ Framework The BESIII DAQ system consists of the readout subsystem, the.
ATLAS SCT/Pixel Off Detector Workshop, UCL, 15 June ROD Test Stand Lukas Tomasek LBL
An operating system (OS) is a collection of system programs that together control the operation of a computer system.
© Janice Regan, CMPT 300, May CMPT 300 Introduction to Operating Systems Operating Systems Overview: Using Hardware.
Online Software November 10, 2009 Infrastructure Overview Luciano Orsini, Roland Moser Invited Talk at SuperB ETD-Online Status Review.
OPERATING SYSTEM REVIEW. System Software The programs that control and maintain the operation of the computer and its devices The two parts of system.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
Operating System Review
CLAS12 DAQ & Trigger Status
CMS High Level Trigger Configuration Management
Online Control Program: a summary of recent discussions
Controlling a large CPU farm using industrial tools
The DZero DAQ System Sean Mattingly Gennady Briskin Michael Clements
ProtoDUNE SP DAQ assumptions, interfaces & constraints
Operating System Review
Tracker Software Status
Presentation transcript:

DAQ Software Gordon Watts UW, Seattle December 8, 1999 Director’s Review Introduction to the System Goals for Installation & Commissioning Software Tasks & Manpower

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 2 Run II DAQ  Readout channels: u Will be ~800,000 in run 2, /event  Data rates: u crates u Initial design capacity: s ~1000 Hz s 250 MBytes/sec into the DAQ/l3-farm  Integration & Control u With online

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 3  Continuous operation u Version 0 s PC & Run 1 DAQ hardware simulate the functionality of the Run 2 system. s Looks similar to final system to both Level 3 and the outside user u Integrate with Hardware as it arrives s Small perturbations u Reliability u Integration with Online (monitor, errors, etc.)  We don’t get calls at 4am u Careful testing as we go along s Test stand at Brown s Si test and other boot strap operations here u System isn’t Fragile s If things aren’t done in the exact order –deal with it –understandable error messages. u All code kept in a code repository (vss) Goals

Segment Data Cables Segment Data Cables ) VRC 1 ) Front End Crate Front End Crate Front End Crate Front End Crate Front End Crate Front End Crate Front End Crate Front End Crate VRC 8 S (4 DATA CES L3 Node (1 of 16) L3 Node (1 of 16) L3 Node (1 of 16) SB 1 SB 4 ETG Event Tag Loop Primary Fiber Channel Loop #1 Primary Fiber Channel Loop #8 Front End Token Readout Loop Front End Token Readout Loop Trigger Framework ) L3 Node (1 of 16) L3 Node (1 of 16) L3 Node (1 of 16) To Collector Router To Collector Router Ethernet We have to write software =

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 5 SC L3 Software L3 Farm Node L3 Supervisor ETG VRC SC SBSB Online System Collector Router L3 Monitor

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 6 L3 Software  During Running, DAQ hardware is stand alone u Running components do not require software intervention on an event-by-event basis s Except for monitoring u Software must deal with initialization and configuration only.  Except for the Farm Node  DAQ components require very little software u VRC, SB are simple, similar, control programs with almost no parameter settings u ETG is similar, with more sophisticated software to handle routing table configuration u Farm Node and Supervisor are the only components that require significant programming effort. s Monitor node to a lesser extent.

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 7 ETG Interface ETG Program Control Disk L3 Supervisor L3 Monitor ETG Node Embedded Systems Embedded Systems Embedded Systems Trigger Framework Triggers Disable DCOM Similar to the VRC (and SB); will reuse software

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 8 Filter Process  Physics Filter executes in a separate process. u Isolates the framework from crashes s The physics analysis code changes much more frequently than the framework once the run has started s Crash recovery saves the event, flags it, and ships it up to the online system for debugging. u Raw event data is stored in shared memory Framework Filter Process Shared Memory Mutexes Run 1 Experience

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 9 Physics Filter Interface  ScriptRunner u Framework that runs physics tools and filters to make the actual physics decision. u Cross platform code (NT, Linux, IRIX, OSF??) u Managed by the L3 Filters Group L3 Framework Interface L3 Framework Interface L3 Filter Process ScriptRunner

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 10 L3 Supervisor  Manages Configuration of DAQ/Trigger Farm u About 60 nodes  Command Planning u Online system will send Level 3 simple commands u L3 must translate them into the specific commands to each node to achieve the online system’s requests  Supports u Multiple Runs u Partitioning the L3 Farm u Node Crash and Recovery u Generic Error Recovery s With minimal impact on running

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 11 ErrorLogging & Monitoring  Error Logging u L3 Filters group will use the zoom ErrorLogger s Adopted a consistent set of standards for reporting errors. u Plug-in module to get the errors off the Level 3 nodes s Sent to monitor process for local relay to online system s Logfiles written in a standard format –Trying to agree with online group to make this standard across all online components  Monitoring u Noncritical information s Event counters, buffer occupancy, etc. u Variables declared & mapped to shared memory s Slow repeater process copies data to monitor process.

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 12 DAQ/Trigger Integration  Between the Hardware and the Online System  Interface is minimal u Data Output u Control u Monitor Information  Implications are not Minimal Detector L1, L2 TCC L3 Supervisor DAQ System Readout Crate L3 Node Ethernet Collector / Router Data Logger RIP Disk FCC Ethernet Data Cable Trigger and Readout UNIX Host NT Level 3 UNIX Host COOR Monitor Ethernet L3 Monitor DAQ Console DAQ Console Detector Console Detector Console

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 13 Software Integration  Integration outside of Level 3 software  Integration with offline (where we meet) u Level 3 Filter s Must run same offline and online u Doom/dspack  Control & Monitor communication u Uses ITC package s Online Group’s standard Communications Package Requires offline-like Code Releases built on Online

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 14 NT Releases  Build is controlled by SoftRelTools u 100’s of source files u Build system required u UNIX centric (offline) s Too much work to maintain two  SRT2 NT integration is done u SRT2 is the build system u Set us back several months; no assigned person  Li (NIU MS) is building NT releases now u Just starting… u Starting with a small DAQ only release s DSPACK + friends, itc, thread_util, l3base u Next step is to build 30+ size release s Everything we had nt release

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 15 NT Releases  Progress is slow u Build system is still in flux!  What does it affect? u ScriptRunner + filters + tools + ITC u 10% Test right now u Our ability to test the system now s Dummy versions of SR interface u Regular nt trigger releases must occur by March 15, 2000 s Muon Filtering in L3 is one of the commissioning milestones.

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 16 Scheduling  Conflicting Requirements u Must be continuous availible starting now u Must upgrade & integrate final hardware as it arrives  Software is impacted u Must upgrade in tandem and without disturbing running system  Tactic u Version 0 of software u Upgrade adiabatically u Interface to internal components remains similar u Interface to online system does not change u Test stand at Brown University for testing.

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 17 VRC Interface FCI from last SB VRC Program Control Disk L3 Supervisor L3 Monitor VRC Node Embedded Systems Embedded Systems Embedded Systems VBD Data Cables 50 MB/s 100 MB/s FCI to 1 st SB 100 MB/s DCOM

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 18 VRC Interface (V0) VRC Program Control Disk L3 Supervisor L3 Monitor VRC Node VBD Data Cables 50 MB/s 100 Mb/s DCOM VME/MPM (2) (to SB/ETG)

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 19 November 1, 1999  Read raw data from FEC into VRC  Send raw data in offline format to online system  Control via COOR u Held up by NT releases COOR L3 Super ITC DCOM Detector/ VRC Collector Router ITC Auto Start Utility DCOM Done Started Not Started FEC 50%

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 20 February 15, 2000  Multicrate readout  Internal communication done via ACE u Already implemented COOR L3 Super ITC DCOM Detector/ VRC Collector Router ITC Auto Start Utility DCOM Done Started Not Started FEC SB/ETG Detector/ VRC ACE L3 Farm Node ACE 25% FEC 50% 75%

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 21 March 15, 2000  Muon Filtering in Level 3 u ScriptRunner Interface must be up u NT releases must be regular COOR L3 Super ITC DCOM Detector/ VRC Collector Router ITC Auto Start Utility DCOM Done Started Not Started FEC SB/ETG Detector/ VRC ACE L3 Farm Node ACE 20% FEC 50%

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 22 May 1, 2000  Multistream Readout u Ability to partition the L3 Farm s Multiple Simultaneous Runs u Route events by trigger bits u ScriptRunner does output streams COOR L3 Super ITC DCOM Detector/ VRC Collector Router ITC Auto Start Utility DCOM Done Started Not Started FEC SB/ETG Detector/ VRC ACE L3 Farm Node ACE 10% FEC 25% 45%

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 23 Test Stands  Detector subsystems have individual setups u Allows them to test readout with final configuration u Allows us to test our software early s High speed running, stress tests for DAQ software  Subsystems have some unique requirements u Necessary for error rate checking in the Si, for example. u Separate software development branches s Attempt to keep as close as possible to the final L3 design to avoid support headaches.

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 24 Test Stands  Three test stands currently in operation u Brown Test Stand s Test hardware prototypes s Primary software development u Silicon Test Stand s L3 Node directly reads out a front end crate s Helping us and Si folks test readout, perform debugging and make system improvements u CFT Test Stand s Instrumented and ready to take data (missing one tracking board (VRBC) to control readout)

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 25 10% Test  Si Test Stand will evolve into full blown readout u 10% test – single barrel readout u Requires full L3 Node u Test out Silicon Filter Code s ScriptRunner, Trigger Tools, etc. s NT releases must be up to speed for this  This is in progress as we speak u The ScriptRunner components are held up by NT releases.

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 26 People  Joint effort u Brown University u University of Washington, Seattle  People: u Gennady Briskin, Brown u Dave Cutts, Brown u Sean Mattingly, Brown u Gordon Watts, UW u +1 post-doc from UW u Students (>1)

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 27 Tasks  VRC u Simple; once done will require few modifications. (1/4 FTE)  SB u Simple; once done will require few modifications (very similar to VRC) (1/4)  ETG u Complex initialization required; hardware interface not well understood yet, requires little work now. By the time VRC ramps down, this will ramp up. (1/2)  Farm Node u Large amount of work left to do in communication with the supervisor and with ScriptRunner. Will require continuous work as system gains in complexity (3/4)

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 28 Tasks  L3 Supervisor u Complex communication with COOR, started but will require continuous upgrades as the system develops in complexity. (1/2)  Monitoring u Initial work done by undergraduates. Have to interface to the outside world. No one working on it at the moment (1/4).  NT Releases u Offloading to NIU student. Requires continuous work and interface with many different software developers (1).  L3 Filter Integration u Done by hand now, will have to be made automatic. Take advantage of offline tools (1/2).

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 29 Conclusions  NT Releases have been the biggest delay u Keeping up with the offline changes requires constant vigilance u Offloading this task to a dedicated person. u 10% test impact, March 15 milstone impact  Group is correct size to handle the task u Continuous operation u Integrating the new hardware with the software u As long as this group isn’t also responsible for releases.  Weak points currently u Monitoring u Integration with online system (log files, error messages, etc.).

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 30 Dedicated 100 Mbits/s Ethernet to Online Collector/Router Dedicated 100 Mbits/s Ethernet to Online Collector/Router L3 Farm Node Ethernet L3 Filter L3 Filter L3 Filter Process DMA capable VME-PCI Bridge L3 Node Framework Each 48 MB/s Control, Monitoring and Error Module L3 Filter Interface Module Node-VME I/O Module Shared Memory Buffers VME Crate MPM Collector Router Module Prototype of the framework is finished Runs in Silicon Test Stand Second version finished by Jan 1. Improved Speed, interaction between processes, new interface, and stability Prototype of the framework is finished Runs in Silicon Test Stand Second version finished by Jan 1. Improved Speed, interaction between processes, new interface, and stability

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 31 Validation Queue Event Validation FECs presence validation Checksum validation L3 Filter Input Interface Process Interface Pool Queue Collector/Router Network Interface Determine where this event should be sent Sent event to collector/router node Filter Queue Data to Online Host System Data Control Output Events Queue Get a pointer to an event buffer Configures MPMs for receiving new event Wait till complete event arrives into MPM Load event data into shared memory buffer Insert event pointer into the next queue L3 Supervisor Interface L3 Monitor Interface L3 Error Interface Command/Monitor/Error Shared Memory Event Buffer Shared Memory Event Buffer Shared Memory L3 Filter Process L3 Filter Output Interface Process Interface Output Pool Queue MPM Reader Details Of Filter Node

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 32 L3 Supervisor Interface  Receives and interprets COOR commands and turns them into internal state objects  Next step is communication to clients VRC/ETG/SB/L3Node COOR Command Interface Current Configuration DB Resource Allocator Command Generator Sequencer L3 Node Clients Supervisor Online System Commands Configuration Request Desired Configuration Data Base Direct Commands

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 33 Auto Start System Configuration Database Client Machine Auto Start Service Get Package List Install Packages Package Database Auto Start Service Change Packages, Get Status, Reboot, etc. Package Running Packages Designed to automatically start after a cold boot and bring a client to a known idle state Also manages software distribution

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 34 Timeline J F M A M J J A S O N D J F ICD FPS Lum (L0) FT install/hookup 1/2 VLPCs installed 1st CFT crate operational Waveguide production All VLPCs installed SMT install/ hookup Beam- ready Forward  MDT & pixel planes install/survey (A&B) Forward  MDT & pixel planes install/survey (C) CC cosmicsECN cosmics Assemble/install/ align EMC End toroids installed Hookup ECS Cosmic Ray Commissioning Phase I: Central muon, DAQ, RECO, trigger, tracker front-end Phase II: Fiber tracker, preshowers, VLPCs, CFT, forward muon Phase III: Full Cosmic Ray Run (add TRIG, SMT, CAL) Install/checkout CAL BLS Install final CFT electronics Install tracking front-end 1 st Collab Commissioning Milestone: Feb 15, 2000 Run II begins: Mar 1, 2001 Roll in Remove shield wall DAQ Availible

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 35 Silicon Test Display Master GUI Monitor Counters Raw Data Viewer CPU1 CPU2

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 36 Monitoring  Non essential information u Helpful for debugging  Two sources of information: u Level 3 Physics Trigger Info s Accept rates, filter timing. s Framework ships binary block of data out to concentrator (1-of-50) which combines it and re- presents it. u Framework Items s Event Counters, Node State. So others can read without impacting system

Director’s Review Dec 8, 1999 Gordon Watts, UW Seattle 37 Monitoring  Framework Items use a Shared Memory Scheme: Shared Memory Framework Process 1 Framework Process 2 Framework Process 3 Slow Retransmitter Process Rest of World TCP/IP (ACE) Rest of World: Requests Particular Items Update Frequency Framework Process: Saves name, type, and data of monitor Data type is arbitrary Implemented with template classes NT Native Now, Soon ACE Try to reuse online software