11/12/2003LIGO-G030554-01-Z1 Data reduction for S3 P Charlton (CIT), I Leonor (UOregon), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald (AEI),

Slides:



Advertisements
Similar presentations
Case Study 1: Data Replication for LIGO Scott Koranda Ann Chervenak.
Advertisements

11/12/2003LIGO Document G Z1 Data reduction for S3 I Leonor (UOregon), P Charlton (CIT), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald.
LIGO-G Z 23 October 2002NSF Review of LIGO Laboratory1 The Penn State LIGO Data Analysis Center Lee Samuel Finn Penn State.
Grid and CDB Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Batch Production and Monte Carlo + CDB work status Janusz Martyniak, Imperial College London MICE CM37 Analysis, Software and Reconstruction.
Searching for correlations in global environmental noise Karl Wette, Susan Scott and Antony Searle ACIGA Data Analysis Centre for Gravitational Physics.
LIGO Reduced Data Sets E7 standard reduced data set RDS generation for future runs decimation examples LIGO-G Z Isabel Leonor (w/ Robert Schofield)
Design considerations for the Indigo Data Analysis Centre. Anand Sengupta, University of Delhi Thanks to -Maria Alessandra Papa (AEI) -Stuart Anderson.
The SAM-Grid Fabric Services Gabriele Garzoglio (for the SAM-Grid team) Computing Division Fermilab.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
LIGO-G E ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech Argonne, May 10, 2004.
G D LSC ITR2003, ANL, LIGO Scientific Collaboration ITR 2003 Advisory Committee Meeting K. Blackburn 1, P. Brady 2, S. Finn 3, E.
AQS Web Quick Reference Guide Changing Raw Data Values Using Maintenance 1. From Main Menu, click Maintenance, Sample Values, Raw Data 2. Enter monitor.
LSC Segment Database Duncan Brown Caltech LIGO-G Z.
Data Management Kelly Clynes Caitlin Minteer. Agenda Globus Toolkit Basic Data Management Systems Overview of Data Management Data Movement Grid FTP Reliable.
LIGO Data Analysis (G M) Computing Breakout Session 1 1.LIGO Data Analysis Systems Data Management and Analysis 2.LIGO/LSC Analysis Software:
Patrick R Brady University of Wisconsin-Milwaukee
G Z LIGO Scientific Collaboration Grid Patrick Brady University of Wisconsin-Milwaukee LIGO Scientific Collaboration.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Update on the LIGO Data Analysis System LIGO Scientific Collaboration Meeting LIGO Hanford Observatory August 19 th, 2002 Kent Blackburn Albert Lazzarini.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Nick Brook Current status Future Collaboration Plans Future UK plans.
LHCb and DataGRID - the workplan for 2001 Eric van Herwijnen Wednesday, 28 march 2001.
LIGO-G9900XX-00-M ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech.
LDAS L IGO Data Analysis System (LDAS) PAC 11 Meeting California Institute of Technology November 29, 2001 James Kent Blackburn LIGO Laboratory, Caltech.
Part Four: The LSC DataGrid Part Four: LSC DataGrid A: Data Replication B: What is the LSC DataGrid? C: The LSCDataFind tool.
Status of Standalone Inspiral Code Duncan Brown University of Wisconsin-Milwaukee LIGO Scientific Collaboration Inspiral Working Group LIGO-G Z.
SAN DIEGO SUPERCOMPUTER CENTER Inca TeraGrid Status Kate Ericson November 2, 2006.
LIGO-G D LSC Meeting Nov An Early Glimpse at the S3 Run Stan Whitcomb (LIGO–Caltech) LIGO Scientific Collaboration Meeting LIGO Hanford.
Grid Deployment Enabling Grids for E-sciencE BDII 2171 LDAP 2172 LDAP 2173 LDAP 2170 Port Fwd Update DB & Modify DB 2170 Port.
Automated Grid Monitoring for LHCb Experiment through HammerCloud Bradley Dice Valentina Mancinelli.
LIGO-G Z 5 June 2001L.S.Finn/LIGO Scientific Collaboration1 LDAS Camp.
NSF Review, 18 Nov 2003 Peter Shawhan (LIGO/Caltech)1 How to Develop a LIGO Search Peter Shawhan (LIGO / Caltech) NSF Review November 18, 2003 LIGO-G E.
Junwei Cao March LIGO Document ID: LIGO-G Computing Infrastructure for Gravitational Wave Data Analysis Junwei Cao.
LIGO-G Z Detector Characterization Summary K. Riles - University of Michigan 1 Summary of the Detector Characterization Sessions Keith.
Jan 12, 2009LIGO-G Z1 DMT and NDS2 John Zweizig LIGO/Caltech Ligo PAC, Caltech, Jan 12, 2009.
LIGO Plans for OSG J. Kent Blackburn LIGO Laboratory California Institute of Technology Open Science Grid Technical Meeting UCSD December 15-17, 2004.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
LIGO-G Z S2 and S3 calibration status Michael Landry LIGO Hanford Observatory for the Calibration team Justin Garofoli, Luca Matone, Hugh Radkins.
Scott Koranda, UWM & NCSA 14 January 2016www.griphyn.org Lightweight Data Replicator Scott Koranda University of Wisconsin-Milwaukee & National Center.
Gregory Mendell, LIGO Hanford Observatory LIGO-G WLIGO-G WLIGO-G W LIGO S5 Reduced Data Set Generation March 2007.
LSC Meeting, March 22, 2002 Peter Shawhan (LIGO/Caltech)LIGO-G E Access to Data from the LIGO Engineering Runs Peter Shawhan (LIGO Lab / Caltech)
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 LSC Software and Other Things Alan Wiseman University of Wisconsin.
May 29, 2006 GWADW, Elba, May 27 - June 21 LIGO-G0200XX-00-M Data Quality Monitoring at LIGO John Zweizig LIGO / Caltech.
LIGO-G Z Detector Characterization Issues K. Riles - University of Michigan 1 Detector Characterization Issues -- S2 Analysis & S3 Planning.
LIGO-G Z SenseMonitor Updates for S3 Patrick Sutton LIGO-Caltech.
FTS monitoring work WLCG service reliability workshop November 2007 Alexander Uzhinskiy Andrey Nechaevskiy.
LIGO-G W Use of Condor by the LIGO Scientific Collaboration Gregory Mendell, LIGO Hanford Observatory On behalf of the LIGO Scientific Collaboration.
GridView - A Monitoring & Visualization tool for LCG Rajesh Kalmady, Phool Chand, Kislay Bhatt, D. D. Sonvane, Kumar Vaibhav B.A.R.C. BARC-CERN/LCG Meeting.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
LIGO-G Z Detector Characterization Summary K. Riles - University of Michigan 1 Summary of the Detector Characterization Sessions Keith.
LIGO-G Z LSC Meeting, March Notifying the Control Room of GRBs in Real-Time R. Rahkola (also D. Barker, C. Parameswariah, J. Smith, J.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI Mario Reale – GARR NetJobs: Network Monitoring Using Grid Jobs.
LIGO-G Z Report on the S2 RunK. Riles / S. Whitcomb 1 Report on the S2 Run Keith Riles (University of Michigan) Stan Whitcomb (LIGO–Caltech)
Simulation Production System Science Advisory Committee Meeting UW-Madison March 1 st -2 nd 2007 Juan Carlos Díaz Vélez.
EGEE is a project funded by the European Union under contract IST Issues from current Experience SA1 Feedback to JRA1 A. Pacheco PIC Barcelona.
LSC Z000.0 LSC at LLO LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 LSC Software Coordinator Report Alan Wiseman University.
Report on the FCT MDC Stuart Anderson, Kent Blackburn, Philip Charlton, Jeffrey Edlund, Rick Jenet, Albert Lazzarini, Tom Prince, Massimo Tinto, Linqing.
LIGO-G Z Introduction to QScan Shourov K. Chatterji SciMon Camp LIGO Livingston Observatory 2006 August 18.
Scott Koranda, UWM & NCSA 20 November 2016www.griphyn.org Lightweight Replication of Heavyweight Data Scott Koranda University of Wisconsin-Milwaukee &
Data Acquisition, Diagnostics & Controls (DAQ)
Simulation Production System
Gregory Mendell LIGO Hanford Observatory
GLAST Release Manager Automated code compilation via the Release Manager Navid Golpayegani, GSFC/SSAI Overview The Release Manager is a program responsible.
Report on the S2 Run Keith Riles (University of Michigan)
M Ashley - LSC Meeting – LIGO Hanford, 12 Nov 2003
The STACK-SLIDE Search Initial Report: PULG F2F UWM June 2003
Software Implementation
Presentation transcript:

11/12/2003LIGO-G Z1 Data reduction for S3 P Charlton (CIT), I Leonor (UOregon), S Anderson (CIT), K Bayer (MIT), M Foster (PSU), S Grunewald (AEI), B Johnson (LHO), S Koranda (UWM), G Mendell (LHO), H Pulapaka (CIT), I Yakushin (LLO) LSC Meeting, LIGO Hanford Observatory, Nov 9-13 Detector Characterization Session

11/12/2003LIGO-G Z2 Tools: Data Reduction RDS is being made by submitting createRDS user commands to LDAS via createrdsGUI.tcl, a graphical interface to LDAS createRDS (non-GUI version also available) LDAS can perform channel cuts, decimation and merging of LHO/LLO frame files. These operations are supported by the createrds scripts RDS scripts are designed to run 24/7 with minimal supervision, will continue to run through LDAS downtime Provides visual feedback of RDS progress, error states, web monitoring, notices With current optimisation and hardware upgrades, performance at sites is sufficient to reduce data in better than real-time When necessary, scripts can run many createRDS threads in parallel for high-speed reduction S3 RDS scripts use 2 threads for L1 (>4x LHO, >5x LLO) and 1 thread for L2, L3 (>10x real-time)

11/12/2003LIGO-G Z3

11/12/2003LIGO-G Z4 Tools: Data Propagation Lightweight Data Replicator (LDR), developed by Scott Koranda, is being used to propagate data from LHO/LLO to CIT and from CIT to Tier 2 centres (MIT, PSU, UWM, AEI) Helper applications (Pulapaka/Koranda/Johnson):  Script for moving data from framebuilder to LDAS tape archive  LDRLockedSegments.tcl - script which retrieves information about locked time segments for each IFO from LDAS  RDS-publish.py - script for verifying RDS files (FrCheck) and publishing meta-data to the LDR database (name, size, md5sum, locked status,...)  Local Storage Module - plugin to LDR which organises data into user-defined locations See Scott Koranda’s talk for more details

11/12/2003LIGO-G Z5 Tools: Monitoring createrdsGUI visual alarms LDAS Search Summary Pages on the Web  LDAS CIT web page -  createrds - HTML version of GUI visual alerts, logs  datamon - monitors how long it takes before reduced data from IFO sites is visible in LDAS at CIT, MIT alerts - sent to designated recipients in case of errors eg.  LDAS job submission errors and job failures  RDS data generation rate falling behind  LDAS RDS data visibility falling behind

11/12/2003LIGO-G Z6 Tools: Validation Prior to Mock Data Challenge:  Installed pre-release LDAS at LHO, LLO, CIT  Installed updated LDR w/ new GLOBUS toolkit, LDR helper apps  Installed RDS GUI, web monitoring RDS MDC, Oct  L1, L2, L3 RDS data generated at LHO, LLO  Tapes shipped to CIT  Data reduced at CIT starting 9 Oct  LDR began moving data from LHO Oct 10, but not “near real-time” E10 used as a re-run of the MDC with LDAS MDC did not achieve all goals but many problems were solved! Far fewer problems in E10, all resolved before S3

11/12/2003LIGO-G Z7 S3 Reduced Data Set Site-specific files (H and L), gzip compressed Level 0 – raw data, copied to LDAS tape archive at each site as it written by framebuilder  LHO: 9739 channels, 6.7 MB/s. LLO: 4435 channels, 2.9 MB/s  TOTAL: channels, 9.6 MB/s Level 1 – first level of reduction  LHO: 264 channels, ~0.9 MB/s. LLO: 139 channels, ~0.44 MB/s  TOTAL: 403 channels, 1.34 MB/s (~1/7 of L0) Level 2 – second level (requested by PULG)  AS_Q, DARM_CTRL, DARM_CTRL_EXC_DAQ, ETMX_EXC_DAQ, DARM_GAIN, ICMTRX_01, SV_STATE_VECTOR  LHO: 14 channels, 0.2 MB/s. LLO: 7 channels, 0.1 MB/s  TOTAL: 21 channels, 0.3 MB/s (~1/5 of L1) Level 3 – third level (AS_Q only)  LHO: H1 & H2 AS_Q, 0.11 MB/s. LLO: :L1 AS_Q, 0.06 MB/s  TOTAL: 3 channels, 0.17 MB/s (~1/2 of L2)

11/12/2003LIGO-G Z8 CIT RDS: L1, L2 LHO RDS: L1, L2, L3 LLO RDS: L1, L2, L3 L3 RDS (LDR) L3 RDS (LDR) L0 RAW (FedEx) MIT LDR: L1, L2, L3 PSU LDR: L1, L2, L3 AEI LDR: L1, L2, L3 UWM LDR: L1, L2, L3 L1, L2, L3 RDS (LDR) L1, L2, L3 RDS (LDR) L1 RDS (FedEx)

11/12/2003LIGO-G Z9 S3 RDS Contacts RDS  LHO: Greg Mendell, Ben Johnson  LLO: Igor Yakushin  CIT: Philip Charlton, Isabel Leonor, Stuart Anderson LDR  LHO, LLO, CIT: Hari Pulapaka  MIT: Keith Bayer  UWM: Scott Koranda  PSU: Mike Foster  AEI: Steffen Grunewald

11/12/2003LIGO-G Z10 S3 Data Reduction Rates Rate of RDS as a multiple of real-time  LHO: L x, L2 - 18x, L3 - 23x  LLO: L1 - 6x, L2 - 11x, L3 - 13x  CIT: L x, L2 - 23x Time between frame-file GPS time and reduction  LHO: L min, L min, L min  LLO: L1 - 5 min, L min, L min GridFTP Transfer rates  LHO->CIT 4 MB/s, LLO->CIT 0.1 MB/s  CIT->MIT 1 MB/s, CIT->UWM 4 MB/s Delay in L0  L1  L2  L3  CIT  Tier 2 pipeline: Typical Level 3 RDS Delay to Tier 2 is 2-3 hours