Problemi e strategie relativi al calcolo off-line di VIRGO Laura Brocco Universita’ di Roma “La Sapienza” & INFN Roma1 for the VIRGO collaboration.

Slides:



Advertisements
Similar presentations
For the Collaboration GWDAW 2005 Status of inspiral search in C6 and C7 Virgo data Frédérique MARION.
Advertisements

GWDAW 11 - Potsdam, 19/12/ Coincidence analysis between periodic source candidates in C6 and C7 Virgo data C.Palomba (INFN Roma) for the Virgo Collaboration.
GWDAW 16/12/2004 Inspiral analysis of the Virgo commissioning run 4 Leone B. Bosi VIRGO coalescing binaries group on behalf of the VIRGO collaboration.
Adaptive Hough transform for the search of periodic sources P. Astone, S. Frasca, C. Palomba Universita` di Roma “La Sapienza” and INFN Roma Talk outline.
GEO02 February GEO Binary Inspiral Search Analysis.
SKELETON BASED PERFORMANCE PREDICTION ON SHARED NETWORKS Sukhdeep Sodhi Microsoft Corp Jaspal Subhlok University of Houston.
Searching for correlations in global environmental noise Karl Wette, Susan Scott and Antony Searle ACIGA Data Analysis Centre for Gravitational Physics.
23/04/2008VLVnT08, Toulon, FR, April 2008, M. Stavrianakou, NESTOR-NOA 1 First thoughts for KM3Net on-shore data storage and distribution Facilities VLV.
D. Buskulic, ACAT 2002, Moscow The VIRGO experiment Data analysis software tools used during Virgo engineering runs Review and future needs.
High Performance Computing (HPC) at Center for Information Communication and Technology in UTM.
Large scale data flow in local and GRID environment V.Kolosov, I.Korolko, S.Makarychev ITEP Moscow.
Test Configuration for Control For the test configuration we used a VME based control system, constituted by a VME crate with a VMPC4a from Cetia (CPU.
LIGO-G Z Coherent Coincident Analysis of LIGO Burst Candidates Laura Cadonati Massachusetts Institute of Technology LIGO Scientific Collaboration.
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
Alexandre A. P. Suaide VI DOSAR workshop, São Paulo, 2005 STAR grid activities and São Paulo experience.
GLAST LAT ProjectDOE/NASA Baseline-Preliminary Design Review, January 8, 2002 K.Young 1 LAT Data Processing Facility Automatically process Level 0 data.
LHCb Applications and GRID Integration Domenico Galli Catania, April 9, st INFN-GRID Workshop.
EGO Computing Center site report EGO - Via E. Amaldi S. Stefano a Macerata - Cascina (PI) | Stefano Cortese INFN Computing Workshop –
Update on a New EPICS Archiver Kay Kasemir and Leo R. Dalesio 09/27/99.
Searching for gravitational radiation from Scorpius X-1: Limits from the second LIGO science run Alberto Vecchio on behalf of the LIGO Scientific Collaboration.
6-10 Oct 2002GREX 2002, Pisa D. Verkindt, LAPP 1 Virgo Data Acquisition D. Verkindt, LAPP DAQ Purpose DAQ Architecture Data Acquisition examples Connection.
Introduction to dCache Zhenping (Jane) Liu ATLAS Computing Facility, Physics Department Brookhaven National Lab 09/12 – 09/13, 2005 USATLAS Tier-1 & Tier-2.
The european ITM Task Force data structure F. Imbeaux.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
Multidimensional classification of burst triggers from the fifth science run of LIGO Soma Mukherjee CGWA, UTB GWDAW11, Potsdam, 12/18/06 LIGO-G
Status of Standalone Inspiral Code Duncan Brown University of Wisconsin-Milwaukee LIGO Scientific Collaboration Inspiral Working Group LIGO-G Z.
The Analysis of Binary Inspiral Signals in LIGO Data Jun-Qi Guo Sept.25, 2007 Department of Physics and Astronomy The University of Mississippi LIGO Scientific.
What is Triana?. GAPGAP Triana Distributed Work-flow Network Action Commands Workflow, e.g. BPEL4WS Triana Engine Triana Controlling Service (TCS) Triana.
GRBs & VIRGO C7 run Alessandra Corsi & E. Cuoco, F. Ricci.
Real-time Acquisition and Processing of Data from the GMRT Pulsar Back- ends Ramchandra M. Dabade (VNIT, Nagpur) Guided By, Yashwant Gupta.
Searching for Gravitational Waves with LIGO Andrés C. Rodríguez Louisiana State University on behalf of the LIGO Scientific Collaboration SACNAS
A.Viceré, Università di Urbino HTASC 2003, Pisa June 13 th 1/20 Virgo Data Analysis Andrea Viceré Università di Urbino and INFN Firenze
Status of coalescing binaries search activities in Virgo GWDAW 11 Status of coalescing binaries search activities in Virgo GWDAW Dec 2006 Leone.
A GRID solution for Gravitational Waves Signal Analysis from Coalescing Binaries: preliminary algorithms and tests F. Acernese 1,2, F. Barone 2,3, R. De.
Sep. 17, 2002BESIII Review Meeting BESIII DAQ System BESIII Review Meeting IHEP · Beijing · China Sep , 2002.
Searching for Gravitational Waves from Binary Inspirals with LIGO Duncan Brown University of Wisconsin-Milwaukee for the LIGO Scientific Collaboration.
1 Status of Search for Compact Binary Coalescences During LIGO’s Fifth Science Run Drew Keppel 1 for the LIGO Scientific Collaboration 1 California Institute.
1Antonella Bozzi – LSC/Virgo meeting Amsterdam Ligo/Virgo Data Transfer Bulk data replication tools and plans for S6 data replication Antonella.
Status of the Bologna Computing Farm and GRID related activities Vincenzo M. Vagnoni Thursday, 7 March 2002.
A Grid Approach to Geographically Distributed Data Analysis for Virgo F. Barone, M. de Rosa, R. De Rosa, R. Esposito, P. Mastroserio, L. Milano, F. Taurino,
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
GEO Online Detector Characterization System R. Balasubramanian Cardiff University LSC March 2003 LIGO-G Z.
S. Frasca INFN – Virgo and “La Sapienza” Rome University Baton Rouge, March 2007.
Search for bursts with the Frequency Domain Adaptive Filter (FDAF ) Sabrina D’Antonio Roma II Tor Vergata Sergio Frasca, Pia Astone Roma 1 Outlines: FDAF.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
GWDAW91Thursday, December 16 First Comparison Between LIGO &Virgo Inspiral Search Pipelines F. Beauville on behalf of the LIGO-Virgo Joint Working Group.
Gennaro Tortone, Sergio Fantinel – Bologna, LCG-EDT Monitoring Service DataTAG WP4 Monitoring Group DataTAG WP4 meeting Bologna –
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
ATLAS Distributed Analysis DISTRIBUTED ANALYSIS JOBS WITH THE ATLAS PRODUCTION SYSTEM S. González D. Liko
XRD data analysis software development. Outline  Background  Reasons for change  Conversion challenges  Status 2.
The first AURIGA-TAMA joint analysis proposal BAGGIO Lucio ICRR, University of Tokyo A Memorandum of Understanding between the AURIGA experiment and the.
Jianming Qian, UM/DØ Software & Computing Where we are now Where we want to go Overview Director’s Review, June 5, 2002.
A proposal for the KM3NeT Computing Model Pasquale Migliozzi INFN - Napoli 1.
DGAS Distributed Grid Accounting System INFN Workshop /05/1009, Palau Giuseppe Patania Andrea Guarise 6/18/20161.
Report on the FCT MDC Stuart Anderson, Kent Blackburn, Philip Charlton, Jeffrey Edlund, Rick Jenet, Albert Lazzarini, Tom Prince, Massimo Tinto, Linqing.
SEARCH FOR INSPIRALING BINARIES S. V. Dhurandhar IUCAA Pune, India.
A Web Based Job Submission System for a Physics Computing Cluster David Jones IOP Particle Physics 2004 Birmingham 1.
Development of KAGRA Burst Pipeline
WP18, High-speed data recording Krzysztof Wrona, European XFEL
The VIRGO DATA ANALYSIS Fulvio Ricci
Advanced Virgo Detector Monitoring and Data Quality
Virgo Status Detector Status Computing Data Analysis status and Plans
Coherent wide parameter space searches for gravitational waves from neutron stars using LIGO S2 data Xavier Siemens, for the LIGO Scientific Collaboration.
Data Mining, Distributed Computing and Event Detection at BPA
Search for gravitational waves from binary black hole mergers:
Coherent Coincident Analysis of LIGO Burst Candidates
Data Mining, Distributed Computing and Event Detection at BPA
KIT visit to Cascina , 22 March 2019
Presentation transcript:

Problemi e strategie relativi al calcolo off-line di VIRGO Laura Brocco Universita’ di Roma “La Sapienza” & INFN Roma1 for the VIRGO collaboration

Outlines Part I Data Production Data Transfer and Storage Part II Search for gravitational wave pulses and quasi- periodic signals Search for periodic signals Conclusions

I - Data Production and Storage

Status of Virgo CITF commissioning ended on September Engineering Runs (three-days long) doneCITF commissioning ended on September Engineering Runs (three-days long) done ITF commissioning started in September 2003 (ends in September 2004) 4 Engineering Runs done until nowITF commissioning started in September 2003 (ends in September 2004) 4 Engineering Runs done until now Full Virgo locked before the end of 2004Full Virgo locked before the end of 2004

Virgo Data Production 5 different data streams produced: Raw dataRaw data T ime series containing information from the different sub-systems, recorded in 1 sec long frames. Each file is made of 300 frames (  1.8 GByte size). The data flow is 6 MByte/sec. Processed dataProcessed data h-recon, quality channels. Stored in frames 1 sec long. Expected data flow 0.6 MByte/sec. Trend dataTrend data Slowly acquired information, global information, fast quantities. These information are stored in frames 1 hour long. The expected data flow is about 10 kByte/sec. 50 Hz data50 Hz data Fast channels 50 Hz for long term studies Data flow 140 kByte/sec. Network analysis dataNetwork analysis data Data made available to external collaborations (i.e. LIGO). These data contains environmental data, h-recon, etc. Expected data flow ~ 1 MByte/sec (depending on the agreement on data exchange among the different experiments).

Data Transfer & Storage Data Transfer & Storage I – Present SituationCascina VIRGO CNAF LYON bbftp bbftp CASCINA CASCINA: 70 TByte storage (as data buffer for daily activities) + LTO Tapes CNAF CNAF: nas1, nas2 & nas TByte 9.96 TByte full with ER data (from E0 to C3) Asked up to 20 TByte for 2004 Transfer performed by virgo-gateway machine (Dell 1 GHz) Data flow 3 MByte/sec LYON LYON: Data stored with HPSS (from E0 to C3) Data 6.4 MByte/sec

Data Transfer & Storage Data Transfer & Storage II – Futures PlansCascinaStorage SRM MySQL archive bbftp Server SRM Client C2 Temp. Buffer SRM MySQL archive SRM Client C1 bbftp Server SRM Client C3 SRM Client C2 Cascina Bologna-CNAF Bologna Storage bbftp bbftp to Lyon On-Line To Lyon SRM Client C3 To Lyon

Book-Keeping Data-Base Oracle Data-Base. Generated by SRM Client C3 in Cascina, and hosted in Lyon. Replicated both in Bologna and Cascina CascinaBolognaLyonFile Cascina Bologna Lyon File Info. Info. Info. Information Directory 1 yes 0 no/deleted 2 in transfer Directory 1 yes 0 no/deleted 2 in transfer Directory 1 yes 0 no/deleted 2 in transfer Name, Size GPS time DAQ information Event information

II - Off-line Analysis Procedures

Data analysis Requirements Data analysis Requirements I - Search for bursts & coalescing binary gravitational signals Bursts: Short signals (4  100 ms) of unknown shape, frequencies between 50 Hz and 6000 Hz, and amplitude ≤ h ≤ Specific Burst oriented software developed Specific Burst oriented software developed: Burst Library (BuL): urst Library (BuL): C++ library containing several packages dedicated to the search for burst gravitational waves. BuL is developed on DEC/OSF1 V5.2, Linux/RH 6.1 and Linux/RH 7.2, and all the packages are managed and built using CMT. SNAG (Signal and Noise for Gravitational Antennas): MatLab toolbox containing filters to perform burst searches both in frequency and time domain. SNAG is developed on Windows & Linux (to be completed)

Data analysis Requirements Data analysis Requirements I - Search for bursts & coalescing binary gravitational signals Preprocessing for Bursts analysis Whitening: hitening Library dedicated to perform data whitening. There exists a C version (LIB_Whitening original) and a C++ version (Whitening, interfaced with BuL) Ana Batch: : C++ framework which provides some facilities to extract data from Virgo data files (in Frame format). NAP (Noise Analysis Package) C & C++ library containing all the packages dedicated to noise studies and simulations (in development). Typical duration of jobs: 1 hour CPU-time for 1 hour of data samples (on a Xeon 1.7 GHz with 1.5 GByte RAM) From 1/2 to 1 hour CPU-time for 1/2 hour of data samples on MatLab (Windows), depending on the number of templates and of threshold values Some algorithms need machine cluster (matched filtering with  1000 templates)

Data analysis Requirements Data analysis Requirements I - Search for bursts & coalescing binary gravitational signals chirp Coalescing Binary Systems: Compact stars (NS/NS, NS/BH, BH/BH) The exact shape of the signal is accurately predictable, but depends on the two masses of the stars, on their spin rates + several relativistic effects

Data analysis Requirements Data analysis Requirements I - Search for bursts & coalescing binary gravitational signals Coalescing Binary Systems:  Matched filtering techniques have been developed, with thousands of banks of filters (Templates average size 4 MByte) (Flat Search) 1.Single frequency band analysis (Flat Search), running with Merlino framework (written on Ansi C, communication based on MPI on a beowulf cluster) (Multi-Band Template Analysis) 2.Two frequency band analysis (Multi-Band Template Analysis), with same templates grids for all frequency bands (Price Algorithm) 3.Dynamic Matched Filter Techniques (Price Algorithm) ALE (Adaptive Line Enhanced filters) 4.Hierarchical strategies using ALE (Adaptive Line Enhanced filters)  Needed high computing power(~ 300 Gflops for in-time analysis, 3 times more for off-line analysis) and needed distribute framework to parallel computation  Needed high computing power (~ 300 Gflops for in-time analysis, 3 times more for off-line analysis) and needed distribute framework to parallel computation

Data analysis Requirements Data analysis Requirements I - Search for bursts & coalescing binary gravitational signals Scheme for bursts and coalescing binary detection To be Bologna h reconstruction $2 20kHz Lines removal Whitening Decimation/Re-sampling Bursts Filters C.B. Filters DataStorage Ev. selected Ev. selected Storage Raw data Storage

Data analysis Requirements Data analysis Requirements II - Search for periodic gravitational signals Periodic gravitational signals are emitted, e.g., by asymmetric rotating neutron stars. Amplitude of the signals very low  long integration times (~ months) are needed. Hierarchical strategy has been developed based on the alternation of “coherent” and “incoherent” steps. Tflops range Large computing resources needed for the analysis: Tflops range However, the larger is the CP we can access and the wider is the portion of source parameter space we can explore. Low granularity: the analysis method is well suited to a distributed computing environment. Two main computing centers, Bologna and Lyon, plus Napoli and Roma

preliminary analisys input files GRID candidates Candidates copied back to a local machine for further steps of the analysis. Typical output files dimensions ~200kB, ~2∙10 4 candidates. 6 months of data Typical dimensions ~1.2 MB for 6 months of data. Replicated among SEs. 3 months ~ 10 5 jobs sent in 3 months (incoherent steps) Typical job duration ~5-10 hours on a 2.4 GHZ Xeon proc, depending on the source frequency. Performed locally (coherent steps). C.C. Storage

We are carrying on test activities on the data analysis software in two computing environments: local batch systems (PBS) and grid (INFN-Grid). Main activities so far: Adaptation of the data analysis procedures to work in a distributed environment; Tests of the “incoherent” part of the analysis pipeline (several software versions) using simulated data (thousands of jobs submitted). Used machines: Roma, Bologna, Napoli (about 30 machines) whithin INFN-Grid Lyon (25 processors) as a classic batch system Full-scale test of the “coherent” part of the analysis (28 processors for ~3 months, 24 hours/day; farms in Bologna and Roma). Results: very good scaling of performances with the number of nodes involved (but only small scale tests done up to now); grid software more and more stable and reliable;

Conclusions The Virgo experiment will complete the commissioning in Data Production: 5 kinds of data will be produced, with data flow from 10 kByte/sec (Trend Data) up to 6 MByte/sec (raw data) Typical raw-data file size 1.8 GByteStorage: 2 permanent storage, Bologna-CNAF and Lyon, + Cascina Automatic processes to transfer data from Cascina to Bologna and from Bologna to Lyon are in development Data Analysis: Several filters have been developed to search for gravitational waves, all the filtering techniques need for high computing power and parallel computations. 4 M.D.C. (productions) performed until now, next foreseen in June. GRID tests have been performed using Roma, Bologna and Napoli farms. Larger scale tests will be performed in next months. The analysis of scientific data will start in The analysis of scientific data will start in 2005.

Merlino Framework Distributed framework for data a parallel data analysis Is composed of 4 main processes Written in ANSI C code, communication based on MPI and running on a Beowulf cluster “plug-ins” functions customization (dynamic library) Data flow customization Plug-in actually used, tested of under develop: –Matched Filter –Inspiral generator –Mean Filter –PC –Dumped SineFilter By Leone B.Bosi

Next steps (in 2004) integration and validation of the whole analysis software; larger scale grid tests (up to ~100 processors and more involved);

Lyon INFN-GRID Virgo CESE GIIS GRIS CESE GIIS GRIS MDS Virgo-I MDS Virgo-F CESE GIIS GRIS Virgo-I Cnaf Virgo-F Lione- GIIS Cnaf Scenario 1 Virgo BDII RB Virgo-I Roma CESE GIIS GRIS Virgo-I Napoli CESE GIIS GRIS Virgo-F ….- Virgo BDIIRB by Antonia Ghiselli