What does LOFAR have to do with the Virtual Observatory (VO)? LOFAR Science Day 16 December 2003 Melbourne David Barnes The University of Melbourne.

Slides:



Advertisements
Similar presentations
Common Instrument Middleware Architecture and Federation of Instrument Resources for X-ray Crystallography Rick McMullen Indiana University.
Advertisements

A.Micol IVOA Registry REGISTRY WG Mar 2003 A Science Case (and 1000 Questions) for the IVOA Registry.
September 13, 2004NVO Summer School1 VO Protocols Overview Tom McGlynn NASA/GSFC T HE US N ATIONAL V IRTUAL O BSERVATORY.
September 13, 2004NVO Summer School1 VO Protocols Overview Tom McGlynn NASA/GSFC T HE US N ATIONAL V IRTUAL O BSERVATORY.
Sept NVO Summer School1 Cone, SIAP, and OpenSkyQuery Client Development Gretchen Greene, Maria Nieto-Santisteban T HE US N ATIONAL V IRTUAL O.
LEAD Portal: a TeraGrid Gateway and Application Service Architecture Marcus Christie and Suresh Marru Indiana University LEAD Project (
Australian e-Astronomy US-Australia Workshop on High-Performance Grids and Applications 8-9 June 2004 David Barnes Research Fellow, School of Physics The.
Australian Virtual Observatory A distributed volume rendering grid service Gridbus 2003 June 7 Melbourne University David Barnes School of Physics, The.
Astrophysics on the Grid MHD portal with Zeus3D Brett Beeson Collaborators –David Barnes (AusVO) –Andrew Melatos (MHD) –Slavisa Garic (NimrodG) –Astrogrid.
Remote Visualisation System (RVS) By: Anil Chandra.
The Australian Virtual Observatory e-Science Meeting School of Physics, March 2003 David Barnes.
Australian Virtual Observatory Pacific Rim Applications and Grid Middleware Assembly The 4th Workshop 5th-6th June 2003 Monash University David Barnes.
Australian Virtual Observatory International Astronomical Union GA 2003 Joint Discussion 08 17th-18th July 2003 Sydney David Barnes The University of Melbourne.
Jens G Jensen Atlas Petabyte store Supporting Multiple Interfaces to Mass Storage Providing Tape and Mass Storage to Diverse Scientific Communities.
Abstraction Layers Why do we need them? –Protection against change Where in the hourglass do we put them? –Computer Scientist perspective Expose low-level.
WS-JDML: A Web Service Interface for Job Submission and Monitoring Stephen M C Gough William Lee London e-Science Centre Department of Computing, Imperial.
A PPARC funded project AstroGrid Framework Consortium meeting, Dec 14-15, 2004 Edinburgh Tony Linde Programme Manager.
Solar and STP Physics with AstroGrid 1. Mullard Space Science Laboratory, University College London. 2. School of Physics and Astronomy, University of.
An Introduction to Grid Computing Richard Fujimoto Reference: The Grid 2, ch. 1-4, 7 Ian Foster & Carl Kesselman (eds.)
Leicester Database & Archive Service J. D. Law-Green, J. P. Osborne, R. S. Warwick X-Ray & Observational Astronomy Group, University of Leicester What.
Web-based Portal for Discovery, Retrieval and Visualization of Earth Science Datasets in Grid Environment Zhenping (Jane) Liu.
Aus-VO: Progress in the Australian Virtual Observatory Tara Murphy Australia Telescope National Facility.
EU 2nd Year Review – Jan – WP9 WP9 Earth Observation Applications Demonstration Pedro Goncalves :
Portals and Credentials David Groep Physics Data Processing group NIKHEF.
INFSO-RI Enabling Grids for E-sciencE FloodGrid application Ladislav Hluchy, Viet D. Tran Institute of Informatics, SAS Slovakia.
Virtual Observatory --Architecture and Specifications Chenzhou Cui Chinese Virtual Observatory (China-VO) National Astronomical Observatory of China.
A long tradition. e-science, Data Centres, and the Virtual Observatory why is e-science important ? what is the structure of the VO ? what then must we.
The Japanese Virtual Observatory (JVO) Yuji Shirasaki National Astronomical Observatory of Japan.
F. Genova, VOTECH kickoff meeting, 2004/11/ Interoperability standards F. Genova, M. Allen, T. Boch, F. Bonnarel, S. Derriere, P. Fernique, F. Ochsenbein,
Functions and Demo of Astrogrid 1.1 China-VO Haijun Tian.
Flexibility and user-friendliness of grid portals: the PROGRESS approach Michal Kosiedowski
Virtual Observatory & LIGO Roy Williams California Institute of Technology.
Summary of distributed tools of potential use for JRA3 Dugan Witherick HPC Programmer for the Miracle Consortium University College.
May 10, 2006IVOA-GGF Astro-RG WS1 Welcome and Workshop Goal Masatoshi Ohishi / NAOJ & Sokendai Chairman, IVOA 大石雅寿 / 国立天文台 & 総合研究大学院大学
Theory in the Virtual Observatory Gerard Lemson, GAVO.
DATABASE MANAGEMENT SYSTEMS IN DATA INTENSIVE ENVIRONMENNTS Leon Guzenda Chief Technology Officer.
Federation and Fusion of astronomical information Daniel Egret & Françoise Genova, CDS, Strasbourg Standards and tools for the Virtual Observatories.
A PPARC funded project Astronomical services: situated software vs. commodity software Guy Rixon, AstroGrid/AVO/IVOA Building Service Based Grids - GGF11.
The Project The Virtual Observatory Technical Progress Andy Lawrence Nottingham All-Hands meeting Sept 2003 AstroGrid
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Solar and space physics datasets within a Virtual Observatory: the AstroGrid experience Silvia Dalla * and Nicholas A Walton  * School of Physics & Astronomy,
Some Grid Science California Institute of Technology Roy Williams Paul Messina Grids and Virtual Observatory Grids and and LIGO.
March 1st, 2006Prospective PNG PNG: Databases - Virtual Observatory.
The International Virtual Observatory Alliance (IVOA) interoperability in action.
Data Archives: Migration and Maintenance Douglas J. Mink Telescope Data Center Smithsonian Astrophysical Observatory NSF
German Astrophysical Virtual Observatory Overview and Results So Far W. Voges, G. Lemson, H.-M. Adorf.
21-jun-2009 IVOA Standards Pedro Osuna ESA-VO Project Science Archives and Computer Support Engineering Unit (SRE-OE) Science Operations Department (SRE-O)
AstroGrid NAM 2001 Andy Lawrence Cambridge NAM 2001 Andy Lawrence Cambridge Belfast Cambridge Edinburgh Jodrell Leicester MSSL.
12 Oct 2003VO Tutorial, ADASS Strasbourg, Data Access Layer (DAL) Tutorial Doug Tody, National Radio Astronomy Observatory T HE US N ATIONAL V IRTUAL.
EURO-VO: GRID and VO Lofar Information System Design OmegaCEN Kapteyn Institute TARGET- Computing Center University Groningen Garching, 10 April 2008 Lofar.
DSpace System Architecture 11 July 2002 DSpace System Architecture.
Introduction to the VO ESAVO ESA/ESAC – Madrid, Spain.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
Publishing Combined Image & Spectral Data Packages Introduction to MEx M. Sierra, J.-C. Malapert, B. Rino VO ESO - Garching Virtual Observatory Info-Workshop.
PROGRESS: GEW'2003 Using Resources of Multiple Grids with the Grid Service Provider Michał Kosiedowski.
Distributed Archives Interoperability Cynthia Y. Cheung NASA Goddard Space Flight Center IAU 2000 Commission 5 Manchester, UK August 12, 2000.
+ Support multiple virtual environment for Grid computing Dr. Lizhe Wang.
The AstroGrid-D Information Service Stellaris A central grid component to store, manage and transform metadata - and connect to the VO!
Introduction: AstroGrid increases scientific research possibilities by enabling access to distributed astronomical data and information resources. AstroGrid.
DataGrid France 12 Feb – WP9 – n° 1 WP9 Earth Observation Applications.
Grid Services for Digital Archive Tao-Sheng Chen Academia Sinica Computing Centre
AstroGrid and Virtual Observatories for Radio Interferometry arrays/ proposals Anita Richards Paul Harrison Noel Winstanley (Jodrell Bank Centre for Astrophysics,
The CSIRO ASKAP Science Data Archive Progress and Plans
GGF OGSA-WG, Data Use Cases Peter Kunszt Middleware Activity, Data Management Cluster EGEE is a project funded by the European.
EGEE NA4 Lofar Lofar Information System Design OmegaCEN
Computing Infrastructure for DAQ, DM and SC
Observing with Modern Observatories (the data flow)
Google Sky.
The National Grid Service Mike Mineter NeSC-TOE
CEA Experiences Paul Harrison ESO.
Presentation transcript:

What does LOFAR have to do with the Virtual Observatory (VO)? LOFAR Science Day 16 December 2003 Melbourne David Barnes The University of Melbourne

VO is Standards (1) data description –metadata, content tags, context tags –IVOA: VOTable, UCD WGs data retrieval –http, ftp, web services, replica catalogs –IVOA: Data Access Layer, Grid & WS WGs data models –structure and relationships in data –IVOA: Data Modelling WG

VO is Standards (2) data and service compilation and indexing –registries, resource description –IVOA: Registry WG finding data and services –VO Query Language, SQL, brokers, agents –IVOA: VO Query Language WG workflow and interface –portals, portlets, authentication, authorisation –IVOA: no WG yet

VO is Grid Computing local, national, global compute grids –transparent, secure access to processors –job submission & monitoring, workflows local, national, global data grids –replica catalogues (caching) –global filesystems local, national, global networks –move code to data or data to code –realtime client-server processing, analysis and visualisation

VO Utopia (1) diverse data (eg. X-ray photon event lists and SPH cosmo simulation) –found in standard VO registries –described using standard vocabulary –tagged with pointers into a standard astronomy data model –accessed via standard mechanism eg. SOAP query to Web Service –provided in standard storage format, eg. VOTable, HDF, FITS, … data sets are interoperable

VO Utopia (2) diverse tools (eg. source extractor, volume visualiser, PCA algorithm) –found in standard VO registries –described using standard vocabulary –tagged with pointers into a standard astronomy data model –accessed via standard mechanism eg. SOAP query to Web Service –read and write standard storage format, eg. VOTable, HDF, FITS, … tools are interoperable tools AND data are interoperabletools AND data are interoperable

VO Utopia (3): LOFAR naif: registry search : LOFAR exists! : free data is available! traditionalist: query LOFAR archive via VO, download images and analyse at home aspirant: use on-line VO LOFAR pipeline to recalibrate intermediate data product/s and re-image with modified visibility weights braveheart: incorporate any or all LOFAR data products in VO workflows exploiting several other data sets at once

VO Utopia (4): LOFAR Grid: this data is popular in France : replicate data CDS : provide transparently to UK users Grid: this data is owned by David and in propietary period : you cant have it yet Grid: you have requested 4 TB of data : click here to upload and run your code at the data centre rather than fetching for days. Telescope: I am just a long-queue data archive : search for data returns null result : would you like to queue an observing request based on your search metadata?

LOFAR data products will –be large … accurate and standardised metadata will reduce unnecessary downloads and processing –be heavily qualified … pointers into a global data model will help software and astronomers make valid use of the data LOFAR should make a substantial contribution to data modeling in the VO –experience widely varying usage patterns … VO and Grid will provide intelligent caching and data distribution mechanisms

adopting VO buys: functionality: new and emerging VO tools and services, plus wrapped legacy software, immediately available interoperability: all VO tools and services work with and understand LOFAR data to the best of their ability archive & data centre: reference implementations of standards will provide data centre in a box software wider user community

Reality… VO standards are moving targets VOTable stores metadata and small catalogues; most real data still stored in legacy formats (FITS, HDF, …) Tools implementing or supporting VO standards are rare! Portals, workflows, embracing the Grid paradigm, …, are all in the future

end ( unless you care about LOFAR data rates … )

LOFAR data: baseband input & output total baseband data rate is ~20 Terabits/s –13000 antennae, dual pol –64 Msamples/s –12-bit sampling total station data rate is ~150 Gigabits/s –100 stations –16 beams –4 Msamples/s (2 MHz bandwidth) –6-bit sampling

LOFAR data: correlator input & output central correlator input rate is ~150 Gigabit/s output rate is ~10 GB per cycle –5000 baselines (100 stations) –2000 channels (2 MHz bandwidth, 1kHz resolution) –16 beams –4 polarisation products –32-bit IEEE floats ~36 TB/hr for 1s samples ~3.6 TB/hr for 10s samples