The Australian Virtual Observatory Clusters and Grids David Barnes Astrophysics Group.

Slides:



Advertisements
Similar presentations
Trying to Use Databases for Science Jim Gray Microsoft Research
Advertisements

1 Online Science The World-Wide Telescope as a Prototype For the New Computational Science Jim Gray Microsoft Research
INAF experience in Grid projects C. Vuerli, G. Taffoni, V. Manna, A. Barisani, F. Pasian INAF – Trieste.
Building a Mock Universe Cosmological nbody dark matter simulations + Galaxy surveys (SDSS, UKIDSS, 2dF) Access to mock catalogues through VO Provide analysis.
1 Science and the NVO – Overview and Discussion Dave De Young NVO Project Scientist NOAO NVOSS Aspen September 2006.
John Cunniffe Dunsink Observatory Dublin Institute for Advanced Studies Evert Meurs (Dunsink Observatory) Aaron Golden (NUI Galway) Aus VO 18/11/03 Efficient.
Australian e-Astronomy US-Australia Workshop on High-Performance Grids and Applications 8-9 June 2004 David Barnes Research Fellow, School of Physics The.
Australian Virtual Observatory A distributed volume rendering grid service Gridbus 2003 June 7 Melbourne University David Barnes School of Physics, The.
Astrophysics on the Grid MHD portal with Zeus3D Brett Beeson Collaborators –David Barnes (AusVO) –Andrew Melatos (MHD) –Slavisa Garic (NimrodG) –Astrogrid.
What does LOFAR have to do with the Virtual Observatory (VO)? LOFAR Science Day 16 December 2003 Melbourne David Barnes The University of Melbourne.
Clusters, Grids and their applications in Physics David Barnes (Astro) Lyle Winton (EPP)
The Australian Virtual Observatory (a.k.a. eAstronomy Australia) Ray Norris CSIRO ATNF.
The Australian Virtual Observatory e-Science Meeting School of Physics, March 2003 David Barnes.
Australian Virtual Observatory Pacific Rim Applications and Grid Middleware Assembly The 4th Workshop 5th-6th June 2003 Monash University David Barnes.
Australian Virtual Observatory International Astronomical Union GA 2003 Joint Discussion 08 17th-18th July 2003 Sydney David Barnes The University of Melbourne.
Australian Virtual Observatory Centre de Données astronomiques de Strasbourg 19th January 2004, Strasbourg David Barnes The University of Melbourne overview.
EU DataGrid progress Fabrizio Gagliardi EDG Project Leader
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
1 Sizing the Streaming Media Cluster Solution for a Given Workload Lucy Cherkasova and Wenting Tang HPLabs.
A PPARC funded project The Grid Data Warehouse Description of prototype work in progress by AstroGrid. Access-Grid lecture to Universities of Leeds and.
Future of Astronomy: enormous datasets, massive computing, innovative instrumentation Rachel Webster & David Barnes (Project Leader & Project Scientist,
Brian Schmidt The Research School of Astronomy and Astrophysics Mount Stromlo & Siding Spring Observatories.
20 Nov, 2002Virtual Molonglo Observatory1 “The VO in Australia” Melbourne Nov. 28/  What is the AVO?  How did it develop - Grid computing – particle.
VOYAGE Virtual Observatory, Yet Another Great Extension ?! should simulation data be included in a Virtual Observatory ? Alexander Knebe, Swinburne University.
INAF experience in Grid projects F. Pasian INAF. Wed 17 May GRID.IT Project The GRID.IT Project The GRID.IT Project –Application 1 Accessing Databases.
ESO-ESA Existing Activities Archives, Virtual Observatories and the Grid.
Development of China-VO ZHAO Yongheng NAOC, Beijing Nov
GRID Activities at ESAC Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC – Madrid, Spain.
SKAMP - the Molonglo SKA Demonstrator M.J. Kesteven CSIRO ATNF, T. J. Adams, D. Campbell-Wilson, A.J. Green E.M. Sadler University of Sydney, J.D. Bunton,
H OPCAT, 6dFGS & Star Formation Rates Marianne T. Doyle Ph.D. Project.
NPACI Panel on Clusters David E. Culler Computer Science Division University of California, Berkeley
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
Planning for the Virtual Observatory Tara Murphy … with input from other Aus-VO members …
Leicester Database & Archive Service J. D. Law-Green, J. P. Osborne, R. S. Warwick X-Ray & Observational Astronomy Group, University of Leicester What.
Astro-DISC: Astronomy and cosmology applications of distributed super computing.
Astronomical GRID Applications at ESAC Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC.
Aus-VO: Progress in the Australian Virtual Observatory Tara Murphy Australia Telescope National Facility.
EU 2nd Year Review – Jan – WP9 WP9 Earth Observation Applications Demonstration Pedro Goncalves :
Deploying the AstroGrid: Science Use for the Black Hole Census Deploying the AstroGrid: Science Use for the Black Hole Census Nicholas Walton Institute.
National Center for Supercomputing Applications Observational Astronomy NCSA projects radio astronomy: CARMA & SKA optical astronomy: DES & LSST access:
Functions and Demo of Astrogrid 1.1 China-VO Haijun Tian.
Markus Dolensky, ESO Technical Lead The AVO Project Overview & Context ASTRO-WISE ((G)A)VO Meeting, Groningen, 06-May-2004 A number of slides are based.
H OPCAT, 6dF & Star Formation Rates Marianne T. Doyle Ph.D. Project.
The SOC Pilot and the ATOA Jessica Chapman CASS Observatory Operations Research Program Leader 28 June 2011.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
Figure 1. Typical QLWFPC2 performance results with two WFPC2 observations of a Local Group globular cluster running on a 5-node Beowulf cluster with 1.8.
AstroGrid: The UK’s Virtual Observatory Dr Dugan Witherick – Astrophysics Group, UCL Wednesday 5 th December 2007 The University of Warwick.
Wiss. Beirat AIP, ClusterFinder & VO-Methods H. Enke German Astrophysical Virtual Observatory ClusterFinder VO Methods for Astronomical Applications.
A PPARC funded project Astronomical services: situated software vs. commodity software Guy Rixon, AstroGrid/AVO/IVOA Building Service Based Grids - GGF11.
EScience May 2007 From Photons to Petabytes: Astronomy in the Era of Large Scale Surveys and Virtual Observatories R. Chris Smith NOAO/CTIO, LSST.
H OPCAT, 6dFGS & Star Formation Rates Marianne T. Doyle Ph.D. Project.
The Virtual Observatory Europe and the VO: the Astrophysical Virtual Observatory and the EURO-VO Astrophysical Virtual Observatory and the EURO-VO Paolo.
Astrophysical Applications on Superclusters Matthew Bailes Swinburne Centre for Astrophysics and Supercomputing.
Small Projects Bridging IVO Standards and Best Practice Bridging IVOA and Domestic Community Chenzhou Cui, Yongheng Zhao Chinese Virtual Observatory National.
An FX software correlator for VLBI Adam Deller Swinburne University Australia Telescope National Facility (ATNF)
26/4/2001LAL Site Report - HEPix - LAL 2001 LAL Site Report HEPix – LAL Apr Michel Jouvin
German Astrophysical Virtual Observatory Overview and Results So Far W. Voges, G. Lemson, H.-M. Adorf.
AstroGrid NAM 2001 Andy Lawrence Cambridge NAM 2001 Andy Lawrence Cambridge Belfast Cambridge Edinburgh Jodrell Leicester MSSL.
Bright & Dark Galaxies from the HIPASS Radio Survey Marianne T. Doyle *1, Michael J. Drinkwater 1, David J. Rohde 1, Mike Read 2, Baerbel S Koribalski.
Introduction to the VO ESAVO ESA/ESAC – Madrid, Spain.
F. Genova, AstroNET meeting, Poitiers The Astrophysical Virtual Observatory.
Hans Wenzel CDF CAF meeting October 18 th -19 th CMS Computing at FNAL Hans Wenzel Fermilab  Introduction  CMS: What's on the floor, How we got.
RI EGI-InSPIRE RI Astronomy and Astrophysics Dr. Giuliano Taffoni Dr. Claudio Vuerli.
Moving towards the Virtual Observatory Paolo Padovani, ST-ECF/ESO
Google Sky.
L. Glimcher, R. Jin, G. Agrawal Presented by: Leo Glimcher
Presentation transcript:

The Australian Virtual Observatory Clusters and Grids David Barnes Astrophysics Group

Overview What is a Virtual Observatory? Scientific motivation International scene Australian scene DataGrids for VOs ComputeGrids for VOs Sketch of AVO DataGrid and ComputeGrid Clustering experience at Swinburne

What is a Virtual Observatory? A Virtual Observatory (VO) is a distributed, uniform interface to the data archives of the worlds major astronomical observatories. A VO is explored with advanced data mining and visualisation tools which exploit the unified interface to enable cross-correlation and combined processing of distributed and diverse datasets. VOs will rely on, and provide motivation for, the development of national and international computational and data grids.

Scientific motivation Understanding of astrophysical processes depends on multi-wavelength observations and input from theoretical models. As telescopes and instruments grow in complexity, surveys generate massive databases which require increasing expertise to comprehend. Theoretical modeling codes are growing in sophistication to consume available compute time. Major advances in astrophysics will be enabled by transparently cross-matching, cross-correlating and inter-processing otherwise disparate data.

Sample multi-wavelength data for the galaxy IC5332 (Ryan-Weber) blueH-alpha spectral lineinfrared HI spectral line column density HI spectral line velocity field HI spectral line velocity dispersion

HI profile from public release

International scene AstroGrid ( – phase A (1yr R&D) complete; phase B (3yr implementation) funded £3.7M. Astrophysical Virtual Observatory ( – phase A (3yr R&D) funded 4.0M. National Virtual Observatory ( vo.org) – (5yr framework development) funded USD 10M.

Australian scene Australian Virtual Observatory ( – phase A (1yr common-format archive implementation) funded AUD 260K (2003 LIEF grant [Melb, Syd, ATNF, AAO]). Data archives are: –HIPASS: 1.4 GHz continuum and HI spectral line survey –SUMSS: 843 MHz continuum survey –S4: digital images of the southern sky in five optical filters –ATCA archive: continuum and spectral line images of the southern sky –2dFGRS: optical spectra of >200K southern galaxies –and more...

DataGrids for VOs archives listed on previous slide range from ~10 GB to ~10 TB in processed (reduced) size. providing just the processed images and spectra on-line requires a distributed, high- bandwidth network of data servers – that is, a DataGrid. users may want some simple operations such as smoothing or filtering, applied at the data server. This is a Virtual DataGrid.

ComputeGrids for VOs More complex operations may be applied requiring significant processing: –source detection and parameterisation –reprocessing of raw or intermediate data products with new calibration algorithms –combined processing of raw, intermediate or "final product" data from different archives These operations require a distributed, high- bandwidth network of computational nodes – that is, a ComputeGrid.

Melbourne Adelaide Canberra Sydney Parkes? Swinburne DataCPU? Data ATNF/AAO Theory? HIPASS Gemini? ATCA 2dFGRS RAVE SUMSS CPU Theory Grangenet Possible initial players in the Australian Virtual Observatory Data and Compute Grids… APAC CPU VPAC CPU Theory

Swinburne 1998 – 2000: 40 Compaq Alpha workstations 2001: +16 Dell dual PIII rackmount servers 2002: +30 Dell dual P4 workstations mid 2002: +60 Dell dual P4 rackmount servers November 2002: placed 180 th in Top500 with 343 sustained Gflop/s. (APAC 63 rd with 825 Gflop/s) +30 Dell dual P4 rackmount servers installed mid 2002 at the Parkes telescope in NSW. psuedo-Grid with data pre-processed in realtime at the telescope, shipped back in slowtime.

Swinburne activities N-body simulation codes: –galaxy formation –stellar disk astrophysics –cosmology Pulsar searching and timing –(1 GB/min data recording) Survey processing as a coarse-grained problem Rendering of virtual reality content

Clustering costs nodesprice/nodeprice/cpu 1 cpu, 256MB std mem, 20GB disk, ethernet 1.3K 2 cpu, 1 GB fast mem, 20 GB disk, ethernet 4.4K 2.2K 2 cpu, 2GB fast mem, 60 GB SCSI disk, ethernet 8.0K 4.0K Giganet, Myrinet,...1.5K1.5K (1 cpu) 0.8K (2 cpu) (estimates incl. on-site warranty; 2 nd fastest cpu; excl. infrastructure)

Some ideas... desktop cluster – astro group has 6 dual-cpu workstations. –Add MPI, PVM, Nimrod libs and Ganglia monitoring tool to get 12-cpu loose cluster with 8GB mem. –Use MOSIX to provide transparent job migration with workstations joining the cluster at night-time. pre-purchase cluster – univ. buys ~500 desktops/yr – use them for ~6 months! –build up a cluster of desktops purchased ahead of demand, and replace as deployed to desktops. –Gain compute power of new CPUs without any real effect on end-users.