EVLA Transition to Science Operations: An Overview EVLA Advisory Committee Meeting, March 19-20, 2009 Bob Dickman AD, NM Operations.

Slides:



Advertisements
Similar presentations
1 National Radio Astronomy Observatory The EVLA -- Status, Future Rick Perley National Radio Astronomy Observatory.
Advertisements

National Radio Astronomy Observatory June 13/14, 2005 EVLA Phase II Proposal Review EVLA Phase II Computing Development Bryan Butler (EVLA System Engineer.
Mark McKinnon EVLA Advisory Committee Meeting May 8-9, Budget, Schedule, Contingency Mark McKinnon Project Manager.
Jim UlvestadEVLA Mid-Project Review May 11-12, EVLA Operations Jim Ulvestad Array Science Center Concept (Requirements) Staffing and Cost Plans.
C. ChandlerEVLA Advisory Committee Meeting September 6-7, Scientific Commissioning Plan Claire Chandler.
Hunt for Molecules, Paris, 2005-Sep-20 Software Development for ALMA Robert LUCAS IRAM Grenoble France.
Dale E. Gary Professor, Physics, Center for Solar-Terrestrial Research New Jersey Institute of Technology 1 03/15/2012Preliminary Design Review.
EVLA Early Science: Shared Risk Observing EVLA Advisory Committee Meeting, March 19-20, 2009 Claire Chandler Deputy AD for Science, NM Ops.
McMullinEVLA Advisory Committee Meeting December 14-15, 2004 EVLA Data Post-processing: SSG (AIPS++/CASA) Development J. McMullin.
S.T. MyersEVLA Advisory Committee Meeting September 6-7, 2007 EVLA Algorithm Research & Development Steven T. Myers (NRAO) CASA Project Scientist with.
ALMA Software B.E. Glendenning (NRAO). 2 ALMA “High Frequency VLA” in Chile Presently a European/North American Project –Japan is almost certainly joining.
Mark McKinnon EVLA Advisory Committee Meeting May 8-9, Project Overview Mark McKinnon Project Manager.
The Expanded Very Large Array: Phase I Science and Technical Requirements Rick Perley NRAO - Socorro.
N. RadziwillEVLA NSF Mid-Project Report May 11-12, 2006 NRAO End to End (e2e) Operations Division Nicole M. Radziwill.
Correlator Growth Path EVLA Advisory Committee Meeting, March 19-20, 2009 Michael P. Rupen Project Scientist for WIDAR.
C. ChandlerEVLA Advisory Committee Meeting September 6-7, Transition Operations Claire Chandler.
JVLA capabilities to be offered for semester 2013A Claire Chandler.
Bill SahrEVLA Advisory Committee Meeting May 8-9, EVLA Monitor & Control.
K. Y. LoEVLA Advisory Committee Meeting September 6-7, 2007 Charge to the Committee K. Y. Lo.
Gustaaf van MoorselEVLA Advisory Committee Meeting May 8-9, 2006 EVLA Computing Software Overview.
EVLA Software Bryan Butler. 2007May22EVLA SAGE Meeting2 Requirements and Goals of EVLA Software Maximize scientific throughput of the instrument At a.
Rick PerleyEVLA Advisory Panel Meeting May , Scientific Impact of Descopes Rick Perley.
2007Sep06 EAC Butler - Software Overview 1 Software Overview Bryan Butler.
1 KFPA Critical Design Review – Fri., Jan. 30, 2009 KFPA Data Pipeline Bob Garwood- NRAO-CV.
SAGE meeting Socorro, May 22-23, 2007 EVLA Science Operations: the Array Science Center Claire Chandler NRAO/Socorro.
P. NapierEVLA Advisory Committee Meeting September 8-9, EVLA Advisory Committee Project Overview P. Napier, Project Manager Management, Schedule,
EVLA Data Processing PDR Scale of processing needs Tim Cornwell, NRAO.
P.NapierEVLA Advisory Comm, 14 Dec 2004 Project Overview Peter Napier, EVLA Project Manager Status 2003 Committee Response.
Introduction to EVLA Software Bryan Butler. 2006Dec05/06EVLA M&C Transition Software CDR2 EVLA Computing (Terse) History The original EVLA Phase I proposal.
Atacama Large Millimeter/submillimeter Array Expanded Very Large Array Robert C. Byrd Green Bank Telescope Very Long Baseline Array Observing with NRAO.
Mark McKinnon NSF Mid-Project Review May 11-12, Baseline Project Definition Mark McKinnon Project Manager.
Recent progress in EVLA-specific algorithms EVLA Advisory Committee Meeting, March 19-20, 2009 S. Bhatnagar and U. Rau.
M.P. RupenCorrelator Face-to-Face Meeting 31 Oct 2006 Output Formats Part II Michael P. Rupen.
Rick PerleyNSF Mid-Project Review May 11-12, Scientific Impact of Descopes Rick Perley.
Atacama Large Millimeter/submillimeter Array Expanded Very Large Array Robert C. Byrd Green Bank Telescope Very Long Baseline Array EVLA Early Science.
RupenEVLA Advisory Committee Meeting May 8-9, Software Requirements Michael P. Rupen EVLA Project Scientist for Software rev. 3may06.
N. RadziwillEVLA Advisory Committee Meeting May 8-9, 2006 NRAO End to End (e2e) Operations Division Nicole M. Radziwill.
UlvestadEVLA Advisory Committee Meeting September 6-7, Future EVLA Operations Jim Ulvestad.
M.P. RupenEVLA Advisory Committee Meeting September 6-7, Correlator Test Plan Michael P. Rupen.
Frazer OwenNSF EVLA Mid-Project Review May 11-12, Transition to EVLA
Gustaaf van MoorselEVLA Advisory Committee Meeting December 14-15, 2004 EVLA Computing End-to-end (E2e) software.
Rick PerleyEVLA Phase II Definition Meeting Aug 23 – 25, The Expanded Very Large Array Phase II Baseline Plan and Constraints.
Bill SahrNSF Review May , EVLA Monitor & Control.
10 January 2006AAS EVLA Town Hall Meeting1 The EVLA: A North American Partnership The EVLA Project on the Web
Computing Introduction Gareth Hunt EVLA System PDR 2001 December 04.
Jim UlvestadEVLA Advisory Committee Meeting May 8-9, EVLA Operations Jim Ulvestad Array Science Center Concept Requirements Staffing and Cost Plans.
S.T.MyersEVLA Advisory Committee Meeting December 14-15, 2004 EVLA Data Post-processing Overview Steven T. Myers AIPS++ Project Scientist.
SAGE meeting Socorro, May 22-23, 2007 WIDAR Correlator Overview Michael P. Rupen Project Scientist for WIDAR & Software.
WIDAR Correlator Options and Potential Craig Walker NRAO Socorro U.S. VLBI Technical Coordination Meeting May 14-15, 2007.
Bryan Butler EVLA Computing Division Head
EVLA Availability - or - When Can I Use It?
Computing Architecture
Committee Charge Fred K.Y. Lo
Monitor and Control Software
Shared Risk Observing Claire Chandler EVLA SSS Review, June 5, 2009
VLA to EVLA Transition Plan
Some Illustrative Use Cases
EVLA Advisory Committee Meeting
Science Commissioning
Gustaaf van Moorsel September 9, 2003
Prototype Correlator Testing
Shared Risk Science with the EVLA
Mark McKinnon EVLA Project Manager
EVLA Advisory Committee Meeting, March 19-20, 2009
Observatory Science Operations
EVLA Operations Jim Ulvestad
Observatory Science Operations
Correlator Growth Path
EVLA Construction Status
EVLA Postprocessing Algorithms
Presentation transcript:

EVLA Transition to Science Operations: An Overview EVLA Advisory Committee Meeting, March 19-20, 2009 Bob Dickman AD, NM Operations

The EVLA is an Unusual Project More than a decade in duration EVLA will produce a tenfold increase in capabilities while keeping the old instrument operational and supporting users  maintain the user base: –>60% of hours kept available to community –EVLA-converted antennas must be usable with VLA correlator (hardware and software implications); the cost of this decision was built into EVLA budget –The EVLA will be delivered in an operational state that offers massively enhanced capabilities to the user community relative to the VLA: The commissioning of basic functionality must be carried out prior to delivery of the instrument Our mission is to deliver Science Opportunity to the community as rapidly as is practical. –To facilitate this, we will take advantage of synergies with the Observatory’s other telescopes, especially ALMA –Users will see the EVLA as one instrument within a single, unified Observatory

What Will Be Delivered?  The EVLA will be delivered four years hence (1/1/13) with: All hardware complete Support for essential correlator capabilities, which will be sufficient to serve the vast majority of EVLA users (see Rick Perley’s talk on Science Use Cases): For continuum observations: 8 GHz bandwidth, dual polarization For spectral line observations: Up to 64 separately steerable sub-bands, with adjustable frequency resolution Archived raw visibilities and calibration tables, with reference images compatible with VAO for projects carried out in standard modes NRAO data reduction package(s) available to the community which are capable of supporting the analysis of data obtained with the completed EVLA User support that leverages ALMA as much as possible  More specialized capabilities (e.g., production of noise-limited, low- frequency, wide-band images, phased array, pulsar gating, radar modes, etc.) will be added as resources permit

EVLA Commissioning: Some High-level Milestones WIDAR Operational: VLA Correlator Construction Turned Off Completed  Year | | | | | Configuration D D C B A D C B A D C B A N EVLA Antennas WIDAR Bandwidth | 256 MHz  | 2 GHz  | 8 GHz  (to community at large) New Receivers Complete: K, Q Ka C {L,S, X, Ku} Reference Image Pipeline | Narrow Band Reference images 

Integrating the EVLA’s Transition to Full Science Operations Areas to be coordinated: –EVLA construction –NM Management and Operations –EVLA User Support and Integration –Post-Processing: Software - Package(s) Software - Algorithms Hardware I shall summarize the status of these areas; details for those not already covered will be provided in the talks that follow

EVLA Construction and Commissioning Complete delivery, verify and integrate –WIDAR –Receivers –Antennas –Backend electronics –Software (M&C, SSS, Post-processing) Verify and integrate here includes: –Test and diagnose problems with hardware, software, and electronics –Certify that EVLA performance meets specifications required in the Project Book –Test and de-bug the WIDAR correlator –Test and certify supporting functional software (user support, calibration, etc.) –Test and certify first supported scientific observing modes –Test and certify science support software Plenty of challenge; project is well in hand

NM Operations and Management Ensure that the scientific productivity of the EVLA is optimized and place the array in the hands of the user community as rapidly as possible Allocate additional resources available/necessary to EVLA construction project to deliver instrument on time, budget Science Operations is a central aspect (see Claire’s talk) Maintain the operational infrastructure of the EVLA

Post-processing Computing Software Packages CASA recently restructured (FY09) –Now being managed jointly by ALMA and EVLA/NM Operations –Enhanced opportunities for synergy –CASA capabilities will support WIDAR capabilities as they are released to the OSRO community –AIPS will also be able to handle all but the most complex data sets While many users will use the EVLA at its higher frequencies (Q, Ka, K bands), where fractional bandwidths are relatively small, there will also be strong pressure to address the special opportunities offered by data at the lower frequencies –This is new territory: Low-frequency, high dynamic range wide-band, wide field imaging (see talks by Sanjay Bhatnagar and Urvashi Rau) –Accelerating the development of the specialized algorithms required in order to take full advantage of the EVLA’s low frequency capabilities is being carried out by an Algorithm Development Working Group led by Gareth Hunt and Frazer Owen (see my talk following Sanjay’s)

Post-Processing in the EVLA Era EVLA data rates/data sets will be times larger than their VLA counterparts Archive disk space will be expanded to keep pace Bandwidths from EVLA site to DSOC, and from the DSOC to Internet2, will be increased to 1 Gbps; if necessary, we will physically ship the largest EVLA data sets to users on media With the advent of the EVLA, the current mutual compatibility between data set size, data analysis programs (CASA, AIPS, etc.) and the typical computing platforms available to the astronomer (lap tops, multiprocessor desk tops, etc.) will be seriously challenged NRAO is exploring the extent to which parallel processing can be used to expedite the analysis of larger EVLA data sets –Part of the EVLA project’s deliverables –Exploiting synergies with ALMA

Post-Processing Hardware Tests have begun with a small prototype computing cluster, purchased jointly with ALMA funds last autumn and now in hand, to define how best to optimize the science pipeline to our users: –Tests are underway with a parallelized version of CASA to determine the optimum cluster architecture relative to WIDAR data sets (e.g., number of nodes, how to overcome I/O bottlenecks, memory requirements; see Bhatnagar/Rau talk) The results of this testing program will: –Help define high-end post processing computing requirements for the EVLA user bas e –Guide the purchase of the “final” post-processing cluster included in the EVLA construction budget –Determine optimum use of this cluster based on what we find in testing program Remote access for multiple users? Users travel to NM to use? Other, more specialized applications?

Questions?