Forward Observer In-Flight Dual Copy System Richard Knepper, Matthew Standish NASA Operation Ice Bridge Field Support Research Technologies Indiana University.

Slides:



Advertisements
Similar presentations
Evaluation of Cloud Storage for Preservation and Distribution of Polar Data. Nadirah Cogbill Mentors: Marlon Pierce, Yu (Marie) Ma, Xiaoming Gao, and Jun.
Advertisements

OVERVIEW TEAM5 SOFTWARE The TEAM5 software manages personnel and test data for personal ESD grounding devices. Test and personnel data may be viewed/reported.
JBOD storage Server Message Block (SMB) PowerShell & SCVMM 2012 R2 Management Scale-out file server clusters Storage Space Hyper-V clusters.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Global Ice Sheet Mapping Observatory (GISMO) Ohio State Univ., JPL, Univ. Kansas, VEXCEL Corp., E.G&G Corp., Wallops Flight Facility.
Antenna Baseline Measurement System 29 January 2010 Mark Phillips, Research Assistant Ohio University, Athens OH Research Intern Naval Research Laboratory.
1 Supplemental line if need be (example: Supported by the National Science Foundation) Delete if not needed. Supporting Polar Research with National Cyberinfrastructure.
© 2009 IBM Corporation Delivering Quality Service with IBM Service Management April 13 th, 2009.
© Copyright 2010 Hewlett-Packard Development Company, L.P. 1 HP + DDN = A WINNING PARTNERSHIP Systems architected by HP and DDN Full storage hardware and.
The Center for Remote Sensing of Ice Sheets (CReSIS) has been compiling Greenland ice sheet thickness data since The airborne program utilizes a.
Team Member 1 (SCHOOL), Team Member 2 (SCHOOL), Team Member 3 (SCHOOL), Team Member 4 (SCHOOL) Mentor: Dr. Blank Blank (School ) This is the place for.
Using NASA Satellite and Modeling Resources for Monitoring Groundwater Availability and Variability for Monitoring Groundwater Availability and Variability.
Track 1: Cluster and Grid Computing NBCR Summer Institute Session 2.2: Cluster and Grid Computing: Case studies Condor introduction August 9, 2006 Nadya.
PolarGrid Geoffrey Fox (PI) Indiana University Associate Dean for Graduate Studies and Research, School of Informatics and Computing, Indiana University.
Chapter 8 Implementing Disaster Recovery and High Availability Hands-On Virtual Computing.
Utilizing Data Sets from the CReSIS Data Archives to Visualize Greenland Echograms Information in Google Earth 2012 Research Experience for Undergraduates.
At A Glance VOLT is a freeware, platform independent tool set that coordinates cross-mission observation planning and scheduling among one or more space.
Elements of a Data Management Plan Bill Michener University Libraries University of New Mexico Data Management Practices for.
Funding Opportunities and Challenges at NSF Jesús M. de la Garza, Ph.D. Program Director Division of Civil and Mechanical Systems Directorate for Engineering.
April nd IBTrACS Workshop 1 Operational Procedures How can we build consistent, homogeneous, well- documented climate quality data?
17-April-2007 High Performance Computing Basics April 17, 2007 Dr. David J. Haglin.
RNA-Seq 2013, Boston MA, 6/20/2013 Optimizing the National Cyberinfrastructure for Lower Bioinformatic Costs: Making the Most of Resources for Publicly.
Multi-Channel Radar Depth Sounder (MCRDS) Signal Processing: A Distributed Computing Approach Je’aime Powell 1, Dr. Linda Hayden 1, Dr. Eric Akers 1, Richard.
HYDRA: Using Windows Desktop Systems in Distributed Parallel Computing Arvind Gopu, Douglas Grover, David Hart, Richard Repasky, Joseph Rinkovsky, Steve.
PROCESSED RADAR DATA INTEGRATION WITH SOCIAL NETWORKING SITES FOR POLAR EDUCATION Jeffrey A. Wood April 19, 2010 A Thesis submitted to the Graduate Faculty.
Archiving and Record Retention Service Cammy Webster Assistant Director - CSD DIS Jan 23, 2007.
Figure 1. Typical QLWFPC2 performance results with two WFPC2 observations of a Local Group globular cluster running on a 5-node Beowulf cluster with 1.8.
GreenSched: An Energy-Aware Hadoop Workflow Scheduler
Abstract The Center for Remote Sensing of Ice Sheets (CReSIS) has collected hundreds of terabytes of radar depth sounder data over the Greenland and Antarctic.
Uses of Geospatial Soils & Surface Measurement Data in DWR Delta Levee Program Joel Dudas
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
John D. McCoy Principal Investigator Tom McKenna Project Manager UltraScienceNet Research Testbed Enabling Computational Genomics Project Overview.
CCS Overview Rene Salmon Center for Computational Science.
Using a MATLAB/Photoshop Interface to Enhance Image Processing in the Interpretation of Radar Imagery The Center for Remote Sensing of Ice Sheets (CReSIS)
1 CReSIS Lawrence Kansas February Geoffrey Fox (PI) Computer Science, Informatics, Physics Chair Informatics Department Director Digital Science.
03/11/021 Spaceport Vision Team Members. 03/11/022 Systems Definition Spaceport System Spaceport Stakeholder Needs High-Level Trade Study Performance.
2015 GLM Annual Science Team Meeting: Cal/Val Tools Developers Forum 9-11 September, 2015 DATA MANAGEMENT For GLM Cal/Val Activities Helen Conover Information.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Backup. What to back up? Pictures Documents (include s if using Off Line Program) Video’s Music System Image – Programs – Registry – Operating System.
March 2004 At A Glance autoProducts is an automated flight dynamics product generation system. It provides a mission flight operations team with the capability.
A Comparative Analysis of Localized Command Line Execution, Remote Execution through Command Line, and Torque Submissions of MATLAB® Scripts for the Charting.
 Does the addition of computing cores increase the performance of the CReSIS Synthetic Aperture Radar Processor (CSARP)?  What MATLAB toolkits and/or.
Technical University of Denmark Radars and modifications IIP KU Team.
Advanced Research Computing Projects & Services at U-M
Using HPS resources in airborne science Aaron Wells and Barb Hallock UITS Research Technologies, Cyberinfrastructure and Service Center Indiana University.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Streaming Applications for Robots with Real Time QoS Oct Supun Kamburugamuve Indiana University.
Evolution of the JPSS Ground Project Calibration and Validation System Patrick Purcell, Gyanesh Chander and Peyush Jain JPSS Ground Project NASA, GSFC.
This material is based upon work supported by the National Science Foundation under Grant No. ANT Any opinions, findings, and conclusions or recommendations.
David P. Anderson Space Sciences Laboratory University of California – Berkeley Public Distributed Computing with BOINC.
Dr. Linda Hayden, Box 672 Elizabeth City State University Elizabeth City, NC Cyberinfrastructure for Remote Sensing.
GPM-GLOBE Precipitation Student Field Campaign Global Precipitation Measurement Mission Developed by the GPM Education and Communications Team NASA Goddard.
RobuSTore: Performance Isolation for Distributed Storage and Parallel Disk Arrays Justin Burke, Huaxia Xia, and Andrew A. Chien Department of Computer.
Advisory Board Meeting March Earth System Science.
03/20/021 Spaceport Vision Team Members Organizations that contributed: Air Force NASA NCSS FAA Industry University Etc.
Supporting the “Solving Business Problems with Environmental Data” Competition 24 th October 2013 Vlad Stoiljkovic.
Creating Grid Resources for Undergraduate Coursework John N. Huffman Brown University Richard Repasky Indiana University Joseph Rinkovsky Indiana University.
12/19/01MODIS Science Team Meeting1 MODAPS Status and Plans Edward Masuoka, Code 922 MODIS Science Data Support Team NASA’s Goddard Space Flight Center.
AIRBORNE INTERNET BY T.NAGA MANI (06991A0539 ). DEPARTMENT OF COMPUTER SCIENCE ENGINEERING ST.THERESSA INSTITUTE OF ENGINEERING AND TECHNOLOGY.
Happy Endings: Reengineering Wesleyan’s Software Deployment to Labs and Classrooms Kyle Tousignant 03/22/2016.
Volume Licensing Readiness: Level 100
Organizations Are Embracing New Opportunities
Volume Licensing Readiness: Level 100
Introduction
NSIDC CLPX Cryosphere Science Data Product Metrics
Volume Licensing Readiness: Level 100
XSEDE’s Campus Bridging Project
Future Data Architectures Big Data Workshop – April 2018
PolarGrid and FutureGrid
CReSIS Cyberinfrastructure
Presentation transcript:

Forward Observer In-Flight Dual Copy System Richard Knepper, Matthew Standish NASA Operation Ice Bridge Field Support Research Technologies Indiana University April 3, 2013

Overview Project Overview Workflow Requirements and Constraints Inflight Proposed Improvements Further Applications 2

33 Project History: IU/CReSIS Partnership Airborne Synthetic Aperture Radar Systems NSF Polar Grid Project Operation Ice Bridge 2009 NSF Science & Technology Center grant for CReSIS Operation Ice Bridge , MultiChannel Radar Depth Sounder Snow Band KU Band KU does radar well, IU does data well

44

55 Workflow (original) Radar systems on the aircraft connect to machine running LabView After flight, drives unloaded to Ground Lab Backup/Copy Operations Matlab Processing of Radar Data Final processing on IU’s Quarry cluster Issues: Delays returning results to data processing team Overnight Turnaround Physical Drive management

66 System Requirements / Constraints Intake of data at rates increasing every 6 months Multiple sources – 3 or 4 instruments File consistency and security throughout Multiple copies Ability to process data quickly Staffing issues – do we want to send an “IT Guy” to 2 or more missions a year? Ideal: archive and process data while in flight, simple enough to allow the radar/data processing team to use FPGAs? SSDs? Vibration issues?

77 Forward Observer System Replace the radar storage array with network: 40Gb Infiniband transport infrastructure 3 Servers with 24 SSD drives each Head – Windows Share to Radar Science – Matlab Processing Archive – Checksum and copy to: Vibration-mounted mechanical drives for cycling out data to ground processing Monitoring/management server Iteration 2: No mechanical drives Process management allows processing during collection

8 Better data assurance across multiple copies Possible to monitor data rates from the radar computers more closely Possible to process in flight Sync of data processing teams and radar teams Significant improvement in usability Faster storage and processing (for some tasks) than the systems at IU and KU Benefits from a computational science system in the plane

9 Storage utilization File counts Current reads/writes Status of processing queues Environmental status of servers Error tracking

10 Radar status GPS info Results of “Quick-Look” Matlab processing to show the ice bed

11 Improved drive management – handling 24 SSD’s at a time for sync/backup Better management of Matlab processing Workflow documentation and automation End goal: remove the “IT guy” and make the system more manageable Apply to new instruments and new platforms, provide data and computational capability in about 10RU of space on a single 7500KVA UPS Future Improvements

12 Questions: Work supported by: NASA Operation Ice Bridge NSF STC for CReSIS Award NSF Polargrid MRI Award IU Pervasive Technology Institute (Lilly Foundation) Thanks!