1 CReSIS Lawrence Kansas February 12 2009 Geoffrey Fox (PI) Computer Science, Informatics, Physics Chair Informatics Department Director Digital Science.

Slides:



Advertisements
Similar presentations
C YBERINFRASTRUCTURE C ENTER FOR P OLAR S CIENCE (CICPS) Cyberinfrastructure for Remote Sensing of Ice Sheets Dr. Linda Hayden Dr. Geoffrey Fox Dr. Prasad.
Advertisements

1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Architecture and Measured Characteristics of a Cloud Based Internet of Things May 22, 2012 The 2012 International Conference.
Evaluation of Cloud Storage for Preservation and Distribution of Polar Data. Nadirah Cogbill Mentors: Marlon Pierce, Yu (Marie) Ma, Xiaoming Gao, and Jun.
1 Challenges and New Trends in Data Intensive Science Panel at Data-aware Distributed Computing (DADC) Workshop HPDC Boston June Geoffrey Fox Community.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
Our Goal: To develop and implement innovative and relevant research collaboration focused on ice sheet, coastal, ocean, and marine research. NSF: Innovation.
Technology Steering Group January 31, 2007 Academic Affairs Technology Steering Group February 13, 2008.
1 Supplemental line if need be (example: Supported by the National Science Foundation) Delete if not needed. Supporting Polar Research with National Cyberinfrastructure.
Student Visits August Geoffrey Fox
Manasa Guduru Sai Prasanth Sridhar Malini srinivasan Sinduja Narasimhan Reference: Aymerich, F. M., Fenu, G., & Surcis, S. (2008). An approach to a cloud.
Introduction to Computers Essential Understanding of Computers and Computer Operations.
SOFTWARE.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Building Sustainable MIS Infrastuctures
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Virtual Desktops and Flex CSU-Pueblo Joseph Campbell.
The Center for Remote Sensing of Ice Sheets (CReSIS) has been compiling Greenland ice sheet thickness data since The airborne program utilizes a.
I399 1 Research Methods for Informatics and Computing D: Basic Issues Geoffrey Fox Associate Dean for Research.
C YBERINFRASTRUCTURE C ENTER FOR P OLAR S CIENCE (CICPS) Cyberinfrastructure for Remote Sensing of Ice Sheets Dr. Linda Hayden
Team Member 1 (SCHOOL), Team Member 2 (SCHOOL), Team Member 3 (SCHOOL), Team Member 4 (SCHOOL) Mentor: Dr. Blank Blank (School ) This is the place for.
1 Faculty Council IT Committee C-13 February 4, /4/2010.
PolarGrid Geoffrey Fox (PI) Indiana University Associate Dean for Graduate Studies and Research, School of Informatics and Computing, Indiana University.
27 May 2004 C.N. Papanicolas EGEE and the role of IASA ( In close collaboration with UOA ) IASA GRID Steering Committee: George Kallos Lazaros.
Cyberinfrastructure Geoffrey Fox Indiana University with Linda Hayden Elizabeth City State University April Virtual meeting.
TeraGrid Resources Enabling Scientific Discovery Through Cyberinfrastructure (CI) Diane Baxter, Ph.D. San Diego Supercomputer Center University of California,
OpenQuake Infomall ACES Meeting Maui May Geoffrey Fox
FRIDAY NOVEMBER 7, 2008 National Selected Show Case Projects (with strong student research component) Cyberinfrastructure, Remote Sensing, Collaboration.
1 PolarGrid Open Grid Forum OGF21, Seattle Washington October Geoffrey Fox Computer Science, Informatics, Physics Pervasive Technology Laboratories.
Cyberinfrastructure Geoffrey Fox Indiana University.
Data Science at Digital Science October Geoffrey Fox Judy Qiu
Multi-Channel Radar Depth Sounder (MCRDS) Signal Processing: A Distributed Computing Approach Je’aime Powell 1, Dr. Linda Hayden 1, Dr. Eric Akers 1, Richard.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Jan. 17, 2002DØRAM Proposal DØRACE Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Remote Analysis Station ArchitectureRemote.
GRSS Technical Committees and Chapter Meeting IGARSS 2007 Dr. Linda Bailey Hayden
PROCESSED RADAR DATA INTEGRATION WITH SOCIAL NETWORKING SITES FOR POLAR EDUCATION Jeffrey A. Wood April 19, 2010 A Thesis submitted to the Graduate Faculty.
1 Meadowood January Geoffrey Fox Associate Dean for Research and Graduate Studies, School of Informatics and Computing Indiana.
Building Effective CyberGIS: FutureGrid Marlon Pierce, Geoffrey Fox Indiana University.
SALSASALSASALSASALSA FutureGrid Venus-C June Geoffrey Fox
Forward Observer In-Flight Dual Copy System Richard Knepper, Matthew Standish NASA Operation Ice Bridge Field Support Research Technologies Indiana University.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
November Geoffrey Fox Community Grids Lab Indiana University Net-Centric Sensor Grids.
Virtual Appliances CTS Conference 2011 Philadelphia May Geoffrey Fox
A Comparative Analysis of Localized Command Line Execution, Remote Execution through Command Line, and Torque Submissions of MATLAB® Scripts for the Charting.
Minority-Serving Institutions (MSI) Cyberinfrastructure (CI) Institute [MSI-CI 2 ] and CI Empowerment Coalition MSI-CIEC October Geoffrey Fox
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Sergiu April 2006June 2006 Overview of TeraGrid Resources and Services Sergiu Sanielevici, TeraGrid Area Director for User.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation,
February 19, 2015 Learning & Research NSU Dr. George Hsieh Department of Computer Science.
Remarks on MOOC’s Open Grid Forum BOF July 24 OGF38B at XSEDE13 San Diego Geoffrey Fox Informatics, Computing.
SALSASALSASALSASALSA Digital Science Center February 12, 2010, Bloomington Geoffrey Fox Judy Qiu
Feb. 13, 2002DØRAM Proposal DØCPB Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Partial Workshop ResultsPartial.
This material is based upon work supported by the National Science Foundation under Grant No. ANT Any opinions, findings, and conclusions or recommendations.
Dr. Linda Hayden, Box 672 Elizabeth City State University Elizabeth City, NC Cyberinfrastructure for Remote Sensing.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Purdue RP Highlights TeraGrid Round Table November 5, 2009 Carol Song Purdue TeraGrid RP PI Rosen Center for Advanced Computing Purdue University.
Tools and Services Workshop
Joslynn Lee – Data Science Educator
Joint Techs, Columbus, OH
Project Title Watershed Watch 2007 Elizabeth City State University
Cyberinfrastructure and PolarGrid
PolarGrid and FutureGrid
Watershed Watch 2007 :: Elizabeth City State University
Cloud versus Cloud: How Will Cloud Computing Shape Our World?
CReSIS Cyberinfrastructure
Project Title Watershed Watch 2013 Elizabeth City State University
Project Title Watershed Watch 2009 Elizabeth City State University
Presentation transcript:

1 CReSIS Lawrence Kansas February Geoffrey Fox (PI) Computer Science, Informatics, Physics Chair Informatics Department Director Digital Science Center and Community Grids Laboratory Indiana University Bloomington IN Linda Hayden (co-PI) ECSU PolarGrid

Support CReSIS with Cyberinfrastructure Base and Field Camps for Arctic and Antarctic expeditions Training and education resources at ECSU Collaboration Technology at ECSU Lower-48 System at Indiana University and ECSU to support off line data analysis and large scale simulations (next stage) Initially modest system at IU/ECSU for data analysis 2

C YBERINFRASTRUCTURE C ENTER FOR P OLAR S CIENCE (CICPS) 3

PolarGrid Greenland 2008 Base System (Ilulissat Airborne Radar) 8U, 64 core cluster, 48TB external fibre-channel array Laptops (one off processing and image manipulation) 2TB MyBook tertiary storage Total data acquisition 12TB (plus 2 back up copies) Satellite transceiver available if needed, but used wired network at airport used for sending data back to IU Base System (NEEM Surface Radar, Remote Deployment) 2U, 8 core system utilizing internal hard drives hot swap for data back up 4.5TB total data acquisition (plus 2 backup copies) Satellite transceiver used for sending data back to IU Laptops (one off processing and image manipulation) 4

C YBERINFRASTRUCTURE C ENTER FOR P OLAR S CIENCE (CICPS) 5 PolarGrid goes to Greenland

NEEM 2008 Base Station 6

PolarGrid Antarctic 2008/2009 Base System (Thwaites Glacier Surface Radar) 2U, 8 core system utilizing internal hard drives hot swap for data back up 11TB total data acquisition (plus 2 backup copies) Satellite transceiver used for sending data back to IU Laptops (one-off processing and image manipulation) IU-funded Sys-Admin 1 admin Greenland NEEM admin Greenland 2009 (March 2009) 1 admin Antarctica 2009/2010 (Nov 09 – Feb 2010) Note that IU effort is a collaboration between research group and University Information Technology support groups 7

ECSU and PolarGrid Initially A base camp 64-core cluster, allowing near real-time analysis of radar data by the polar field teams. An educational videoconferencing Grid to support educational activities PolarGrid Laboratory for students ECSU supports PolarGrid Cyberinfrastructure in the field 8 Assistant Professor, Eric Akers, and graduate student, Je’aime Powell, from ECSU travel to Greenland

PolarGrid Lab Mac OS X Public IP accessible through ECSU firewall Ubuntu Linux Windows XP Additional Software Desktop Publishing Ubuntu Linux Word Processing Web Design Programming Mathematical Applications Geographic Information Systems (GIS)

Experience from Supporting Expeditions I Base processing (NEEM 2008): 600GB – 1TB on 8 cores ~8-12hours Expeditions are data collection intensive, with goal of pre-process computing data validation of daily data gathering within 24 hours Laptops utilized for one-off pre-processing, image manipulation/visualization Heavy utilization of MatLab for all processing (both pre-processing and full processing) CReSIS utilizing PolarGrid base cluster for full data processing of all data collected so far 10

Lessons from field use in expeditions include the necessity of smaller computing engines due to size, weight and power limitations Greenland 2008 successes have realized PG equipment importance. CReSIS is now utilizing PG gear to store and process 2 additional radar systems data Smaller system footprint and data management has driven cost per system down. Complex storage environments are not practical in a mobile data processing environment Pre-processing data in the field has allowed validation of data acquisition during collection phases 11 Experience from Supporting Expeditions II

Field Results – 2008/09 “Without on-site processing enabled by POLARGRID, we would not have identified aircraft inverter-generated RFI. This capability allowed us to replace these “noisy” components with better quality inverters, incorporating CReSIS-developed shielding, to solve the problem mid-way through the field experiment.” Jakobshavn 2008 NEEM 2008GAMBIT 2008/09

13 TeraGrid High Performance Computing Systems Computational Resources (size approximate - not to scale) Slide Courtesy Tommy Minyard, TACC SDSC TACC NCSA ORNL PU IU PSC NCAR (504TF) 2008 (~1PF) Tennessee LONI/LSU UC/ANL

Future Features of PolarGrid PolarGrid will allow all of CReSIS access to TeraGrid to help large scale computing PolarGrid will support the CyberInfrastructure Center for Polar Science concept (CICPS) i.e. the national distributed collaboration to understand ice sheet science Cyberinfrastructure levels the playing field in research and learning Students and faculty can contribute based on interest and ability – not on affiliation PolarGrid will be configured as a cloud for ease of use – virtual machine technology especially helpful for education PolarGrid portal will use Web2.0 style tools to support collaboration ECSU can use PolarGrid to enhance local facilities and Internet2 connectivity 14