Future Grid Future Grid Overview. Future Grid Future GridFutureGridFutureGrid The goal of FutureGrid is to support the research that will invent the future.

Slides:



Advertisements
Similar presentations
FutureGrid and US Cyberinfrastructure Collaboration with EU Symposium on transatlantic EU-U.S. cooperation in the field of large scale research infrastructures.
Advertisements

SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
Cosmic Issues and Analysis of External Comments on FutureGrid TG11 Salt Lake City July Geoffrey Fox
Clouds from FutureGrid’s Perspective April Geoffrey Fox Director, Digital Science Center, Pervasive.
Future Grid Introduction March MAGIC Meeting Gregor von Laszewski Community Grids Laboratory, Digital Science.
SERC Security Systems Engineering Initiative Dr. Clifford Neuman, Director USC Center for Computer Systems Security Information Sciences Institute University.
Student Visits August Geoffrey Fox
Information Technology Infrastructure Needs in Architecture Paperless activities require robust IT solutions Dina Battisto, Ph.D., M.Arch Associate Professor,
SALSASALSASALSASALSA Digital Science Center June 25, 2010, IIT Geoffrey Fox Judy Qiu School.
NSF Vision and Strategy for Advanced Computational Infrastructure Vision: NSF Leadership in creating and deploying a comprehensive portfolio…to facilitate.
FutureGrid Summary TG’10 Pittsburgh BOF on New Compute Systems in the TeraGrid Pipeline August Geoffrey Fox
NSF-Sponsored Industry/University Cooperative Research Center for Advanced Knowledge Enablement (FAU Site) Status Report, May 2010.
SC2010 Gregor von Laszewski (*) (*) Assistant Director of Cloud Computing, CGL, Pervasive Technology Institute.
FutureGrid Summary FutureGrid User Advisory Board TG’10 Pittsburgh August Geoffrey Fox
Big Data and Clouds: Challenges and Opportunities NIST January Geoffrey Fox
FutureGrid Overview David Hancock HPC Manger Indiana University.
FutureGrid: an experimental, high-performance grid testbed Craig Stewart Executive Director, Pervasive Technology Institute Indiana University
FutureGrid: an experimental, high-performance grid testbed Craig Stewart Executive Director, Pervasive Technology Institute Indiana University
Clouds and FutureGrid MSI-CIEC All Hands Meeting SDSC January Geoffrey Fox
GIG Software Integration: Area Overview TeraGrid Annual Project Review April, 2008.
Overview of Cyberinfrastructure Northeastern Illinois University Cyberinfrastructure Day August Geoffrey Fox
FutureGrid SOIC Lightning Talk February Geoffrey Fox
Distributed FutureGrid Clouds for Scalable Collaborative Sensor-Centric Grid Applications For AMSA TO 4 Sensor Grid Technical Interchange Meeting By Anabas,
Science of Cloud Computing Panel Cloud2011 Washington DC July Geoffrey Fox
FutureGrid and US Cyberinfrastructure Collaboration with EU Symposium on transatlantic EU-U.S. cooperation in the field of large scale research infrastructures.
Experimenting with FutureGrid CloudCom 2010 Conference Indianapolis December Geoffrey Fox
PolarGrid Geoffrey Fox (PI) Indiana University Associate Dean for Graduate Studies and Research, School of Informatics and Computing, Indiana University.
Science Clouds and FutureGrid’s Perspective June Science Clouds Workshop HPDC 2012 Delft Geoffrey Fox
Gregor von Laszewski*, Geoffrey C. Fox, Fugang Wang, Andrew Younge, Archit Kulshrestha, Greg Pike (IU), Warren Smith, (TACC) Jens Vöckler (ISI), Renato.
FutureGrid: an experimental, high-performance grid testbed Craig Stewart Executive Director, Pervasive Technology Institute Indiana University
FutureGrid TeraGrid Science Advisory Board San Diego CA July Geoffrey Fox
FutureGrid Design and Implementation of a National Grid Test-Bed David Hancock – HPC Manager - Indiana University Hardware & Network.
Future Grid FutureGrid Overview Dr. Speaker. Future Grid Future GridFutureGridFutureGrid The goal of FutureGrid is to support the research on the future.
FutureGrid: an experimental, high-performance grid testbed Craig Stewart Executive Director, Pervasive Technology Institute Indiana University
What’s Hot in Clouds? Analyze (superficially) the ~140 Papers/Short papers/Workshops/Posters/Demos in CloudCom Each paper may fall in more than one category.
Future Grid FutureGrid Overview Geoffrey Fox SC09 November
FutureGrid SC10 New Orleans LA IU Booth November Geoffrey Fox
FutureGrid Connection to Comet Testbed and On Ramp as a Service Geoffrey Fox Indiana University Infra structure.
Future Grid Future Grid All Hands Meeting Introduction Indianapolis October Geoffrey Fox
FutureGrid SOIC Lightning Talk February Geoffrey Fox
FutureGrid Cyberinfrastructure for Computational Research.
Building Effective CyberGIS: FutureGrid Marlon Pierce, Geoffrey Fox Indiana University.
RAIN: A system to Dynamically Generate & Provision Images on Bare Metal by Application Users Presented by Gregor von Laszewski Authors: Javier Diaz, Gregor.
SALSASALSASALSASALSA FutureGrid Venus-C June Geoffrey Fox
Research in Grids and Clouds and FutureGrid Melbourne University September Geoffrey Fox
FutureGrid TeraGrid Science Advisory Board San Diego CA July Geoffrey Fox
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
Bio Gregor von Laszewski is conducting state-of-the-art work in Cloud computing and GreenIT at Indiana University as part of the Future Grid project. During.
FutureGrid BOF Overview TG 11 Salt Lake City July Geoffrey Fox
SALSASALSASALSASALSA Cloud Panel Session CloudCom 2009 Beijing Jiaotong University Beijing December Geoffrey Fox
FutureGrid NSF September Geoffrey Fox
Design Discussion Rain: Dynamically Provisioning Clouds within FutureGrid PI: Geoffrey Fox*, CoPIs: Kate Keahey +, Warren Smith -, Jose Fortes #, Andrew.
Cyberinfrastructure: An investment worth making Joe Breen University of Utah Center for High Performance Computing.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
Computing Research Testbeds as a Service: Supporting large scale Experiments and Testing SC12 Birds of a Feather November.
SALSASALSASALSASALSA Digital Science Center February 12, 2010, Bloomington Geoffrey Fox Judy Qiu
IEEE Cloud 2011 Panel: Opportunities of Services Business in Cloud Age Fundamental Research Challenges for Cloud-based Service Delivery Gregor von Laszewski.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Lizhe Wang, Gregor von Laszewski, Jai Dayal, Thomas R. Furlani
Cloud Technology and the NGS Steve Thorn Edinburgh University (Matteo Turilli, Oxford University)‏ Presented by David Fergusson.
FutureGrid: a Grid Testbed
Sky Computing on FutureGrid and Grid’5000
Virtualization, Cloud Computing, and TeraGrid
Clouds from FutureGrid’s Perspective
OGCE Portal Applications for Grid Computing
FutureGrid and Applications
Cyberinfrastructure and PolarGrid
OGCE Portal Applications for Grid Computing
PolarGrid and FutureGrid
Sky Computing on FutureGrid and Grid’5000
Presentation transcript:

Future Grid Future Grid Overview

Future Grid Future GridFutureGridFutureGrid The goal of FutureGrid is to support the research that will invent the future of distributed, grid, and cloud computing. FutureGrid will build a robustly managed simulation environment or testbed to support the development and early use in science of new technologies at all levels of the software stack: from networking to middleware to scientific applications. The environment will mimic TeraGrid and/or general parallel and distributed systems This test-bed will enable dramatic advances in science and engineering through collaborative evolution of science applications and related software.

Future Grid Future Grid FutureGrid Network

Future Grid FutureGrid Hardware

Future Grid Future Grid FutureGrid Partners Indiana University Purdue University San Diego Supercomputer Center at University of California San Diego University of Chicago/Argonne National Labs University of Florida University of Southern California Information Sciences Institute University of Tennessee Knoxville University of Texas at Austin/Texas Advanced Computing Center University of Virginia Center for Information Services and GWT-TUD from Technische Universtität Dresden.

Future Grid Future Grid Other Important Collaborators Other Important Collaborators Early users from an application and computer science perspective and from both research and education Grid5000/Aladin and D-Grid in Europe Commercial partners such as – Eucalyptus – Microsoft (Dryad + Azure) – Note Azure external to FutureGrid like GPU systems – We should identify other partners – should we have a formal Corporate Partners program? TeraGrid Open Grid Forum NSF

Future Grid Future Grid FutureGrid Architecture

Future Grid Future Grid FutureGrid Architecture Open Architecture allows to configure resources based on images Shared images allows to create similar experiment environments Experiment management allows management of reproducible activities Through our “stratosphere” design we allow different clouds and images to be “rained” upon hardware.

Future Grid Future Grid FutureGrid Usage Scenarios Developers of end-user applications who want to develop new applications in cloud or grid environments, including analogs of commercial cloud environments such as Amazon or Google. – Is a Science Cloud for me? Developers of end-user applications who want to experiment with multiple hardware environments. Grid middleware developers who want to evaluate new versions of middleware or new systems. Networking researchers who want to test and compare different networking solutions in support of grid and cloud applications and middleware. (Some types of networking research will likely best be done via through the GENI program.) Interest in performance requires that bare metal important

Future Grid Future Grid Selected FutureGrid Timeline October Project Starts November SC09 Demo/F2F Committee Meetings March 2010 FutureGrid network complete March 2010 FutureGrid Annual Meeting September 2010 All hardware (except Track IIC lookalike) accepted October FutureGrid allocatable via TeraGrid process – first two years by user/science board led by Andrew Grimshaw