PolarGrid and FutureGrid

Slides:



Advertisements
Similar presentations
1 Challenges and New Trends in Data Intensive Science Panel at Data-aware Distributed Computing (DADC) Workshop HPDC Boston June Geoffrey Fox Community.
Advertisements

Clouds from FutureGrid’s Perspective April Geoffrey Fox Director, Digital Science Center, Pervasive.
Future Grid Introduction March MAGIC Meeting Gregor von Laszewski Community Grids Laboratory, Digital Science.
Summary Role of Software (1 slide) ARCS Software Architecture (4 slides) SNS -- Caltech Interactions (3 slides)
Student Visits August Geoffrey Fox
Knowledge Environments for Science and Engineering: Overview of Past, Present and Future Michael Pazzani, Information and Intelligent Systems Division,
SALSASALSASALSASALSA Digital Science Center June 25, 2010, IIT Geoffrey Fox Judy Qiu School.
1 Challenges Facing Modeling and Simulation in HPC Environments Panel remarks ECMS Multiconference HPCS 2008 Nicosia Cyprus June Geoffrey Fox Community.
Big Data and Clouds: Challenges and Opportunities NIST January Geoffrey Fox
FutureGrid: an experimental, high-performance grid testbed Craig Stewart Executive Director, Pervasive Technology Institute Indiana University
FutureGrid: an experimental, high-performance grid testbed Craig Stewart Executive Director, Pervasive Technology Institute Indiana University
SALSASALSASALSASALSA AOGS, Singapore, August 11-14, 2009 Geoffrey Fox 1,2 and Marlon Pierce 1
FutureGrid SOIC Lightning Talk February Geoffrey Fox
Science of Cloud Computing Panel Cloud2011 Washington DC July Geoffrey Fox
Getting Access to FutureGrid CTS Conference 2011 Philadelphia May Geoffrey Fox
Experimenting with FutureGrid CloudCom 2010 Conference Indianapolis December Geoffrey Fox
PolarGrid Geoffrey Fox (PI) Indiana University Associate Dean for Graduate Studies and Research, School of Informatics and Computing, Indiana University.
Science Clouds and FutureGrid’s Perspective June Science Clouds Workshop HPDC 2012 Delft Geoffrey Fox
Biomedical Cloud Computing iDASH Symposium San Diego CA May Geoffrey Fox
FutureGrid Connection to Comet Testbed and On Ramp as a Service Geoffrey Fox Indiana University Infra structure.
Image Generation and Management on FutureGrid CTS Conference 2011 Philadelphia May Geoffrey Fox
Future Grid Future Grid All Hands Meeting Introduction Indianapolis October Geoffrey Fox
Building Effective CyberGIS: FutureGrid Marlon Pierce, Geoffrey Fox Indiana University.
SALSASALSASALSASALSA FutureGrid Venus-C June Geoffrey Fox
1 CReSIS Lawrence Kansas February Geoffrey Fox (PI) Computer Science, Informatics, Physics Chair Informatics Department Director Digital Science.
SALSASALSASALSASALSA Cloud Panel Session CloudCom 2009 Beijing Jiaotong University Beijing December Geoffrey Fox
Virtual Appliances CTS Conference 2011 Philadelphia May Geoffrey Fox
Computing Research Testbeds as a Service: Supporting large scale Experiments and Testing SC12 Birds of a Feather November.
Social Networking for Scientists (Research Communities) Using Tagging and Shared Bookmarks: a Web 2.0 Application Marlon Pierce, Geoffrey Fox, Joshua Rosen,
Future Grid Future Grid Overview. Future Grid Future GridFutureGridFutureGrid The goal of FutureGrid is to support the research that will invent the future.
Remarks on MOOC’s Open Grid Forum BOF July 24 OGF38B at XSEDE13 San Diego Geoffrey Fox Informatics, Computing.
SALSASALSASALSASALSA Digital Science Center February 12, 2010, Bloomington Geoffrey Fox Judy Qiu
Big Data to Knowledge Panel SKG 2014 Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China August Geoffrey Fox
HPC in the Cloud – Clearing the Mist or Lost in the Fog Panel at SC11 Seattle November Geoffrey Fox
Training Data Scientists DELSA Workshop DW4 May Washington DC Geoffrey Fox Informatics, Computing.
The Data Capsule for Non-Consumptive Research Beth Plale, Atul Prakash, Geoffrey Fox, Robert H. McDonald A Proposal to the Alfred P. Sloan Foundation HTRC.
Big Data Workshop Summary Virtual School for Computational Science and Engineering July Geoffrey Fox
Geoffrey Fox Panel Talk: February
Lizhe Wang, Gregor von Laszewski, Jai Dayal, Thomas R. Furlani
Panel: Beyond Exascale Computing
Private Public FG Network NID: Network Impairment Device
PEER-TO-PEER NETWORK FAMILIES
Geoffrey Fox, Shantenu Jha, Dan Katz, Judy Qiu, Jon Weissman
Dr. Craig A. Stewart Orcid ID:
Some remarks on Portals and Web Services
FutureGrid: a Grid Testbed
Biology MDS and Clustering Results
CICC Combines Grid Computing with Chemical Informatics
Hilton Hotel Honolulu Tapa Ballroom 2 June 26, 2017 Geoffrey Fox
GCC2005 and the Harmony and Prosperity of Civilizations
Clouds from FutureGrid’s Perspective
The Great Academia/Industry Grid Debate
The two faces of Cyberinfrastructure: Grids (or Web 2
Panel: Revisiting Distributed Simulation and the Grid
Gateway and Web Services
Big Data Architectures
Cyberinfrastructure and PolarGrid
Services, Security, and Privacy in Cloud Computing
Department of Intelligent Systems Engineering
$1M a year for 5 years; 7 institutions Active:
3 Questions for Cluster and Grid Use
Next-generation Internet architecture
Panel on Research Challenges in Big Data
Digital Science Center
Chemical Informatics and Cyberinfrastructure Collaboratory
Cloud versus Cloud: How Will Cloud Computing Shape Our World?
Big Data, Simulations and HPC Convergence
Cloud Futures Panel Panel Chair: Geoffrey Fox, July 9, 2019
CReSIS Cyberinfrastructure
Convergence of Big Data and Extreme Computing
Presentation transcript:

PolarGrid and FutureGrid State IT Conference Panel October 1, 2009, Indianapolis Geoffrey Fox gcf@indiana.edu www.infomall.org School of Informatics and Computing and Community Grids Laboratory, Digital Science Center Pervasive Technology Institute Indiana University

FutureGrid FutureGrid is part of TeraGrid – NSF’s national network of supercomputers – and is aimed at providing a distributed testbed of ~9 clusters for both application and computer scientists exploring Clouds Grids Multicore and architecture diversity Testbed enabled by virtual machine technology including virtual network Dedicated network connects allowing experiments to be isolated Modest number of cores (5000) but will be relatively large as a Science Cloud

Add 768 core Windows Server at IU and Network Fault Generator

Pairwise Clustering 30,000 Points on Tempest