Student Visits August 26 2009 Geoffrey Fox

Slides:



Advertisements
Similar presentations
The Open Science Data Cloud Robert L. Grossman University of Chicago and Open Cloud Consortium April 4, 2012 A 501(c)(3) not-for-profit operating clouds.
Advertisements

Samford University Virtual Supercomputer (SUVS) Brian Toone 4/14/09.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
Clouds from FutureGrid’s Perspective April Geoffrey Fox Director, Digital Science Center, Pervasive.
Future Grid Introduction March MAGIC Meeting Gregor von Laszewski Community Grids Laboratory, Digital Science.
GEODE Workshop 16 th January 2007 Issues in e-Science Richard Sinnott University of Glasgow Ken Turner University of Stirling.
Indiana University QuakeSim Activities Marlon Pierce, Geoffrey Fox, Xiaoming Gao, Jun Ji, Chao Sun.
1 Clouds and Sensor Grids CTS2009 Conference May Alex Ho Anabas Inc. Geoffrey Fox Computer Science, Informatics, Physics Chair Informatics Department.
UK e-Science and the White Rose Grid Paul Townend Distributed Systems and Services Group Informatics Research Institute University of Leeds.
1 Multicore and Cloud Futures CCGSC September Geoffrey Fox Community Grids Laboratory, School of informatics Indiana University
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Parallel Data Analysis from Multicore to Cloudy Grids Indiana University Geoffrey Fox, Xiaohong Qiu, Scott Beason, Seung-Hee.
SALSASALSASALSASALSA Digital Science Center June 25, 2010, IIT Geoffrey Fox Judy Qiu School.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Panel Session The Challenges at the Interface of Life Sciences and Cyberinfrastructure and how should we tackle them? Chris Johnson, Geoffrey Fox, Shantenu.
3DAPAS/ECMLS panel Dynamic Distributed Data Intensive Analysis Environments for Life Sciences: June San Jose Geoffrey Fox, Shantenu Jha, Dan Katz,
Assessment of Core Services provided to USLHC by OSG.
1 Challenges Facing Modeling and Simulation in HPC Environments Panel remarks ECMS Multiconference HPCS 2008 Nicosia Cyprus June Geoffrey Fox Community.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Big Data and Clouds: Challenges and Opportunities NIST January Geoffrey Fox
Supercomputing Center Jysoo Lee KISTI Supercomputing Center National e-Science Project.
FutureGrid: an experimental, high-performance grid testbed Craig Stewart Executive Director, Pervasive Technology Institute Indiana University
FutureGrid: an experimental, high-performance grid testbed Craig Stewart Executive Director, Pervasive Technology Institute Indiana University
SALSASALSASALSASALSA AOGS, Singapore, August 11-14, 2009 Geoffrey Fox 1,2 and Marlon Pierce 1
Overview of Cyberinfrastructure Northeastern Illinois University Cyberinfrastructure Day August Geoffrey Fox
FutureGrid SOIC Lightning Talk February Geoffrey Fox
Science of Cloud Computing Panel Cloud2011 Washington DC July Geoffrey Fox
Experimenting with FutureGrid CloudCom 2010 Conference Indianapolis December Geoffrey Fox
PolarGrid Geoffrey Fox (PI) Indiana University Associate Dean for Graduate Studies and Research, School of Informatics and Computing, Indiana University.
Science Clouds and FutureGrid’s Perspective June Science Clouds Workshop HPDC 2012 Delft Geoffrey Fox
OpenQuake Infomall ACES Meeting Maui May Geoffrey Fox
FutureGrid Dynamic Provisioning Experiments including Hadoop Fugang Wang, Archit Kulshrestha, Gregory G. Pike, Gregor von Laszewski, Geoffrey C. Fox.
Future Grid FutureGrid Overview Geoffrey Fox SC09 November
The Future of the iPlant Cyberinfrastructure: Coming Attractions.
Future Grid Future Grid All Hands Meeting Introduction Indianapolis October Geoffrey Fox
Building Effective CyberGIS: FutureGrid Marlon Pierce, Geoffrey Fox Indiana University.
NanoHUB.org and HUBzero™ Platform for Reproducible Computational Experiments Michael McLennan Director and Chief Architect, Hub Technology Group and George.
SBIR Final Meeting Collaboration Sensor Grid and Grids of Grids Information Management Anabas July 8, 2008.
SALSASALSASALSASALSA FutureGrid Venus-C June Geoffrey Fox
DAME: A Distributed Diagnostics Environment for Maintenance Duncan Russell University of Leeds.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
Applications and Requirements for Scientific Workflow Introduction May NSF Geoffrey Fox Indiana University.
SALSASALSASALSASALSA Clouds Ball Aerospace March Geoffrey Fox
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
SALSASALSASALSASALSA Cloud Panel Session CloudCom 2009 Beijing Jiaotong University Beijing December Geoffrey Fox
OGCE Components for Enhancing UltraScan Job Management. Suresh Marru,Raminder Singh, Marlon Pierce.
November Geoffrey Fox Community Grids Lab Indiana University Net-Centric Sensor Grids.
Design Discussion Rain: Dynamically Provisioning Clouds within FutureGrid PI: Geoffrey Fox*, CoPIs: Kate Keahey +, Warren Smith -, Jose Fortes #, Andrew.
Computing Research Testbeds as a Service: Supporting large scale Experiments and Testing SC12 Birds of a Feather November.
Security: systems, clouds, models, and privacy challenges iDASH Symposium San Diego CA October Geoffrey.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Future Grid Future Grid Overview. Future Grid Future GridFutureGridFutureGrid The goal of FutureGrid is to support the research that will invent the future.
SALSASALSASALSASALSA Digital Science Center February 12, 2010, Bloomington Geoffrey Fox Judy Qiu
Big Data to Knowledge Panel SKG 2014 Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China August Geoffrey Fox
HPC in the Cloud – Clearing the Mist or Lost in the Fog Panel at SC11 Seattle November Geoffrey Fox
IU Site Update TeraGrid Round Table Craig Stewart, Stephen Simms, Kurt Seiffert November 4, 2010.
1 Cloud Systems Panel at HPDC Boston June Geoffrey Fox Community Grids Laboratory, School of informatics Indiana University
Directions in eScience Interoperability and Science Clouds June Interoperability in Action – Standards Implementation.
Geoffrey Fox Panel Talk: February
Lizhe Wang, Gregor von Laszewski, Jai Dayal, Thomas R. Furlani
Tools and Services Workshop
Joslynn Lee – Data Science Educator
NSF start October 1, 2014 Datanet: CIF21 DIBBs: Middleware and High Performance Analytics Libraries for Scalable Data Science Indiana University.
Biology MDS and Clustering Results
Clouds from FutureGrid’s Perspective
Cyberinfrastructure and PolarGrid
PolarGrid and FutureGrid
Big Data, Simulations and HPC Convergence
CReSIS Cyberinfrastructure
Presentation transcript:

Student Visits August Geoffrey Fox

SALSASALSA 2 2 e-moreorlessanything ‘ e-Science is about global collaboration in key areas of science, and the next generation of infrastructure that will enable it.’ from inventor of term John Taylor Director General of Research Councils UK, Office of Science and Technology e-Science is about developing tools and technologies that allow scientists to do ‘faster, better or different’ research Similarly e-Business captures the emerging view of corporations as dynamic virtual organizations linking employees, customers and stakeholders across the world. This generalizes to e-moreorlessanything including e-PolarGrid, e- Bioinformatics, e-HavingFun and e-Education A deluge of data of unprecedented and inevitable size must be managed and understood. People (virtual organizations), computers, data (including sensors and instruments) must be linked via hardware and software networks

SALSASALSA 3 3 What is Cyberinfrastructure Cyberinfrastructure is (from NSF) infrastructure that supports distributed research and learning (e-Science, e-Research, e- Education) – Links data, people, computers Exploits Internet technology (Web2.0 and Clouds) adding (via Grid technology) management, security, supercomputers etc. It has two aspects: parallel – low latency (microseconds) between nodes and distributed – highish latency (milliseconds) between nodes Parallel needed to get high performance on individual large simulations, data analysis etc.; must decompose problem Distributed aspect integrates already distinct components – especially natural for data (as in biology databases etc.)

SALSASALSA 4 Relevance of Web 2.0 to Academia Web 2.0 can help e-Research in many ways Its tools (web sites) can enhance scientific collaboration, i.e. effectively support virtual organizations, in different ways from grids The popularity of Web 2.0 can provide high quality technologies and software that (due to large commercial investment) can be very useful in e-Research and preferable to complex Grid or Web Service solutions The usability and participatory nature of Web 2.0 can bring science and its informatics to a broader audience Cyberinfrastructure is research analogue of major commercial initiatives e.g. to important job opportunities for students! Web 2.0 is major commercial use of computers and “Google/Amazon” farms spurred cloud computing – Same computer answering your Google query can do bioinformatics – Can be accessed from a web page with a credit card i.e. as a Service

SALSASALSA Clouds v Grids Philosophy Clouds are (by definition) commercially supported approach to large scale computing – So we should expect Clouds to replace Compute Grids – Current Grid technology involves “non-commercial” software solutions which are hard to evolve/sustain – Grid approaches to distributed data and sensors still valid Informational Retrieval is major data intensive commercial application so we can expect technologies from this field (Dryad, Hadoop) to be relevant for related scientific (File/Data parallel) applications – Technologies still immature but can be expected to rapidly become mainstream Data becoming more and more important in all fields including Science

SALSASALSA Activities in CGL/DSC Project Leaders – Gregor von Lazewski (mainly FutureGrid, GreenIT, GPU) – Marlon Pierce (mainly Grids, Portals, Web2.0, PolarGrid, QuakeSim) – Judy Qiu (mainly Multicore, Data Intensive Computing, Data mining) Highlighted Facilities – 32 nodes each with 24 cores – Tempest 768 core cluster – Cloud Testbed running Nimbus and Eucalyptus Collaborations – UITS to get good facilities and explore implications of new technologies for computing Infrastructure – Need applications to test and motivate new technologies: Bioinformatics; Cheminformatics; Health-informatics, Polar Science; Earthquake Science; Particle Physics ; Geographic Information systems and Sensor Nets

SALSASALSA FutureGrid FutureGrid is expected to start next month and will use modern virtual machine technology to build test environments for new distributed applications with 8 distributed systems. Partners in the FutureGrid project include: Purdue University, University of California San Diego, University of Chicago/Argonne National Labs, University of Florida, University of Southern California Information Sciences Institute, University of Texas Austin/Texas Advanced Computing Center, University of Tennessee Knoxville, University of Virginia, and the Center for Information Services and High Performance Computing at the Technische Universitaet Dresden, Germany. It could define the next generation of scientific computing environments is Gregor’s current web page

SALSASALSA Multicore and Cloud Technologies to support Data Intensive applications Using Dryad (Microsoft) and MPI to study structure of Gene Sequences on Tempest Cluster See infomall.org/ salsa for Judy’s projects infomall.org/ salsa

OGCE Project: Open Social Gadget Containers and Mash Ups for Scientific Communities (Raminder Singh and Gerald Guo). OGCE Project: Open Social Gadget Containers and Mash Ups for Scientific Communities (Raminder Singh and Gerald Guo).

SALSASALSA Daily RDAHMM Updates QuakeSim: Daily analysis and event classification of GPS data from REASoN’s GRWS (Xiaoming Gao)

SALSASALSA FloodGrid and Swarm: Integrating GIS, Workflows, and Grid Job Management (Marie Ma and Jun Wang)