The National e-Science Centre and UK e-Science Programme Muffy Calder University of Glasgow http: //www.nesc.ac.uk Muffy.

Slides:



Advertisements
Similar presentations
Edinburgh - at the Frontiers of e-Science
Advertisements

Delivery of Industrial Strength Middleware Federated Strengths Agility & Coordination Prof. Malcolm Atkinson Director 21 st January 2004.
The UK e-Science Programme & The National e-Science Centre Malcolm Atkinson Director of NeSC Universities of Edinburgh and Glasgow Pilot Projects Meeting.
Research Councils ICT Conference Welcome Malcolm Atkinson Director 17 th May 2004.
National e-Science Centre Glasgow e-Science Hub Opening: Remarks NeSCs Role Prof. Malcolm Atkinson Director 17 th September 2003.
Open Grid Service Architecture - Data Access & Integration (OGSA-DAI) Dr Martin Westhead Principal Consultant, EPCC Telephone: Fax:+44.
NeSC: National e-Science Centre. NeSC Mission Help the UK develop international strength in Grid computing Industry, Commerce, Scientific Research, …
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
E-Science Update Steve Gough, ITS 19 Feb e-Science large scale science increasingly carried out through distributed global collaborations enabled.
Tony Hey Director of UK e-Science Programme e-Science, Databases and the Grid.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
Decision Support Tools CBR & Modeling Jeff Allan University of Sheffield.
GEODE Workshop 16 th January 2007 Issues in e-Science Richard Sinnott University of Glasgow Ken Turner University of Stirling.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
E-Science and Global Grids in the Information Society: The Role of an EU e-IRG? Tony Hey Director of the UK e-Science Core Programme
CERN Krakow 2001 F. Gagliardi - CERN/IT 1 RTD efforts in Europe by Kyriakos Baxevanidis Foster cohesion, interoperability, cross- fertilization of knowledge,
UK e-Science and the White Rose Grid Paul Townend Distributed Systems and Services Group Informatics Research Institute University of Leeds.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
DAME, EuroGrid WP3 and GEODISE Esa Nuutinen. Introduction Dame, EuroGrid WP3 and GEODISE All are Grid based tools for Engineers. Many times engineers.
Introduction to Grid Computing Ann Chervenak Carl Kesselman And the members of the Globus Team.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Supercomputing Center Jysoo Lee KISTI Supercomputing Center National e-Science Project.
Welcome e-Science in the UK Building Collaborative eResearch Environments Prof. Malcolm Atkinson Director 23 rd February 2004.
GridPP Tuesday, 23 September 2003 Tim Phillips. 2 Bristol e-Science Vision National scene Bristol e-Science Centre Issues & Challenges.
A long tradition. e-science, Data Centres, and the Virtual Observatory why is e-science important ? what is the structure of the VO ? what then must we.
Peer to Peer & Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The University.
DISTRIBUTED COMPUTING
DAME: Distributed Engine Health Monitoring on the Grid
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Jarek Nabrzyski, Ariel Oleksiak Comparison of Grid Middleware in European Grid Projects Jarek Nabrzyski, Ariel Oleksiak Poznań Supercomputing and Networking.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
Future of e-Science Malcolm Atkinson Director 18 th March 2004.
The DAME project Professor Jim Austin University of York.
DAME: A Distributed Diagnostics Environment for Maintenance Duncan Russell University of Leeds.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
SEEK Welcome Malcolm Atkinson Director 12 th May 2004.
“Grids and eScience” Mark Hayes Technical Director - Cambridge eScience Centre GEFD Summer School 2003.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
GRID ARCHITECTURE Chintan O.Patel. CS 551 Fall 2002 Workshop 1 Software Architectures 2 What is Grid ? "...a flexible, secure, coordinated resource- sharing.
1 ARGONNE  CHICAGO Grid Introduction and Overview Ian Foster Argonne National Lab University of Chicago Globus Project
Authors: Ronnie Julio Cole David
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
…building the next IT revolution From Web to Grid…
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
IBM & HSBC visit Malcolm Atkinson Director & e-Science Envoy UK National e-Science Centre & e-Science Institute 30 th March 2006.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
Edinburgh e-Science MSc Bob Mann Institute for Astronomy & NeSC University of Edinburgh.
Grid Glasgow Outline LHC Computing at a Glance Glasgow Starting Point LHC Computing Challenge CPU Intensive Applications Timeline ScotGRID.
Presentation of the A particle collision = an event Physicist's goal is to count, trace and characterize all the particles produced and fully.
Enabling, facilitating and delivering quality training in the UK and Internationally Introduction to e-science concepts Mike Mineter Training Outreach.
Toward a common data and command representation for quantum chemistry Malcolm Atkinson Director 5 th April 2004.
The National Grid Service Mike Mineter.
NERC e-Science Meeting Malcolm Atkinson Director & e-Science Envoy UK National e-Science Centre & e-Science Institute 26 th April 2006.
GRIDSTART a European GRID coordination attempt Fabrizio Gagliardi CERN.
GRIDSTART Brussels 20/9/02 1www.gridstart.org GRIDSTART and European activities Dr Francis Wray EPCC The University of Edinburgh.
Welcome Grids and Applied Language Theory Dave Berry Research Manager 16 th October 2003.
RC ICT Conference 17 May 2004 Research Councils ICT Conference The UK e-Science Programme David Wallace, Chair, e-Science Steering Committee.
Storage Management on the Grid Alasdair Earl University of Edinburgh.
INTRODUCTION TO GRID & CLOUD COMPUTING U. Jhashuva 1 Asst. Professor Dept. of CSE.
Welcome to National e-Science Centre Official Opening
UK e-Science OGSA-DAI November 2002 Malcolm Atkinson
Grid Computing.
Tour of CERN Computer Center
CS258 Spring 2002 Mark Whitney and Yitao Duan
Grid Application Model and Design and Implementation of Grid Services
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

The National e-Science Centre and UK e-Science Programme Muffy Calder University of Glasgow http: // Muffy Calder University of Glasgow http: //

An overview E-Science –challenges and opportunities Grid computing –challenges and opportunities Activities –UK and international –Scotland and National e-Science Centre E-Science –challenges and opportunities Grid computing –challenges and opportunities Activities –UK and international –Scotland and National e-Science Centre

e-Science ‘e-Science is about global collaboration in key areas of science, and the next generation of infrastructure that will enable it.’ ‘e-Science will change the dynamic of the way science is undertaken.’ John Taylor Director General of Research Councils Office of Science and Technology ‘e-Science is about global collaboration in key areas of science, and the next generation of infrastructure that will enable it.’ ‘e-Science will change the dynamic of the way science is undertaken.’ John Taylor Director General of Research Councils Office of Science and Technology

The drivers for e-Science More data –instrument resolution and laboratory automation –storage capacity and data sources More computation –computations available, simulations doubling every year Faster networks –bandwidth –need to schedule More interplay and collaboration –between scientists, engineers, computer scientists etc. –between computation and data More data –instrument resolution and laboratory automation –storage capacity and data sources More computation –computations available, simulations doubling every year Faster networks –bandwidth –need to schedule More interplay and collaboration –between scientists, engineers, computer scientists etc. –between computation and data

The drivers for e-Science Exploration of data and models –in silico discovery Floods of public data –gene sequence doubling every 9 months –Searches required doubling every 4-5 months In summary Shared data, information and computation by geographically dispersed communities. Exploration of data and models –in silico discovery Floods of public data –gene sequence doubling every 9 months –Searches required doubling every 4-5 months In summary Shared data, information and computation by geographically dispersed communities.

Current model: ad-hoc client- server HPC Analysis Storage Analysis Experiment Computing HPC Scientist

The vision: computation & information utility M ID L E W A R E Experiment Computing Storage Analysis Scientist

e-Science Examples Bioinformatics/Functional genomics Collaborative Engineering Medical/Healthcare informatics Earth Observation Systems (flood monitoring) TeleMicroscopy Virtual Observatories Robotic Telescopes Particle Physics at the LHC –EU DataGrid particle physics, biology & medical imaging, Earth observation –GridPP, ScotGrid –AstroGrid Bioinformatics/Functional genomics Collaborative Engineering Medical/Healthcare informatics Earth Observation Systems (flood monitoring) TeleMicroscopy Virtual Observatories Robotic Telescopes Particle Physics at the LHC –EU DataGrid particle physics, biology & medical imaging, Earth observation –GridPP, ScotGrid –AstroGrid

Lift Capabilities Drag Capabilities Responsiveness Deflection capabilities Responsiveness Thrust performance Reverse Thrust performance Responsiveness Fuel Consumption Braking performance Steering capabilities Traction Dampening capabilities Crew Capabilities - accuracy - perception - stamina - re-action times - SOP’s Engine Models Airframe Models Wing Models Landing Gear Models Stabilizer Models Human Models Multi-disciplinary Simulations Whole system simulations are produced by coupling all of the sub-system simulations

Multi-disciplinary Simulations Virtual National Air Space VNAS GRC Engine Models LaRC Airframe Models Landing Gear Models ARC Wing Models Stabilizer Models Human Models FAA Ops Data Weather Data Airline Schedule Data Digital Flight Data Radar Tracks Terrain Data Surface Data 22,000 Commercial US Flights a day 50,000 Engine Runs 22,000 Airframe Impact Runs 132,000 Landing/ Take-off Gear Runs 48,000 Human Crew Runs 66,000 Stabilizer Runs 44,000 Wing Runs Simulation Drivers (Being pulled together under the NASA AvSP Aviation ExtraNet (AEN) Many aircraft, flight paths, airport operations, and the environment are combined to get a virtual national airspace US National Air Space Simulation Environment

Global in-flight engine diagnostics in-flight data airline maintenance centre ground station global network eg SITA internet, , pager DS&S Engine Health Center data centre Distributed Aircraft Maintenance Environment: Universities of Leeds, Oxford, Sheffield &York

LHC computing Tier2 Centre ~1000 PCs Online System Offline Farm ~20,000 PCs CERN Computer Centre >20,000 PCs RAL Regional Centre US Regional Centre French Regional Centre Italian Regional Centre Institute Institute ~200 PCs Workstations ~100 MByte/sec Mbit/sec one bunch crossing per 25 ns 100 triggers per second each event is ~1 MByte physicists work on analysis “channels” each institute has ~10 physicists working on one or more channels data for these channels is cached by the institute server Physics data cache ~PByte/sec ~ Gbit/sec or Air Freight Tier2 Centre ~1000 PCs ~Gbit/sec Tier 0 Tier 1 Tier 3 Tier 4 assumes PC = ~ 25 SpecInt95 ScotGRID++ ~1000 PCs Tier 2

Emergency response teams bring sensors, data, simulations and experts together –wildfire: predict movement of fire & direct fire-fighters –also earthquakes, peacekeeping forces, battlefields,… bring sensors, data, simulations and experts together –wildfire: predict movement of fire & direct fire-fighters –also earthquakes, peacekeeping forces, battlefields,… Los Alamos National Laboratory: wildfireNational Earthquake Simulation Grid

Earth observation ENVISAT –€ 3.5 billion –400 terabytes/year –700 users ENVISAT –€ 3.5 billion –400 terabytes/year –700 users ground deformation prior to a volcano

To achieve e-Science we need…

Grid the vision Computing resources Data Knowledge Instruments People Solution Complex problem GRID

The Grid Computing cycles, data storage, bandwidth and facilities viewed as commodities. –like the electricity grid Software and hardware infrastructure to support model of computation and information utilities on demand. –middleware An emergent infrastructure –delivering dependable, pervasive and uniform access to a set of globally distributed, dynamic and heterogeneous resources. Computing cycles, data storage, bandwidth and facilities viewed as commodities. –like the electricity grid Software and hardware infrastructure to support model of computation and information utilities on demand. –middleware An emergent infrastructure –delivering dependable, pervasive and uniform access to a set of globally distributed, dynamic and heterogeneous resources.

The Grid Supporting computations with differing characteristics: Highthroughput –Unbounded, robust, scalable, use otherwise idle machines On demand –short-term bounded, hard timing constraints Data intensive –computed, measured, stored, recalled, e.g. particle physics Distributed supercomputing –classical CPU and memory intensive Collaborative –mainly in support of hhi, e.g. access grid Supporting computations with differing characteristics: Highthroughput –Unbounded, robust, scalable, use otherwise idle machines On demand –short-term bounded, hard timing constraints Data intensive –computed, measured, stored, recalled, e.g. particle physics Distributed supercomputing –classical CPU and memory intensive Collaborative –mainly in support of hhi, e.g. access grid

Grid Data Generated by sensor From a database Computed on request Measured on request Generated by sensor From a database Computed on request Measured on request

Grid Services Deliver bandwidth, data, computation cycles Architecture and basic services for –authentication –authorisation –resource scheduling and coscheduling –monitoring, accounting and payment –protection and security –quality of service guarantees –secondary storage –directories –interoperability –fault-tolerance –reliability Deliver bandwidth, data, computation cycles Architecture and basic services for –authentication –authorisation –resource scheduling and coscheduling –monitoring, accounting and payment –protection and security –quality of service guarantees –secondary storage –directories –interoperability –fault-tolerance –reliability

But…has the emperor got any clothes on? Maybe string vest and pants: –semantic web machine understandable information on the web –virtual organisations –pervasive computing “ … a billion people interacting with a million e-businesses with a trillion intelligent devices interconnected ” Lou Gerstner, IBM (2000) Maybe string vest and pants: –semantic web machine understandable information on the web –virtual organisations –pervasive computing “ … a billion people interacting with a million e-businesses with a trillion intelligent devices interconnected ” Lou Gerstner, IBM (2000)

National and International Activities USA –NASA Information Power Grid –DOE Science Grid –NSF National Virtual Observatory etc. UK e-Science Programme Japan – Grid Data Farm, ITBL Netherlands – VLAM, PolderGrid Germany – UNICORE, Grid proposal France – Grid funding approved Italy – INFN Grid Eire – Grid proposals Switzerland - Network/Grid proposal Hungary – DemoGrid, Grid proposal … USA –NASA Information Power Grid –DOE Science Grid –NSF National Virtual Observatory etc. UK e-Science Programme Japan – Grid Data Farm, ITBL Netherlands – VLAM, PolderGrid Germany – UNICORE, Grid proposal France – Grid funding approved Italy – INFN Grid Eire – Grid proposals Switzerland - Network/Grid proposal Hungary – DemoGrid, Grid proposal …

Cambridge Newcastle Edinburgh Oxford Glasgow Manchester Cardiff Soton London Belfast DL RAL Hinxton UK e-science centres AccessGrid always-on video walls

National e-Science Centre Edinburgh + Glasgow Universities –Physics & Astronomy  2 –Informatics, Computing Science –EPCC £6M EPSRC/DTI + £2M SHEFC over 3 years Edinburgh + Glasgow Universities –Physics & Astronomy  2 –Informatics, Computing Science –EPCC £6M EPSRC/DTI + £2M SHEFC over 3 years e-Science Institute –visitors, workshops, co-ordination, outreach middleware development –50 : 50 industry : academia UK representation at GGF coordinate regional centres e-Science Institute –visitors, workshops, co-ordination, outreach middleware development –50 : 50 industry : academia UK representation at GGF coordinate regional centres

Generic Grid Middleware All e-Science Centres will donate resources to form a UK ‘national’ Grid All Centres will run same Grid Software -based on Globus*, SRB and Condor Work with Global Grid Forum (GGF) and major computing companies. *Globus – project to develop open architecture, open source Grid software, Globus toolkit 2.0 now available. All e-Science Centres will donate resources to form a UK ‘national’ Grid All Centres will run same Grid Software -based on Globus*, SRB and Condor Work with Global Grid Forum (GGF) and major computing companies. *Globus – project to develop open architecture, open source Grid software, Globus toolkit 2.0 now available.

How can you participate? Informal collaborations Formal collaborative LINK funded projects Attend seminars at eSI Suggest visitors for eSI Suggest themes and topics for eSI. Informal collaborations Formal collaborative LINK funded projects Attend seminars at eSI Suggest visitors for eSI Suggest themes and topics for eSI.

eSI forthcoming events Friday, March 15 Blue Gene Monday, March 18 IWOX: Introductory Workshop on XML Schema Tuesday, March 19 AWOX: Advanced Workshop on XML: XML Schema, Web Services and Tools Thursday, March 21 GOGO: Getting OGSA Going 1 Monday, April 8 Software Management for Grid Projects Monday, April 15 DAI: Database Access and Integration Wednesday, April 17 DBFT Meeting Monday, April 29 The Information Grid Sunday, July 21 GGF5/ HPDC11 Friday, March 15 Blue Gene Monday, March 18 IWOX: Introductory Workshop on XML Schema Tuesday, March 19 AWOX: Advanced Workshop on XML: XML Schema, Web Services and Tools Thursday, March 21 GOGO: Getting OGSA Going 1 Monday, April 8 Software Management for Grid Projects Monday, April 15 DAI: Database Access and Integration Wednesday, April 17 DBFT Meeting Monday, April 29 The Information Grid Sunday, July 21 GGF5/ HPDC11

Watch the web pages…. Thank you! Watch the web pages…. Thank you!