Presentation is loading. Please wait.

Presentation is loading. Please wait.

White Rose Grid Infrastructure Overview

Similar presentations


Presentation on theme: "White Rose Grid Infrastructure Overview"— Presentation transcript:

1 White Rose Grid Infrastructure Overview
Chris Cartledge Deputy Director Corporate Information and Computing Services, The University of Sheffield

2 Contents History YHMAN Web site Grid capabilities
Current computation capabilities Planned machines Usage YHMAN Grid capabilities Contacts Training FEC, Futures

3 White Rose Grid History
2001: SRIF Opportunity, joint procurement Leeds led: Peter Dew, Joanna Schmidt 3 clusters Sun SPARC system, Solaris Leeds, Maxima: 6800 (20 processors), 4*V880 (8 proc) Sheffield, Titania: 10 (later 11)* V880 (8 proc) York, Pascali: 6800 (20 proc), Fimbrata: V880 1 cluster 2.2, 2.4 GHz Intel Xeon, Myrinet Leeds, Snowdon 292 CPUs, linux

4 White Rose Grid History continued
Joint working to enable use across sites but heterogenous: a range of systems each system primarily to meet local needs up to 25% for users from the other sites Key services common Sun Grid Engine to control work in the clusters Globus to link clusters registration

5 WRG Web Site There is a shared web site: http://www.wrgrid.org.uk/
Linked to/from local sites Covers other related projects and resources e-Science Centre of Excellence Leeds SAN and specialist graphics equipment Sheffield ppGrid node York, UKLight work

6 Current Facilities: Leeds
Everest: supplied by Sun/ Streamline Dual core Opteron: power & space efficient 404 CPU cores, 920GB memory 64-bit Linux (SuSE 9.3) OS Low latency Myrinet interconnect 7 * 8-way (4 chips with 2 cores), 32GB 64 * 4-way (2 chips with 2 cores), 8GB

7 Leeds (continued) SGE, Globus/GSI Intel, GNU, PGI compilers.
Shared memory & Myrinet MPI NAG, FFTW, BLAS, LAPACK, etc Libraries 32- and 64-bit software versions

8 Maxima transition Snowdon transition
Maintenance to June 2006, expensive Need to move all home directories to SAN Users can still use it, but “at risk” Snowdon transition Maintenance until June 2007 Home directories already on the SAN Users encouraged to move

9 Sheffield Iceberg: Sun Microsystems/ Streamline
160 * 2.4GHz AMD Opteron (PC technology) processors 64-bit Scientific Linux (Redhat based) 20 * 4-way, 16GB, fast Myrinet for parallel/large 40 * 2-way, 4GB for high high throughput GNU and Portland Group compilers, NAG Sun Grid Engine (6), MPI, OpenMP, Globus Abaqus, Ansys, Fluent, Maple, Matlab

10 Also At Sheffield GridPP (Particle Physics Grid)
160 * 2.4GHz AMD Opteron 80* 2-way, 4GB 32-bit Scientific Linux ppGrid stack 2nd most productive Very successful!

11 Popular! Sheffield Utilisation high Lots of users: 827 White Rose: 37
Since installation: 40% Last 3 months: 80% White Rose: 26%

12 York £205k from SRIF 3 £100k computing systems £50k storage system
remainder ancillary equipment, contingency Shortlist agreed(?) - for June Compute, possibly core, Opteron Storage, possibly 10TB

13 Other Resources YHMAN Leased fibre 2Gb/s Performance
Wide area MetroLAN UKLight Archiving Disaster recovery

14 Grid Resources Queuing Globus Toolkit Storage Resource Broker
Sun Grid Engine (6) Globus Toolkit 2.4 is installed and working issue over GSI-SSH on 64-bit OS (ancient GTK) Globus 4 being looked at Storage Resource Broker being worked on

15 Training Available across White Rose Universities
Sheffield: RTP - 4 units, 5 credits each High Performance and Grid Computing Programming and Application Development for Computational Grids Techniques for High Performance Computing including Distributed Computing Grid Computing and Application Development

16 Contacts Leeds: Joanna Schmidt
, +44 (0) Sheffield Michael Griffiths or Peter Tillotson +44 (0) , +44 (0) York: Aaron Turner ,+44 (0)

17 Futures FEC will have an impact Different Funding models a challenge
Can we maintain 25% use from other sites? how can we fund continuing GRID work? Different Funding models a challenge Leeds: departmental shares Sheffield: unmetered service York: based in Computer Science Relationship opportunities NGS, WUN, region, suppliers?

18 Achievements White Rose Grid: not hardware, services
People(!): familiar in working with Grid Experience of working as a virtual organisation Intellectual property in training Success: Research Engaging with Industry Solving user problems


Download ppt "White Rose Grid Infrastructure Overview"

Similar presentations


Ads by Google