Presentation is loading. Please wait.

Presentation is loading. Please wait.

DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility.

Similar presentations


Presentation on theme: "DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility."— Presentation transcript:

1 DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility

2 Three Services and a data heart DiRAC-2 has successfully management 5 services at four sites for 3 years. Science Case and Technical Case reviewed – 40x uplift in computing and data systems needed. – 10x from hardware, 4x from better code and workflows Code and workflows will be need to be developed and optimised to run on new systems and architectures. – Very few users write efficient code – Investment and methodology in place to do this (for now) User centricity requires – Cloud - user sees a single “service” – data federation and data services – Adoption of AAAI

3 Lattice QCD Astrophysics Cosmology Nuclear Physics Particle Physics Particle Astrophysics Planetary Science Solar Physics DiRAC-3: Illustrative Structural Diagram Extreme Scaling Data Intensive Memory Intensive Data Management Internet Analytics Many Core Coding Data Analytics Programming Code Optimization Fine Tuning Parallel Management Multi-threading Disaster Recovery Data Handling Archiving Tight compute/storage coupling to facilitate confrontation of complex simulations with Petascale data sets Maximal computational effort applied to a problem of fixed size Larger memory footprint as the problem size grows with increasing machine power

4 Data Management Internet Analytics Disaster Recovery 100PB TAPE DiRAC-3 Extreme Scaling 10 Pflops Many Core 100GBs 3D 16GB Cache per card IO 100Gbs Infiniband Standard 10PB (PFS) 10 – 20PB Data Storage 1-3 Pflops ≥256TB RAM IO 150GB/s all to all IB WORLD Virtualization (to desktop) Data Intensive > 8GB/core 1.25 Pflops X86 24TB SMP 14PB Secondary Disk 2PB PFS Very Fast SSD? >250GB/s all to all IB Memory Intensive TAPE 100PB Analytics TBD 1PB Buffer

5 Where do we fit in Sibling Facility to UK-T- Joined by the need to confront models with data – Physically this is how we should be joined up Need to establish relationships with LSST, SKA, Euclid….. AAAI and Cloud means that although operationally we may be distinct, from a user perspective we can look like another resource/service

6 Workshop Algorithms to Architecture: – A NeI Project Directors Group workshop for Data Intensive Computing in the Physical Sciences HPC is very data intensive these days…….. – RCUK will fund this – Jan/Feb next year – Essentially for Archer, DiRAC, UK-T0, JASMINE communities A real drill down into our problem sets Knowledge and skills sharing

7 Thanks for inviting me It’s been great so far


Download ppt "DiRAC-3 – The future Jeremy Yates, STFC DiRAC HPC Facility."

Similar presentations


Ads by Google