Presentation is loading. Please wait.

Presentation is loading. Please wait.

Research Infrastructure Simon Hood Research Infrastructure Coordinator

Similar presentations


Presentation on theme: "Research Infrastructure Simon Hood Research Infrastructure Coordinator"— Presentation transcript:

1 Research Infrastructure Simon Hood Research Infrastructure Coordinator its-ri-team@manchester.ac.uk

2 Michael Smith (FLS) MIB? MHS? Materials (EPS)? CSF Job Queue Backend nodes (compute) iCSF Backend LVS nodes (VMs, GPUs) RQ Job Queue Backend nodes (compute) Zreck Backend ET nodes (FPGA, GPU, Phi) 20 Gb/s RDS - Isilon Home dirs Shared areas Router ACLs SSH X2GO NX SSHFS Research Virt. Desktop Service Cmd-line “mounts” Firewall Router ACLs 130.88.99.0/27 Campus only 10.99.0.0/16 Off campus On campus RVMS Research VMs Campus CIR Ecosystem Router ACLs 20 Gb/s

3 Ecosystem Workflow 2. Submit compute job. EG: long running parallel high memory heat/stress analysis from home. iCSF RVDS CSF RDS 3. Check on compute job. EG: while away at conference in Barcelona. Submit other jobs. SSH CSF RDS 4. Analyse results EG: In application GUI on laptop in hotel & back in office. iCSF RVDS RDS 5. Publish results. EG: Front-end Web Server running on RVMS accessing Isilon share. RVMS RDS 1. Input preparation. EG: upload data to RDS, set job parameter in application GUI, in office on campus. iCSF RVDS SSHFS RDS

4 CIR Stats CSF – £1.3m academic contribution since Dec 2010 5000 CPU cores – £175k more lined up (Jan 2014?) – Awaiting outcome of some big bids…Kilburn??? Storage – Isilon – 500 PB per year – Current: 120 TB for each faculty – going fast! Network – July: £1.5m on Cisco kit – 80 Gb core, 10Gb buildings People – Pen, George, Simon

5 Recent and Current Work Redqueen – Summer: RGF-funded refresh of 50% of cluster – Integration with Isilon (RDN) CSF (mostly batch compute) – Summer: £300k procurement – RDN: moving all home-dirs to Isilon (keep local Lustre-based scratch) Gateways – SSH and SSHFS/SFTP – Research Virtual Desktop Service: NX, X2GO New Clusters – Incline/iCSF: interactive compute – Zreck: GPGPUs, Xeon Phi, FPGA, … RDN (20 Gb) – CSF, Redqueen, Incline/iCSF, Zreck – Michael Smith (FLS)

6 Thankyou! Email questions to me: Simon.Hood@manchester.ac.ukSimon.Hood@manchester.ac.uk


Download ppt "Research Infrastructure Simon Hood Research Infrastructure Coordinator"

Similar presentations


Ads by Google