Christian Kocks April 3, 2012 High-Performance Computing Cluster in Aachen.

Slides:



Advertisements
Similar presentations
Cluster Computing at IQSS Alex Storer, Research Technology Consultant.
Advertisements

Setting up of condor scheduler on computing cluster Raman Sehgal NPD-BARC.
CCPR Workshop Lexis Cluster Introduction October 19, 2007 David Ash.
Southgreen HPC system Concepts Cluster : compute farm i.e. a collection of compute servers that can be shared and accessed through a single “portal”
Introducing the Command Line CMSC 121 Introduction to UNIX Much of the material in these slides was taken from Dan Hood’s CMSC 121 Lecture Notes.
Asynchronous Solution Appendix Eleven. Training Manual Asynchronous Solution August 26, 2005 Inventory # A11-2 Chapter Overview In this chapter,
Installing and running COMSOL on a Windows HPCS2008(R2) cluster
HPCC Mid-Morning Break Interactive High Performance Computing Dirk Colbry, Ph.D. Research Specialist Institute for Cyber Enabled Discovery.
Automating Student Course Profile & Student Record Report Uploads to GaDOE Chris A. McManigal Camden County Schools Kingsland, GA.
Telnet/SSH: Connecting to Hosts Internet Technology1.
Research Computing with Newton Gerald Ragghianti Newton HPC workshop Sept. 3, 2010.
All rights reserved, property and © CAD Computer GmbH & Co.KG 2009 Cover page.
CSE 390a Editing and Moving Files
 Accessing the NCCS Systems  Setting your Initial System Environment  Moving Data onto the NCCS Systems  Storing Data on the NCCS Systems  Running.
Building service testbeds on FIRE D5.2.5 Virtual Cluster on Federated Cloud Demonstration Kit August 2012 Version 1.0 Copyright © 2012 CESGA. All rights.
Remote OMNeT++ v2.0 Introduction What is Remote OMNeT++? Remote environment for OMNeT++ Remote simulation execution Remote data storage.
WORK ON CLUSTER HYBRILIT E. Aleksandrov 1, D. Belyakov 1, M. Matveev 1, M. Vala 1,2 1 Joint Institute for nuclear research, LIT, Russia 2 Institute for.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Warmup A programmer’s wife tells him, “Would you mind going to the store and picking up a loaf of bread? Also, if they have eggs, get a dozen.” The programmer.
Debugging and Profiling GMAO Models with Allinea’s DDT/MAP Georgios Britzolakis April 30, 2015.
17-April-2007 High Performance Computing Basics April 17, 2007 Dr. David J. Haglin.
AE6382 Secure Shell Usually referred to as ssh, the name refers to both a program and a protocol. The program ssh is one of the most useful networking.
ACCESS IC LAB Graduate Institute of Electronics Engineering, NTU Usage of Workstation Lecturer: Yu-Hao( 陳郁豪 ) Date:
Linux & Shell Scripting Small Group Lecture 3 How to Learn to Code Workshop group/ Erin.
Unified scripts ● Currently they are composed of a main shell script and a few auxiliary ones that handle mostly the local differences. ● Local scripts.
City Cluster Quickstart Lien-Chi Lai, COLA Lab, Department of Mathematics, NTU 2010/05/11.
Running Genesis Free-Electron Laser Code on iDataPlex Dave Dunning 15 th January 2013.
Getting Started on Emerald Research Computing Group.
Using the Weizmann Cluster Nov Overview Weizmann Cluster Connection Basics Getting a Desktop View Working on cluster machines GPU For many more.
Introduction to SAS/willow (Unix) Sam Gordji Weir 107.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Linux Operations and Administration
Unix Servers Used in This Class  Two Unix servers set up in CS department will be used for some programming projects  Machine name: eustis.eecs.ucf.edu.
Grid Remote Execution of Large Climate Models (NERC Cluster Grid) Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre
Remote & Collaborative Visualization. TACC Remote Visualization Systems Longhorn – Dell XD Visualization Cluster –256 nodes, each with 48 GB (or 144 GB)
HUBbub 2013: Developing hub tools that submit HPC jobs Rob Campbell Purdue University Thursday, September 5, 2013.
File Transfer Protocol (FTP) CIS 130. File Transfer Protocol (FTP) Copy files from one internet host (server) to your account on another host –Need domain.
Introduction to Hartree Centre Resources: IBM iDataPlex Cluster and Training Workstations Rob Allan Scientific Computing Department STFC Daresbury Laboratory.
Creating Simple and Parallel Data Loads With DTS.
Parallel MATLAB jobs on Biowulf Dave Godlove, NIH February 17, 2016 While waiting for the class to begin, log onto Helix.
Active-HDL Server Farm Course 11. All materials updated on: September 30, 2004 Outline 1.Introduction 2.Advantages 3.Requirements 4.Installation 5.Architecture.
SSH. 2 SSH – Secure Shell SSH is a cryptographic protocol – Implemented in software originally for remote login applications – One most popular software.
1 CSE 390a Lecture 3 bash shell continued: processes; multi-user systems; remote login; editors slides created by Marty Stepp, modified by Jessica Miller.
NREL is a national laboratory of the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, operated by the Alliance for Sustainable.
Advanced Taverna Aleksandra Pawlik University of Manchester materials by Katy Wolstencroft, Aleksandra Pawlik, Alan Williams
Setting up visualization. Make output folder for visualization files Log into vieques $ ssh
Using ROSSMANN to Run GOSET Studies Omar Laldin ( using materials from Jonathan Crider, Harish Suryanarayana ) Feb. 3, 2014.
Intel Xeon Phi Training - Introduction Rob Allan Technology Support Manager The Hartree Centre.
An Brief Introduction Charlie Taylor Associate Director, Research Computing UF Research Computing.
Advanced Computing Facility Introduction
GRID COMPUTING.
Auburn University
PARADOX Cluster job management
slides created by Marty Stepp, modified by Josh Goodwin
FTP Lecture supp.
ASU Saguaro 09/16/2016 Jung Hyun Kim.
Telnet/SSH Connecting to Hosts Internet Technology.
slides borrowed and adapted from Alex Mariakis and CSE 390a
CCR Advanced Seminar: Running CPLEX Computations on the ISE Cluster
Advanced Computing Facility Introduction
CSE 390a Lecture 3 bash shell continued: processes; multi-user systems; remote login; editors slides created by Marty Stepp, modified by Jessica Miller.
Remote Computing Services Cloud connection Distributed system
slides created by Marty Stepp, modified by Jessica Miller
Introduction to High Performance Computing Using Sapelo2 at GACRC
CSE 390a Lecture 3 bash shell continued: processes; multi-user systems; remote login; editors slides created by Marty Stepp, modified by Jessica Miller.
Using the Omega3P Eigensolver
CSE 390a Lecture 3 bash shell continued: processes; multi-user systems; remote login; editors slides created by Marty Stepp, modified by Jessica Miller.
Quick Tutorial on MPICH for NIC-Cluster
Working in The IITJ HPC System
QlikView for use with SAP Netweaver Version 5.8 New Features
Presentation transcript:

Christian Kocks April 3, 2012 High-Performance Computing Cluster in Aachen

Christian Kocks April 3, 2012 Slide 2 Typical Simulation Workflow Simulator developmentSimulationEvaluation and visualization of results

Christian Kocks April 3, 2012 Slide 3 Workflow for Simulations on HPC Cluster Develop simulator on local PC / server Perform short tests on local PC / server Transfer simulator to HPC cluster (Subversion!) Perform short tests on HPC cluster using the Linux login shell Enqueue simulations on HPC cluster Wait for notification from HPC cluster Transfer results to local PC / server (e.g. with WinSCP using SCP protocol) Evaluate and visualize results on local PC HPCHigh-Performance Computing SCPSecure Copy

Christian Kocks April 3, 2012 Slide 4 Local Simulations vs. HPC Cluster Simulations Local simulationsHPC cluster simulation Execution of simulations can be directly controlledSimulations must be enqueued Very low setup time necessary to start simulationSome preparation necessary for running simulations Number of available resources very limited“Unlimited” resources Number of parallel jobs limited“Unlimited” number of parallel jobs Arrangements with colleagues necessaryNo arrangements necessary number of simulations total simulation time local HPC cluster

Christian Kocks April 3, 2012 Slide 5 Using Matlab on HPC Cluster Connect to Linux login shell using SSH client (e.g. PuTTY): Server name: cluster-linux.rz.rwth-aachen.de Load Matlab modules: module load MISC module load matlab Start Matlab: matlab -nodisplay -nodesktop -nosplash -nojvm -logfile job.log Alternative: use graphical remote session (with NX client)

Christian Kocks April 3, 2012 Slide 6 Using Subversion on HPC Cluster Use Subversion module svntest for first tests Check out Subversion module (e.g. svntest) from KT server: svn checkout [dst] Update local working copy: svn update Add file test.m: svn add test.m Commit changes to KT server: svn commit

Christian Kocks April 3, 2012 Slide 7 Sample Queue Script File #!/usr/bin/env zsh #BSUB -J sim_awgn_matlab# job name #BSUB -o sim_awgn_matlab.%J# job output (use %J for job id) #BSUB -e sim_awgn_matlab.e%J# error output #BSUB -W 0:20 # hard limits in hours:minutes #BSUB -M 512 # memory in MB #BSUB -u address for notification #BSUB -N# enable notification #BSUB -n 2# request number of compute slots #BSUB -a openmp# use esub for OpenMP/shared memory jobs ### load matlab modules module load MISC module load matlab ### change to the work directory cd $HOME/svn/lib/simulators/sim_awgn_matlab ### run matlab matlab -nodisplay -nodesktop -nosplash -nojvm -logfile job.log <sim_awgn_matlab_run.m

Christian Kocks April 3, 2012 Slide 8 Sample Queue Script File – Advanced #!/usr/bin/env zsh #BSUB -J sim_awgn_matlab# job name #BSUB -o sim_awgn_matlab.%J# job output (use %J for job id) #BSUB -e sim_awgn_matlab.e%J# error output #BSUB -W 0:20 # hard limits in hours:minutes #BSUB -M 512 # memory in MB #BSUB -u address for notification #BSUB -N# enable notification #BSUB -n 2# request number of compute slots #BSUB -a openmp# use esub for OpenMP/shared memory jobs ### load matlab modules module load MISC module load matlab ### change to the work directory cd $HOME/svn/lib/simulators/sim_awgn_matlab ### run matlab matlab -nodisplay -nodesktop -nosplash -nojvm -logfile job.log <<EOF sim_awgn_matlab('ebn0', , 'ModulationOrder', 2, 'Log', 'true', 'NumSymbols', 1000, ‚Filename', 'sample_advanced'); EOF

Christian Kocks April 3, 2012 Slide 9 Job Management Enqueue a job: bsub <myscript.sh Query unfinished jobs: bjobs Kill unfinished job: bkill [job ID]

Christian Kocks April 3, 2012 Slide 10 General Hints for using HPC Cluster Ask Mrs. Tiedtke from Uni DuE for HPC cluster account Write the simulator in a way to allow the execution of multiple small simulations instead of one long simulation Collect simulation parameters and results in MAT file Deactivate all graphical outputs Read “HPC Primer” for further information on using the cluster Visit

Christian Kocks April 3, 2012 Slide 11 High-Performance Computing Cluster in Aachen Demonstration…