My Road to Supercomputing Holly Krutka The University of Oklahoma.

Slides:



Advertisements
Similar presentations
Phoenics User Conference on CFD May 2004 Vipac Engineers & Scientists Ltd COMPUTATIONAL FLUID DYNAMICS Simulation of Turbulent Flows and Pollutant Dispersion.
Advertisements

Outline Overview of Pipe Flow CFD Process ANSYS Workbench
Modelling FSI problems in ANSYS Fluent via UDF
October 2007Susan Eggers: SOSP Women's Workshop 1 Your Speaker in a Nutshell BA in Economics 1965 PhD University of CA, Berkeley 1989 Technical skills.
Parallel Computation of the 2D Laminar Axisymmetric Coflow Nonpremixed Flames Qingan Andy Zhang PhD Candidate Department of Mechanical and Industrial Engineering.
Aerodynamic Study of Go-kart Nose Cones ME450 Introduction to Computer Aided Engineering Becker, Joe Professor H. U. Akay May 1, 2000.
OpenFOAM on a GPU-based Heterogeneous Cluster
HPC Impacts Automotive Aerodynamics Computational Fluid Dynamics HPC demands Kevin Golsch Aerodynamics – Energy Center 1 October 2010.
Linné FLOW Centre Research on Ekman at the Linné Flow Center, KTH Mechanics Dan Henningson, Director.
Jordanian-German Winter Academy 2006 NATURAL CONVECTION Prepared by : FAHED ABU-DHAIM Ph.D student UNIVERSITY OF JORDAN MECHANICAL ENGINEERING DEPARTMENT.
Virtual Machines for HPC Paul Lu, Cam Macdonell Dept of Computing Science.
Virtual Supercomputing Cam Macdonell Dept of Computing Science.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Introduction to Modeling Fluid Dynamics 1.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Mathematical Modeling and Engineering Problem solving.
1 CFD Analysis Process. 2 1.Formulate the Flow Problem 2.Model the Geometry 3.Model the Flow (Computational) Domain 4.Generate the Grid 5.Specify the.
Atmospheric Flow over Terrain using Hybrid RANS/LES European Wind Energy Conference & Exhibition 2007 A. Bechmann, N.N. Sørensen and J. Johansen Wind Energy.
THE AFFORDABLE SUPERCOMPUTER HARRISON CARRANZA APARICIO CARRANZA JOSE REYES ALAMO CUNY – NEW YORK CITY COLLEGE OF TECHNOLOGY ECC Conference 2015 – June.
Report : Zhen Ming Wu 2008 IEEE 9th Grid Computing Conference.
Processing of a CAD/CAE Jobs in grid environment using Elmer Electronics Group, Physics Department, Faculty of Science, Ain Shams University, Mohamed Hussein.
1 CCOS Seasonal Modeling: The Computing Environment S.Tonse, N.J.Brown & R. Harley Lawrence Berkeley National Laboratory University Of California at Berkeley.
Lecture Objectives: -Define turbulence –Solve turbulent flow example –Define average and instantaneous velocities -Define Reynolds Averaged Navier Stokes.
An Intercomparison Exercise on the Capabilities of CFD Models to Predict Deflagration of a Large-Scale H 2 -Air Mixture in Open Atmosphere J. García, E.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Cyberinfrastructure for Distributed Rapid Response to National Emergencies Henry Neeman, Director Horst Severini, Associate Director OU Supercomputing.
COMPTUER CLUSTERING WITH LINUX-ON-CD Robert Ibershoff Computer Electronic Networking.
In-term project presentation by Kanish Jindal Modeling of chlorine contact chamber at West Lafayette treatment plant.
TransAT Tutorial Particle Tracking July 2015 ASCOMP
The Central Processing Unit
Supercomputing Center CFD Grid Research in N*Grid Project KISTI Supercomputing Center Chun-ho Sung.
Mathematical Equations of CFD
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
E-science grid facility for Europe and Latin America E2GRIS1 Gustavo Miranda Teixeira Ricardo Silva Campos Laboratório de Fisiologia Computacional.
Parallelization of 2D Lid-Driven Cavity Flow
University of Illinois at Urbana-Champaign Computational Fluid Dynamics Lab Bin Zhao 1 LES Simulation of Transient Fluid Flow and Heat Transfer in Continuous.
Supporting Molecular Simulation-based Bio/Nano Research on Computational GRIDs Karpjoo Jeong Konkuk Suntae.
Interactive Computational Sciences Laboratory Clarence O. E. Burg Assistant Professor of Mathematics University of Central Arkansas Science Museum of Minnesota.
An Investigation into Implementations of DNA Sequence Pattern Matching Algorithms Peden Nichols Computer Systems Research April,
Computational Fluid Dynamics in OpenFOAM Henrik Kaald Melbø Supervisor: Heinz Preisig Department of Chemical Engineering.
NEES Cyberinfrastructure Center at the San Diego Supercomputer Center, UCSD George E. Brown, Jr. Network for Earthquake Engineering Simulation NEES TeraGrid.
Very Fast Chip-level Thermal Analysis Keiji Nakabayashi†, Tamiyo Nakabayashi‡, and Kazuo Nakajima* †Graduate School of Information Science, Nara Institute.
Computational Fluid Dynamics Applied to the Analysis of 10-mm Hydrocyclone Solids Separation Performance S. A. Grady, M. M. Abdullah, and G. D. Wesson.
Using the Weizmann Cluster Nov Overview Weizmann Cluster Connection Basics Getting a Desktop View Working on cluster machines GPU For many more.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Enabling the use of e-Infrastructures with.
Lecture Objectives -Finish Particle dynamics modeling -See some examples of particle tracking -Eulerian Modeling -Define deposition velocity -Fluid Dynamics.
TransAT Tutorial Separation of Oil, Gas & Water July 2015 ASCOMP
Computation Time Analysis - Climate Reanalysis Data Dipanwita Dasgupta University of Notre Dame Graduate Operating Systems.
Density Currents Over Smooth and Rough Surfaces Nic Johnson/Environmental Science/Senior/Local Utilities, Sustainable Development using GIS Dr. Kiran Bhaganagar,
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
© Ram Ramanan 2/22/2016 Commercial Codes 1 ME 7337 Notes Computational Fluid Dynamics for Engineers Lecture 4: Commercial Codes.
TransAT Tutorial Dam Break June 2015 ASCOMP
TransAT Tutorial Flow over a Cylinder (Re=150) June 2015 ASCOMP
Dynamic visualisation of the combustion processes in boilers Marek Gayer Computer Graphics Group Department of Computer Science and.
December 13, G raphical A symmetric P rocessing Prototype Presentation December 13, 2004.
An Introduction to Computational Fluids Dynamics Prapared by: Chudasama Gulambhai H ( ) Azhar Damani ( ) Dave Aman ( )
Investigation of supersonic and hypersonic laminar shock/boundary-layer interactions R.O. Bura, Y.F.Yao, G.T. Roberts and N.D. Sandham School of Engineering.
A Web Based Job Submission System for a Physics Computing Cluster David Jones IOP Particle Physics 2004 Birmingham 1.
HPC need and potential of ANSYS CFD and mechanical products at CERN A. Rakai EN-CV-PJ2 5/4/2016.
High Energy Physics at the OU Supercomputing Center for Education & Research Henry Neeman, Director OU Supercomputing Center for Education & Research University.
Density Currents Over Smooth and Rough Surfaces Nic Johnson/Environmental Science/Senior/Local Utilities, Sustainable Development using GIS Dr. Kiran Bhaganagar,
TransAT Tutorial Backward Step May 2015 ASCOMP
Presenter: Nicholus Tayari Akankwasa
Motorbike Simulation with OpenFOAM
CLUSTER COMPUTING Presented By, Navaneeth.C.Mouly 1AY05IS037
FEA Introduction.
Support for ”interactive batch”
with Computational Scientists
CINECA HIGH PERFORMANCE COMPUTING SYSTEM
14. Computational Fluid Dynamics
RDFinancial in PowerPoint
Parallel Feature Identification and Elimination from a CFD Dataset
Presentation transcript:

My Road to Supercomputing Holly Krutka The University of Oklahoma

Research Background  Using a computational fluid dynamics (CFD) package Fluent to simulate air flow  The air flow is related to the melt blowing process of polymers  Chemical engineer  NOT A COMPUTER SCIENTIST!!!

Examples of Melt Blowing Dies

Why Use CFD?  Experiments had already been completed  There is no affordable way to include the polymer in the air flow and take experimental measurements  To be on the cutting edge of research in our field  FAST AND CHEAP!!!

How Fluent Works  Make a computational domain  Split domain up into finite volumes  Set boundary conditions  Set model parameters  Fluent solves the continuity equation, momentum equations, ect.  Analyze simulation results

What can be gained from CFD?

Undergraduate Research  2D simulations  Symmetrical  Isothermal initially  Grid cells were small  About 100,000 cells per grid  Using a single processor on a lab computer  Simulations required between 2-6 days to finish

Graduate Research  Simulations become much more complex  Non-isothermal  3D  700,000 – 2,000,000+  Including polymers  Time dependent simulations  Lots of simulations to run…and only two processors?!?!?!?

Supercomputing  CFD program is actually designed to run on a supercomputer  OU has a great supercomputing program  One problem….still NOT a computer scientist  YOU DON’T NEED TO BE!

Using Sooner  CFD package was loaded on to Sooner (Brandon George, OSCER Manager of Operations )  Started using sooner.oscer.ou.edu (IBM p690, 32 POWER4 1.1 GHz CPUs, 32 GB RAM -- now decommissioned)  Use Secure Shell to move my simulations to Sooner  Wrote a simple script file (with help) to run the simulations  Submit the simulations to the queue and wait

From Sooner to Topdawg  Fluent loaded on to Topdawg (Brandon George)  Later transitioned to topdawg.oscer.ou.edu (Dell Xeon cluster,1024 Xeon 3.2 GHz CPUs, 2176 GB RAM) Simulations are submitted in a queue, similar to Sooner  Great improvement on speed Desktop → Sooner → Topdawg

Advantages  Using more processors at a time (4 – 8)  Much faster  More simulations at once  Much more stable  My desktop computer is free to work

Conclusions  You do not need to be a computer scientist to use OU’s supercomputing resources  Utilizing OSCER has greatly improved both the quality and quantity of my PhD research

Questions?