Presentation on theme: "NeSC Workshop July 20, 2002 Simulation of Chemical Reactor Performance – A Grid-Enabled Application – Kenneth A. Bishop Li Cheng Karen D. Camarda The University."— Presentation transcript:
NeSC Workshop July 20, 2002 Simulation of Chemical Reactor Performance – A Grid-Enabled Application – Kenneth A. Bishop Li Cheng Karen D. Camarda The University of Kansas
NeSC Workshop July 20, 2002 Presentation Organization Application Background Chemical Reactor Performance Evaluation Grid Assets In Play Hardware Assets Software Assets Contemporary Research NCSA Chemical Engineering Portal Application Cactus Environment Application
NeSC Workshop July 20, 2002 Chemical Reactor Description V 2 O 5 Catalyst in Tubes Feed Products Coolant Molten Salt Reaction Conditions: Temperature: 640 ~ 770 K Pressure: 2 atm O-Xylene : Air Mixture Phthalic Anhydride
NeSC Workshop July 20, 2002 Simulator Capabilities Reaction Mechanism: Heterogeneous Or Pseudo-homogeneous Reaction Path: Three Specie Or Five Specie Paths Flow Phenomena: Diffusive vs Bulk And Radial vs Axial Excitation: Composition And/Or Temperature
NeSC Workshop July 20, 2002 Chemical Reactor Start-up TEMPERATURE K TUBE AXIAL POSITION ENTRANCEEXIT CENTER RADIUS INITIAL CONDITION:FEED NITROGEN FEED TEMP. 640 K COOLANT TEMP. 640 K FINAL CONDITION:FEED 1% ORTHO-XYLENE FEED TEMP. 683 K COOLANT TEMP. 683 K TEMPERATURE
NeSC Workshop July 20, 2002 TEMPERATURE ORTHO-XYLENE PHTHALIC ANHYDRIDE TOLUALDEHYDE PHTHALIDE COx LOWHIGH Reactor Start-up: t = 60 +
NeSC Workshop July 20, 2002 TEMPERATURE ORTHO-XYLENE PHTHALIC ANHYDRIDE TOLUALDEHYDE PHTHALIDE COx LOW HIGH Reactor Start-up: t = Reactor Start-up: t = +
NeSC Workshop July 20, 2002 Grid Assets In Play - Hardware The University of Kansas JADE O2K  250MHz, R10000, 512M RAM PILTDOWN Indy  175MHz, R4400, 64M RAM Linux Workstations Windows Workstations University of Illinois (NCSA) MODI4 O2K  195MHz, R10000, 12G RAM Linux ( IA32  & IA64  Clusters) Boston University LEGO O2K  195MHz, R10000, 8G RAM
NeSC Workshop July 20, 2002 Grid Assets In Play - Software The University of Kansas IRIX 6.5: Globus 2.0 (host); COG [Java] (client); Cactus Linux: Globus 2.0 (host); COG [Java] (client); Cactus Windows 2K: COG (client); Cactus University of Illinois (NCSA) IRIX 6.5: Globus 2.0 (host); COG (client); Cactus Linux: Cactus Boston University IRIX 6.5: Globus (host); COG (client); Cactus
NeSC Workshop July 20, 2002 Research Projects Problem Complexity: Initial (Target) Pseudo-homogeneous (Heterogeneous) Kinetics Temperature And Feed Composition Excitation 1,500 (70,000) grid nodes & 200 (1,000) time steps Applications Alliance Chemical Engineering Portal; Li Cheng –Thrust: Distributed Computation Assets –Infrastructure: Method of Lines, XCAT Portal, DDASSL Cactus Environment; Karen Camarda –Thrust: Parallel Computation Algorithms –Infrastructure: Crank-Nicholson, Cactus, PETSc
NeSC Workshop July 20, 2002 ChE Portal Project Plan Grid Asset Deployment Client: KU Host: KU or NCSA or BU Grid Services Used Globus Resource Allocation Manager Grid FTP Computation Distribution (File Xfer Load) Direct to Host Job Submission (Null) Client- Job Submission; Host- Simulation (Negligible) Client- Simulation; Host- ODE Solver (Light) Client- Solver; Host- Derivative Evaluation (Heavy )
NeSC Workshop July 20, 2002 ChE Portal Project Results Run Times (Wall Clock Minutes) Load\Host PILTDOWN JADEMODI4 Null NegligibleNA LightNA Heavy2540* NA15.00** 211,121 Derivative Evaluations ** Exceeded Interactive Queue Limit After 3 Time Steps (10,362 Derivative Evaluations)
NeSC Workshop July 20, 2002 ChE Portal Project Conclusions Conclusions The Cost For The Benefits Associated With The Use Of Grid Enabled Assets Appears Negligible. The Portal Provides Robust Mechanisms For Managing Grid Distributed Computations. The Cost Of File Transfer Standard Procedures As A Message Passing Mechanism Is Extremely High. Recommendation A High Priority Must Be Assigned To Development Of High Performance Alternatives To Standard File Transfer Protocols.
NeSC Workshop July 20, 2002 Cactus Project Plan Grid Asset Deployment Client: KU Host: NCSA (O2K, IA32 Cluster, IA64 Cluster) Grid Services Used MPICH-G Cactus Environment Evaluation Shared Memory : Message Passing Problem Size: 5x10 5 – 1x10 8 Algebraic Equations Grid Assets: 0.5 – 8.0 O2K Processor Minutes 0.1 – 4.0 IA32 Cluster Processor Minutes Application Script Use
NeSC Workshop July 20, 2002 Cactus Project Results
NeSC Workshop July 20, 2002 Cactus Project Conclusions Conclusions The IA32 Cluster Outperforms O2K On The Small Problems Run To Date. (IA32 Faster Than O2K; IA32 Speedup Exceeds O2K Speedup.) The Cluster Computations Appear To Be Somewhat Fragile. (Convergence Problems Encountered Above 28 Cluster Node Configuration; Similar (?) Problems With The IA64 Cluster.) The Grid Service (MPICH-G) Evaluation Has Only Begun. Recommendations Continue The Planned Evaluation of Grid Services. Continue The Planned IA64 Cluster Evaluation.
NeSC Workshop July 20, 2002 Overall Conclusions The University Of Kansas Is Actively Involved In Developing The Grid Enabled Computation Culture Appropriate To Its Research & Teaching Missions. Local Computation Assets Appropriate To Topical Application Development And Use Are Necessary. Understanding Of And Access To Grid Enabled Assets Are Necessary.