Presentation on theme: "Simulation at NASA for the Space Radiation Effort Dr. Robert C. Singleterry Jr. NASA Administrator's Fellow (Cohort 6) NASA Langley Research Center HPC."— Presentation transcript:
Simulation at NASA for the Space Radiation Effort Dr. Robert C. Singleterry Jr. NASA Administrator's Fellow (Cohort 6) NASA Langley Research Center HPC Users Forum, Stuttgart, GM
Overview Simulation at NASA for Space Radiation What are Our problems? Am I (we) Unique? Possible Solutions? No Real Conclusion!
Simulation at NASA for Space Radiation What is Space Radiation? One of the top 5 problems that must be solved for extended space travel
Simulation at NASA for Space Radiation Just solve the Boltzmann Equations: Easy as pie …… theres pie?
Simulation at NASA for Space Radiation Not so Easy!! Stochastic Methods Monte Carlo is the most prevalent Deterministic Methods Discrete ordinates is the most prevalent Now can use finite elements for the geometry solution Straight ahead method (what we use now) Dozens if not hundreds of other methods exists also Possible solutions for space radiation? Physics allows the straight ahead method Many ways to even solve with this method Interpolation Ray-by-ray
Simulation at NASA for Space Radiation Todays method of choice for us
What are Our Problems? As vendors move towards a multi/many core environment, the individual cores slow down to beat thermal limits The NASA Space Radiation code is a physics research code with a web based front end Last thing thought about was execution time Physics code was written from 1970 to present Legacy does not begin to explain it It is serial!!! (We have added dynamic memory allocation) All is not lost, just limited
What are Our Problems? Interpolation – not much to do here Thread the mathematics Course parallel over interpolation points Ray-by-Ray Since the rays are independent, each ray can go on a core Thread the mathematics Still hit a wall at about cores Would take more money than we have to rewrite our code for a cluster environment Latest NASA machine: 43,008 cores!!!
What are Our Problems? As cores go from 43,008 to 1,000,000 Our algorithm is stuck at cores without major rework that we cannot afford Not sure how many more cores we could utilize if we could afford a rework but << 1,000,000!!! SUMMARY As the cores get slower, our execution time gets longer as we cannot use more cores Yet users want more and better answers as computers get more powerful OK, as users needs become more demanding
What are Our Problems? HPCWire, 9/24/08, Intel: CPUs Will Prevail Over Accelerators in HPC What we're finding is that if someone is going to go to the effort of optimizing an application to take advantage of an offload engine, whatever it may be, the first thing they have to do is parallelize their code Richard Dracott, General Manager, HPC Business Unit
Am I (we) Unique? There are many small and large ISVs Abaqus < 128 cores Few open source packages can >>128 cores Most (if not all) engineering, day-to-day packages cannot use more than 1000 cores MCNPX can use < cores for certain types of problems Most everybody needs help! Those that do not need help can afford To rewrite code when new architectures arrive To write code from scratch to fit an architecture
Possible Solutions? Smarter compilers – Users Point of View! No new language, just amend Fortran and C Like MP but with MPI – Programming Environment Nice if housed in current environments Intel PGI Absoft etc… Do not care if production compile takes days Enable non-x86 hardware in current compilers ASICs GPGPU/Cell FPGA etc…
No Real Conclusions Nothing but the possible solutions Develop new algorithms to solve the Boltzmann Equation so >1,000,000 cores can be utilized Over 10M$ and 3 years to parallelize and V&V what we already have and must be done first!! Over 100M$ and 10 years to develop new methods of solution to fit the vision of chip makers and then V&V the methods Space Radiation is a small but unique solution domain of the total radiation analysis world