Presentation is loading. Please wait.

Presentation is loading. Please wait.

AT LOUISIANA STATE UNIVERSITY CCT: Center for Computation & LSU Condor in Louisiana Tevfik Kosar Center for Computation & Technology Louisiana.

Similar presentations


Presentation on theme: "AT LOUISIANA STATE UNIVERSITY CCT: Center for Computation & LSU Condor in Louisiana Tevfik Kosar Center for Computation & Technology Louisiana."— Presentation transcript:

1 AT LOUISIANA STATE UNIVERSITY CCT: Center for Computation & Technology @ LSU Condor in Louisiana Tevfik Kosar Center for Computation & Technology Louisiana State University April 30, 2008

2 HPC Resources in Louisiana Need for Condor? Reservoir Simulations Coastal Modeling Conclusions Roadmap

3 40Gb/sec bandwidth state-wide Next-generation network for research Connected to the National LambdaRail (NLR, 10Gb/sec) in Baton Rouge Spans 6 universities and 2 health centers The Louisiana Optical Network Initiative (LONI) is a high speed computing and networking resource supporting scientific research and the development of new technologies, protocols, and applications to positively impact higher education and economic development in Louisiana. Louisiana Optical Network Initiative - http://loni.orghttp://loni.org

4 1 x Dell 50 TF Intel Linux cluster housed at the state's Information Systems Building (ISB) ▫ “Queen Bee” named after Governor Kathleen Blanco who pledged $40 million over ten years for the development and support of LONI. ▫ 680 nodes (5,440 CPUs), 688 GB RAM  Two quad-core 2.33 GHz Intel Xeon 64-bit processors  8 GB RAM per node ▫ Measured 50.7 TF peak performance ▫ According to the June, 2007 Top500 listing*, Queen Bee ranked the 23rd fastest supercomputer in the world. 6 x Dell 5 TF Intel Linux clusters housed at 6 LONI member institutions ▫ 128 nodes (512 CPUs), 512 GB RAM  Two dual-core 2.33 GHz Xeon 64-bit processors  4 GB RAM per node ▫ Measured 4.772 TF peak performance 5 x IBM Power5 575 AIX clusters housed at 5 LONI member institutions ▫ 13 nodes (104 CPUs), 224 GB RAM  Eight 1.9 GHz IBM Power5 processors  16 GB RAM per node ▫ Measured 0.851 TF peak performance LONI Computing Resources * http://top500.org/list/2007/06/100http://top500.org/list/2007/06/100  Combined total of 84 Teraflops

5 National Lambda Rail Louisiana Optical Network IBM P5 Supercomputers LONI Members Dell 80 TF Cluster LONI: The big picture… by Chris Womack

6 Who would say NO to more free cycles? Condor is more than cycle-stealing Condor project for us: Batch Scheduler (Condor) Gateway to Grid (Condor-G) Grid Software Stack (DAGMan, NeST, Stork..) Open Source (do your own thing) Why Condor?

7 UCoMS Goals: Reservoir simulation and uncertainty analysis 26M simulations, each generating 50MB of data --> 1.3 PB of data total Drilling processing and real-time monitoring is data-intensive as well --> real-time visualization and analysis of TB’s of streaming data Ubiquitous Computing and Monitoring System for Discovery and Managment of Energy Resources

8 UCoMS Abstract Workflow

9 UCoMS Concrete Workflow

10 10 Putting Together 10

11 Monitoring DAGs via WEB UCoMS Closed Loop Demonstration -- SC07

12 Monitoring DAGs via WEB

13 13 Monitoring DAGs via WEB

14 14 SCOOP Goals: Execution of atmospheric (WindGen) and hydrodynamic models (WW3, ADCIRC) for predicting effects of a storm (i.e. storm surge). 32 tracks per Storm every six hours Each track may have different priority Issues related to data and workflow management, resource discovery and brokering, task farming SURA Coastal Ocean Observing and Prediction Program SURA: Southeastern Universities Research Association SURA Coastal Ocean Observing and Prediction Program SURA: Southeastern Universities Research Association

15 SCOOP Scheduling Scheduling issues: – Dynamic prioritization based on scenario and resources – Three queues: on-demand (preemptive), priority, and best effort - co-scheduling, advanced reservation

16 16 Best Effort Scheduling 16

17 17 On-demand Scheduling 17

18 18 Conclusions Condor is more than a cycle-stealing tool We have used Condor, DAGMan and Stork successfully in end-to-end processing of Coastal Modeling Reservoir Simulations Willing to share exerience Visually monitoring DAG progress New Stork release soon 18 Hmm..


Download ppt "AT LOUISIANA STATE UNIVERSITY CCT: Center for Computation & LSU Condor in Louisiana Tevfik Kosar Center for Computation & Technology Louisiana."

Similar presentations


Ads by Google