Presentation is loading. Please wait.

Presentation is loading. Please wait.

Gabor Madl1, Nikil Dutt1, Sherif Abdelwahed2

Similar presentations


Presentation on theme: "Gabor Madl1, Nikil Dutt1, Sherif Abdelwahed2"— Presentation transcript:

1 Performance Estimation of Distributed Real-time Embedded Systems by Discrete Event Simulations
Gabor Madl1, Nikil Dutt1, Sherif Abdelwahed2 1 University of California, Irvine 2 Mississippi State University {gabe, EMSOFT 2007 November 9, 2018

2 Outline November 9, 2018 Motivation Modeling DRE systems
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Outline Motivation Distributed Real-time Embedded (DRE) systems Need to combine various analysis methods for performance estimation Modeling DRE systems Formal performance estimation Branching intervals, race conditions Event order tree Real-time Verification On-the-fly construction of the event order tree Evaluation Comparison with random simulations Comparison with timed automata model checking November 9, 2018

3 Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation
DRE Systems Distributed real-time embedded (DRE) systems are often reactive and event-driven Better latency than in synchronous/time-triggered systems Easier to implement, no need for global synchronization Computations are driven by events – complex model Asynchrony, concurrency, race conditions Hard to predict all behaviors Performance estimation is a key challenge Task execution times Communication delays Degree of parallelism Throughput November 9, 2018

4 Static Analysis Methods
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Static Analysis Methods Classical scheduling theory [Liu, Layland 1973] Symta/S [Henia et al. 2005] Application of heterogeneous scheduling methods for component-based systems Modular Performance Analysis [Wandeler et al. 2006] Application of real-time calculus for performance estimation Synchronous languages [Benveniste et al. 2003] Alternative approach for defining real-time constraints in globally synchronous systems with deterministic concurrency Giotto [Henzinger et al. 2003] Time-triggered language for schedulability analysis using the concept of logical execution time November 9, 2018

5 Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation
Simulations Several variations used by the industry (e.g. RTL languages, SystemC, etc.) Various abstraction levels trade off performance and accuracy Ptolemy II [Lee et al. 2001] Complex framework for the heterogeneous modeling of embedded systems Focuses on non-deterministic systems Simulation-based performance estimation method for MPSoCs [Lahiri et al. 2001] Execution traces represented as symbolic graphs Analysis is based on simulations of the symbolic model November 9, 2018

6 Model Checking November 9, 2018
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Model Checking Task timed automata [Ericsson et al. 1999] Approach for deciding non-preemptive schedulability of asynchronous event-driven task graphs Task-level analysis of DRE systems [Madl et al. 2004, 2005, 2006] Extends task timed automata with event channels, schedulers, to model distributed real-time embedded systems Thread-level analysis of real-time middleware using timed automata [Subramonian et al. 2006] Detailed analysis of middleware services Low-level abstraction November 9, 2018

7 Evaluation of Existing Methods
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Evaluation of Existing Methods Static analysis methods Often assume independent, periodic tasks Cannot capture dynamic effects such as race conditions Simulations Can show the presence of an error, never its absence Ad-hoc, hard to measure coverage Limited design space exploration Model checking State space explosion problem No partial results Time consuming and costly Each method has its advantage and disadvantage November 9, 2018

8 Need to Combine Methods
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Need to Combine Methods Providing formal guarantees on real-time properties is often infeasible in distributed asynchronous systems Combine simulations and model checking to increase the coverage of simulations Need to capture dynamic effects for accurate performance estimation Race conditions, non-deterministic execution times, arbitration etc. Need to capture execution intervals in continuous time as non-wcet deadline misses are possible in distributed asynchronous systems November 9, 2018

9 Model-based Design and Analysis
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Model-based Design and Analysis The domain-specific model (DSM) captures key properties of the design The DSM is mapped to a formal executable model, that drives the real-time verification and performance estimation The formal executable model can be mapped to heterogeneous models of computation (MoC) The same formal model is used for both performance estimation and real-time verification Parameters obtained by simulations November 9, 2018

10 Model for Real-time Systems
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Model for Real-time Systems DRE = {T, M, C, TR, D} T is the set of tasks M is the set of machines C is the set of channels C  T TR is the set of timers TR  T D is the task dependency relationship D  T  T machine(tk): T  M Tasks’ execution interval given by [bcet, wcet] Channels are special tasks that model FIFO buffers and delays Deadlines defined for tasks Timers trigger execution of tasks periodically November 9, 2018

11 Events November 9, 2018 DRE MoC is a continuous-time extension of DES
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Events DRE MoC is a continuous-time extension of DES Can be mapped to timed automata Event labels are time-stamped values from the domain of non-negative real numbers Global time During simulation, “jump” to event with the smallest timestamp The tasks’ execution time is defined as a constraint between the timestamps of its input and output events Tasks, channels, timers compose using events Event flow follows dependencies November 9, 2018

12 Task States, Schedulers
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Task States, Schedulers Abstract model for scheduling Fixed priority scheduler composes with tasks using events Scheduler triggers transition from wait to run state Channels are not triggered by scheduler Scheduling policy is defined as priorities between transitions Non-preemptive scheduling November 9, 2018

13 Performance Estimation Problem
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Performance Estimation Problem A run or execution trace of the DRE model is the chronological sequence of events occurred in the model In the DRE MoC two execution traces are equivalent, if they contain the same events, and the chronological order of the events is the same We define the end-to-end computation time between two events as the maximum possible difference between those events along all the possible runs of the model DRE model implies a partial ordering on events Enumerate equivalent execution traces to obtain end-to-end computation time November 9, 2018

14 Branching Intervals November 9, 2018
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Branching Intervals In the examples below, total orderings are as follows: iA,oA,iB,oB,iC,oC and iA,iC,oC,oA,iB,oB and iA,oA,iB,iC,oB,oC Scheduler events omitted to keep the example simple More total orderings are possible Total ordering of events implies time constraints on the execution intervals of tasks In the example shown in the left, A’s execution time has to be less than 3, otherwise oB will not happen before iC Anytime the order may change, the event order tree has to branch to capture these cases Branching Interval November 9, 2018

15 Race Conditions November 9, 2018 Race Condition
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Race Conditions If two tasks receive input events at the same time from tasks that are assigned to different machine, race conditions may be present In the example shown on the right, there can be a race condition between task_C and task_D Task_B and task_E are assigned to different machines Race conditions can affect the total ordering of events This problem may result in priority inversion Race Condition November 9, 2018

16 Event Order Tree November 9, 2018 Branching Points Race Conditions
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Event Order Tree Combination of branching intervals and race conditions are interesting corner cases Extremely hard to find by simulations Branching Points Race Conditions November 9, 2018

17 Real-time Verification by DES
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Real-time Verification by DES Event order tree provides a way for the real-time verification of a large class of DRE models Proposed method is not guaranteed to terminate Restriction: the model has to return to the initial state Otherwise finite horizon analysis, or use timed automata If restriction above is satisfied, then the event order tree is repeatable; it repeats itself from all the leaves There are finite number of nodes in the tree One simulation only for equivalent execution traces Timers generate events at discrete time steps Each task generates finite number of output events for each input event November 9, 2018

18 On-the-fly Detection of Branching
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation On-the-fly Detection of Branching Both the performance estimation and real-time verification is based on the event order tree Event order tree can be extremely large; there is a need to obtain it on-the-fly We utilize a DFS-based exploration of the tree Based on repetitive simulations Memory use is not significant Each task detects its branching intervals during DES Generate all permutations of branching intervals November 9, 2018

19 Case Studies Used for Analysis
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Case Studies Used for Analysis Set of examples based on real-time CORBA avionics applications Software timers, asynchronous event propagation Thread pool policy follows the half-sync half-async pattern We have assumed non-preemptive scheduling for comparisons with other methods November 9, 2018

20 Compared to Model Checking
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Compared to Model Checking Model checking tailored to answer yes/no questions Is the execution time less than x? Storing timed states is a major contributor to memory use Checking for previously visited states is inefficient Realistic models are often too complex Exhaustive analysis infeasible – state space explosion problem Compared performance to UPPAAL and Verimag IF tool Timed automata model checkers faster in general On larger models they run out of memory Proposed method is CPU-bound – partial results when exhaustive analysis is infeasible (most of the time) Well suited for multicore/distributed implementation November 9, 2018

21 Compared to Random Simulations
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Compared to Random Simulations Random simulations often revisit the same total orderings of events, hard to find rare events Symbolic simulation model – faster than naïve simulations Simulation speed ~30ms for 100 tasks, ~1-5ms for 5-15 tasks DES-based method increases coverage over time Focuses on corner cases Branching intervals Race conditions Their combinations Improves the existing practice of the ad-hoc combination of directed simulations and random simulations for performance estimation November 9, 2018

22 DREAM Analysis Framework
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation DREAM Analysis Framework ALDERIS model GME tool Open-source DREAM Tool Automatic timed automata model generation for the UPPAAL and Verimag IF tools Simulation-based model checking Performance Estimation using DES Random testing Schedulability optimizations Verimag IF model checker ALDERIS model XML representation UPPAAL model checker November 9, 2018

23 Work in Progress & Future Work
Outline Motivation Modeling Performance Estimation Real-time Verification Evaluation Work in Progress & Future Work DREAM was integrated in the Scenery project by Fujitsu Labs of America Analyze real-time properties in embedded software and SoCs Composing with other MoCs System-level performance estimation Wireless & multi-media case studies Looking for domain-specific modeling languages with large-scale DRE examples (hard to find) Possible optimizations by hierarchical model checking Distributing model checker algorithm to increase scalability Not the main focus at the moment November 9, 2018

24 Questions? EMSOFT 2007 November 9, 2018


Download ppt "Gabor Madl1, Nikil Dutt1, Sherif Abdelwahed2"

Similar presentations


Ads by Google