Software Dynamics: A New Method of Evaluating Real-Time Performance of Distributed Systems Janusz Zalewski Computer Science Florida Gulf Coast University Ft. Myers, FL FALSE2002, Nashville, Nov , 2002
Talk Outline RT Software Architecture Evaluating S/W Architectures Timeliness & S/W Dynamics Conclusion FALSE2002, Nashville, Nov , 2002
Feedback Control System FALSE2002, Nashville, Nov , 2002
Generic Real-Time Software Architecture FALSE2002, Nashville, Nov , 2002
Sensor/Actuator component User Interface component Communication Link component Database component Processing component Timing component. Basic Components of Real-Time Software Architecture FALSE2002, Nashville, Nov , 2002
Air-Traffic Control System Physical Diagram FALSE2002, Nashville, Nov , 2002
Air-Traffic Control System Context Diagram FALSE2002, Nashville, Nov , 2002
The idea of grouping I/O information into different categories, which later determine the software architecture follows the fundamental software engineering principle of separation of concerns (Parnas, 1970s). FALSE2002, Nashville, Nov , 2002
Model of a Distributed Embedded Simulation FALSE2002, Nashville, Nov , 2002
We are missing good (any) measures to characterize Behavioral Properties of a software module (its dynamics). FALSE2002, Nashville, Nov , 2002
Interrupt Latency The time interval between the occurrence of an external event and start of the first instruction of the interrupt service routine. FALSE2002, Nashville, Nov , 2002
H/W logic processing Interrupt disable time Handling higher H/W priorities Switching to handler code. Interrupt Latency Involves FALSE2002, Nashville, Nov , 2002
Real-Time System Responsiveness FALSE2002, Nashville, Nov , 2002
Dispatch Latency The time interval between the end of the interrupt handler code and the first instruction of the process activated (made runnable) by this interrupt. FALSE2002, Nashville, Nov , 2002
Dispatch Latency Involves OS decision time to reschedule (non-preemptive kernel state) context switch time return from OS call. FALSE2002, Nashville, Nov , 2002
Real-Time Properties * Responsiveness * Timeliness * Schedulability * Predictability FALSE2002, Nashville, Nov , 2002
How to measure these properties? * Responsiveness - just outlined * Timeliness - proposed below * Schedulability - rate monotonic and deadline monotonic analyses. FALSE2002, Nashville, Nov , 2002
Two measures of timeliness: * Overall time deadlines are missed (by a task) * Number of times deadlines are missed by X percent FALSE2002, Nashville, Nov , 2002
5-task Benchmark FALSE2002, Nashville, Nov , 2002
Overall time the deadlines are missed for 100 experiments. FALSE2002, Nashville, Nov , 2002
The number of times the deadlines are missed by 2%.
Overall time the deadlines are missed for 100 experiments (CORBA). FALSE2002, Nashville, Nov , 2002
The number of times the deadlines are missed by 2% (CORBA). FALSE2002, Nashville, Nov , 2002
ATCS: Software Components Communicating via CORBA FALSE2002, Nashville, Nov , 2002
Overall time (in milliseconds) deadlines are missed for 20 aircraft (in 100 experiments). FALSE2002, Nashville, Nov , 2002
Number of times deadlines are missed by more than 20% for 20 aircraft (in 100 experiments). FALSE2002, Nashville, Nov , 2002
Satellite Ground Control Station FALSE2002, Nashville, Nov , 2002
SGCS Implementation FALSE2002, Nashville, Nov , 2002
SGCS Physical Architecture FALSE2002, Nashville, Nov , 2002
Single DB Client Request Processing Time. FALSE2002, Nashville, Nov , 2002
Percent of deadlines missed for one DB Client. FALSE2002, Nashville, Nov , 2002
Five DB Clients Request Processing Time. FALSE2002, Nashville, Nov , 2002
Percent of deadlines missed for five DB Clients. FALSE2002, Nashville, Nov , 2002
Sensitivity: a measure of the magnitude of system’s response to changes. FALSE2002, Nashville, Nov , 2002
Sensitivity: (y1 – y0)/[(y1 + y0)/2] (x1 – x0)/[(x1 + x0)/2] FALSE2002, Nashville, Nov , 2002
Sensitivity = 1.73 FALSE2002, Nashville, Nov , 2002
Sensitivity = 1.00 FALSE2002, Nashville, Nov , 2002
Sensitivity = 1.64 FALSE2002, Nashville, Nov , 2002
First Order Dynamics G(s) = K / ( *s + 1) FALSE2002, Nashville, Nov , 2002
Time constant - : a measure of the speed of system’s response to changes. FALSE2002, Nashville, Nov , 2002
Settling Time: time when curve reaches 2% max Time Constant = 0.25 * Settling Time FALSE2002, Nashville, Nov , 2002
= 165 ms FALSE2002, Nashville, Nov , 2002
= 87.5 ms FALSE2002, Nashville, Nov , 2002
= 15 ms FALSE2002, Nashville, Nov , 2002
Distributed Embedded Simulation Architecture FALSE2002, Nashville, Nov , 2002
Statistical measures of timeliness: * Round-trip time stability * Service time effect FALSE2002, Nashville, Nov , 2002
Service time effect for a specific architecture FALSE2002, Nashville, Nov , 2002
Round-trip message time for 5-task simulation FALSE2002, Nashville, Nov , 2002
Conclusion Behavioral Properties are crucial for successful software development Sensitivity is one important property Software Dynamics seems to be a measurable property as well FALSE2002, Nashville, Nov , 2002