Presentation is loading. Please wait.

Presentation is loading. Please wait.

Adaptive Offloading for Pervasive Computing

Similar presentations


Presentation on theme: "Adaptive Offloading for Pervasive Computing"— Presentation transcript:

1 Adaptive Offloading for Pervasive Computing
AmiN Saremi 7/6/2005

2 Introduction Pervasive Computing
Challenge: run complex applications on resource-constrained mobile device such as PDA. Solutions rewrite applications according to the resource capacity of each mobile device application-based or system-based adaptations Adaptive Offloading

3 Decision Making Problems for Adaptive Offloading
The offloading inference engine should trigger offloading at the right time and offload the right program objects to achieve low offloading overhead and efficient program execution. adaptive offloading triggering efficient application partitioning

4 Solution Overview Our assumptions
the application is written in an object-oriented languages the user’s environment contains powerful surrogates and plentiful wireless bandwidth

5

6 Offloading inference engine does not require any prior knowledge about an application’s execution pattern or the runtime environment’s resource status offloading inference engine employs the Fuzzy Control model as the basis for the offloading triggering inference module selects an effective application partitioning from many possible partition plans Memory constraint or CPU speed

7 Distributed Offloading Platform
application execution monitoring Application execution graph Each graph node represents a Java class memory size, AccessFreq, Location, IsNative Each graph edge represents the interactions between the objects of two classes InteractionFreq, BandwidthRequirement

8 Candidate Partition Plan Generation
Resource Monitoring mobile device, the surrogate, and the wireless network available memory in the Java heap, wireless bandwidth and delay Candidate Partition Plan Generation

9 Surrogate Discovery Transparent RPC Platform

10 Adaptive Offloading Inference Engine
Overhead of offloading transferring objects between the mobile device and the surrogate performing remote data accesses and function invocations over a wireless network Offloading Triggering Inference examines the current resource and the available resources Decides whether offloading should be triggered decides what level of resource utilization

11 simple threshold-based approach
“if the current amount of free memory on the mobile device is less than 20% of its total memory, then trigger offloading and offload enough program objects to free up at least 40% of the mobile device’s memory.” Fuzzy Control model linguistic decision-making rules provided by system or application developers membership functions generic fuzzy inference engine based on fuzzy logic theory

12 offloading memory size
Offloading rules if (AvailMem is low) and (AvailBW is high) then NewMemSize := low; if (AvailMem is low) and (AvailBW is moderate) then NewMemSize := average; if (AvailMem is high) and (AvailBW is low) then NewMemSize := high; If any of these rules is matched by the current system conditions, the offloading inference engine triggers offloading offloading memory size current memory consumption - new memory utilization

13 Mappings between numerical and linguistic values for each linguistic variable

14

15 Application Partition Selection
considering the target memory utilization on the mobile device multiple offloading requirements minimizing wireless bandwidth overhead minimizing average response time minimizing total execution time For each neighbor node Vk of Vi B i,k to denote the total amount of data traffic transferred between Vi and Vk, F i,k to define a total interaction number, MS k to represent the memory size of Vk.

16 For cost metrics Ck and Cl : Ck >=Cl if and only if
Splitting Large Classes

17 Trace-Driven Simulation Experiments
For comparison algorithm least recently used (LRU) Split Class Fuzzy Trigger our approach

18

19 References Xiaohui Gu, Alan Messer, Ira Greenberg, Dejan Milojicic, Klara Nahrstedt, “Adaptive Offloading for Pervasive Computing”, IEEE Pervasive Computing Magazine 2004. X. Gu, K. Nahrstedt, A. Messer, I. Greenberg, and D. Milojicic, “Adaptive Offloading Inference for Delivering Applications in Pervasive Computing Environment”, Proc. of IEEE International Conference on Pervasive Computing and Communications (PerCom 2003), Dallas-Fort Worth, Texas, March 2003.


Download ppt "Adaptive Offloading for Pervasive Computing"

Similar presentations


Ads by Google