Sofia Pediaditaki and Mahesh Marina University of Edinburgh

Slides:



Advertisements
Similar presentations
Achieving Throughput Fairness in Wireless Mesh Network Based on IEEE Janghwan Lee and Ikjun Yeom Division of Computer Science KAIST
Advertisements

Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
Combining Classification and Model Trees for Handling Ordinal Problems D. Anyfantis, M. Karagiannopoulos S. B. Kotsiantis, P. E. Pintelas Educational Software.
Relational Learning with Gaussian Processes By Wei Chu, Vikas Sindhwani, Zoubin Ghahramani, S.Sathiya Keerthi (Columbia, Chicago, Cambridge, Yahoo!) Presented.
CARA: Collision-Aware Rate Adaptation for IEEE WLANs Presented by Eric Wang 1.
MAC Layer (Mis)behaviors Christophe Augier - CSE Summer 2003.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Final Project: Project 9 Part 1: Neural Networks Part 2: Overview of Classifiers Aparna S. Varde April 28, 2005 CS539: Machine Learning Course Instructor:
Bayes Nets Rong Jin. Hidden Markov Model  Inferring from observations (o i ) to hidden variables (q i )  This is a general framework for representing.
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
Low Delay Marking for TCP in Wireless Ad Hoc Networks Choong-Soo Lee, Mingzhe Li Emmanuel Agu, Mark Claypool, Robert Kinicki Worcester Polytechnic Institute.
The Impact of Multihop Wireless Channel on TCP Throughput and Loss Zhenghua Fu, Petros Zerfos, Haiyun Luo, Songwu Lu, Lixia Zhang, Mario Gerla INFOCOM2003,
1 Expected Data Rate (EDR): An Accurate High-Throughput Path Metric For Multi- Hop Wireless Routing Jun Cheol Park Sneha Kumar Kasera.
Selected Data Rate Packet Loss Channel-error Loss Collision Loss Reduced Packet Probing (RPP) Multirate Adaptation For Multihop Ad Hoc Wireless Networks.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
Bayesian Networks. Male brain wiring Female brain wiring.
Midterm Review Rao Vemuri 16 Oct Posing a Machine Learning Problem Experience Table – Each row is an instance – Each column is an attribute/feature.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Bayesian Networks Martin Bachler MLA - VO
Analysing Microarray Data Using Bayesian Network Learning Name: Phirun Son Supervisor: Dr. Lin Liu.
MARCH : A Medium Access Control Protocol For Multihop Wireless Ad Hoc Networks 성 백 동
Aprendizagem Computacional Gladys Castillo, UA Bayesian Networks Classifiers Gladys Castillo University of Aveiro.
MOJO: A Distributed Physical Layer Anomaly Detection System for WLANs Richard D. Gopaul CSCI 388.
Sunghwa Son Introduction Time-varying wireless channel  Large-scale attenuation Due to changing distance  Small-scale fading Due to multipath.
Hybrid OFDMA/CSMA Based Medium Access Control for Next- Generation Wireless LANs Yaser Pourmohammadi Fallah, Salman Khan, Panos Nasiopoulos, Hussein Alnuweiri.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Classification Techniques: Bayesian Classification
ECE 256: Wireless Networking and Mobile Computing
Slides for “Data Mining” by I. H. Witten and E. Frank.
CHAPTER 6 Naive Bayes Models for Classification. QUESTION????
Classification (slides adapted from Rob Schapire) Eran Segal Weizmann Institute.
Cross-Layer Approach to Wireless Collisions Dina Katabi.
Dimensionality Reduction in Unsupervised Learning of Conditional Gaussian Networks Authors: Pegna, J.M., Lozano, J.A., Larragnaga, P., and Inza, I. In.
Identifying “Best Bet” Web Search Results by Mining Past User Behavior Author: Eugene Agichtein, Zijian Zheng (Microsoft Research) Source: KDD2006 Reporter:
Machine Learning in Practice Lecture 6 Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Classification COMP Seminar BCB 713 Module Spring 2011.
BAYESIAN LEARNING. 2 Bayesian Classifiers Bayesian classifiers are statistical classifiers, and are based on Bayes theorem They can calculate the probability.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Accurate WiFi Packet Delivery Rate Estimation and Applications Owais Khan and Lili Qiu. The University of Texas at Austin 1 Infocom 2016, San Francisco.
MAC Protocols for Sensor Networks
1 CARA: Collision-Aware Rate Adaptation for IEEE WLANs Jongseok Kim, Seongkwan Kim, Sunghyun Choi and Daji Qiao* School of Electrical Engineering.
Machine Learning: Ensemble Methods
MAC Protocols for Sensor Networks
Impact of Interference on Multi-hop Wireless Network Performance
Who am I? Work in Probabilistic Machine Learning Like to teach 
Balancing Uplink and Downlink Delay of VoIP Traffic in WLANs
David S. L. Wei Joint Work with Alex Chia-Chun Hsu and C.-C. Jay Kuo
Sofus A. Macskassy Fetch Technologies
Boosted Augmented Naive Bayes. Efficient discriminative learning of
Semi-supervised Machine Learning Gergana Lazarova
Topics in Distributed Wireless Medium Access Control
Table 1. Advantages and Disadvantages of Traditional DM/ML Methods
COMP61011 : Machine Learning Ensemble Models
A Rate-Adaptive MAC Protocol for Multi-Hop Wireless Networks
Data Mining Lecture 11.
Classification Techniques: Bayesian Classification
CHAPTER 7 BAYESIAN NETWORK INDEPENDENCE BAYESIAN NETWORK INFERENCE MACHINE LEARNING ISSUES.
Subject Name: Adhoc Networks Subject Code: 10CS841
Lecture 10 – Introduction to Weka
Machine Learning: UNIT-3 CHAPTER-1
Performance Implications of DCF to ESS Mesh Networks
Performance Implications of DCF to ESS Mesh Networks
A task of induction to find patterns
Using Bayesian Network in the Construction of a Bi-level Multi-classifier. A Case Study Using Intensive Care Unit Patients Data B. Sierra, N. Serrano,
Performance Implications of DCF to ESS Mesh Networks
MAS 622J Course Project Classification of Affective States - GP Semi-Supervised Learning, SVM and kNN Hyungil Ahn
A task of induction to find patterns
Speech recognition, machine learning
Presentation transcript:

Sofia Pediaditaki and Mahesh Marina University of Edinburgh A Machine Learning Approach to Loss Differentiation Solution in 802.11 Wireless Networks Sofia Pediaditaki and Mahesh Marina s.pediaditaki@sms.ed.ac.uk mmarina@inf.ed.ac.uk University of Edinburgh

Introduction How can a sender infer the actual cause of loss with: 802.11 Causes of Packet Losses: Channel errors Interference (collisions or hidden terminals) Mobility, handoffs, queue overflows, etc. How can a sender infer the actual cause of loss with: No or little receiver feedback A lot of uncertainty (time-varying channels, interference, traffic patterns, etc.). Use machine learning algorithms!

Do we Need Loss Differentiation? Rate Adaptation: Channel error Lower rate improves SNR Collision Lower rate worsens problem DCF mechanism: In 802.11, cause of loss is collision by default Doubling the contention window hurts performance if cause is channel error Various other applications (e.g. Carrier sensing threshold adaptation [Ma et al – ICC’07])

State of the Art Use RTS/CTS to infer cause of loss Drawbacks Rate Adaptation Algorithms [CARA-Infocom’06, RRAA-MobiCom ’06] Use RTS/CTS to infer cause of loss Small frames resilient to channel errors Medium is captured Data packet is lost due to channel error Drawbacks RTS/CTS is rarely used in practice Extra overhead Hidden terminal issue not fully resolved Potential unfairness C D A B

Our Aim A general purpose loss differentiator which is: Accurate and efficient: responsive and robust to the operational environment Supported by commodity hardware fully implementable in the device driver without e.g. MAC changes Has acceptable computational cost and low overhead Requires no (or little) information from the receiver

The Proposed Approach Loss differentiation can be seen as a “classification” problem Class labels: Types of losses Features: Observable data Goal: Assign each error to a class The Classification Process: Training Phase: < attributes, class > pairs as training data Operational Phase: Classify new “unlabeled” data (test data)

The Classification Process < attributes, class > Training Phase Pre-processing Input Dataset Training Data Learning Algorithm: Naive Bayes Bayesian Nets Decision Trees Etc. Operational Phase < attributes, ?> < attributes, class > “Unlabeled” Data Trained Model Prediction

Performance Evaluation (1/2) Training data using Qualnet Simulator Single-hop random topologies (WLANs) Varying number of rates and flows , with or without fading Multi-hop random topologies One-hop traffic, multiple rates, with or without fading Learning algorithms using Weka workbench (University of Waikato, New Zealand) Classes of interest: Channel errors Interference

Performance Evaluation (2/2) Classification Features: Rate The higher the rate, the higher the channel error probability Retransmissions No Due to backoff, collision probability decreases across retransmissions Channel Busy Time Observed channel errors and collisions Easily obtained at the sender

Preliminary Results: No fading Try the simple things first (K.I.S.S. Rule)! Bayes Method Prediction Accuracy% Training Time (sec) Naive Bayes WLAN WLAN-MH 0.01 99.5 95.9 29303 WLAN – 55140 WLAN-MH instances 10-fold Cross Validation Almost perfect predictor But things are not that simple!

Preliminary Results: All together A small step for man ... Bayes Method Prediction Accuracy% Training Time (sec) Naive Bayes 87 0.06 Bayesian Net 87.7 0.15 125213 instances 10-fold Cross Validation Naive Bayes assumes attributes are independent Bayesian Networks make Naive Bayes less “naive”

Discussion Which machine learning algorithm is more appropriate to use? Which features are the most representative? Is this solution generalizable? Can we use the solution as it is in real hardware? How much training is it required? What if we use semi-supervised learning?

Summary Why do we need a loss differentiator: Rate adaptation algorithms, 802.11 DCF mechanism, ... We propose a machine learning-based predictor Handles loss differentiation as “classification” problem There are still many things do be we should consider... So, can we use such solution? Yes, we can [Obama ‘08] Preliminary results show we could 

Thank you Questions?

Naive Bayes Bayes’s Theorem Posterior probability = prior probability x likelihood / Evidence p(C| F1... Fn) = p(C).p(F1... Fn|C) / p(F1... Fn) Bayesian Classifier

Bayesian Networks A Bayes net (also called a belief network) is an augmented directed acyclic graph, represented by the pair V , E where: V is a set of vertices E is a set of directed edges joining vertices Each vertex in V contains the following information: The name of a random variable A probability distribution table: How the probability of this variable’s values depends on all possible combinations of parental values