Download presentation

Presentation is loading. Please wait.

1
**Active Learning to Classify Email**

4/22/05

2
What’s the problem? How will I ever sort all these new s?

3
What’s the problem? To get an idea of what mail I have gotten, I will need to sort these new messages. A great solution would be if I could sort just a few and my computer could sort the rest for me. To make it really accurate, the assistant could even pick which messages I should manually sort, so that it can learn to do the best job possible. (Active Learning)

4
What’s the solution? To solve this problem, we need a way to choose the most informative training examples. This requires some way of sorting s by how informative they are for classification. So, where do we start?

5
**Email Classification So, what do we know about email classification?**

SVM and Naïve Bayes significantly outperform many other methods (Brutlag 2000, Kiritchenko 2001) Both SVM and Naïve Bayes are suitable for “online” learning required for solving this problem effectively. (Cauwenberghs 2000) Classifier accuracy varies more between users than between algorithms. (Kiritchenko 2001) SVM performs better for users with more in each folder. (Brutlag 2000) Users with more , such as in our example problem, tend to have more in each folder than other users. (Klimt 2004) Thus, we have chosen SVM as the basis for this research.

6
**“Bag-of-Words” Model classification decision email data “bag of words”**

SVM

7
**Multiple SVMs Using separate SVMs for each section LLSF classification**

decision data SVMs

8
**Active Learning with SVM**

In general, examples closer to the decision boundary hyperplane will cause larger displacement of that boundary. (Schohn and Cohn 2000, Tong 2001)

9
**What if our prediction is right?**

Labeling the closer example: Labeling the farther example:

10
**And if our prediction is wrong?**

Picking the closer example: Picking the farther example:

11
**Incorporating Diversity**

In this example, the instance near the top is intuitively more likely to be informative. This is known as “diversity” (Brinker 2003).

12
**Active Learning with SVM**

But what about when you have multiple SVMs (like one-vs-rest)? (Yan 2003)

13
The Enron Corpus 150+ users 200,000 s

14
Initial Results Trained on 10%, Tested on 90%

15
**Chrono-Diverse Algorithm**

The way a user sorts changes over time. Pick training data that are maximally different from previous data with respect to time.

16
**Combination Algorithm**

Combine strengths of Standard and Chrono-Diverse. Take a weighted combination of their results. Adjust weighting with parameter lambda.

17
Results Trained on 10%, Tested on 90%

18
Parameter Tuning

19
Conclusions State-of-the-art algorithm for active learning with text classification performs horribly on data! Choosing s for time diversity works very well. Combining the two works best.

20
**Future Work Improve the efficiency of SVM or find a better alternative**

Determine when using chronological diversity performs best and worst Adapt the algorithm to online classification

Similar presentations

OK

Ensembles. Ensemble Methods l Construct a set of classifiers from training data l Predict class label of previously unseen records by aggregating predictions.

Ensembles. Ensemble Methods l Construct a set of classifiers from training data l Predict class label of previously unseen records by aggregating predictions.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on electricity for class 10th math Ppt on email etiquettes presentation templates Ppt on introduction to product management Ppt on db2 mainframe computer Ppt on tyres manufacturing process Ppt on urinary catheter care Ppt on world war 2 Ppt on digital media broadcasting Ibm si view ppt online Ppt on australian continent