Presentation is loading. Please wait.

Presentation is loading. Please wait.

Some Technical Considerations In negative selection algorithms 4/15/2005.

Similar presentations


Presentation on theme: "Some Technical Considerations In negative selection algorithms 4/15/2005."— Presentation transcript:

1 Some Technical Considerations In negative selection algorithms 4/15/2005

2 Two issues Boundary-aware versus point-wise: a negative selection can do something positive selection cannot. Geometric shape of detectors: whether and why it matters.

3 Background “greatest common factor” in negative selection algorithms: detectors representing complementary space

4 Boundary-aware vs. point-wise Self region is not equivalent to training points. Boundary-aware: training data are used as a group instead of individuals. Assumption on self samples

5 Training points are discrete Where does ‘self’ stop? (We will see it is not only the issue of self threshold.)

6 Boundary-aware NS algorithm The sample points are used collectively: It provides more information than what can be obtained from points individually whole is large than parts Positive selection cannot do. Drawback: clusters are not represented explicitly.

7 Simplified example

8 Fair assumption It is fair to assume that the density of self sample distribution is related to “self threshold”. There is no reason to assume that the self samples are not close to the boundary between self and nonself regions.

9

10 Aggressive or conservative to detect?

11 Geometric shape of detectors Depending on data representation, generation algorithm. Comparison of different simple shapes. Potential benefit of various shapes or multiple shapes. Sphere is minimal

12 Two Examples of different algorithms GA with niching (hyper-cube) GA 1.Raw_fitnessR = volume(R) − C · num_elements(R) 2.fitnessR = raw_fitnessR − volume(R \ Rj) V-detector (sphere) 1.Random position 2.Largest size that does cover self samples Possible extension to accommodate noise data

13 comparison (hyper) sphere: position + size Hyper-ellipse: position + size in each direction Rectangle, (hyper) cute: rules (or value ranges) in each direction “Hyper” means high dimensional. Or more than that? More parameters for each detectors is not necessarily a bad thing, but we need a reason of advantage to use it.

14 Why different shapes? Some works tried various shapes. Only comparing performance empirically is not enough to reveal which is better in which scenario. One specific shape cannot be the best for all possibility. For example, if the nonself region is long in one direction, it has to take may spherical detectors, but not necessarily rectangular detectors. Combination may include strength of different shapes. Complicity in generation and representation cannot be ignored.

15 Sphere is good Minimal: a point plus a matching threshold (distance measure) Hyper sphere is not the limit: abstract sphere It applies to different distance measure. For example, Manhattan distance –square (45 o rotated) It applies to different data representation. Is sphere good enough??

16 references D. Dasgupta and F. Gonzalez. An Immunity- Based Technique to Characterize Intrusions in Computer Networks. IEEE Transactions on Evolutionary Computation, 6(3), pages 1081- 1088 June 2002. Zhou Ji, Dipankar Dasgupta, Negative Selection Algorithm using Variable-Sized Detectors in Real-Valued Application. GECCO 2004 Z. Ji. Influence of training data’s interpretation strategy on real-valued negative selection algorithm, Technical report, The University of Memphis, June, 2004.


Download ppt "Some Technical Considerations In negative selection algorithms 4/15/2005."

Similar presentations


Ads by Google