ขั้นตอนวิธีเชิงพันธุกรรมสำหรับการอนุมานเครื่องจักรสถานะจำกัด

Slides:



Advertisements
Similar presentations
Un percorso realizzato da Mario Malizia
Advertisements

A Fast PTAS for k-Means Clustering
5.1 Rules for Exponents Review of Bases and Exponents Zero Exponents
Simplifications of Context-Free Grammars
PDAs Accept Context-Free Languages
ALAK ROY. Assistant Professor Dept. of CSE NIT Agartala
Logical and Artificial Intelligence in Games Lecture 14
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
An Exemplar-Based Fitness Function with Dynamic Value Based on Sub-Problem Difficulty Steve Rowe June 2006 Srowe -at- cybernet -dot- com Please put "genetic"
Sequential Logic Design
Copyright © 2013 Elsevier Inc. All rights reserved.
Multiplication X 1 1 x 1 = 1 2 x 1 = 2 3 x 1 = 3 4 x 1 = 4 5 x 1 = 5 6 x 1 = 6 7 x 1 = 7 8 x 1 = 8 9 x 1 = 9 10 x 1 = x 1 = x 1 = 12 X 2 1.
Division ÷ 1 1 ÷ 1 = 1 2 ÷ 1 = 2 3 ÷ 1 = 3 4 ÷ 1 = 4 5 ÷ 1 = 5 6 ÷ 1 = 6 7 ÷ 1 = 7 8 ÷ 1 = 8 9 ÷ 1 = 9 10 ÷ 1 = ÷ 1 = ÷ 1 = 12 ÷ 2 2 ÷ 2 =
By John E. Hopcroft, Rajeev Motwani and Jeffrey D. Ullman
Create an Application Title 1Y - Youth Chapter 5.
Add Governors Discretionary (1G) Grants Chapter 6.
CALENDAR.
1 1  1 =.
1  1 =.
Introduction to Turing Machines
1/23 Learning from positive examples Main ideas and the particular case of CProgol4.2 Daniel Fredouille, CIG talk,11/2005.
The 5S numbers game..
A Fractional Order (Proportional and Derivative) Motion Controller Design for A Class of Second-order Systems Center for Self-Organizing Intelligent.
1 OFDM Synchronization Speaker:. Wireless Access Tech. Lab. CCU Wireless Access Tech. Lab. 2 Outline OFDM System Description Synchronization What is Synchronization?
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
Numerical Analysis 1 EE, NCKU Tien-Hao Chang (Darby Chang)
EE, NCKU Tien-Hao Chang (Darby Chang)
Chapter 11: Models of Computation
Turing Machines.
Look at This PowerPoint for help on you times tables
Regression with Panel Data
Biology 2 Plant Kingdom Identification Test Review.
MaK_Full ahead loaded 1 Alarm Page Directory (F11)
Artificial Intelligence
When you see… Find the zeros You think….
Midterm Review Part II Midterm Review Part II 40.
 Find the difference between the two numbers on the red boxes.  If the difference of the red boxes matches the blue box say “deal” f not, it’s “no.
1 Non Deterministic Automata. 2 Alphabet = Nondeterministic Finite Accepter (NFA)
16. Mean Square Estimation
1 Dr. Scott Schaefer Least Squares Curves, Rational Representations, Splines and Continuity.
1 Decidability continued…. 2 Theorem: For a recursively enumerable language it is undecidable to determine whether is finite Proof: We will reduce the.
1 Non Deterministic Automata. 2 Alphabet = Nondeterministic Finite Accepter (NFA)
The Pumping Lemma for CFL’s
More on Dynamic Programming Bellman-FordFloyd-Warshall Wednesday, July 30 th 1.
1 Machine Learning: Lecture 3 Decision Tree Learning (Based on Chapter 3 of Mitchell T.., Machine Learning, 1997)
CS6800 Advanced Theory of Computation
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Finite State Machine State Assignment for Area and Power Minimization Aiman H. El-Maleh, Sadiq M. Sait and Faisal N. Khan Department of Computer Engineering.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Genetic Programming.
Genetic Algorithm.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Lecture 23: Finite State Machines with no Outputs Acceptors & Recognizers.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Genetic Algorithms Michael J. Watts
Algorithms and Algorithm Analysis The “fun” stuff.
ASC2003 (July 15,2003)1 Uniformly Distributed Sampling: An Exact Algorithm for GA’s Initial Population in A Tree Graph H. S.
Applying Genetic Algorithm to the Knapsack Problem Qi Su ECE 539 Spring 2001 Course Project.
DYNAMIC FACILITY LAYOUT : GENETIC ALGORITHM BASED MODEL
1 Genetic Algorithms K.Ganesh Introduction GAs and Simulated Annealing The Biology of Genetics The Logic of Genetic Programmes Demo Summary.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
ECE 103 Engineering Programming Chapter 52 Generic Algorithm Herbert G. Mayer, PSU CS Status 6/4/2014 Initial content copied verbatim from ECE 103 material.
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Daniil Chivilikhin and Vladimir Ulyantsev
Machine Learning: Lecture 3
Lecture 14 Learning Inductive inference
Presentation transcript:

ขั้นตอนวิธีเชิงพันธุกรรมสำหรับการอนุมานเครื่องจักรสถานะจำกัด อาจารย์ที่ปรึกษาวิทยานิพนธ์ รศ. ดร. ประภาส จงสถิตย์วัฒนา ประธานกรรมการ ศ. ดร. ชิดชนก เหลือสินทรัพย์ กรรมการ ผศ. ดร. บุญเสริม กิจศิริกุล ผศ. ดร. ณชล ไชยรัตนะ เสนอโดย นายนัทที นิภานันท์ เลขประจำตัว 403 02410 21

Introduction Inference of FSM ≡ ? ? ? from observed input/output Mimic the target machine Target Machines Hypothesis Machine ? ? ? ≡ INPUT OUTPUT Learning Method

Presentation Outline Claim Some details of claim Conviction Extras Legal stuff Some details of claim Conviction Experiment Analysis Conclusion Extras Summary

Hypothesis A new genetic algorithm proposed in this thesis is a better way to solve the problem of finite state machine inference than the former genetic algorithm

Legal stuff: Objective To develop a better genetic algorithm for the problem

Legal Stuff: Scope Compare the new method with reference genetic algorithm The new method must be shown to be better than the reference method The solutions from the new method must be shown to be consistent

Former GA (REF) Encode δ and λ in bit string Single point crossover Evaluate by counting similar output bit ... Next State Output Next State Output Next State Output Next State Output 0-transition 1-transition 0-transition 1-transition State 0 State N Hypothesis Machine Hypothesis OUTPUT INPUT Sequence Compare OUTPUT Sequence OUTPUT Sequence

New GA New evaluation & encoding New crossover operator NEW1 method

New Evaluation Old evaluation can mislead the search 0/B Old evaluation can mislead the search Correct δ under wrong λ will result in totally wrong score B A 0/A 1/A 1/B Target Machine 0/A B A 0/B 1/B 1/A Hypothesis Machine

New Evaluation Main idea Perform local search on each output value Each transition is evaluated by some IO pairs Why make-then-ask? Perform local search on each output value old method fixes output (A) it is only 1/3 correct IO Sequence: Evaluate particular state X by (0,B) (0,B) (0,A) 0/A X new method adjusts its output according to IO. It choose B to get 2/3 correct 0/?  0/B X

New Evaluation: Example (b) 1 (c) Input : 0 0 1 0 1 0 1 Output : 0 1 1 0 0 0 0 a c d a d a d (a) X Y (d) 1 Evaluation value = 3 + 0 + 1 + 2 = 6

Output Definition X Y (b) 1 (c) (a) (d) 1 Output:  0 (a) X Y (d) 1 Output:  0  N/A (any arbitrary value)  1

New Encoding Encoding λ is futile ... omitted Next State Next State 0-transition 1-transition 0-transition 1-transition State 0 State N

New Crossover The encoding scheme introduces chance of not having a tight linkage high destructive effect A B C D E F G

New Crossover Choose two parents, find the best one Rearrange state according to DFS order discard inaccessible state Perform single point crossover on the new list of states

New Crossover: Example B C D E F G A G C E D B

Experiment To compare performance of REF, NEW1 and NEW2 Measure number of generation used, time used and successful run

Setup of the experiment Output Algorithm FSM Generator Hypothesis Machine REF Hypothesis Machine IO Sequence Generator Hypothesis Machine Target FSM FSM FSM Hypothesis Machine NEW1 Hypothesis Machine Hypothesis Machine IO Seq Set FSM FSM Hypothesis Machine NEW2 Hypothesis Machine Hypothesis Machine Input

Experimental Result

Experimental Result

Experimental Result : Summary (Generation) REF NEW1 NEW2 Total 749,246 527,488 505,780 Relative 100% 70.40% 67.51% Best 31 35 Successful Runs 461 561 585 (Time) REF NEW1 NEW2 Total 64,350 46,393 51,801 Relative 100% 72.10% 80.50% Best 2 52 26

Analysis New evaluation & coding = local search Schema preservation Search space reduction Intron from inaccessible state Schema preservation Intron nullifying intron from unexercised state

Search Space Reduction

Search Space Reduction

Search Space Reduction

Search Space Reduction : Summary Experiment A A1 A2 Avg. Generation Used 70.40% 61.60% 54.12% Avg. Time Used 72.09% 55.16% 49.43% Number of problem Solved 121.69% 135.62% 150.64%

Additional Experiment Comparison between NEW2 and heuristic based method, red-blue compare correctness of result when the size of training set is reduced using cross validation method correctness = proportion of correctly identified data on test set Training set : IO Sequence length: 5  35 (step 2) number: 6  36 (step 6)

Heuristic Method : red-blue Using heuristic in search fast scalable No restriction on the size of the hypothesis

Correctness vs. Sample Size

Additional Experiment: Analysis shorter description of hypothesis is more preferable Occam’s Razor

Size of hypothesis vs. Sample Size

What have been done? A genetic algorithm for finite state machine inference problem is presented It is empirically shown that the proposed method is better than former methods

What can be extended by others? Practical Issues Better linkage awareness Chromosome representation Theoretical Issues Effect of intron Formal analysis of preference of short hypothesis

What would you like to ask?