Presentation is loading. Please wait.

Presentation is loading. Please wait.

System Architecture The Integration of Processing Components and Knowledge.

Similar presentations


Presentation on theme: "System Architecture The Integration of Processing Components and Knowledge."— Presentation transcript:

1 System Architecture The Integration of Processing Components and Knowledge

2 Introduction So far –Presented methods of achieving goals Integration of methods? –Controlling execution –Incorporating knowledge

3 3 Knowledge “ The fact of knowing a thing, state, person; A state of being aware or informed; Consciousness”. Shorter Oxford English Dictionary (1973)

4 4 Knowledge and Intelligence Knowledge IntelligenceCognition Language; Creativity; Planning, Thinking, Computation

5 5 Knowledge and Intelligence Knowledge is acquired and disseminated by intelligent and cognate beings. The terms knowledge, cognition and intelligence are used interchangeably. And there is a good reason for this: Various cognitive processes help in converting information and stimuli into knowledge. Knowledgeable beings then act intelligently because of their greater awareness.

6 What knowledge? What do algorithms achieve? What is known about the problem being solved? Relationship between problem and algorithm?

7 Knowledge representation Each knowledge representation scheme must include: Strategic knowledge about the problem being investigated and potential solutions Algorithmic knowledge about how potential solutions translate into computer vision algorithms and how the algorithms interact (about the control) Data knowledge about information derived from the image and how this can contribute to the solution

8 Knowledge representation Implied Feature vectors Relational structures Hierarchical structures Rules Frames

9 Implied knowledge Knowledge encoded in software Usually inflexible in –Execution –Reuse Simple to design and implement Systems often unreliable

10 Feature vectors As seen in statistical representations Vector elements can be –Numerical –Symbolic coded numerically

11 Example: strokes3 loops1 w-h ratio1 A N strokes3 loops0 w-h ratio1

12 Relational structures Encodes relationships between –Objects –Parts of objects Can become unwieldy for –Large scenes –Complex objects

13 Relational structures supporting block adjacent above

14 Follow natural division of Hierarchical structures scene objects parts of object

15 Example: scene roadwaybuilding grassland grasstreeroadjunction edges

16 Uses Structure defines possible appearance of objects Structure guides processing Representing a processing result

17 Rule-Based System DatabaseRulebase Inference engine

18 Rules Rules code quanta of knowledge Interpretation –Forwards –Backwards 

19 Forward chaining If is TRUE Execute Antecedent will be a test on some data Action might modify the data Suitable for low level processing

20 Backward chaining Action is some goal to achieve Antecedent defines how it should be achieved Suitable for high level processing –Guides focus of system

21 Frames A “data-structure for representing a stereotyped situation” Slot (attribute) Filler (value: atomic, link to another frame, default or empty, call to a function to fill the slot)

22 Methods of control How to control how the system’s knowledge is used. –Hierarchical

23 Hierarchical control “Algorithm” defines control Two extreme variants –Bottom-up –Top-down

24 Bottom-up control Object recognition Extracted features, Attributes, Relationships Image Decision making Feature extraction

25 Top-down control Hypothesised object Predicted features, Attributes, Relationships Features in image that Support or refute the hypothesis Prediction Directed feature extraction

26 Critique Inflexible methods Errors propagate Hybrid control –Can make predictions –Verify –Modify predictions

27 Hybrid control Object recognition Image Decision making Feature extraction Extracted features, Attributes, Relationships Predicted features, Attributes, Relationships Prediction

28 Uncertainty Reasoning Bayesian methods –Define a belief network –A tree structure Reflects evidential support of a fact F1F2F3

29 Dempster-Shafer Bayesian theory has confidence in belief only No measure of disbelief D-S attempts to define this

30 Dempster-Shafer Bayesian reasoning allows us to state our belief in a hypothesis and our belief in that same hypothesis when some new data are received. Dempster-Shafer theory (D-S) also provides an assessment of belief in some hypothesis which can be modified in the light of new data. Unlike Bayesian reasoning, D-S takes into account the fact that it may not be possible to assign a belief to every hypothesis set.

31 D-S scenario Mrs Jones has a carton of cream delivered along with the milk every early morning on some days of the week. On most mornings following delivery of the cream, the carton is found open and the content is gone. Mrs Jones believes that the culprit is one of the three animals that stalk the area. One animal is a dog, the other a cat and the third a fox. Occasionally a neighbour will catch sight of the thief in the act, but the delivery is before daylight and no neighbour has been certain about their sighting.

32 D-S scenario There are three suspects: –Dog –d –Cat – c –Fox – f and each suspect represents a hypothesis. Only a single animal is responsible for the theft.

33 D-S scenario The set of hypotheses is called the frame of discernment Θ. In this example: Θ={d,c,f} The thief is either the dog or the cat or the fox. DS is not limited to assigning a belief to only dog, cat or fox but can assign beliefs to any element that is a member of the power set of Θ. The belief in an element, x, is referred to as a probability mass denoted, m(x).

34 D-S scenario The power set of Θ is the set of all subsets of Θ and is denoted by 2 Θ. 2 Θ ={Φ, {d}, {c}, {f}, {d,c}, {d,f}, {c,f}, Θ} Where Φ denote the empty set. The power set expresses all possibilities. For example, {d} is the hypothesis that the dog takes the cream and {d,f} is the hypothesis that the culprit is either the dog or the fox.

35 D-S scenario There are restrictions on the values of m(x): Which state that the total mass must sum to 1 and that the empty set is not possible (the closed world assumption which means that no animal other than the dog, fox or cat is stealing the cream). Any subset x that has a non-zero value for m(x) is called a focal element.

36 D-S scenario Suppose neighbour 1 states that she believes it is either the dog or cat with probability 0.8. So m({d,c}) = 0.8. The probability must sum to 1 and so 0.2 has to assigned somehow to the other hypotheses sets. The best we can do without any other information is to assign it to the whole frame of discernment m({d, c, f}) = 0.2.

37 D-S scenario On the following night, another neighbour spots the thief and states that she believes that it was either the cat or fox with probability 0.7. How should these new data be combined with the original data? D-S theory states that the original mass is combined with the new mass according to the rule -eqn (1)

38 D-S scenario A is the set of focal elements identified by neighbour 1 and B those by neighbour 2. This equation states that there is a set C of focal elements formed by the intersection of the sets in A and B and the mass assigned to an element in C is the product of the intersection masses. The result of applying eqn (1) is given in table 1

39 D-S scenario Neighbour 2 m({c,f}) =0.7m({d,c,f})=0.3 Neighbour 1m({d,c})=0.8m({c})=0.56m({d,c})=0.24 m({d,c,f})=0.2m({c,f}) =0.14m(d,c,f})=0.06 Table 1. The probability masses from neighbours 1 and 2 are combined

40 D-S scenario We shall use the notation m n to indicate the evidence has been encountered at step n. The first step was from neighbour 1 and the second from neighbour 2, which are combined to give a new belief at step 3. So: m 3 ({c})=0.56 m 3 ({d,c})=0.24 m 3 ({c,f}) =0.14 m 3 (d,c,f})=0.06

41 D-S scenario Two probability measures are provided which assess the belief (Bel) and plausibility (Pl) of any set of hypotheses: -eqn (2) -eqn (3)

42 D-S scenario These two measures represent lower and upper bounds on the belief in a set of hypotheses. So the belief in the cat being the culprit is the sum of the masses where the set of hypotheses is a subset of {c}, which in this case is simply: Bel({c})=0.56 The plausibility is the sum of all masses that contain cat as a member: Pl ({c}) = 0.56+0.24+0.14+0.06= 1.0

43 D-S scenario The belief and plausibility in the dog and fox are: Bel({d}) = 0 Pl({d}) = 0.24+0.06 = 0.3 Bel({f}) =0 Pl ({f}) 0.14+0.06=0.2

44 D-S scenario The belief and plausibility in it being either the dog or cat are: Bel ({d,c}) = 0.56+0.24=0.8 Pl({d,c}) = 0.56+0.24+0.14+0.06=1.0

45 D-S scenario Neighbour 3 m 4 ({f})=0.6m 4 =({d,c,f})=0.4 m 3 ({c})=0.56nullm 5 ({c})=0.224 m 3 ({d,c})=0.24nullm 5 ({d,c})=0.096 Existing focal elements m 3 ({c,f}) =0.14m 5 ({f})=0.084m 5 ({d,f})=0.056 m 3 (d,c,f})=0.06m 5 ({f})=0.036m 5 ({d,c,f})=0.024 Table 2. Combining evidence from neighbour 3 with the evidence derived from combining sightings of neighbours 1 and 2

46 D-S scenario Table 2 is problematic because there are two null entries that indicate an empty intersection between the existing focal elements and the new evidence. In other words, the empty set has a mass which violates the earlier condition that is not possible to have belief in something outside of the sets of hypotheses. The suggested way around this problem is to normalise the entries using the following equation

47 D-S scenario For our example, this equation suggests that we should divide each new focal element by the sum of all focal elements that do not have a null entry. All we are doing is ensuring that the null entries have a mass of zero and that all other new focal elements sum to 1. The denominator is: 0.084+ 0.036 + 0.224 + 0.096 + 0.056 + 0.024 = 0.52 Each newly calculated focal element in table 2 is now updated by dividing by 0.52. The updated values are given in table 3. The final beliefs and plausibilities for each set of hypotheses after all three neighbours have given evidence are list in table 4

48 D-S scenario Neighbour 3 m 4 ({f})=0.6m 4 =({d,c,f})=0.4 m 3 ({c})=0.56nullm 5 ({c})=0.431 m 3 ({d,c})=0.24nullm 5 ({d,c})=0.185 Existing focal elements m 3 ({c,f}) =0.14m 5 ({f})=0.162m 5 ({d,f})=0.108 m 3 (d,c,f})=0.06m 5 ({f})=0.069m 5 ({d,c,f})=0.046 Table 3

49 D-S scenario BeliefPlausibility {d}00.231 {c}0.4310.770 {f}0.2310.385 {d,c}0.6160.770 {d,f}0.2310.570 {c,f}0.7701.0 {d,c,f}1.0 Table 4

50 Summary Intelligent (vision) systems –Knowledge representation –Control strategies –reasoning


Download ppt "System Architecture The Integration of Processing Components and Knowledge."

Similar presentations


Ads by Google