Download presentation
Presentation is loading. Please wait.
Published byThomasine O’Brien’ Modified over 9 years ago
1
Intelligent Agents CMPT 463
2
Outline Agents and environments PEAS of task environment (Performance measure, Environment, Actuators, Sensors) Environment types Agent types
3
Agents An AI program = An intelligent Agent An agent is anything that can be viewed as perceiving its environment though sensors and acting upon that environment though actuators. Perception-Action Cycle
4
Application of Intelligent Agents AI has successfully been used in o Finance? o Robotics? o Games? o Medicine? o The Web?
5
AI in Finance Stock Market Bonds Commodities Market Trading Agent Rates Online News Trades (Buy/Sell)
6
AI in Robotics Physical actuators Environment Robot Cameras, microphones, infrared range finders Touch Motors: wheels, legs, arms, grippers Voice
7
AI in Games Chess game Characters in games You Game Agent Your moves Its own moves
8
AI in Medicine You Diagnostic Agent Vital Signals: blood pressures, heart signals Diagnostics Doctor
9
Simple Diagnostic Expert System: http://familydoctor.org/familydoctor/en/health- tools/search-by-symptom.html http://familydoctor.org/familydoctor/en/health- tools/search-by-symptom.html
10
AI and the Web Search engines World Wide Web CrawlerRetrieve Web Pages through programs DB You Query List of hits
11
Percept : the agent’s perceptual inputs at any given instant Percept Sequence : the complete history of everything the agent has ever perceived The agent function maps from percept histories to actions: [f: P* A ] (abstract) The agent program runs on the physical architecture to produce f. (implementation)
12
Vacuum-Cleaner World Percepts: location and contents, e.g., [A, Dirty] Actions: Left, Right, Suck, NoOp
13
Agent Function Percept sequenceAction [A, Clean]Right [A, Dirty]Suck [B, Clean]Left [B, Dirty]Suck [A, Clean] Right [A, Clean] [A, Dirty]Suck …… [A, Clean] [A, Clean] [A, Clean]Right [A, Clean] [A, Clean] [A, Dirty]Suck …… Can it be implemented in a small program?
14
A Vacuum-Cleaner Agent function REFLEX-VACUUM-AGENT ([location,status]) returns an action if status = Dirty then return Suck else if location = A then return Right else if location = B then return Left
15
Good Behavior and Rationality Rational Agent – an agent that does the “right” thing for each possible percept sequence, a rational agent should select an action that is expected to maximize its performance measure, given the evidence provided by the percept sequence and what ever built- in knowledge the agent has.
16
Task Environment To design a rational agent we must specify its task environment. PEAS description of the environment: o P erformance measure o E nvironment o A ctuators o S ensors
17
PEAS Example Consider the task of designing an automated taxi: Performance measure : safety, destination, profits, legality, comfort… Environment : US streets/freeways, traffic, pedestrians, weather… Actuators : steering, accelerator, brake, horn, speaker/display… Sensors : camera, sonar, GPS, odometer, engine sensor…
18
PEAS Example Agent: Part-picking robot Performance measure: o Percentage of parts in correct bins Environment: o Conveyor belt with parts, bins Actuators: o Jointed arm and hand Sensors: o Camera, joint angle sensors
19
Internet Shopping Agent Performance measure? o price, quality, appropriateness, efficiency Environment? o WWW sites, vendors, shippers Actuators? o display to user, follow URL, fill in form Sensors? o HTML pages (text, graphics, scripts)
20
Environment Types Fully observable (vs. partially observable): An agent’s sensors give it access to the complete state of the environment at each point in time. o An environment is full observable when the sensors can detect all aspects that are relevant to the choice of action. o Set game vs. poker (needs internal memory)
21
Environment Types Deterministic (vs. stochastic): The next state of the environment is completely determined by the current state and the action executed by the agent. o Chess vs. game with dice (uncertainty, unpredictable) o An environment is uncertain if it is not fully observable or not deterministic.
22
Environment Types Episodic (vs. sequential): The agent’s experience is divided into atomic “episodes” (each episode consists of the agent perceiving and then performing a single action), and the choice of action in each episode depends only on the episode itself. o In sequential environments, the current decision could affect all future decisions. o Vacuum cleaner and part-picking robot o Chess and taxi driving
23
Environment Types Single agent (vs. multiagent): An agent operating by itself in an environment. o Crossword puzzle vs. chess
24
Environment Types Static (vs. dynamic): The environment is unchanged while an agent is deliberating. (The environment is semidynamic if the environment itself does not change with the passage of time but the agent’s performance score does.) o Taxi driving vs. chess (when played with a clock) vs. crossword puzzles
25
Environment Types Discrete (vs. continuous): A limited number of distinct, clearly defined state of the environment, percepts and actions. o Chess vs. taxi driving (infinite: speed and location are continuous values)
26
SolitaireChess with a clock BackgammonTaxiVacuum Cleaner Observable? Deterministic? Episodic? Static? Discrete? Single-agent?
27
SolitaireChess with a clock BackgammonTaxiVacuum Cleaner Observable?NoYes NoYes Deterministic? Episodic? Static? Discrete? Single-agent?
28
SolitaireChess with a clock BackgammonTaxiVacuum Cleaner Observable?NoYes NoYes Deterministic?Yes No Yes Episodic? Static? Discrete? Single-agent?
29
SolitaireChess with a clock BackgammonTaxiVacuum Cleaner Observable?NoYes NoYes Deterministic?Yes No Yes Episodic?No Yes Static? Discrete? Single-agent?
30
SolitaireChess with a clock BackgammonTaxiVacuum Cleaner Observable?NoYes NoYes Deterministic?Yes No Yes Episodic?No Yes Static?YesSemiYesNoYes Discrete? Single-agent?
31
SolitaireChess with a clock BackgammonTaxiVacuum Cleaner Observable?NoYes NoYes Deterministic?Yes No Yes Episodic?No Yes Static?YesSemiYesNoYes Discrete?Yes NoYes Single-agent?
32
SolitaireChess with a clock BackgammonTaxiVacuum Cleaner Observable?NoYes NoYes Deterministic?Yes No Yes Episodic?No Yes Static?YesSemiYesNoYes Discrete?Yes NoYes Single-agent?YesNo Yes
34
Environment Types The simplest environment is o Fully observable, deterministic, episodic, static, discrete and single-agent. Most real situations are o partially observable, stochastic, sequential, dynamic, continuous, multi-agent.
35
The Structure of Agents How does the inside of the agent work? o Agent = Architecture + Program Basic algorithm for a rational agent oWhile (true) do Get percept from sensors into memory Determine best action based on memory Record action in memory Perform action Most AI programs are a variation of this theme
36
Agent Types Function TABLE-DRIVEN_AGENT(percept) returns an action static: percepts, a sequence initially empty table, a table of actions, indexed by percept sequence append percept to the end of percepts action LOOKUP(percepts, table) return action Drawbacks: Huge table Take a long time to build the table This approach is doomed to failure
37
Agent Types Four basic types: - simple reflex agents - model-based reflex agents - goal-based agents - utility-based agents
38
Simple Reflex Agents Select action on the basis of only the current percept. o E.g. the vacuum- agent Large reduction in possible percept/action situations(next page). Implemented through condition-action rules o If dirty then suck
39
The vacuum-cleaner world function REFLEX-VACUUM-AGENT ([location,status]) returns an action if status = Dirty then return Suck else if location = A then return Right else if location = B then return Left
40
The Structure of Agents Simple Reflex Agent function Simple-Reflex-Agent (percept) return action static:rules, a set of condition-action rules state <- INTERPRET-INPUT( percept ) rule <- RULE-MATCH( state, rules ) action <- RULE-ACTION[ rule ] return action
41
Model-Based Reflex Agents To tackle partially observable environments. o Maintain internal state Over time update state using world knowledge o How does the world change. o How do actions affect world. Model of World
42
The Structure of Agents Reflex Agent With State function Reflex-Agent-With-State (percept) return action static: state, a description of the current world state rules, a set of condition-action rules action, the most recent action, initially none state <- UPDATE-STATE( state, action, percept ) rule <- RULE-MATCH( state, rules ) action <- RULE-ACTION[ rule ] return action
43
Goal-Based Agents The agent needs a goal to know which situations are desirable. Typically investigated in search and planning research. Major difference: future is taken into account
44
Utility-Based Agents Certain goals can be reached in different ways. o Some are better, have a higher utility. Utility function maps a (sequence of) state(s) onto a real number.
45
Summary An agent perceives and acts in an environment, has an architecture, and is implemented by an agent program. Task environment – PEAS (P erformance, E nvironment, A ctuators, S ensors ) An ideal agent always chooses the action which maximizes its expected performance, given its percept sequence so far. An agent program maps from percept to action and updates internal state. o Reflex agents respond immediately to percepts. simple reflex agents model-based reflex agents o Goal-based agents act in order to achieve their goal(s). o Utility-based agents maximize their own utility function.
46
Try out some intelligent agents! A Chatbot is a computer program designed to simulate an intelligent conversation with one or more human users via auditory or textual methods, primarily for engaging in small talk. ALICE : http://alice.pandorabots.com/http://alice.pandorabots.com/ o Won the Loebner Prize three times (in 2000, 2001 and 2004) ELIZA : http://nlp-addiction.com/eliza/http://nlp-addiction.com/eliza/ o One of the classic chat bots, written at MIT by Joseph Weizenbaum between 1964 and 1966
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.