Presentation is loading. Please wait.

Presentation is loading. Please wait.

Embodied Speech and Facial Expression Avatar  Dan Harbin - Evan Zoss - Jaclyn Tech - Brent Sicking  May 10, 2004.

Similar presentations


Presentation on theme: "Embodied Speech and Facial Expression Avatar  Dan Harbin - Evan Zoss - Jaclyn Tech - Brent Sicking  May 10, 2004."— Presentation transcript:

1 Embodied Speech and Facial Expression Avatar  Dan Harbin - Evan Zoss - Jaclyn Tech - Brent Sicking  May 10, 2004

2 Problem Background/Needs Statement Messages of the face help illustrate verbal communication by revealing what the expresser is feeling or trying to convey. The ability to generate animated facial expressions together with speech is important to many diverse application areas. –A deaf person could use an animated face as a lip-reading system. –An autistic child could be positively affected from a robotic face in terms of social interaction, language development, and learning through structure and repetition.

3 Goals and Objectives The overall goal of this project is to create a robotic face capable of displaying human emotion accompanied with speech.

4 Goals and Objectives Reverse engineer Yano’s motors and sensors so we are able to move them to any desired position. Develop a GUI that allows the user to move each motor in both directions to a desired position. Research the psychology behind the use of facial expressions to convey emotion and mimic these facial expressions with the Yano face. Develop a GUI that allows the user to select and display real human facial expressions. Develop software to mimic speech based on a measure of the intensity of various pre-recorded wave files.

5 Yano Control System

6 Part 1: The Computer 1.Allows the user to directly control the movement of Yano’s eyes, cheeks, and mouth motors. 2.Provides parameterized control of Yano’s facial expressions by allowing the user to both select from a predefined set of expressions and to control his expression in terms of valence, arousal, and stance. 3.Allows the user to load a pre-recorded wave file and play it back as Yano mimics human speech based on the intensity of the wave file.

7 User Interface: The Main Menu

8 User Interface: Manual Motor Control

9 User Interface: Facial Expressions

10 Arousal – to stir up, excite; provoke; to awaken from. Valence – the degree of attraction or aversion that an individual feels toward a specific object or event. Stance – the altitude or position of a standing person; mental posture; point of view; a station, a position.

11 User Interface: Sound Processing Progress Bar File Name (.wav) Intensity Meter (based on power waveform)

12 User Interface: Sound Processing Original WaveformPower Waveform

13 Input Port: AD1 AD5 Power: Gnd Vcc Serial Port Motor Control Port: SV6 SV1 Part 2: SV203 Microcontroller

14 Receives command through the serial port Set or Clear the appropriate Motor Control Pin(s) Read an analogue voltage off of the desired Input Pin(s) Transmit a value representing the voltage back up the serial line SV203 Functional Description

15 Serial Port – ASCII text commands are sent to the board via the serial port to tell it what to do. Values from the input pins are also sent back to the computer via the serial port –List of commands we use: SVxM0 – initialize pin x to use digital logic PSx – set pin x high PCx – clear pin x to low Dn – delay for n milliseconds before next command PC1PC3PC5D300PS1PS3PS5 – typical motor control command ADy – read the voltage of input pin y, transmit up serial port Motor Control Port – sends the logic controls for the motors to the Yano I/O Board. When a pin is set high with PSx, it is set to 6V, PCx will set it to 0V. We use six pins, SV1 through SV6 A/D Input Port – receives the status of Yano’s switches from the Yano I/O Board. We use 5 pins, AD1 through AD5. Each pin will have 6V on it if it’s switch is open, and near 0V if it is closed. The SV203 converts these voltages to the numbers 0 – 255 for 0V-6V. SV203 Interface Description

16 SV203 Microcontroller Yano Switch Circuit: Part 3: Yano I/O Board

17 Receives logic controls for the motors from SV203 Converts them into powered control for Yano’s motors Reads in status of Yano’s switches, open or closed Converts this to a voltage, 6V for open, 0V for closed, and sends back to SV203 Yano I/O Board Functional Description

18 Motor Control Input – the logic input for the H-Bridges that determines motor direction and movement. They are paired off, 2 pins per H-Bridge, 1 Bridge per motor: –Mouth: SV5 and SV6 –Cheeks: SV3 and SV4 –Eyes: SV1 and SV2 Motor Outputs – 3 two pin ports, one for each motor, each pin will have either Vcc or Gnd. If both pins are Vcc (default state) there is no potential between them and the motor will not turn. If one pin drops to Gnd, the motor will turn one way, vice-versa for the other pin. Sensor Inputs – these ports connect directly to Yano’s switches. The mouth and cheek motors each have two limit switches to determine when they run far enough in each direction. The eye motor can run a complete 360 degree rotation, and so just has a single sensor that is triggered when the eye motor is in a particular place around the rotation. Sensor Outputs - the interface back to the SV203 that has 5 pins, each of which are set to 6V for open switch and 0V for closed switch. They are paired off according to which motor they are the limit switches for: –Mouth: AD3 and AD4 –Cheeks: AD1 and AD2 –Eyes: AD5 Yano I/O Board Interface Description

19 Part 4: Yano

20 Yano has 3 motors powered by the Yano I/O Board. One for each the mouth, one for the cheeks, and one to control the eyelids, eyebrows, and ears. When the mouth and cheek motors reach their endpoints (ie. fully open or fully closed), they close a switch to indicate that limit is reached. These switches are read by the Yano I/O Board. Yano Functional Description

21 Yano’s interfaces are the motor controls, and the switch feedbacks. The wires are coded as follows: –Motors: Red/Black – Eyes Green/Black – Cheeks White/Black – Mouth –Sensors: Red/Green/Brown – Mouth Gray/Yellow/Pink – Cheeks Green/Yellow/Red/Brown – Eyes Yano Interface Description

22 Validation and Testing Procedures Calibration Test - Calibrate the motors, then run the motors to its limits and back to see if it stays calibrated. Expression Test - Change from any one expression to any other expression, and the face should show the desired expression each time. Speech Test - Using a sample sound file, make sure Yano produces the right mouth movements for the differences in sound volume consistently and accurately.

23 Validation and Testing Procedures How do we know we accomplished our goal? Calibration – We are able to know the exact position of each motor at any given time while the software is running. Expressions – We can produce and move between a pre- defined set of believable and readable facial expressions. Speech – Yano is consistently able to move his mouth in concurrence with a wave file; The movement and amount of opening is believable and realistic.

24 Itemized Budget PartQuantityCost Computer1N/A Yano1$65.49 SV203 Microcontroller1$59.98 TC4424 H-bridges9$9.33 Serial Cable1$11.99 Breadboard1$19.97 2 pin.100" Female Locking Connector6$8.94 4 pin.100" Female Locking Connector1$1.49 6 pin.100" Female Locking Connector1$1.49 8 pin.100" Female Locking Connector2$2.98 2 pin.100" Male Locking Connector4$5.96 4 pin.100" Male Locking Connector3$4.47 6 pin.100" Male Locking Connector1$1.49 8 pin.100" Male Locking Connector1$1.49 1kΩ Resistor5$0.99.1 µF Capacitor1$0.10.01 µF Capacitor1$0.10 Green Wire24$1.00 Red Wire17$1.00 Black Wire12$1.00 Total $199.26

25 Timeline of Tasks

26 Thank you Applied Materials and The National Science Foundation


Download ppt "Embodied Speech and Facial Expression Avatar  Dan Harbin - Evan Zoss - Jaclyn Tech - Brent Sicking  May 10, 2004."

Similar presentations


Ads by Google