Motion and optical flow Thursday, Nov 20 Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys, S. Lazebnik.

Slides:



Advertisements
Similar presentations
Optical Flow Estimation
Advertisements

Motion.
Feature Tracking and Optical Flow
Computer Vision Optical Flow
3D Computer Vision and Video Computing 3D Vision Topic 4 of Part II Visual Motion CSc I6716 Fall 2011 Cover Image/video credits: Rick Szeliski, MSR Zhigang.
Automatic Image Alignment (direct) : Computational Photography Alexei Efros, CMU, Fall 2006 with a lot of slides stolen from Steve Seitz and Rick.
Announcements Quiz Thursday Quiz Review Tomorrow: AV Williams 4424, 4pm. Practice Quiz handout.
Optical Flow Methods 2007/8/9.
May 2004Motion/Optic Flow1 Motion Field and Optical Flow Ack, for the slides: Professor Yuan-Fang Wang Professor Octavia Camps.
Lecture 9 Optical Flow, Feature Tracking, Normal Flow
Announcements Project1 artifact reminder counts towards your grade Demos this Thursday, 12-2:30 sign up! Extra office hours this week David (T 12-1, W/F.
Announcements Project 1 test the turn-in procedure this week (make sure your folder’s there) grading session next Thursday 2:30-5pm –10 minute slot to.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Computing in Image Analysis
Optical Flow Estimation
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
1 Stanford CS223B Computer Vision, Winter 2006 Lecture 7 Optical Flow Professor Sebastian Thrun CAs: Dan Maynes-Aminzade, Mitul Saha, Greg Corrado Slides.
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
Automatic Image Alignment via Motion Estimation
Announcements Project1 due Tuesday. Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Supplemental:
CSCE 641 Computer Graphics: Image Registration Jinxiang Chai.
3D Computer Vision and Video Computing 3D Vision Topic 8 of Part 2 Visual Motion (II) CSC I6716 Spring 2004 Zhigang Zhu, NAC 8/203A
Computer Vision Optical Flow Marc Pollefeys COMP 256 Some slides and illustrations from L. Van Gool, T. Darell, B. Horn, Y. Weiss, P. Anandan, M. Black,
Optical flow Combination of slides from Rick Szeliski, Steve Seitz, Alyosha Efros and Bill Freeman.
Feature Tracking and Optical Flow
MASKS © 2004 Invitation to 3D vision Lecture 3 Image Primitives andCorrespondence.
Video Analysis Mei-Chen Yeh May 29, Outline Video representation Motion Actions in Video.
TP15 - Tracking Computer Vision, FCUP, 2013 Miguel Coimbra Slides by Prof. Kristen Grauman.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 19 – Dense motion estimation (OF) 1.
Feature Tracking and Optical Flow Computer Vision ECE 5554 Virginia Tech Devi Parikh 11/12/13 Slides borrowed from Derek Hoiem, who adapted many from Lana.
Segmentation slides adopted from Svetlana Lazebnik.
Structure from Motion, Feature Tracking, and Optical Flow Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/15/10 Many slides adapted.
Motion Segmentation By Hadas Shahar (and John Y.A.Wang, and Edward H. Adelson, and Wikipedia and YouTube) 1.
CSE 185 Introduction to Computer Vision Feature Tracking and Optical Flow.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Uses of Motion 3D shape reconstruction Segment objects based on motion cues Recognize events and activities Improve video quality Track objects Correct.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #16.
Perceptual and Sensory Augmented Computing Computer Vision WS 08/09 Computer Vision – Lecture 18 Motion and Optical Flow Bastian Leibe RWTH.
3D Imaging Motion.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Miguel Tavares Coimbra
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Feature Tracking and Optical Flow Computer Vision ECE 5554, ECE 4984 Virginia Tech Devi Parikh 11/10/15 Slides borrowed from Derek Hoiem, who adapted many.
Lecture 9 Feature Extraction and Motion Estimation Slides by: Michael Black Clark F. Olson Jean Ponce.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion / Optical Flow II Estimation of Motion Field Avneesh Sud.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
1 Motion Estimation Readings: Ch 9: plus papers change detection optical flow analysis Lucas-Kanade method with pyramid structure Ming Ye’s improved.
MASKS © 2004 Invitation to 3D vision Lecture 3 Image Primitives andCorrespondence.
Image Motion. The Information from Image Motion 3D motion between observer and scene + structure of the scene –Wallach O’Connell (1953): Kinetic depth.
Motion and optical flow
Feature Tracking and Optical Flow
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Estimating Parametric and Layered Motion
Motion and Optical Flow
3D Vision Topic 4 of Part II Visual Motion CSc I6716 Fall 2009
CSE 577 Image and Video Analysis
Image Primitives and Correspondence
Motion Estimation Today’s Readings
Announcements more panorama slots available now
Motion Estimation Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.
Announcements Questions on the project? New turn-in info online
Announcements more panorama slots available now
23 April 2019.
Optical flow Computer Vision Spring 2019, Lecture 21
Optical flow and keypoint tracking
Presentation transcript:

Motion and optical flow Thursday, Nov 20 Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys, S. Lazebnik

Today Pset 3 solutions Introduction to motion Motion fields Feature-based motion estimation Optical flow

Video A video is a sequence of frames captured over time Now our image data is a function of space (x, y) and time (t)

Applications of segmentation to video Background subtraction A static camera is observing a scene Goal: separate the static background from the moving foreground How to come up with background frame estimate without access to “empty” scene?

Applications of segmentation to video Background subtraction Shot boundary detection Commercial video is usually composed of shots or sequences showing the same objects or scene Goal: segment video into shots for summarization and browsing (each shot can be represented by a single keyframe in a user interface) Difference from background subtraction: the camera is not necessarily stationary

Applications of segmentation to video Background subtraction Shot boundary detection For each frame –Compute the distance between the current frame and the previous one »Pixel-by-pixel differences »Differences of color histograms »Block comparison –If the distance is greater than some threshold, classify the frame as a shot boundary

Applications of segmentation to video Background subtraction Shot boundary detection Motion segmentation Segment the video into multiple coherently moving objects

Motion and perceptual organization Sometimes, motion is the only cue

Motion and perceptual organization Sometimes, motion is foremost cue

Motion and perceptual organization Even “impoverished” motion data can evoke a strong percept

Motion and perceptual organization Even “impoverished” motion data can evoke a strong percept

Uses of motion Estimating 3D structure Segmenting objects based on motion cues Learning dynamical models Recognizing events and activities Improving video quality (motion stabilization)

Today Pset 3 solutions Introduction to motion Motion fields Feature-based motion estimation Optical flow

Motion field The motion field is the projection of the 3D scene motion into the image

Motion field and parallax P(t) is a moving 3D point Velocity of scene point: V = dP/dt p(t) = (x(t),y(t)) is the projection of P in the image Apparent velocity v in the image: given by components v x = dx/dt and v y = dy/dt These components are known as the motion field of the image p(t)p(t) p(t+dt) P(t)P(t) P(t+dt) V v

Motion field and parallax p(t)p(t) p(t+dt) P(t)P(t) P(t+dt) V v To find image velocity v, differentiate p with respect to t (using quotient rule): Image motion is a function of both the 3D motion (V) and the depth of the 3D point (Z) Quotient rule: D(f/g) = (g f’ – g’ f)/g^2

Motion field and parallax Pure translation: V is constant everywhere

Motion field and parallax Pure translation: V is constant everywhere V z is nonzero: Every motion vector points toward (or away from) v 0, the vanishing point of the translation direction

Motion field and parallax Pure translation: V is constant everywhere V z is nonzero: Every motion vector points toward (or away from) v 0, the vanishing point of the translation direction V z is zero: Motion is parallel to the image plane, all the motion vectors are parallel The length of the motion vectors is inversely proportional to the depth Z

Motion parallax ax/MotionParallax.html

Figure from Michael Black, Ph.D. Thesis Length of flow vectors inversely proportional to depth Z of 3d point points closer to the camera move more quickly across the image plane Motion field + camera motion

Zoom outZoom inPan right to left

Motion estimation techniques Feature-based methods Extract visual features (corners, textured areas) and track them over multiple frames Sparse motion fields, but more robust tracking Suitable when image motion is large (10s of pixels) Direct methods Directly recover image motion at each pixel from spatio-temporal image brightness variations Dense motion fields, but sensitive to appearance variations Suitable for video and when image motion is small

Feature-based matching for motion Interesting point Best matching neighborhood Time tTime t+1

A Camera Mouse Video interface: use feature tracking as mouse replacement User clicks on the feature to be tracked Take the 15x15 pixel square of the feature In the next image do a search to find the 15x15 region with the highest correlation Move the mouse pointer accordingly Repeat in the background every 1/30th of a second James Gips and Margrit Betke

A Camera Mouse Specialized software for communication, games James Gips and Margrit Betke

A Camera Mouse Specialized software for communication, games James Gips and Margrit Betke

What are good features to track? Recall the Harris corner detector Can measure quality of features from just a single image Automatically select candidate “templates”

Motion estimation techniques Feature-based methods Extract visual features (corners, textured areas) and track them over multiple frames Sparse motion fields, but more robust tracking Suitable when image motion is large (10s of pixels) Direct methods Directly recover image motion at each pixel from spatio-temporal image brightness variations Dense motion fields, but sensitive to appearance variations Suitable for video and when image motion is small

Optical flow Definition: optical flow is the apparent motion of brightness patterns in the image Ideally, optical flow would be the same as the motion field Have to be careful: apparent motion can be caused by lighting changes without any actual motion

Apparent motion ~= motion field Figure from Horn book

Estimating optical flow Given two subsequent frames, estimate the apparent motion field between them. Key assumptions Brightness constancy: projection of the same point looks the same in every frame Small motion: points do not move very far Spatial coherence: points move like their neighbors I(x,y,t–1)I(x,y,t)I(x,y,t)

Brightness constancy Figure by Michael Black

Brightness Constancy Equation: Can be written as: The brightness constancy constraint I(x,y,t–1)I(x,y,t)I(x,y,t) So,

The brightness constancy constraint How many equations and unknowns per pixel? One equation, two unknowns Intuitively, what does this constraint mean? The component of the flow perpendicular to the gradient (i.e., parallel to the edge) is unknown

The brightness constancy constraint How many equations and unknowns per pixel? One equation, two unknowns Intuitively, what does this constraint mean? The component of the flow perpendicular to the gradient (i.e., parallel to the edge) is unknown edge (u,v)(u,v) (u’,v’) gradient (u+u’,v+v’) If (u, v) satisfies the equation, so does (u+u’, v+v’) if

The aperture problem Perceived motion

The aperture problem Actual motion

The barber pole illusion

The barber pole illusion

The barber pole illusion

Solving the aperture problem (grayscale image) How to get more equations for a pixel? Spatial coherence constraint: pretend the pixel’s neighbors have the same (u,v) If we use a 5x5 window, that gives us 25 equations per pixel

Solving the aperture problem Prob: we have more equations than unknowns The summations are over all pixels in the K x K window This technique was first proposed by Lucas & Kanade (1981) Solution: solve least squares problem minimum least squares solution given by solution (in d) of:

Conditions for solvability When is this solvable? A T A should be invertible A T A should not be too small –eigenvalues 1 and 2 of A T A should not be too small A T A should be well-conditioned – 1 / 2 should not be too large ( 1 = larger eigenvalue) Slide by Steve Seitz, UW

Edge – gradients very large or very small – large  1, small 2

Low-texture region – gradients have small magnitude – small  1, small 2

High-texture region – gradients are different, large magnitudes – large  1, large 2

Example use of optical flow: Motion Paint Use optical flow to track brush strokes, in order to animate them to follow underlying scene motion.

Motion vs. Stereo: Similarities Both involve solving –Correspondence: disparities, motion vectors –Reconstruction

Motion vs. Stereo: Differences Motion: –Uses velocity: consecutive frames must be close to get good approximate time derivative –3d movement between camera and scene not necessarily single 3d rigid transformation Whereas with stereo: –Could have any disparity value –View pair separated by a single 3d transformation

Summary Motion field: 3d motions projected to 2d images; dependency on depth Solving for motion with –sparse feature matches –dense optical flow Optical flow –Brightness constancy assumption –Aperture problem –Solution with spatial coherence assumption