Robust Visual Motion Analysis: Piecewise-Smooth Optical Flow

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Motion.
Investigation Into Optical Flow Problem in the Presence of Spatially-varying Motion Blur Mohammad Hossein Daraei June 2014 University.
Robust Visual Motion Analysis: Piecewise-Smooth Optical Flow and Motion-Based Detection and Tracking Ming Ye Electrical Engineering, University of Washington.
Computer Vision Optical Flow
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Motion Estimation I What affects the induced image motion? Camera motion Object motion Scene structure.
Announcements Quiz Thursday Quiz Review Tomorrow: AV Williams 4424, 4pm. Practice Quiz handout.
Detecting and Tracking Moving Objects for Video Surveillance Isaac Cohen and Gerard Medioni University of Southern California.
Announcements Project1 artifact reminder counts towards your grade Demos this Thursday, 12-2:30 sign up! Extra office hours this week David (T 12-1, W/F.
Computing motion between images
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Motion Computing in Image Analysis
Optical Flow Estimation
Lecture 19: Optical flow CS6670: Computer Vision Noah Snavely
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Numerical Recipes (Newton-Raphson), 9.4 (first.
Motion Field and Optical Flow. Outline Motion Field and Optical Flow Definition, Example, Relation Optical Flow Constraint Equation Assumptions & Derivation,
COMP 290 Computer Vision - Spring Motion II - Estimation of Motion field / 3-D construction from motion Yongjik Kim.
3D Rigid/Nonrigid RegistrationRegistration 1)Known features, correspondences, transformation model – feature basedfeature based 2)Specific motion type,
Matching Compare region of image to region of image. –We talked about this for stereo. –Important for motion. Epipolar constraint unknown. But motion small.
Optical Flow Digital Photography CSE558, Spring 2003 Richard Szeliski (notes cribbed from P. Anandan)
Announcements Project1 due Tuesday. Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Supplemental:
Computer Vision Optical Flow Marc Pollefeys COMP 256 Some slides and illustrations from L. Van Gool, T. Darell, B. Horn, Y. Weiss, P. Anandan, M. Black,
Motion Detection in UAV Videos by Cooperative Optical Flow and Parametric Analysis Masaharu Kobashi.
Optical flow (motion vector) computation Course: Computer Graphics and Image Processing Semester:Fall 2002 Presenter:Nilesh Ghubade
CSSE463: Image Recognition Day 30 This week This week Today: motion vectors and tracking Today: motion vectors and tracking Friday: Project workday. First.
1 Motion Estimation Readings: Ch 9: plus papers change detection optical flow analysis Lucas-Kanade method with pyramid structure Ming Ye’s improved.
Optical Flow Donald Tanguay June 12, Outline Description of optical flow General techniques Specific methods –Horn and Schunck (regularization)
The Brightness Constraint
Motion Segmentation By Hadas Shahar (and John Y.A.Wang, and Edward H. Adelson, and Wikipedia and YouTube) 1.
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Uses of Motion 3D shape reconstruction Segment objects based on motion cues Recognize events and activities Improve video quality Track objects Correct.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #16.
Effective Optical Flow Estimation
Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF.
Optical Flow. Distribution of apparent velocities of movement of brightness pattern in an image.
Segmentation of Vehicles in Traffic Video Tun-Yu Chiang Wilson Lau.
Miguel Tavares Coimbra
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion Estimation I What affects the induced image motion?
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
Robust Visual Motion Analysis: Piecewise-Smooth Optical Flow Ming Ye Electrical Engineering, University of Washington
1 Lucas-Kanade Motion Estimation Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Motion / Optical Flow II Estimation of Motion Field Avneesh Sud.
Motion estimation Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2005/4/12 with slides by Michael Black and P. Anandan.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
Linearizing (assuming small (u,v)): Brightness Constancy Equation: The Brightness Constraint Where:),(),(yxJyxII t  Each pixel provides 1 equation in.
1 Motion Estimation Readings: Ch 9: plus papers change detection optical flow analysis Lucas-Kanade method with pyramid structure Ming Ye’s improved.
MOTION Model. Road Map Motion Model Non Parametric Motion Field : Algorithms 1.Optical flow field estimation. 2.Block based motion estimation. 3.Pel –recursive.
Motion estimation Parametric motion (image alignment) Tracking Optical flow.
Motion estimation Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/23 with slides by Michael Black and P. Anandan.
Motion and Optical Flow
Digital Visual Effects Yung-Yu Chuang
The Brightness Constraint
Range Imaging Through Triangulation
The Brightness Constraint
The Brightness Constraint
Motion Estimation Today’s Readings
Announcements more panorama slots available now
Motion Estimation Thanks to Steve Seitz, Simon Baker, Takeo Kanade, and anyone else who helped develop these slides.
Image and Video Processing
CSSE463: Image Recognition Day 30
Announcements Questions on the project? New turn-in info online
Coupled Horn-Schunck and Lukas-Kanade for image processing
CSSE463: Image Recognition Day 30
Announcements more panorama slots available now
CSSE463: Image Recognition Day 30
Optical flow and keypoint tracking
Presentation transcript:

Robust Visual Motion Analysis: Piecewise-Smooth Optical Flow Ming Ye Electrical Engineering, University of Washington mingye@u.washington.edu

What Is Visual Motion 2D image velocity Optical flow 3D motion projection Temporal correspondence Image deformation Optical flow An image of 2D velocity Each pixel Vs=(x,y) = (us,vs) where us and vs are the displacements in x and y. (x,y,t)  (x+u,y+v,t+1)

Rigid scene + camera translation Estimated horizontal motion Structure From Motion Rigid scene + camera translation Estimated horizontal motion Depth map

Scene Dynamics Understanding Brighter pixels => larger speeds. Estimated horizontal motion Surveillance Event analysis Video compression Motion boundaries are smooth. Motion smoothness

Target Detection and Tracking Tracking results A tiny airplane --- only observable by its distinct motion

Optical Flow Estimation:Basics Template matching Assumptions: Brightness conservation Flow smoothness Difficulties: Aperture problem (local information insufficient) Outliers (motion boundaries, abrupt image noise) red square: homogenous area (extreme case, motion completely ambiguous) green square: directionally homogenous (motion parallel to the edge ambiguous) yellow square: good template (little ambiguity) In Slide Show, you’ll see the content in the 2 yellow squares matching blue square: motion discontinuity

Results from Prior Methods: LS = Least Squares, LS-R = Robust Least Squares, R = new robust method Sampled by2: True LS-LS LS-R R-R Confidence LS = Least Squares, LS-R = Robust Least Squares, R = new robust method Horizontal flow: M-OFC LS-LMedS LS-R R-R M-OFC = solving the optical flow constraint using the M-Estimator LMedS = Least Median of Squares

Estimating Piecewise-Smooth Optical Flow with Global Matching and Graduated Optimization A Bayesian Approach

Problem Statement Assuming only brightness conservation and piecewise-smooth motion, find the optical flow to best describe the intensity change in three frames.

Approach: Matching-Based Global Optimization Step 1. Robust local gradient-based method for high-quality initial flow estimate. Step 2. Global gradient-based method to improve the flow-field coherence. Step 3. Global matching that minimizes energy by a greedy approach.

Global Energy Design Global energy Matching error Smoothness error V is the optical flow at pixel s. E is the brightness conservation. V is the optical flow field. s Global energy Matching error Warping error Smoothness error B - - I and I are prev & next frame; I (V ) is the warped intensity in prev frame. + s E is the flow smoothness error in a neighborhood about pixel s. S Error function:

Step 1: Gradient-Based Local Regression A crude flow estimate is assumed available (and has been compensated for) A robust gradient-based local regression is used to compute the incremental flow V. The dominant translational motion in the neighborhood of each pixel is computed by solving a set of flow equations using a least-median-of-squares criterion.

Step 2: Gradient-Based Global Optimization The coherence of V using a gradient-based global optimization method. The energy to minimize is given by where e is the residual of the OFC, V is the ith vector of the initial flow, and the sigmas are parameters. B s

Step 3: Global Matching The new flow estimate still exhibits gross errors at motion boundaries and other places with poor gradient estimates. This error is reduced by solving the matching-based formulation equation through greedy propagation. The energy is calculated for all pixels. Then each pixel is visited, examining whether a trial estimate from the candidates in its neighborhood is better (lower energy). If so, this becomes the new estimate for that pixel. This is repeated iteratively.

Overall Algorithm Level p warp Calculate gradients Local OFC Global matching Projection Level p Level p-1 Local OFC Global OFC Image pyramid

Advantages Best of Everything Local OFC High-quality initial flow estimates Robust local scale estimates Global OFC Improve flow smoothness Global Matching The optimal formulation Correct errors caused by poor gradient quality and hierarchical process Results: fast convergence, high accuracy, simultaneous motion boundary detection

Experiments Experiments were run on several standard test videos. Estimates of optical flow were made for the middle frame of every three. The results were compared with the Black and Anandan algorithm.

TS: Translating Squares Homebrew, ideal setting, test performance upper bound 64x64, 1pixel/frame Groundtruth (cropped), Our estimate looks the same

TS: Flow Estimate Plots LS BA S1 (S2 is close) S3 looks the same as the groundtruth. S1, S2, S3: results from our Step I, II, III (final)

TT: Translating Tree 150x150 (Barron 94) BA S3 BA 2.60 0.128 0.0724 e: error in pixels, cdf: culmulative distribution function for all pixels

DT: Diverging Tree 150x150 (Barron 94) BA S3 BA 6.36 0.182 0.114

YOS: Yosemite Fly-Through 316x252 (Barron, cloud excluded) BA S3 BA 2.71 0.185 0.118 S3 1.92 0.120 0.0776

TAXI: Hamburg Taxi 256x190, (Barron 94) max speed 3.0 pix/frame LMS BA Ours Error map Smoothness error

Traffic 512x512 (Nagel) max speed: 6.0 pix/frame BA Ours Error map Smoothness error

Pepsi Can 201x201 (Black) Max speed: 2pix/frame Ours Smoothness error BA

FG: Flower Garden 360x240 (Black) Max speed: 7pix/frame BA LMS Ours Error map Smoothness error

Contributions (1/2) Formulation Solution: 3-step optimization More complete design, minimal parameter tuning Adaptive local scales Strength of two error terms automatically balanced 3-frame matching to avoid visibility problems Solution: 3-step optimization Robust initial estimates and scales Model parameter self-learning Inherit merits of 3 methods and overcome shortcomings

Contributions (2/2) Results Significance High accuracy Fast convergence By product: motion boundaries Significance Foundation for higher-level (model-based) visual motion analysis Methodology applicable to other low-level vision problems