Center for Machine Perception Czech Technical University, Prague ACCV 2007, Tokyo, Japan
Importance of Classification Speed Time-to-decision vs. precision trade-off is inherent in many detection, recognition and matching problems in computer vision Often the trade-off is not explicitly stated in the problem formulation, but decision time clearly influences impact of a method Example: face detection - Viola-Jones (2001) – real-time performance
- Schneiderman-Kanade (1998) - smaller error rates, but 1000x slower
Fast Emulation of A Decision Process The Idea Given a black box algorithm A performing some useful binary decision task Train a sequential classifier S to (approximately) emulate detection performance of algorithm A while minimizing time-to-decision Allow user to control quality of the approximation Contribution A general framework for speeding up existing algorithms by a sequential classifier learned by the WaldBoost algorithm [1] Demonstrated on two interest point detectors Advantages Instead of spending man-months on code optimization, choose relevant feature class and train sequential classifier S Your (slow) Matlab code can be speeded up this way! [1] J. Šochman and J. Matas. Waldboost – Learning For Time Constrained Sequential Detection. CVPR 2005
Black-box Generated Training Set
Basic notions: Sequential strategy S is characterized by: Problem formulation:
WaldBoost Sequential Classifier Training Combines AdaBoost training (provides measurements) with Wald’s sequential decision making theory (for sequential decisions) Sequential WaldBoost classifier Set of weak classifiers (features) and thresholds are found during training
Emulation of Similarity-Invariant Regions Motivation Hessian-Laplace is close to state of the art Kadir’s detector very slow (100x slower than Difference of Gaussians) Executables of both methods available at robots.ox.ac.uk Both detectors are scale-invariant which is easily implemented via a scanning window + a sequential test Implementation Choice of : various filters computable with integral images - difference of rectangular regions, variance in a window Positive samples: Patches twice the size of original interest point scale
Results: Hessian-Laplace – boat sequence
Repeatability slightly higher (left graph) Matching score slightly higher (middle graph) Higher number of correspondences and correct matches in WaldBoost. Speed comparison (850x680 image)
Approximation Quality – Hesian-Laplace Yellow circles: repeated detections (85% coverage)
Approximation Quality – Kadir-Brady Yellow circles: repeated detections (96% coverage) Red circles: original detections not found by the WaldBoost detector
A general framework for speeding up existing binary decision algorithms has been presented To optimize the emulator’s time-to-decision a WaldBoost sequential classifier was trained The approach was demonstrated on (but is not limited to) two interest point detectors emulation Future work Precise localization of the detections by interpolation on the grid Using real value output of the black-box algorithm (“sequential regression”) To emulate or to be repeatable?
Conclusions and Future Work A general framework for speeding up existing binary decision algorithms has been presented To optimize the emulator’s time-to-decision a WaldBoost sequential classifier was trained The approach was demonstrated (but is not limited to) on two interest point detectors Future work Precise localization of the detections by interpolation on the grid Using real value output of the black-box algorithm (“sequential regression”) To emulate or to be repeatable? Thank you for your attention
Do'stlaringiz bilan baham: |