SlideShare a Scribd company logo
1
   Probability density function (pdf)
    estimation using isocontours/isosurfaces
   Application to Image Registration
   Application to Image Filtering
   Circular/spherical density estimation in
    Euclidean space



                                               2
Histograms         Kernel      Mixture model
                    density
                   estimate

     Parameter selection:       Sample-based
bin-width/bandwidth/number      methods
        of components
    Bias/variance tradeoff:
                                 Do not treat a
  large bandwidth: high bias,
                                  signal as a
low bandwidth: high variance)
                                    signal
                                                  3
Continuous image representation: function I(x,y) at
Trace out isocontours of the intensity
     using some interpolant. values.
               several intensity




                                                      4
P (αLevel< α + Curves at Intensity α α + region
     < I Level ∆ α ) ∝ area of brown ∆ α
          Curves at Intensity α and
                
  Assume a uniform                       1 d                    
   density on (x,y)
                            p I (α ) =          
                                       | Ω | dα        ∫   dxdy 
                                                                 
                                                 I ( x, y )≤ α  
   Random variable               1              du
 transformation from
                            =
                               |Ω |      ∫     I2 + I2
                                    I ( x, y )= α   x    y
     (x,y) to (I,u)
                             Every point in the image
Integrate out u along the
  u = direction to get
                              domain contributes to
  densitylevel set
      the of intensity I
                                   the density.
   (dummy variable)
      Published in CVPR 2006, PAMI 2009.
                                                                     6
7
P (α 1 < I1 < α 1 at Intensity (α 1I 2 <+α∆ 2α + )∆ αI1 ) ∝
          Level Curves + ∆ α 1 ,α 2 < , α 1           1 in 2
          Level Curves at Intensity α 1 in I1 and α 2 in I 2
       area ofα black region I 2
         and ( 2 , α 2 + ∆ α 2 ) in

                          1                               1
p I1 ,I 2 (α 1 , α 2 ) =
                         |Ω |   ∑
                                C
                                    | ∇ I1 ( x, y)∇ I 2 ( x, y) sin(θ ( x, y)) |
C = {( x , y) | I1 ( x , y) = α 1 , I 2 ( x, y) = α 2 }
θ ( x , y) = angle between gradients at ( x, y)

                  Relationships between
                geometric and probabilistic
                         entities.
                                                                                   8
   Similar density estimator developed by Kadir
    and Brady (BMVC 2005) independently of us.

   Similar idea: several differences in
    implementation, motivation, derivation of
    results and applications.



                                                   9
1                             du
                 p I (α ) =
                            |Ω |          ∫
                                     I ( x, y )= α
                                                     | ∇ I ( x, y ) |

                           1                                1
 p I1 ,I 2 (α 1 , α 2 ) =
                          |Ω |   ∑
                                 C
                                      | ∇ I1 ( x, y)∇ I 2 ( x , y) sin(θ ( x, y)) |
 C = {( x , y) | I1 ( x, y) = α 1 , I 2 ( x, y) = α 2 }

Compute cumulative of the cumulative) do not exist
  Densities (derivatives
    where image gradients are (α < or where image
 interval measures.           P zero, I < α + ∆ )
                  gradients run parallel.
                                                                                  10
11
102464 bins
                 256 bins
                 512
                 128
                   32
                   bins


Standard histograms          Isocontour Method




                                            12
13
   Randomized/digital approximation to area
    calculation.

   Strict lower bound on the accuracy of the
    isocontour method, for a fixed interpolant.

   Computationally more expensive than the
    isocontour method.

                                                  14
128 x 128
  bins




            15
   Simplest one: linear interpolant to each half-
    pixel (level curves are segments).

   Low-order polynomial interpolants: high
    bias, low variance.

   High-order polynomial interpolants: low
    bias, high variance.

                                                     16
Accuracy of estimated density
    Polynomial
                         improves as signal is sampled with
    Interpolant
                                 finer resolution.

                            Bandlimited analog signal,
                             Nyquist-sampled digital
Assumptions on signal:                signal:
  better interpolant        Accurate reconstruction by
                                 sincinterpolant!
                               (Whitaker-Shannon
                               Sampling Theorem)
                                                         17
   Probability density function (pdf) estimation
    using isocontours
   Application to Image Registration
   Application to Image Filtering
   Circular/spherical density estimation in
    Euclidean space



                                                    18
•Mutual Information: Well known image find the
   Given two images of an object, to
 similarity measure Viola and Wells (IJCV
   geometric transformation that “best”
 1995) and Maes et al (TMI 1997).
   aligns one with the other, w.r.t. some
   image similarity measure.
 Insensitive to illumination changes:
useful in multimodality image
registration



                                                 19
Marginal
 MI ( I1 , I 2 ) = H ( I1 ) − H ( I1 | I 2 )                                Probabilities
                 = H ( I 1 ) + H ( I 2 ) − H ( I1 , I 2 )                     p1 (α 1 )
 H ( I1 ) = − ∑ p1 (α 1 ) log p1 (α 1 )                                       p2 (α 2 )
                α1

 H ( I 2 ) = − ∑ p2 (α 2 ) log p2 (α 2 )                                   Joint Probability
                α   2                                                         p12(α 1 , α 2 )
 H ( I1 , I 2 ) = − ∑     ∑        p12 (α 1 , α 2 ) log p12 (α 1 , α 2 )
                        α1 α   2



Marginal entropy                        H ( I1 )
  Joint entropy                       H ( I1 , I 2 )

Conditional entropy H ( I1 | I 2 )
                                                                                                20
Hypothesis: If the alignment between images is
optimal then Mutual information is maximum.

  I1               I2

                                     Functions of Geometric
                                        Transformation




       p12(i, j)        H ( I1 , I 2 )        MI( I1 , I 2 )
                                                               21
σ = 0.05   σ = 0 .2   σ = 0 .7
                                 22
σ = 0.05
      2
      7     32 bins
           128 bins
            PVI=partial volume
           interpolation (Maes
             et al, TMI 1997)




                           23
PD slice            Warped andsliceslice slice
                       WarpedNoisy T2
                           T2 T2




     Brute force search for the
         maximum of MI
                                                 24
MI with standard     MI with our method
  histograms




σ = 0 .7
Par. of affine transf.



θ = 30, s = t = − 0.3, ϕ = 0
                                     25
Method        Error in     Error in    Error in t
                        Theta        s(avg.,var. (avg., var.)
                        (avg., var.) )


          Histograms    3.7,18.1    0.7,0        0.43,0.08
          (bilinear)
32 BINS
          Isocontours   0,0.06      0,0          0,0

          PVI           1.9, 8.5    0.56,0.08    0.49,0.1

          Histograms    0.3,49.4    0.7,0        0.2,0
          (cubic)

          2DPointProb   0.3,0.22    0,0          0,0

                                                                26
   Probability density function (pdf) estimation
    using isocontours
   Application to Image Registration
   Application to Image Filtering
   Circular/spherical density estimation in
    Euclidean space



                                                    27
Anisotropic neighborhood filters (Kernel
density based filters): Grayscale images
                      ∑      K ( I ( x, y ) − I (a, b);σ ) I ( x, y )
               ( x , y )∈ N ( a ,b )
 I ( a, b) =
                              ∑        K ( I ( x, y ) − I (a, b); σ )
                       ( x , y )∈ N ( a ,b )


                                                 K: a decreasing
                                               Parameter σ controls
                                                     function
                                               the degree of
                                                    (typically
                                               anisotropicity of the
                                               smoothing
                                                    Gaussian)
                                               Central Pixel (a,b):
                                                 Neighborhood
                                                 N(a,b) around
                                                     (a,b)              28
Anisotropic Neighborhood filters:
               Problems
                            Sensitivity to
Sensitivity to the         the SIZE of the
  parameter σ              Neighborhood

                   Does not
                 account for
                   gradient
                 information
                                             29
Anisotropic Neighborhood filters:
            Problems
Treat pixels as independent samples




                                      30
Continuous Image Representation
     Interpolate in between the pixel values
                              ∑      K ( I ( x, y ) − I (a, b);σ ) I ( x, y )
                       ( x, y )∈ N ( a ,b )
        I ( a, b) =
                                    ∑         K ( I ( x, y ) − I (a, b);σ )
                              ( x, y )∈ N ( a ,b )




                 ∫ ∫ I ( x, y) K ( I ( x, y) − I (a, b);σ )dxdy
               N ( a ,b )
 I ( a, b) =
                            ∫ ∫ K ( I ( x, y) − I (a, b);σ )dxdy
                      N ( a ,b )                                                31
Continuous Image Representation
                       Areas between
                       isocontours at
                    intensity α and α+Δ
                     (divided by area of
                      neighborhood)=
                        Pr(α<Intensity
                        <α+Δ|N(a,b))



                                   32
∫ ∫ I ( x, y) K ( I ( x, y) − I (a, b);σ )dxdy
                      N ( a ,b )
       I ( a, b) =
                                   ∫ ∫ K ( I ( x, y) − I (a, b);σ )dxdy
                             N ( a ,b )




             Lim∆ → 0 ∑0 Area(α < I < α + ∆ | N ( a, b)) × α × K (α − I ( a, b); σ )
               Lim∆ → ∑ Pr
I ( a,(b) b) =
    I a, =            α α
               Lim∆ → 0 ∑0 Area(α < I < α + ∆ | N ( a, b)) × K (α − I ( a, b); σ )
                 Lim∆ → ∑ Pr
                        α α

    Areas between isocontours:                     Published in
     contribute to weights for
             averaging.
                                                  EMMCVPR 2009
                                                                             33
Extension to RGB images




R (a, b), G (a, b), B (a, b)
                                     
   ∑) α Pr(α < ( R, G, B) < α + ∆ ) K (α − ( R, G, B);σ )
   (α
=                              = Area
        Joint Probability< of R,G,B (α − ( R,of,overlap of
             
   ∑ Pr(α < ( R, G, B) α + ∆ ) K
    
  (α )      isocontour pairs from R, G, B images
                                                G B ); σ )
                                                          34
Mean-shift framework
• A clustering method developed by
  Fukunaga& Hostetler (IEEE Trans. Inf.
  Theory, 1975).
• Applied to image filtering by Comaniciu and
  Meer (PAMI 2003).
• Involves independent update of each pixel by
  maximization of local estimate of probability
  density of joint spatial and intensity
  parameters.
                                              35
Mean-shift framework
• One step of mean-shift update around (a,b,c) where
  c=I(a,b).
                       ∑    (x, y, I(x, y))w(x, y)
                   (x, y)∈ N(a,b)
     ˆ ˆ ˆ
 1. (a, b, c) :=
                                ∑     w(x, y)
                            (x, y)∈ N(a,b)

   ˆ ˆ
  (a, b) ≡ updated center of the neighborhood , c ≡ updated intensity value
                                                ˆ
                  (x − a) 2 (y − b) 2 (I(x, y) − c) 2 
 w(x, y) := exp −          −         −                
                       σs2
                               σs2
                                           σr 2        
                                                      
                 ˆ ˆ ˆ
 2. (a, b, c) ⇐ (a, b, c)
 3. Repeat steps (1) and (2) till (a, b, c) stops changing.
 4. Set I(a, b) ⇐ c.
                  ˆ
                                                                          36
Our Method in Mean-shift Setting




 I(x,y)      X(x,y)=x    Y(x,y)=y


                                    37
Our Method in Mean-shift Setting
                                                                    Facets of tessellation
                                                                    induced by isocontours
                                                                    and the pixel grid
                                                               ( X k , Yk ) = Centroid of Facet #k.
                                                               I k = Intensity (from interpolated
                                                                image) at ( X k , Yk ) .
                                                                 Ak = Area of Facet #k.


                                       ∑      ( X k , Yk , I k ) A k K( ( X k , Yk , I k ) − ( X (a, b), Y (a, b), I (a, b)) ; σ )
                                    k ∈ N ( a ,b )
( X (a, b), Y (a, b), I (a, b)) =
                                                 ∑        A k K( ( X k , Yk , I k ) − ( X (a, b), Y (a, b), I (a, b)) ; σ )
                                             k ∈ N ( a ,b )
                                                                                                                        38
Experimental Setup: Grayscale Images
• Piecewise-linear interpolation used for our
  method in all experiments.
• For our method, Kernel K = pillbox kernel, i.e.
    K ( z; σ ) = 1 If |z| <= σ
    K ( z; σ ) = 0   If |z| >σ

• For discrete mean-shift, Kernel K = Gaussian.
• Parameters used: neighborhood radius ρ=3, σ=3.
• Noise model: Gaussian noise of variance 0.003
  (scale of 0 to 1).
                                                    39
Original
Image         Noisy Image




Denoised      Denoised
(Isocontour   (Gaussian Kernel
Mean Shift)   Mean Shift)




                                 40
Denoised      Denoised
Original   Noisy Image   (Isocontour   (Std.Mean
Image                    Mean Shift)   Shift)




                                                   41
Experiments on color images
• Use of pillbox kernels for our method.
• Use of Gaussian kernels for discrete mean
  shift.
• Parameters used: neighborhood radius ρ= 6,
σ = 6.
• Noise model: Independent Gaussian noise on
  each channel with variance 0.003 (on a scale
  of 0 to 1).
                                             42
Experiments on color images
• Independent piecewise-linear interpolation
  on R,G,B channels in our method.
• Smoothing of R, G, B values done by coupled
  updates using joint probabilities.




                                            43
Original        Noisy Image
Image




Denoised      Denoised
(Isocontour   (Gaussian Kernel
Mean Shift)   Mean Shift)




                                 44
Denoised      Denoised
Original   Noisy Image   (Isocontour   (Gaussian Kernel
Image                    Mean Shift)   Mean Shift)




                                                   45
Observations
• Discrete kernel mean shift performs poorly with
  small neighborhoods and small values of σ.
• Why? Small sample-size problem for kernel
  density estimation.
• Isocontour based method performs well even in
  this scenario (number of isocontours/facets >>
  number of pixels).
• Large σ or large neighborhood values not always
  necessary for smoothing.
                                               46
Observations
• Superior behavior observed when comparing
  isocontour-based neighborhood filters with
  standard neighborhood filters for the same
  parameter set and the same number of
  iterations.




                                           47
   Probability density function (pdf) estimation
    using isocontours
   Application to Image Registration
   Application to Image Filtering
   Circular/spherical density estimation in
    Euclidean space



                                                    48
 Examples of unit vector data:
1. Chromaticity vectors of color values:
                      ( R, G, B)
    ( r , g , b) =
                     R2 + G2 + B2

2. Hue (from the HSI color scheme) obtained
  from the RGB values.
               3 (G − B) 
    θ = arctan            
               2R − G − B 
                          
                                              49
Convert RGB values to unit                                          ( Ri , Gi , Bi )
                                      vi = (ri , g i , bi ) =
         vectors                                                   Ri2 + Gi2 + Bi2


 Estimate density of unit                            1
                                                            N

        vectors                             p (v ) =
                                                     N    ∑i= 1
                                                                  K(v; κ , vi )

 K ( v; κ , u ) = C p ( κ )e κ vT u
                                                  Other mixture
                                                   voMF popular
 K = von - Mises Fisher Kernel                   kernels: Watson,
                                                      models
 κ = concentration parameter                    Banerjee et al (JMLR
                                                      cosine.
 C p ( κ ) = normalization constant                    2005)
                                                                                       50
Density of
 Estimate density of RGB
(magnitude,chromaticity)
 using KDE/Mixture models               p ( m, v) = m 2 p ( R, G, B )
   using random-variable
                                       m = ) 2 + 2 B −G 2 2 B 2
        transformation ( R − Ri ) 2 + (G − Gi R ( + Bi ) +
                         
                 1 N
p ( R ,G , B ) =    ∑ exp −                                 
                                                            
                                   p (v) =v) = ∑ m 2 p ( R, G , B(κdmT vi )
                 N i=1                   σ 12 N∞      1
                                      p ( N ∫ 2π I (κ )    exp ) i v
      Density of chromaticity
   Density of chromaticity:                    i=1    o   i
                                                m= 0
    (integrate out magnitude) mi = wi , v 2 wi 2κ i = 2 mi
     conditioning on m=1.                        =    ,
                                        m=     R +m + B σ
                                                   G               2
                                                   i
  Projected normal estimator:
Variable bandwidthspheres”,
Watson,”Statistics on voMF KDE:          What’s new? The notion that
             1983,                       all estimation can proceed in
  Bishop, “Neural networks for                  Euclidean space.
 Small,”Therecognition” 2006.
   pattern statistical theory of
         shape”, 1995                                                    51
Estimate density of RGB using KDE/Mixture models

Use random variable transformation to get density of HSI
              (hue, saturation,intensity)

        Integrate out S,I to get density of hue




                                                      52
   Consistency between densities of Euclidean and
    unit vector data (in terms of random variable
    transformation/conditioning).
   Potential to use the large body of literature
    available for statistics of Euclidean data
    (example: Fast Gauss Transform Greengard et al
    (SIAM Sci. Computing 1991), Duraiswami et al
    (IJCV 2003).
   Model selection can be done in Euclidean space.

                                                      53

More Related Content

What's hot (19)

PDF
Design Approach of Colour Image Denoising Using Adaptive Wavelet
IJERD Editor
 
PDF
Test
Kinni MEW
 
PDF
A Novel Methodology for Designing Linear Phase IIR Filters
IDES Editor
 
PDF
Biao Hou--SAR IMAGE DESPECKLING BASED ON IMPROVED DIRECTIONLET DOMAIN GAUSSIA...
grssieee
 
PDF
Macrocanonical models for texture synthesis
Valentin De Bortoli
 
PDF
Bouguet's MatLab Camera Calibration Toolbox for Stereo Camera
Yuji Oyamada
 
PDF
Bouguet's MatLab Camera Calibration Toolbox
Yuji Oyamada
 
PDF
Object Detection with Discrmininatively Trained Part based Models
zukun
 
PDF
Uncertainty in deep learning
Yujiro Katagiri
 
PDF
Stability of adaptive random-walk Metropolis algorithms
BigMC
 
PDF
Nonlinear Manifolds in Computer Vision
zukun
 
PDF
BMC 2012
BOUWMANS Thierry
 
PDF
Gtti 10032021
Valentin De Bortoli
 
PDF
Color Img at Prisma Network meeting 2009
Juan Luis Nieves
 
PDF
Ben Gal
yairgo11
 
PDF
Nonparametric Density Estimation
jachno
 
PPT
Chapter 1 introduction (Image Processing)
Varun Ojha
 
PPTX
Snakes in Images (Active contour tutorial)
Yan Xu
 
PDF
Continuous and Discrete-Time Analysis of SGD
Valentin De Bortoli
 
Design Approach of Colour Image Denoising Using Adaptive Wavelet
IJERD Editor
 
Test
Kinni MEW
 
A Novel Methodology for Designing Linear Phase IIR Filters
IDES Editor
 
Biao Hou--SAR IMAGE DESPECKLING BASED ON IMPROVED DIRECTIONLET DOMAIN GAUSSIA...
grssieee
 
Macrocanonical models for texture synthesis
Valentin De Bortoli
 
Bouguet's MatLab Camera Calibration Toolbox for Stereo Camera
Yuji Oyamada
 
Bouguet's MatLab Camera Calibration Toolbox
Yuji Oyamada
 
Object Detection with Discrmininatively Trained Part based Models
zukun
 
Uncertainty in deep learning
Yujiro Katagiri
 
Stability of adaptive random-walk Metropolis algorithms
BigMC
 
Nonlinear Manifolds in Computer Vision
zukun
 
Gtti 10032021
Valentin De Bortoli
 
Color Img at Prisma Network meeting 2009
Juan Luis Nieves
 
Ben Gal
yairgo11
 
Nonparametric Density Estimation
jachno
 
Chapter 1 introduction (Image Processing)
Varun Ojha
 
Snakes in Images (Active contour tutorial)
Yan Xu
 
Continuous and Discrete-Time Analysis of SGD
Valentin De Bortoli
 

Viewers also liked (7)

PDF
NIPS2009: Understand Visual Scenes - Part 1
zukun
 
PDF
CVPR2010: Advanced ITinCVPR in a Nutshell: part 6: Mixtures
zukun
 
PDF
ECCV2010 tutorial: statisitcal and structural recognition of human actions pa...
zukun
 
PPTX
ICCV2009: Max-Margin Ađitive Classifiers for Detection
zukun
 
PPS
789d600f 9574 4e73 977f 3f717cb0369a Les40ansdemadame
body194
 
PPT
ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 1
zukun
 
PDF
ICCV2009: MAP Inference in Discrete Models: Part 1: Introduction
zukun
 
NIPS2009: Understand Visual Scenes - Part 1
zukun
 
CVPR2010: Advanced ITinCVPR in a Nutshell: part 6: Mixtures
zukun
 
ECCV2010 tutorial: statisitcal and structural recognition of human actions pa...
zukun
 
ICCV2009: Max-Margin Ađitive Classifiers for Detection
zukun
 
789d600f 9574 4e73 977f 3f717cb0369a Les40ansdemadame
body194
 
ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 1
zukun
 
ICCV2009: MAP Inference in Discrete Models: Part 1: Introduction
zukun
 
Ad

Similar to CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides (20)

PDF
Modern features-part-1-detectors
zukun
 
PDF
Lecture03
zukun
 
PDF
Test
Kinni MEW
 
KEY
Team meeting 100325
Yi-Hsin Liu
 
KEY
Team meeting 100325
Yi-Hsin Liu
 
PDF
NIPS2008: tutorial: statistical models of visual images
zukun
 
PDF
IGARSS_AMASM_woo_20110727.pdf
grssieee
 
PDF
Discrete Models in Computer Vision
Yap Wooi Hen
 
PDF
ICCV2009: MAP Inference in Discrete Models: Part 2
zukun
 
PDF
Image Processing 2
jainatin
 
PDF
Signal Processing Course : Compressed Sensing
Gabriel Peyré
 
PDF
Bayesian Methods for Machine Learning
butest
 
PDF
icml2004 tutorial on bayesian methods for machine learning
zukun
 
PDF
Estimation and Prediction of Complex Systems: Progress in Weather and Climate
modons
 
PDF
Lecture11
zukun
 
PDF
Plane rectification through robust vanishing point tracking using the expecta...
Sergio Mancera
 
PDF
CVPR2010: higher order models in computer vision: Part 4
zukun
 
PDF
FABIA: Large Data Biclustering in Drug Design
Martin Heusel
 
PDF
Influence of Signal-to-Noise Ratio and Point Spread Function on Limits of Sup...
Tuan Q. Pham
 
PDF
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Beniamino Murgante
 
Modern features-part-1-detectors
zukun
 
Lecture03
zukun
 
Test
Kinni MEW
 
Team meeting 100325
Yi-Hsin Liu
 
Team meeting 100325
Yi-Hsin Liu
 
NIPS2008: tutorial: statistical models of visual images
zukun
 
IGARSS_AMASM_woo_20110727.pdf
grssieee
 
Discrete Models in Computer Vision
Yap Wooi Hen
 
ICCV2009: MAP Inference in Discrete Models: Part 2
zukun
 
Image Processing 2
jainatin
 
Signal Processing Course : Compressed Sensing
Gabriel Peyré
 
Bayesian Methods for Machine Learning
butest
 
icml2004 tutorial on bayesian methods for machine learning
zukun
 
Estimation and Prediction of Complex Systems: Progress in Weather and Climate
modons
 
Lecture11
zukun
 
Plane rectification through robust vanishing point tracking using the expecta...
Sergio Mancera
 
CVPR2010: higher order models in computer vision: Part 4
zukun
 
FABIA: Large Data Biclustering in Drug Design
Martin Heusel
 
Influence of Signal-to-Noise Ratio and Point Spread Function on Limits of Sup...
Tuan Q. Pham
 
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Beniamino Murgante
 
Ad

More from zukun (20)

PDF
My lyn tutorial 2009
zukun
 
PDF
ETHZ CV2012: Tutorial openCV
zukun
 
PDF
ETHZ CV2012: Information
zukun
 
PDF
Siwei lyu: natural image statistics
zukun
 
PDF
Lecture9 camera calibration
zukun
 
PDF
Brunelli 2008: template matching techniques in computer vision
zukun
 
PDF
Modern features-part-4-evaluation
zukun
 
PDF
Modern features-part-3-software
zukun
 
PDF
Modern features-part-2-descriptors
zukun
 
PDF
Modern features-part-0-intro
zukun
 
PDF
Lecture 02 internet video search
zukun
 
PDF
Lecture 01 internet video search
zukun
 
PDF
Lecture 03 internet video search
zukun
 
PDF
Icml2012 tutorial representation_learning
zukun
 
PPT
Advances in discrete energy minimisation for computer vision
zukun
 
PDF
Gephi tutorial: quick start
zukun
 
PDF
EM algorithm and its application in probabilistic latent semantic analysis
zukun
 
PDF
Object recognition with pictorial structures
zukun
 
PDF
Iccv2011 learning spatiotemporal graphs of human activities
zukun
 
PDF
Icml2012 learning hierarchies of invariant features
zukun
 
My lyn tutorial 2009
zukun
 
ETHZ CV2012: Tutorial openCV
zukun
 
ETHZ CV2012: Information
zukun
 
Siwei lyu: natural image statistics
zukun
 
Lecture9 camera calibration
zukun
 
Brunelli 2008: template matching techniques in computer vision
zukun
 
Modern features-part-4-evaluation
zukun
 
Modern features-part-3-software
zukun
 
Modern features-part-2-descriptors
zukun
 
Modern features-part-0-intro
zukun
 
Lecture 02 internet video search
zukun
 
Lecture 01 internet video search
zukun
 
Lecture 03 internet video search
zukun
 
Icml2012 tutorial representation_learning
zukun
 
Advances in discrete energy minimisation for computer vision
zukun
 
Gephi tutorial: quick start
zukun
 
EM algorithm and its application in probabilistic latent semantic analysis
zukun
 
Object recognition with pictorial structures
zukun
 
Iccv2011 learning spatiotemporal graphs of human activities
zukun
 
Icml2012 learning hierarchies of invariant features
zukun
 

Recently uploaded (20)

PPTX
HIRSCHSPRUNG'S DISEASE(MEGACOLON): NURSING MANAGMENT.pptx
PRADEEP ABOTHU
 
PPTX
SCHOOL-BASED SEXUAL HARASSMENT PREVENTION AND RESPONSE WORKSHOP
komlalokoe
 
PPTX
Maternal and Child Tracking system & RCH portal
Ms Usha Vadhel
 
PDF
IMP NAAC-Reforms-Stakeholder-Consultation-Presentation-on-Draft-Metrics-Unive...
BHARTIWADEKAR
 
PPTX
Presentation: Climate Citizenship Digital Education
Karl Donert
 
PPTX
How to Configure Access Rights of Manufacturing Orders in Odoo 18 Manufacturing
Celine George
 
PDF
Federal dollars withheld by district, charter, grant recipient
Mebane Rash
 
PPTX
Mrs Mhondiwa Introduction to Algebra class
sabinaschimanga
 
PDF
Comprehensive Guide to Writing Effective Literature Reviews for Academic Publ...
AJAYI SAMUEL
 
PPTX
Optimizing Cancer Screening With MCED Technologies: From Science to Practical...
i3 Health
 
PPTX
2025 Winter SWAYAM NPTEL & A Student.pptx
Utsav Yagnik
 
PPTX
PYLORIC STENOSIS: NURSING MANAGEMENT.pptx
PRADEEP ABOTHU
 
PDF
water conservation .pdf by Nandni Kumari XI C
Directorate of Education Delhi
 
PPTX
Gall bladder, Small intestine and Large intestine.pptx
rekhapositivity
 
PPTX
Views on Education of Indian Thinkers Mahatma Gandhi.pptx
ShrutiMahanta1
 
PPTX
CLEFT LIP AND PALATE: NURSING MANAGEMENT.pptx
PRADEEP ABOTHU
 
PDF
Ziehl-Neelsen Stain: Principle, Procedu.
PRASHANT YADAV
 
PDF
FULL DOCUMENT: Read the full Deloitte and Touche audit report on the National...
Kweku Zurek
 
PPTX
How to Configure Prepayments in Odoo 18 Sales
Celine George
 
PPSX
Health Planning in india - Unit 03 - CHN 2 - GNM 3RD YEAR.ppsx
Priyanshu Anand
 
HIRSCHSPRUNG'S DISEASE(MEGACOLON): NURSING MANAGMENT.pptx
PRADEEP ABOTHU
 
SCHOOL-BASED SEXUAL HARASSMENT PREVENTION AND RESPONSE WORKSHOP
komlalokoe
 
Maternal and Child Tracking system & RCH portal
Ms Usha Vadhel
 
IMP NAAC-Reforms-Stakeholder-Consultation-Presentation-on-Draft-Metrics-Unive...
BHARTIWADEKAR
 
Presentation: Climate Citizenship Digital Education
Karl Donert
 
How to Configure Access Rights of Manufacturing Orders in Odoo 18 Manufacturing
Celine George
 
Federal dollars withheld by district, charter, grant recipient
Mebane Rash
 
Mrs Mhondiwa Introduction to Algebra class
sabinaschimanga
 
Comprehensive Guide to Writing Effective Literature Reviews for Academic Publ...
AJAYI SAMUEL
 
Optimizing Cancer Screening With MCED Technologies: From Science to Practical...
i3 Health
 
2025 Winter SWAYAM NPTEL & A Student.pptx
Utsav Yagnik
 
PYLORIC STENOSIS: NURSING MANAGEMENT.pptx
PRADEEP ABOTHU
 
water conservation .pdf by Nandni Kumari XI C
Directorate of Education Delhi
 
Gall bladder, Small intestine and Large intestine.pptx
rekhapositivity
 
Views on Education of Indian Thinkers Mahatma Gandhi.pptx
ShrutiMahanta1
 
CLEFT LIP AND PALATE: NURSING MANAGEMENT.pptx
PRADEEP ABOTHU
 
Ziehl-Neelsen Stain: Principle, Procedu.
PRASHANT YADAV
 
FULL DOCUMENT: Read the full Deloitte and Touche audit report on the National...
Kweku Zurek
 
How to Configure Prepayments in Odoo 18 Sales
Celine George
 
Health Planning in india - Unit 03 - CHN 2 - GNM 3RD YEAR.ppsx
Priyanshu Anand
 

CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides

  • 1. 1
  • 2. Probability density function (pdf) estimation using isocontours/isosurfaces  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 2
  • 3. Histograms Kernel Mixture model density estimate Parameter selection: Sample-based bin-width/bandwidth/number methods of components Bias/variance tradeoff: Do not treat a large bandwidth: high bias, signal as a low bandwidth: high variance) signal 3
  • 4. Continuous image representation: function I(x,y) at Trace out isocontours of the intensity using some interpolant. values. several intensity 4
  • 5. P (αLevel< α + Curves at Intensity α α + region < I Level ∆ α ) ∝ area of brown ∆ α Curves at Intensity α and
  • 6.  Assume a uniform 1 d   density on (x,y) p I (α ) =  | Ω | dα  ∫ dxdy    I ( x, y )≤ α  Random variable 1 du transformation from = |Ω | ∫ I2 + I2 I ( x, y )= α x y (x,y) to (I,u) Every point in the image Integrate out u along the u = direction to get domain contributes to densitylevel set the of intensity I the density. (dummy variable) Published in CVPR 2006, PAMI 2009. 6
  • 7. 7
  • 8. P (α 1 < I1 < α 1 at Intensity (α 1I 2 <+α∆ 2α + )∆ αI1 ) ∝ Level Curves + ∆ α 1 ,α 2 < , α 1 1 in 2 Level Curves at Intensity α 1 in I1 and α 2 in I 2 area ofα black region I 2 and ( 2 , α 2 + ∆ α 2 ) in 1 1 p I1 ,I 2 (α 1 , α 2 ) = |Ω | ∑ C | ∇ I1 ( x, y)∇ I 2 ( x, y) sin(θ ( x, y)) | C = {( x , y) | I1 ( x , y) = α 1 , I 2 ( x, y) = α 2 } θ ( x , y) = angle between gradients at ( x, y) Relationships between geometric and probabilistic entities. 8
  • 9. Similar density estimator developed by Kadir and Brady (BMVC 2005) independently of us.  Similar idea: several differences in implementation, motivation, derivation of results and applications. 9
  • 10. 1 du p I (α ) = |Ω | ∫ I ( x, y )= α | ∇ I ( x, y ) | 1 1 p I1 ,I 2 (α 1 , α 2 ) = |Ω | ∑ C | ∇ I1 ( x, y)∇ I 2 ( x , y) sin(θ ( x, y)) | C = {( x , y) | I1 ( x, y) = α 1 , I 2 ( x, y) = α 2 } Compute cumulative of the cumulative) do not exist Densities (derivatives where image gradients are (α < or where image interval measures. P zero, I < α + ∆ ) gradients run parallel. 10
  • 11. 11
  • 12. 102464 bins 256 bins 512 128 32 bins Standard histograms Isocontour Method 12
  • 13. 13
  • 14. Randomized/digital approximation to area calculation.  Strict lower bound on the accuracy of the isocontour method, for a fixed interpolant.  Computationally more expensive than the isocontour method. 14
  • 15. 128 x 128 bins 15
  • 16. Simplest one: linear interpolant to each half- pixel (level curves are segments).  Low-order polynomial interpolants: high bias, low variance.  High-order polynomial interpolants: low bias, high variance. 16
  • 17. Accuracy of estimated density Polynomial improves as signal is sampled with Interpolant finer resolution. Bandlimited analog signal, Nyquist-sampled digital Assumptions on signal: signal: better interpolant Accurate reconstruction by sincinterpolant! (Whitaker-Shannon Sampling Theorem) 17
  • 18. Probability density function (pdf) estimation using isocontours  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 18
  • 19. •Mutual Information: Well known image find the Given two images of an object, to similarity measure Viola and Wells (IJCV geometric transformation that “best” 1995) and Maes et al (TMI 1997). aligns one with the other, w.r.t. some image similarity measure. Insensitive to illumination changes: useful in multimodality image registration 19
  • 20. Marginal MI ( I1 , I 2 ) = H ( I1 ) − H ( I1 | I 2 ) Probabilities = H ( I 1 ) + H ( I 2 ) − H ( I1 , I 2 ) p1 (α 1 ) H ( I1 ) = − ∑ p1 (α 1 ) log p1 (α 1 ) p2 (α 2 ) α1 H ( I 2 ) = − ∑ p2 (α 2 ) log p2 (α 2 ) Joint Probability α 2 p12(α 1 , α 2 ) H ( I1 , I 2 ) = − ∑ ∑ p12 (α 1 , α 2 ) log p12 (α 1 , α 2 ) α1 α 2 Marginal entropy H ( I1 ) Joint entropy H ( I1 , I 2 ) Conditional entropy H ( I1 | I 2 ) 20
  • 21. Hypothesis: If the alignment between images is optimal then Mutual information is maximum. I1 I2 Functions of Geometric Transformation p12(i, j) H ( I1 , I 2 ) MI( I1 , I 2 ) 21
  • 22. σ = 0.05 σ = 0 .2 σ = 0 .7 22
  • 23. σ = 0.05 2 7 32 bins 128 bins PVI=partial volume interpolation (Maes et al, TMI 1997) 23
  • 24. PD slice Warped andsliceslice slice WarpedNoisy T2 T2 T2 Brute force search for the maximum of MI 24
  • 25. MI with standard MI with our method histograms σ = 0 .7 Par. of affine transf. θ = 30, s = t = − 0.3, ϕ = 0 25
  • 26. Method Error in Error in Error in t Theta s(avg.,var. (avg., var.) (avg., var.) ) Histograms 3.7,18.1 0.7,0 0.43,0.08 (bilinear) 32 BINS Isocontours 0,0.06 0,0 0,0 PVI 1.9, 8.5 0.56,0.08 0.49,0.1 Histograms 0.3,49.4 0.7,0 0.2,0 (cubic) 2DPointProb 0.3,0.22 0,0 0,0 26
  • 27. Probability density function (pdf) estimation using isocontours  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 27
  • 28. Anisotropic neighborhood filters (Kernel density based filters): Grayscale images ∑ K ( I ( x, y ) − I (a, b);σ ) I ( x, y ) ( x , y )∈ N ( a ,b ) I ( a, b) = ∑ K ( I ( x, y ) − I (a, b); σ ) ( x , y )∈ N ( a ,b ) K: a decreasing Parameter σ controls function the degree of (typically anisotropicity of the smoothing Gaussian) Central Pixel (a,b): Neighborhood N(a,b) around (a,b) 28
  • 29. Anisotropic Neighborhood filters: Problems Sensitivity to Sensitivity to the the SIZE of the parameter σ Neighborhood Does not account for gradient information 29
  • 30. Anisotropic Neighborhood filters: Problems Treat pixels as independent samples 30
  • 31. Continuous Image Representation Interpolate in between the pixel values ∑ K ( I ( x, y ) − I (a, b);σ ) I ( x, y ) ( x, y )∈ N ( a ,b ) I ( a, b) = ∑ K ( I ( x, y ) − I (a, b);σ ) ( x, y )∈ N ( a ,b ) ∫ ∫ I ( x, y) K ( I ( x, y) − I (a, b);σ )dxdy N ( a ,b ) I ( a, b) = ∫ ∫ K ( I ( x, y) − I (a, b);σ )dxdy N ( a ,b ) 31
  • 32. Continuous Image Representation Areas between isocontours at intensity α and α+Δ (divided by area of neighborhood)= Pr(α<Intensity <α+Δ|N(a,b)) 32
  • 33. ∫ ∫ I ( x, y) K ( I ( x, y) − I (a, b);σ )dxdy N ( a ,b ) I ( a, b) = ∫ ∫ K ( I ( x, y) − I (a, b);σ )dxdy N ( a ,b ) Lim∆ → 0 ∑0 Area(α < I < α + ∆ | N ( a, b)) × α × K (α − I ( a, b); σ ) Lim∆ → ∑ Pr I ( a,(b) b) = I a, = α α Lim∆ → 0 ∑0 Area(α < I < α + ∆ | N ( a, b)) × K (α − I ( a, b); σ ) Lim∆ → ∑ Pr α α Areas between isocontours: Published in contribute to weights for averaging. EMMCVPR 2009 33
  • 34. Extension to RGB images R (a, b), G (a, b), B (a, b)      ∑) α Pr(α < ( R, G, B) < α + ∆ ) K (α − ( R, G, B);σ ) (α =   = Area Joint Probability< of R,G,B (α − ( R,of,overlap of  ∑ Pr(α < ( R, G, B) α + ∆ ) K  (α ) isocontour pairs from R, G, B images G B ); σ ) 34
  • 35. Mean-shift framework • A clustering method developed by Fukunaga& Hostetler (IEEE Trans. Inf. Theory, 1975). • Applied to image filtering by Comaniciu and Meer (PAMI 2003). • Involves independent update of each pixel by maximization of local estimate of probability density of joint spatial and intensity parameters. 35
  • 36. Mean-shift framework • One step of mean-shift update around (a,b,c) where c=I(a,b). ∑ (x, y, I(x, y))w(x, y) (x, y)∈ N(a,b) ˆ ˆ ˆ 1. (a, b, c) := ∑ w(x, y) (x, y)∈ N(a,b) ˆ ˆ (a, b) ≡ updated center of the neighborhood , c ≡ updated intensity value ˆ  (x − a) 2 (y − b) 2 (I(x, y) − c) 2  w(x, y) := exp − − −   σs2 σs2 σr 2    ˆ ˆ ˆ 2. (a, b, c) ⇐ (a, b, c) 3. Repeat steps (1) and (2) till (a, b, c) stops changing. 4. Set I(a, b) ⇐ c. ˆ 36
  • 37. Our Method in Mean-shift Setting I(x,y) X(x,y)=x Y(x,y)=y 37
  • 38. Our Method in Mean-shift Setting Facets of tessellation induced by isocontours and the pixel grid ( X k , Yk ) = Centroid of Facet #k. I k = Intensity (from interpolated image) at ( X k , Yk ) . Ak = Area of Facet #k. ∑ ( X k , Yk , I k ) A k K( ( X k , Yk , I k ) − ( X (a, b), Y (a, b), I (a, b)) ; σ ) k ∈ N ( a ,b ) ( X (a, b), Y (a, b), I (a, b)) = ∑ A k K( ( X k , Yk , I k ) − ( X (a, b), Y (a, b), I (a, b)) ; σ ) k ∈ N ( a ,b ) 38
  • 39. Experimental Setup: Grayscale Images • Piecewise-linear interpolation used for our method in all experiments. • For our method, Kernel K = pillbox kernel, i.e. K ( z; σ ) = 1 If |z| <= σ K ( z; σ ) = 0 If |z| >σ • For discrete mean-shift, Kernel K = Gaussian. • Parameters used: neighborhood radius ρ=3, σ=3. • Noise model: Gaussian noise of variance 0.003 (scale of 0 to 1). 39
  • 40. Original Image Noisy Image Denoised Denoised (Isocontour (Gaussian Kernel Mean Shift) Mean Shift) 40
  • 41. Denoised Denoised Original Noisy Image (Isocontour (Std.Mean Image Mean Shift) Shift) 41
  • 42. Experiments on color images • Use of pillbox kernels for our method. • Use of Gaussian kernels for discrete mean shift. • Parameters used: neighborhood radius ρ= 6, σ = 6. • Noise model: Independent Gaussian noise on each channel with variance 0.003 (on a scale of 0 to 1). 42
  • 43. Experiments on color images • Independent piecewise-linear interpolation on R,G,B channels in our method. • Smoothing of R, G, B values done by coupled updates using joint probabilities. 43
  • 44. Original Noisy Image Image Denoised Denoised (Isocontour (Gaussian Kernel Mean Shift) Mean Shift) 44
  • 45. Denoised Denoised Original Noisy Image (Isocontour (Gaussian Kernel Image Mean Shift) Mean Shift) 45
  • 46. Observations • Discrete kernel mean shift performs poorly with small neighborhoods and small values of σ. • Why? Small sample-size problem for kernel density estimation. • Isocontour based method performs well even in this scenario (number of isocontours/facets >> number of pixels). • Large σ or large neighborhood values not always necessary for smoothing. 46
  • 47. Observations • Superior behavior observed when comparing isocontour-based neighborhood filters with standard neighborhood filters for the same parameter set and the same number of iterations. 47
  • 48. Probability density function (pdf) estimation using isocontours  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 48
  • 49.  Examples of unit vector data: 1. Chromaticity vectors of color values: ( R, G, B) ( r , g , b) = R2 + G2 + B2 2. Hue (from the HSI color scheme) obtained from the RGB values.  3 (G − B)  θ = arctan   2R − G − B    49
  • 50. Convert RGB values to unit ( Ri , Gi , Bi ) vi = (ri , g i , bi ) = vectors Ri2 + Gi2 + Bi2 Estimate density of unit 1 N vectors p (v ) = N ∑i= 1 K(v; κ , vi ) K ( v; κ , u ) = C p ( κ )e κ vT u Other mixture voMF popular K = von - Mises Fisher Kernel kernels: Watson, models κ = concentration parameter Banerjee et al (JMLR cosine. C p ( κ ) = normalization constant 2005) 50
  • 51. Density of Estimate density of RGB (magnitude,chromaticity) using KDE/Mixture models p ( m, v) = m 2 p ( R, G, B ) using random-variable m = ) 2 + 2 B −G 2 2 B 2 transformation ( R − Ri ) 2 + (G − Gi R ( + Bi ) +  1 N p ( R ,G , B ) = ∑ exp −    p (v) =v) = ∑ m 2 p ( R, G , B(κdmT vi ) N i=1 σ 12 N∞ 1  p ( N ∫ 2π I (κ )  exp ) i v Density of chromaticity Density of chromaticity: i=1 o i m= 0 (integrate out magnitude) mi = wi , v 2 wi 2κ i = 2 mi conditioning on m=1. = , m= R +m + B σ G 2 i Projected normal estimator: Variable bandwidthspheres”, Watson,”Statistics on voMF KDE: What’s new? The notion that 1983, all estimation can proceed in Bishop, “Neural networks for Euclidean space. Small,”Therecognition” 2006. pattern statistical theory of shape”, 1995 51
  • 52. Estimate density of RGB using KDE/Mixture models Use random variable transformation to get density of HSI (hue, saturation,intensity) Integrate out S,I to get density of hue 52
  • 53. Consistency between densities of Euclidean and unit vector data (in terms of random variable transformation/conditioning).  Potential to use the large body of literature available for statistics of Euclidean data (example: Fast Gauss Transform Greengard et al (SIAM Sci. Computing 1991), Duraiswami et al (IJCV 2003).  Model selection can be done in Euclidean space. 53