SlideShare a Scribd company logo
1 of 46
Download to read offline
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis




Image analysis



       7    Image analysis
              Computer Vision and Classiļ¬cation
              Image Segmentation




                                                                          413 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



The k-nearest-neighbor method



       The k-nearest-neighbor (knn) procedure has been used in data
       analysis and machine learning communities as a quick way to
       classify objects into predeļ¬ned groups.

       This approach requires a training dataset where both the class y
       and the vector x of characteristics (or covariates) of each
       observation are known.




                                                                          414 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



The k-nearest-neighbor method



       The training dataset (yi , xi )1ā‰¤iā‰¤n is used by the
       k-nearest-neighbor procedure to predict the value of yn+1 given a
       new vector of covariates xn+1 in a very rudimentary manner.

       The predicted value of yn+1 is simply the most frequent class found
       amongst the k nearest neighbors of xn+1 in the set (xi )1ā‰¤iā‰¤n .




                                                                           415 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



The k-nearest-neighbor method




       The classical knn procedure does not involve much calibration and
       requires no statistical modeling at all!

       There exists a Bayesian reformulation.




                                                                           416 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



vision dataset (1)



       vision: 1373 color pictures, described by 200 variables rather than
       by the whole table of 500 Ɨ 375 pixels

       Four classes of images: class C1 for motorcycles, class C2 for
       bicycles, class C3 for humans and class C4 for cars




                                                                             417 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



vision dataset (2)



       Typical issue in computer vision problems: build a classiļ¬er to
       identify a picture pertaining to a speciļ¬c topic without human
       intervention

       We use about half of the images (648 pictures) to construct the
       training dataset and we save the 689 remaining images to test the
       performance of our procedures.




                                                                           418 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



vision dataset (3)




                                                                          419 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



A probabilistic version of the knn methodology (1)


       Symmetrization of the neighborhood relation: If xi belongs to the
       k-nearest-neighborhood of xj and if xj does not belong to the
       k-nearest-neighborhood of xi , the point xj is added to the set of
       neighbors of xi

       Notation: i āˆ¼k j

       The transformed set of neighbors is then called the symmetrized
       k-nearest-neighbor system.




                                                                            420 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



A probabilistic version of the knn methodology (2)




                                                                          421 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



A probabilistic version of the knn methodology (3)

                                                                    ļ£«                               ļ£¶

                                                             exp ļ£­Ī²                ICj (yā„“ )      Nk ļ£ø
                                                                           ā„“āˆ¼k i
             P(yi = Cj |yāˆ’i , X, Ī², k) =                                  ļ£«                          ļ£¶,
                                                           G
                                                                exp ļ£­Ī²                ICg (yā„“ )    Nk ļ£ø
                                                          g=1                 ā„“āˆ¼k i

       Nk being the average number of neighbours over all xi ā€™s, Ī² > 0
       and

           yāˆ’i = (y1 , . . . , yiāˆ’1 , yi+1 , . . . , yn )                 and X = {x1 , . . . , xn } .


                                                                                                          422 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



A probabilistic version of the knn methodology (4)

               Ī² grades the inļ¬‚uence of the prevalent neighborhood class ;
               the probabilistic knn model is conditional on the covariate
               matrix X ;
               the frequencies n1 /n, . . . , nG /n of the classes within the
               training set are representative of the marginal probabilities
               p1 = P(yi = C1 ), . . . , pG = P(yi = CG ):


       If the marginal probabilities pg are known and diļ¬€erent from ng /n
       =ā‡’ reweighting the various classes according to their true
       frequencies


                                                                                423 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



Bayesian analysis of the knn probabilistic model

       From a Bayesian perspective, given a prior distribution Ļ€(Ī², k)
       with support [0, Ī²max ] Ɨ {1, . . . , K}, the marginal predictive
       distribution of yn+1 is

       P(yn+1 = Cj |xn+1 , y, X) =
                               K           Ī²max
                                                  P(yn+1 = Cj |xn+1 , y, X, Ī², k)Ļ€(Ī², k|y, X) dĪ² ,
                              k=1      0


       where Ļ€(Ī², k|y, X) is the posterior distribution given the training
       dataset (y, X).



                                                                                            424 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



Unknown normalizing constant


       Major diļ¬ƒculty: it is impossible to compute f (y|X, Ī², k)

       Use instead the pseudo-likelihood made of the product of the full
       conditionals
                                                G
                     f (y|X, Ī², k) =                            P(yi = Cg |yāˆ’i , X, Ī², k)
                                               g=1 yi =Cg




                                                                                            425 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



Pseudo-likelihood approximation

       Pseudo-posterior distribution

                                   Ļ€(Ī², k|y, X) āˆ f (y|X, Ī², k)Ļ€(Ī², k)


       Pseudo-predictive distribution

            P(yn+1 = Cj |xn+1 , y, X) =
                    K       Ī²max
                                   P(yn+1 = Cj |xn+1 , y, X, Ī², k)Ļ€(Ī², k|y, X) dĪ² .
               k=1      0




                                                                                      426 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



MCMC implementation (1)

       MCMC approximation is required

       Random walk Metropolisā€“Hastings algorithm

       Both Ī² and k are updated using random walk proposals:

       (i) for Ī², logistic tranformation

                              Ī² (t) = Ī²max exp(Īø(t) ) (exp(Īø(t) ) + 1) ,

       in order to be able to simulate a normal random walk on the Īøā€™s,
       Ėœ
       Īø āˆ¼ N (Īø(t) , Ļ„ 2 );


                                                                           427 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



MCMC implementation (2)

       (ii) for k, we use a uniform proposal on the r neighbors of k (t) ,
       namely on {k (t) āˆ’ r, . . . , k (t) āˆ’ 1, k (t) + 1, . . . k (t) + r} āˆ© {1, . . . , K}.

       Using this algorithm, we can thus derive the most likely class
       associated with a covariate vector xn+1 from the approximated
       probabilities,
                                 M
                          1
                                      P yn+1 = l|xn+1 , y, x, (Ī² (i) , k (i) ) .
                          M
                                i=1




                                                                                            428 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Computer Vision and Classiļ¬cation



MCMC implementation (3)




                                                                             2000
               8




                                                                             1500
                                                                 Frequency
               7
           Ī²




                                                                             1000
               6




                                                                             500
               5




                                                                             0
                    0     5000       10000       15000   20000                           5.0       5.5       6.0        6.5    7.0   7.5

                                    Iterations




                                                                             15000
               40
               30




                                                                             10000
                                                                 Frequency
           k

               20




                                                                             5000
               10




                                                                             0

                    0     5000       10000       15000   20000                       4         6         8         10     12   14    16

                                    Iterations




                          Ī²max = 15, K = 83, Ļ„ 2 = 0.05 and r = 1
          k sequences for the knn Metropolisā€“Hastings Ī² based on 20, 000 iterations


                                                                                                                                           429 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Image Segmentation

       This underlying structure of the ā€œtrueā€ pixels is denoted by x,
       while the observed image is denoted by y.

       Both objects x and y are arrays, with each entry of x taking a
       ļ¬nite number of values and each entry of y taking real values.

       We are interested in the posterior distribution of x given y
       provided by Bayesā€™ theorem, Ļ€(x|y) āˆ f (y|x)Ļ€(x).

       The likelihood, f (y|x), describes the link between the observed
       image and the underlying classiļ¬cation.



                                                                          430 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Menteith dataset (1)



       The Menteith dataset is a 100 Ɨ 100 pixel satellite image of the
       lake of Menteith.

       The lake of Menteith is located in Scotland, near Stirling, and
       oļ¬€ers the peculiarity of being called ā€œlakeā€ rather than the
       traditional Scottish ā€œloch.ā€




                                                                          431 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Menteith dataset (2)



                                   100
                                   80
                                   60
                                   40
                                   20




                                                20        40        60    80   100




                                   Satellite image of the lake of Menteith

                                                                                     432 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Random Fields



       If we take a lattice I of sites or pixels in an image, we denote by
       i āˆˆ I a coordinate in the lattice. The neighborhood relation is
       then denoted by āˆ¼.

       A random ļ¬eld on I is a random structure indexed by the lattice
       I, a collection of random variables {xi ; i āˆˆ I} where each xi takes
       values in a ļ¬nite set Ļ‡.




                                                                             433 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Markov Random Fields


       Let xn(i) be the set of values taken by the neighbors of i.

       A random ļ¬eld is a Markov random ļ¬eld (MRF) if the conditional
       distribution of any pixel given the other pixels only depends on the
       values of the neighbors of that pixel; i.e., for i āˆˆ I,

                                          Ļ€(xi |xāˆ’i ) = Ļ€(xi |xn(i) ) .




                                                                          434 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Ising Models

       If pixels of the underlying (true) image x can only take two colors
       (black and white, say), x is then binary, while y is a grey-level
       image.

       We typically refer to each pixel xi as being foreground if xi = 1
       (black) and background if xi = 0 (white).

       We have

                            Ļ€(xi = j|xāˆ’i ) āˆ exp(Ī²ni,j ) ,                Ī² > 0,

       where ni,j =              ā„“āˆˆn(i) Ixā„“ =j      is the number of neighbors of xi with
       color j.

                                                                                            435 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Ising Models

       The Ising model is deļ¬ned via full conditionals

                                                             exp(Ī²ni,1 )
                           Ļ€(xi = 1|xāˆ’i ) =                                       ,
                                                        exp(Ī²ni,0 ) + exp(Ī²ni,1 )

       and the joint distribution satisļ¬es
                                       ļ£«                                        ļ£¶

                                      Ļ€(x) āˆ exp ļ£­Ī²                       Ixj =xi ļ£ø ,
                                                                 jāˆ¼i


       where the summation is taken over all pairs (i, j) of neighbors.



                                                                                        436 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Simulating from the Ising Models




       The normalizing constant of the Ising Model is intractable except
       for very small lattices I...

       Direct simulation of x is not possible!




                                                                           437 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Ising Gibbs Sampler

       Algorithm (Ising Gibbs Sampler)
               Initialization: For i āˆˆ I, generate independently
                                                       (0)
                                                     xi      āˆ¼ B(1/2) .

               Iteration t (t ā‰„ 1):
                    1   Generate u = (ui )iāˆˆI , a random ordering of the elements of I.
                                                   (t)      (t)
                    2   For 1 ā‰¤ ā„“ ā‰¤ |I|, update nuā„“ ,0 and nuā„“ ,1 , and generate

                                                                           (t)
                                                                   exp(Ī²nuā„“ ,1 )
                                      x(t) āˆ¼ B
                                       uā„“                          (t)             (t)
                                                                                         .
                                                        exp(Ī²nuā„“ ,0 ) + exp(Ī²nuā„“ ,1 )



                                                                                             438 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Ising Gibbs Sampler (2)
                 20   60   100         20   60   100         20   60   100         20   60   100         20   60   100
           20




                                 20




                                                       20




                                                                             20




                                                                                                   20
           40




                                 40




                                                       40




                                                                             40




                                                                                                   40
           60




                                 60




                                                       60




                                                                             60




                                                                                                   60
           80




                                 80




                                                       80




                                                                             80




                                                                                                   80
           100




                                 100




                                                       100




                                                                             100




                                                                                                   100
                 20   60   100         20   60   100         20   60   100         20   60   100         20   60   100
           20




                                 20




                                                       20




                                                                             20




                                                                                                   20
           40




                                 40




                                                       40




                                                                             40




                                                                                                   40
           60




                                 60




                                                       60




                                                                             60




                                                                                                   60
           80




                                 80




                                                       80




                                                                             80




                                                                                                   80
           100




                                 100




                                                       100




                                                                             100




                                                                                                   100
       Simulations from the Ising model with a four-neighbor neighborhood structure
       on a 100 Ɨ 100 array after 1, 000 iterations of the Gibbs sampler: Ī² varies in
       steps of 0.1 from 0.3 to 1.2.
                                                                                                                         439 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Potts Models


       If there are G colors and if ni,g denotes the number of neighbors of
       i āˆˆ I with color g (1 ā‰¤ g ā‰¤ G) (that is, ni,g = jāˆ¼i Ixj =g )
       In that case, the full conditional distribution of xi is chosen to
       satisfy Ļ€(xi = g|xāˆ’i ) āˆ exp(Ī²ni,g ).
       This choice corresponds to the Potts model, whose joint density is
       given by                         ļ£«              ļ£¶

                                      Ļ€(x) āˆ exp ļ£­Ī²                       Ixj =xi ļ£ø .
                                                                 jāˆ¼i




                                                                                        440 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Simulating from the Potts Models (1)


       Algorithm (Potts Metropolisā€“Hastings Sampler)
               Initialization: For i āˆˆ I, generate independently
                                                 (0)
                                               xi      āˆ¼ U ({1, . . . , G}) .

               Iteration t (t ā‰„ 1):
                    1   Generate u = (ui )iāˆˆI a random ordering of the elements of I.
                    2   For 1 ā‰¤ ā„“ ā‰¤ |I|,
                              generate

                                          x(t) āˆ¼ U ({1, xuā„“ āˆ’ 1, xuā„“ + 1, . . . , G}) ,
                                          Ėœ uā„“           (tāˆ’1)    (tāˆ’1)




                                                                                          441 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Simulating from the Potts Models (2)



       Algorithm (continued)
                                       (t)
               compute the nul ,g and
                                                        (t)(t)
                                 Ļl = exp(Ī²nuā„“ ,Ėœ )/ exp(Ī²nuā„“ ,xu ) āˆ§ 1 ,
                                                x                         ā„“


                              (t)
               and set xuā„“ equal to xuā„“ with probability Ļl .
                                    Ėœ




                                                                              442 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Simulating from the Potts Models (3)
                 20   60   100         20   60   100         20   60   100         20   60   100         20   60   100
           20




                                 20




                                                       20




                                                                             20




                                                                                                   20
           40




                                 40




                                                       40




                                                                             40




                                                                                                   40
           60




                                 60




                                                       60




                                                                             60




                                                                                                   60
           80




                                 80




                                                       80




                                                                             80




                                                                                                   80
           100




                                 100




                                                       100




                                                                             100




                                                                                                   100
                 20   60   100         20   60   100         20   60   100         20   60   100         20   60   100
           20




                                 20




                                                       20




                                                                             20




                                                                                                   20
           40




                                 40




                                                       40




                                                                             40




                                                                                                   40
           60




                                 60




                                                       60




                                                                             60




                                                                                                   60
           80




                                 80




                                                       80




                                                                             80




                                                                                                   80
           100




                                 100




                                                       100




                                                                             100




                                                                                                   100
       Simulations from the Potts model with four grey levels and a four-neighbor
       neighborhood structure based on 1000 iterations of the Metropolisā€“Hastings
       sampler. The parameter Ī² varies in steps of 0.1 from 0.3 to 1.2.
                                                                                                                         443 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Posterior Inference (1)
       The prior on x is a Potts model with G categories,
                                       ļ£«                  ļ£¶
                                1
                   Ļ€(x|Ī²) =        exp ļ£­Ī²         Ixj =xi ļ£ø ,
                              Z(Ī²)
                                                                    iāˆˆI jāˆ¼i

       where Z(Ī²) is the normalizing constant.

       Given x, we assume that the observations in y are independent
       normal random variables,
                                                            1             1
        f (y|x, Ļƒ 2 , Āµ1 , . . . , ĀµG ) =                           exp āˆ’ 2 (yi āˆ’ Āµxi )2   .
                                                        (2Ļ€Ļƒ 2 )1/2      2Ļƒ
                                                  iāˆˆI



                                                                                               444 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Posterior Inference (2)



       Priors

                                      Ī² āˆ¼ U ([0, 2]) ,
            Āµ = (Āµ1 , . . . , ĀµG ) āˆ¼ U ({Āµ ; 0 ā‰¤ Āµ1 ā‰¤ . . . ā‰¤ ĀµG ā‰¤ 255}) ,
                               Ļ€(Ļƒ 2 ) āˆ Ļƒ āˆ’2 I]0,āˆž[ (Ļƒ 2 ) ,

       the last prior corresponding to a uniform prior on log Ļƒ.




                                                                             445 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Posterior Inference (3)


       Posterior distribution
                                                     ļ£«                                             ļ£¶
                                             1
       Ļ€(x, Ī², Ļƒ 2 , Āµ|y) āˆ Ļ€(Ī², Ļƒ 2 , Āµ) Ɨ      exp ļ£­Ī²                                   Ixj =xi ļ£ø
                                            Z(Ī²)
                                                                                iāˆˆI jāˆ¼i

                                                          1               āˆ’1
                                            Ɨ                     exp          (yi āˆ’ Āµxi )2    .
                                                      (2Ļ€Ļƒ 2 )1/2         2Ļƒ 2
                                                iāˆˆI




                                                                                                       446 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Full conditionals (1)

                                                         ļ£±                                      ļ£¼
                                                         ļ£²                         1            ļ£½
           P(xi = g|y, Ī², Ļƒ 2 , Āµ) āˆ exp                     Ī²         Ixj =g āˆ’        (yi āˆ’ Āµg )2 ,
                                                         ļ£³                        2Ļƒ 2            ļ£¾
                                                                 jāˆ¼i

       can be simulated directly.


                              ng =           Ixi =g       and sg =                Ixi =g yi
                                       iāˆˆI                                 iāˆˆI

       the full conditional distribution of Āµg is a truncated normal
       distribution on [Āµgāˆ’1 , Āµg+1 ] with mean sg /ng and variance Ļƒ 2 /ng


                                                                                                       447 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Full conditionals (2)


       The full conditional distribution of Ļƒ 2 is an inverse gamma
       distribution with parameters |I|2 /2 and iāˆˆI (yi āˆ’ Āµxi )2 /2.

       The full conditional distribution of Ī² is such that
                                          ļ£«                   ļ£¶
                                  1
                    Ļ€(Ī²|y) āˆ          exp ļ£­Ī²          Ixj =xi ļ£ø .
                               Z(Ī²)
                                                                    iāˆˆI jāˆ¼i




                                                                              448 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Path sampling Approximation (1)

                                         Z(Ī²) =             exp {Ī²S(x)} ,
                                                        x

       where S(x) =                iāˆˆI       jāˆ¼i Ixj =xi




                             dZ(Ī²)                                        exp(Ī²S(x))
                                            = Z(Ī²)                S(x)
                              dĪ²                              x
                                                                             Z(Ī²)
                                            = Z(Ī²) EĪ² [S(X)] ,


                                          d log Z(Ī²)
                                                     = EĪ² [S(X)] .
                                              dĪ²

                                                                                       449 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Path sampling Approximation (2)




       Path sampling identity
                                                                      Ī²1
                             log {Z(Ī²1 )/Z(Ī²0 )} =                         EĪ² [S(x)]dĪ² .
                                                                    Ī²0




                                                                                           450 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Path sampling Approximation (3)



       For a given value of Ī², EĪ² [S(X)] can be approximated from an
       MCMC sequence.

       The integral itself can be approximated by computing the value of
       f (Ī²) = EĪ² [S(X)] for a ļ¬nite number of values of Ī² and
                                                          Ė†
       approximating f (Ī²) by a piecewise-linear function f (Ī²) for the
       intermediate values of Ī².




                                                                           451 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Path sampling Approximation (3)

                          40000
                          35000
                          30000
                          25000
                          20000
                          15000
                          10000




                                  0.0           0.5            1.0        1.5   2.0




       Approximation of f (Ī²) for the Potts model on a 100 Ɨ 100 image, a
       four-neighbor neighborhood, and G = 6, based on 1500 MCMC iterations after
       burn-in.                                                                  452 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Posterior Approximation (1)
           35.0




                                                                   58.5 59.0 59.5 60.0 60.5
           34.0
           33.0
           32.0




                                           0   500   1000   1500                              0   500   1000   1500
           70.0 70.5 71.0 71.5 72.0 72.5




                                                                   83.0 83.5 84.0 84.5 85.0
                                           0   500   1000   1500                              0   500   1000   1500




                                                                   113.5
           95.0 95.5 96.0 96.5 97.0 97.5




                                                                   112.5
                                                                   111.5
                                                                   110.5




                                           0   500   1000   1500                              0   500   1000   1500




       Dataset Menteith: Sequence of Āµg ā€™s based on 2000 iterations of the hybrid
       Gibbs sampler (read row-wise from Āµ1 to Āµ6 ).

                                                                                                                      453 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Posterior Approximation (2)


                                100




                                                                                                      40
                                80




                                                                                                      30
                                60




                                                                                                      20
                                40




                                                                                                      10
                                20
                                0




                                                                                                      0
                                        32.0   32.5     33.0     33.5      34.0      34.5      35.0                58.5       59.0         59.5     60.0    60.5
                                60




                                                                                                      50
                                50




                                                                                                      40
                                40




                                                                                                      30
                                30




                                                                                                      20
                                20




                                                                                                      10
                                10
                                0




                                                                                                      0
                                      70.0     70.5     71.0       71.5       72.0          72.5            83.0            83.5       84.0         84.5      85.0




                                                                                                      100
                                50




                                                                                                      80
                                40




                                                                                                      60
                                30




                                                                                                      40
                                20




                                                                                                      20
                                10
                                0




                                                                                                      0




                                        95.0     95.5     96.0      96.5          97.0      97.5            110.5         111.0    111.5    112.0   112.5   113.0    113.5




       Histograms of the Āµg ā€™s


                                                                                                                                                                             454 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Posterior Approximation (3)




                                                                                              120
                                20.5




                                                                                              100
                                20.0




                                                                                              80
                                                                                              60
                                19.5




                                                                                              40
                                19.0




                                                                                              20
                                18.5




                                                                                              0
                                                                      0   500   1000   1500             18.5       19.0        19.5     20.0       20.5
                                1.295 1.300 1.305 1.310 1.315 1.320




                                                                                              100 120
                                                                                              80
                                                                                              60
                                                                                              40
                                                                                              20
                                                                                              0




                                                                      0   500   1000   1500                    1.295   1.300   1.305   1.310   1.315   1.320




       Raw plots and histograms of the Ļƒ 2 ā€™s and Ī²ā€™s based on 2000 iterations of the
       hybrid Gibbs sampler

                                                                                                                                                               455 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Image segmentation (1)
       Based on (x(t) )1ā‰¤tā‰¤T , an estimator of x needs to be derived from
       an evaluation of the consequences of wrong allocations.

       Two common ways corresponding to two loss functions

                                            L1 (x, x) =               Ixi =Ė†i ,
                                                                           x
                                                                iāˆˆI

                                                L2 (x, x) = Ix=b ,
                                                               x


       Corresponding estimators

                          xM P M = arg max PĻ€ (xi = g|y) ,
                          Ė†i                                                      iāˆˆI,
                                                 1ā‰¤gā‰¤G

                                         xM AP = arg max Ļ€(x|y) ,
                                                                 x
                                                                                         456 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Image segmentation (2)



       xM P M and xM AP not available in closed form!
       Approximation
                                                                      N
                                     xM P M
                                     Ė†i          =       max               Ix(j) =g ,
                                                     gāˆˆ{1,...,G}             i
                                                                     j=1


       based on a simulated sequence, x(1) , . . . , x(N ) .




                                                                                        457 / 458
Bayesian Core:A Practical Approach to Computational Bayesian Statistics
   Image analysis
     Image Segmentation



Image segmentation (3)




                                        100
                                        80
                                        60
                                        40
                                        20


                                                  20       40      60     80   100
                                        100
                                        80
                                        60
                                        40
                                        20




                                                  20       40      60     80   100




       (top) Segmented image based on the MPM estimate produced after 2000
       iterations of the Gibbs sampler and (bottom) the observed image
                                                                                     458 / 458

More Related Content

What's hot

Tailored Bregman Ball Trees for Effective Nearest Neighbors
Tailored Bregman Ball Trees for Effective Nearest NeighborsTailored Bregman Ball Trees for Effective Nearest Neighbors
Tailored Bregman Ball Trees for Effective Nearest NeighborsFrank Nielsen
Ā 
Convolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernelsConvolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernelstuxette
Ā 
Lecture 3 image sampling and quantization
Lecture 3 image sampling and quantizationLecture 3 image sampling and quantization
Lecture 3 image sampling and quantizationVARUN KUMAR
Ā 
Habilitation Ć  diriger des recherches
Habilitation Ć  diriger des recherchesHabilitation Ć  diriger des recherches
Habilitation Ć  diriger des recherchesPierre Pudlo
Ā 
Image Restoration and Denoising By Using Nonlocally Centralized Sparse Repres...
Image Restoration and Denoising By Using Nonlocally Centralized Sparse Repres...Image Restoration and Denoising By Using Nonlocally Centralized Sparse Repres...
Image Restoration and Denoising By Using Nonlocally Centralized Sparse Repres...IJERA Editor
Ā 
Expert Lecture on GPS at UIET, CSJM, Kanpur
Expert Lecture on GPS at UIET, CSJM, KanpurExpert Lecture on GPS at UIET, CSJM, Kanpur
Expert Lecture on GPS at UIET, CSJM, KanpurSuddhasheel GHOSH, PhD
Ā 
MLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic trackMLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic trackarogozhnikov
Ā 
Bayesian Neural Networks
Bayesian Neural NetworksBayesian Neural Networks
Bayesian Neural Networksm.a.kirn
Ā 
Fast Object Recognition from 3D Depth Data with Extreme Learning Machine
Fast Object Recognition from 3D Depth Data with Extreme Learning MachineFast Object Recognition from 3D Depth Data with Extreme Learning Machine
Fast Object Recognition from 3D Depth Data with Extreme Learning MachineSoma Boubou
Ā 
Lec-08 Feature Aggregation II: Fisher Vector, AKULA and Super Vector
Lec-08 Feature Aggregation II: Fisher Vector, AKULA and Super VectorLec-08 Feature Aggregation II: Fisher Vector, AKULA and Super Vector
Lec-08 Feature Aggregation II: Fisher Vector, AKULA and Super VectorUnited States Air Force Academy
Ā 
MLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic trackMLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic trackarogozhnikov
Ā 
CVPR2008 tutorial generalized pca
CVPR2008 tutorial generalized pcaCVPR2008 tutorial generalized pca
CVPR2008 tutorial generalized pcazukun
Ā 
Differential analyses of structures in HiC data
Differential analyses of structures in HiC dataDifferential analyses of structures in HiC data
Differential analyses of structures in HiC datatuxette
Ā 
Machine Learning and Statistical Analysis
Machine Learning and Statistical AnalysisMachine Learning and Statistical Analysis
Machine Learning and Statistical Analysisbutest
Ā 

What's hot (19)

Tailored Bregman Ball Trees for Effective Nearest Neighbors
Tailored Bregman Ball Trees for Effective Nearest NeighborsTailored Bregman Ball Trees for Effective Nearest Neighbors
Tailored Bregman Ball Trees for Effective Nearest Neighbors
Ā 
Convolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernelsConvolutional networks and graph networks through kernels
Convolutional networks and graph networks through kernels
Ā 
Deep Learning Opening Workshop - Horseshoe Regularization for Machine Learnin...
Deep Learning Opening Workshop - Horseshoe Regularization for Machine Learnin...Deep Learning Opening Workshop - Horseshoe Regularization for Machine Learnin...
Deep Learning Opening Workshop - Horseshoe Regularization for Machine Learnin...
Ā 
Lecture 3 image sampling and quantization
Lecture 3 image sampling and quantizationLecture 3 image sampling and quantization
Lecture 3 image sampling and quantization
Ā 
Habilitation Ć  diriger des recherches
Habilitation Ć  diriger des recherchesHabilitation Ć  diriger des recherches
Habilitation Ć  diriger des recherches
Ā 
Image Restoration and Denoising By Using Nonlocally Centralized Sparse Repres...
Image Restoration and Denoising By Using Nonlocally Centralized Sparse Repres...Image Restoration and Denoising By Using Nonlocally Centralized Sparse Repres...
Image Restoration and Denoising By Using Nonlocally Centralized Sparse Repres...
Ā 
Expert Lecture on GPS at UIET, CSJM, Kanpur
Expert Lecture on GPS at UIET, CSJM, KanpurExpert Lecture on GPS at UIET, CSJM, Kanpur
Expert Lecture on GPS at UIET, CSJM, Kanpur
Ā 
11 clusadvanced
11 clusadvanced11 clusadvanced
11 clusadvanced
Ā 
MLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic trackMLHEP Lectures - day 1, basic track
MLHEP Lectures - day 1, basic track
Ā 
Bayesian Neural Networks
Bayesian Neural NetworksBayesian Neural Networks
Bayesian Neural Networks
Ā 
Fast Object Recognition from 3D Depth Data with Extreme Learning Machine
Fast Object Recognition from 3D Depth Data with Extreme Learning MachineFast Object Recognition from 3D Depth Data with Extreme Learning Machine
Fast Object Recognition from 3D Depth Data with Extreme Learning Machine
Ā 
Deep Learning Opening Workshop - Domain Adaptation Challenges in Genomics: a ...
Deep Learning Opening Workshop - Domain Adaptation Challenges in Genomics: a ...Deep Learning Opening Workshop - Domain Adaptation Challenges in Genomics: a ...
Deep Learning Opening Workshop - Domain Adaptation Challenges in Genomics: a ...
Ā 
Lec-08 Feature Aggregation II: Fisher Vector, AKULA and Super Vector
Lec-08 Feature Aggregation II: Fisher Vector, AKULA and Super VectorLec-08 Feature Aggregation II: Fisher Vector, AKULA and Super Vector
Lec-08 Feature Aggregation II: Fisher Vector, AKULA and Super Vector
Ā 
MLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic trackMLHEP Lectures - day 2, basic track
MLHEP Lectures - day 2, basic track
Ā 
CVPR2008 tutorial generalized pca
CVPR2008 tutorial generalized pcaCVPR2008 tutorial generalized pca
CVPR2008 tutorial generalized pca
Ā 
CLIM Program: Remote Sensing Workshop, Statistical Emulation with Dimension R...
CLIM Program: Remote Sensing Workshop, Statistical Emulation with Dimension R...CLIM Program: Remote Sensing Workshop, Statistical Emulation with Dimension R...
CLIM Program: Remote Sensing Workshop, Statistical Emulation with Dimension R...
Ā 
Lec07 aggregation-and-retrieval-system
Lec07 aggregation-and-retrieval-systemLec07 aggregation-and-retrieval-system
Lec07 aggregation-and-retrieval-system
Ā 
Differential analyses of structures in HiC data
Differential analyses of structures in HiC dataDifferential analyses of structures in HiC data
Differential analyses of structures in HiC data
Ā 
Machine Learning and Statistical Analysis
Machine Learning and Statistical AnalysisMachine Learning and Statistical Analysis
Machine Learning and Statistical Analysis
Ā 

Similar to Bayesian KNN Approach for Image Classification

ch8Bayes.pptx
ch8Bayes.pptxch8Bayes.pptx
ch8Bayes.pptxssuser4c50a9
Ā 
ch8Bayes.ppt
ch8Bayes.pptch8Bayes.ppt
ch8Bayes.pptImXaib
Ā 
Expectation propagation
Expectation propagationExpectation propagation
Expectation propagationDong Guo
Ā 
Neural Processes Family
Neural Processes FamilyNeural Processes Family
Neural Processes FamilyKota Matsui
Ā 
TunUp final presentation
TunUp final presentationTunUp final presentation
TunUp final presentationGianmario Spacagna
Ā 
03 digital image fundamentals DIP
03 digital image fundamentals DIP03 digital image fundamentals DIP
03 digital image fundamentals DIPbabak danyal
Ā 
Naive bayes
Naive bayesNaive bayes
Naive bayesumeskath
Ā 
5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdfRahul926331
Ā 
Digital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsDigital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsMostafa G. M. Mostafa
Ā 
On image intensities, eigenfaces and LDA
On image intensities, eigenfaces and LDAOn image intensities, eigenfaces and LDA
On image intensities, eigenfaces and LDARaghu Palakodety
Ā 
Iwsm2014 an analogy-based approach to estimation of software development ef...
Iwsm2014   an analogy-based approach to estimation of software development ef...Iwsm2014   an analogy-based approach to estimation of software development ef...
Iwsm2014 an analogy-based approach to estimation of software development ef...Nesma
Ā 
Parallel Bayesian Optimization
Parallel Bayesian OptimizationParallel Bayesian Optimization
Parallel Bayesian OptimizationSri Ambati
Ā 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1arogozhnikov
Ā 
Pathway Discovery in Cancer: the Bayesian Approach
Pathway Discovery in Cancer: the Bayesian ApproachPathway Discovery in Cancer: the Bayesian Approach
Pathway Discovery in Cancer: the Bayesian ApproachFrancesco Gadaleta
Ā 
Wiamis2010 poster
Wiamis2010 posterWiamis2010 poster
Wiamis2010 posterPluribus One
Ā 
Wiamis2010 poster
Wiamis2010 posterWiamis2010 poster
Wiamis2010 posterguest7a41157
Ā 

Similar to Bayesian KNN Approach for Image Classification (20)

ch8Bayes.pptx
ch8Bayes.pptxch8Bayes.pptx
ch8Bayes.pptx
Ā 
ch8Bayes.ppt
ch8Bayes.pptch8Bayes.ppt
ch8Bayes.ppt
Ā 
ch8Bayes.ppt
ch8Bayes.pptch8Bayes.ppt
ch8Bayes.ppt
Ā 
Inex07
Inex07Inex07
Inex07
Ā 
Expectation propagation
Expectation propagationExpectation propagation
Expectation propagation
Ā 
Lec-3 DIP.pptx
Lec-3 DIP.pptxLec-3 DIP.pptx
Lec-3 DIP.pptx
Ā 
Neural Processes Family
Neural Processes FamilyNeural Processes Family
Neural Processes Family
Ā 
TunUp final presentation
TunUp final presentationTunUp final presentation
TunUp final presentation
Ā 
ML.pptx
ML.pptxML.pptx
ML.pptx
Ā 
03 digital image fundamentals DIP
03 digital image fundamentals DIP03 digital image fundamentals DIP
03 digital image fundamentals DIP
Ā 
Naive bayes
Naive bayesNaive bayes
Naive bayes
Ā 
5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf5 DimensionalityReduction.pdf
5 DimensionalityReduction.pdf
Ā 
Digital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image FundamentalsDigital Image Processing: Digital Image Fundamentals
Digital Image Processing: Digital Image Fundamentals
Ā 
On image intensities, eigenfaces and LDA
On image intensities, eigenfaces and LDAOn image intensities, eigenfaces and LDA
On image intensities, eigenfaces and LDA
Ā 
Iwsm2014 an analogy-based approach to estimation of software development ef...
Iwsm2014   an analogy-based approach to estimation of software development ef...Iwsm2014   an analogy-based approach to estimation of software development ef...
Iwsm2014 an analogy-based approach to estimation of software development ef...
Ā 
Parallel Bayesian Optimization
Parallel Bayesian OptimizationParallel Bayesian Optimization
Parallel Bayesian Optimization
Ā 
MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1MLHEP 2015: Introductory Lecture #1
MLHEP 2015: Introductory Lecture #1
Ā 
Pathway Discovery in Cancer: the Bayesian Approach
Pathway Discovery in Cancer: the Bayesian ApproachPathway Discovery in Cancer: the Bayesian Approach
Pathway Discovery in Cancer: the Bayesian Approach
Ā 
Wiamis2010 poster
Wiamis2010 posterWiamis2010 poster
Wiamis2010 poster
Ā 
Wiamis2010 poster
Wiamis2010 posterWiamis2010 poster
Wiamis2010 poster
Ā 

More from Christian Robert

Asymptotics of ABC, lecture, CollĆØge de France
Asymptotics of ABC, lecture, CollĆØge de FranceAsymptotics of ABC, lecture, CollĆØge de France
Asymptotics of ABC, lecture, CollĆØge de FranceChristian Robert
Ā 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinChristian Robert
Ā 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdfChristian Robert
Ā 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?Christian Robert
Ā 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Christian Robert
Ā 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Christian Robert
Ā 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking componentsChristian Robert
Ā 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihoodChristian Robert
Ā 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)Christian Robert
Ā 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerChristian Robert
Ā 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statisticsChristian Robert
Ā 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Christian Robert
Ā 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussionChristian Robert
Ā 

More from Christian Robert (20)

Asymptotics of ABC, lecture, CollĆØge de France
Asymptotics of ABC, lecture, CollĆØge de FranceAsymptotics of ABC, lecture, CollĆØge de France
Asymptotics of ABC, lecture, CollĆØge de France
Ā 
Workshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael MartinWorkshop in honour of Don Poskitt and Gael Martin
Workshop in honour of Don Poskitt and Gael Martin
Ā 
discussion of ICML23.pdf
discussion of ICML23.pdfdiscussion of ICML23.pdf
discussion of ICML23.pdf
Ā 
How many components in a mixture?
How many components in a mixture?How many components in a mixture?
How many components in a mixture?
Ā 
restore.pdf
restore.pdfrestore.pdf
restore.pdf
Ā 
Testing for mixtures at BNP 13
Testing for mixtures at BNP 13Testing for mixtures at BNP 13
Testing for mixtures at BNP 13
Ā 
Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?Inferring the number of components: dream or reality?
Inferring the number of components: dream or reality?
Ā 
CDT 22 slides.pdf
CDT 22 slides.pdfCDT 22 slides.pdf
CDT 22 slides.pdf
Ā 
Testing for mixtures by seeking components
Testing for mixtures by seeking componentsTesting for mixtures by seeking components
Testing for mixtures by seeking components
Ā 
discussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihooddiscussion on Bayesian restricted likelihood
discussion on Bayesian restricted likelihood
Ā 
NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)NCE, GANs & VAEs (and maybe BAC)
NCE, GANs & VAEs (and maybe BAC)
Ā 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
Ā 
Coordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like samplerCoordinate sampler : A non-reversible Gibbs-like sampler
Coordinate sampler : A non-reversible Gibbs-like sampler
Ā 
eugenics and statistics
eugenics and statisticseugenics and statistics
eugenics and statistics
Ā 
Laplace's Demon: seminar #1
Laplace's Demon: seminar #1Laplace's Demon: seminar #1
Laplace's Demon: seminar #1
Ā 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
Ā 
asymptotics of ABC
asymptotics of ABCasymptotics of ABC
asymptotics of ABC
Ā 
ABC-Gibbs
ABC-GibbsABC-Gibbs
ABC-Gibbs
Ā 
Likelihood-free Design: a discussion
Likelihood-free Design: a discussionLikelihood-free Design: a discussion
Likelihood-free Design: a discussion
Ā 
the ABC of ABC
the ABC of ABCthe ABC of ABC
the ABC of ABC
Ā 

Recently uploaded

Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management systemChristalin Nelson
Ā 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONHumphrey A BeƱa
Ā 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxAshokKarra1
Ā 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxCarlos105
Ā 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
Ā 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
Ā 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
Ā 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Celine George
Ā 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
Ā 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
Ā 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYKayeClaireEstoconing
Ā 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
Ā 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
Ā 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
Ā 
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļøcall girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø9953056974 Low Rate Call Girls In Saket, Delhi NCR
Ā 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
Ā 
Science 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxScience 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxMaryGraceBautista27
Ā 

Recently uploaded (20)

Concurrency Control in Database Management system
Concurrency Control in Database Management systemConcurrency Control in Database Management system
Concurrency Control in Database Management system
Ā 
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATIONTHEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
THEORIES OF ORGANIZATION-PUBLIC ADMINISTRATION
Ā 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptx
Ā 
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptxBarangay Council for the Protection of Children (BCPC) Orientation.pptx
Barangay Council for the Protection of Children (BCPC) Orientation.pptx
Ā 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Ā 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
Ā 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
Ā 
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Incoming and Outgoing Shipments in 3 STEPS Using Odoo 17
Ā 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
Ā 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Ā 
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITYISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
ISYU TUNGKOL SA SEKSWLADIDA (ISSUE ABOUT SEXUALITY
Ā 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
Ā 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
Ā 
Model Call Girl in Tilak Nagar Delhi reach out to us at šŸ”9953056974šŸ”
Model Call Girl in Tilak Nagar Delhi reach out to us at šŸ”9953056974šŸ”Model Call Girl in Tilak Nagar Delhi reach out to us at šŸ”9953056974šŸ”
Model Call Girl in Tilak Nagar Delhi reach out to us at šŸ”9953056974šŸ”
Ā 
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptxFINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
FINALS_OF_LEFT_ON_C'N_EL_DORADO_2024.pptx
Ā 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
Ā 
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptxLEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
LEFT_ON_C'N_ PRELIMS_EL_DORADO_2024.pptx
Ā 
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļøcall girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
call girls in Kamla Market (DELHI) šŸ” >ą¼’9953330565šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
Ā 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
Ā 
Science 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptxScience 7 Quarter 4 Module 2: Natural Resources.pptx
Science 7 Quarter 4 Module 2: Natural Resources.pptx
Ā 

Bayesian KNN Approach for Image Classification

  • 1. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image analysis 7 Image analysis Computer Vision and Classiļ¬cation Image Segmentation 413 / 458
  • 2. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation The k-nearest-neighbor method The k-nearest-neighbor (knn) procedure has been used in data analysis and machine learning communities as a quick way to classify objects into predeļ¬ned groups. This approach requires a training dataset where both the class y and the vector x of characteristics (or covariates) of each observation are known. 414 / 458
  • 3. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation The k-nearest-neighbor method The training dataset (yi , xi )1ā‰¤iā‰¤n is used by the k-nearest-neighbor procedure to predict the value of yn+1 given a new vector of covariates xn+1 in a very rudimentary manner. The predicted value of yn+1 is simply the most frequent class found amongst the k nearest neighbors of xn+1 in the set (xi )1ā‰¤iā‰¤n . 415 / 458
  • 4. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation The k-nearest-neighbor method The classical knn procedure does not involve much calibration and requires no statistical modeling at all! There exists a Bayesian reformulation. 416 / 458
  • 5. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation vision dataset (1) vision: 1373 color pictures, described by 200 variables rather than by the whole table of 500 Ɨ 375 pixels Four classes of images: class C1 for motorcycles, class C2 for bicycles, class C3 for humans and class C4 for cars 417 / 458
  • 6. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation vision dataset (2) Typical issue in computer vision problems: build a classiļ¬er to identify a picture pertaining to a speciļ¬c topic without human intervention We use about half of the images (648 pictures) to construct the training dataset and we save the 689 remaining images to test the performance of our procedures. 418 / 458
  • 7. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation vision dataset (3) 419 / 458
  • 8. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation A probabilistic version of the knn methodology (1) Symmetrization of the neighborhood relation: If xi belongs to the k-nearest-neighborhood of xj and if xj does not belong to the k-nearest-neighborhood of xi , the point xj is added to the set of neighbors of xi Notation: i āˆ¼k j The transformed set of neighbors is then called the symmetrized k-nearest-neighbor system. 420 / 458
  • 9. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation A probabilistic version of the knn methodology (2) 421 / 458
  • 10. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation A probabilistic version of the knn methodology (3) ļ£« ļ£¶ exp ļ£­Ī² ICj (yā„“ ) Nk ļ£ø ā„“āˆ¼k i P(yi = Cj |yāˆ’i , X, Ī², k) = ļ£« ļ£¶, G exp ļ£­Ī² ICg (yā„“ ) Nk ļ£ø g=1 ā„“āˆ¼k i Nk being the average number of neighbours over all xi ā€™s, Ī² > 0 and yāˆ’i = (y1 , . . . , yiāˆ’1 , yi+1 , . . . , yn ) and X = {x1 , . . . , xn } . 422 / 458
  • 11. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation A probabilistic version of the knn methodology (4) Ī² grades the inļ¬‚uence of the prevalent neighborhood class ; the probabilistic knn model is conditional on the covariate matrix X ; the frequencies n1 /n, . . . , nG /n of the classes within the training set are representative of the marginal probabilities p1 = P(yi = C1 ), . . . , pG = P(yi = CG ): If the marginal probabilities pg are known and diļ¬€erent from ng /n =ā‡’ reweighting the various classes according to their true frequencies 423 / 458
  • 12. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation Bayesian analysis of the knn probabilistic model From a Bayesian perspective, given a prior distribution Ļ€(Ī², k) with support [0, Ī²max ] Ɨ {1, . . . , K}, the marginal predictive distribution of yn+1 is P(yn+1 = Cj |xn+1 , y, X) = K Ī²max P(yn+1 = Cj |xn+1 , y, X, Ī², k)Ļ€(Ī², k|y, X) dĪ² , k=1 0 where Ļ€(Ī², k|y, X) is the posterior distribution given the training dataset (y, X). 424 / 458
  • 13. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation Unknown normalizing constant Major diļ¬ƒculty: it is impossible to compute f (y|X, Ī², k) Use instead the pseudo-likelihood made of the product of the full conditionals G f (y|X, Ī², k) = P(yi = Cg |yāˆ’i , X, Ī², k) g=1 yi =Cg 425 / 458
  • 14. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation Pseudo-likelihood approximation Pseudo-posterior distribution Ļ€(Ī², k|y, X) āˆ f (y|X, Ī², k)Ļ€(Ī², k) Pseudo-predictive distribution P(yn+1 = Cj |xn+1 , y, X) = K Ī²max P(yn+1 = Cj |xn+1 , y, X, Ī², k)Ļ€(Ī², k|y, X) dĪ² . k=1 0 426 / 458
  • 15. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation MCMC implementation (1) MCMC approximation is required Random walk Metropolisā€“Hastings algorithm Both Ī² and k are updated using random walk proposals: (i) for Ī², logistic tranformation Ī² (t) = Ī²max exp(Īø(t) ) (exp(Īø(t) ) + 1) , in order to be able to simulate a normal random walk on the Īøā€™s, Ėœ Īø āˆ¼ N (Īø(t) , Ļ„ 2 ); 427 / 458
  • 16. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation MCMC implementation (2) (ii) for k, we use a uniform proposal on the r neighbors of k (t) , namely on {k (t) āˆ’ r, . . . , k (t) āˆ’ 1, k (t) + 1, . . . k (t) + r} āˆ© {1, . . . , K}. Using this algorithm, we can thus derive the most likely class associated with a covariate vector xn+1 from the approximated probabilities, M 1 P yn+1 = l|xn+1 , y, x, (Ī² (i) , k (i) ) . M i=1 428 / 458
  • 17. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Computer Vision and Classiļ¬cation MCMC implementation (3) 2000 8 1500 Frequency 7 Ī² 1000 6 500 5 0 0 5000 10000 15000 20000 5.0 5.5 6.0 6.5 7.0 7.5 Iterations 15000 40 30 10000 Frequency k 20 5000 10 0 0 5000 10000 15000 20000 4 6 8 10 12 14 16 Iterations Ī²max = 15, K = 83, Ļ„ 2 = 0.05 and r = 1 k sequences for the knn Metropolisā€“Hastings Ī² based on 20, 000 iterations 429 / 458
  • 18. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Image Segmentation This underlying structure of the ā€œtrueā€ pixels is denoted by x, while the observed image is denoted by y. Both objects x and y are arrays, with each entry of x taking a ļ¬nite number of values and each entry of y taking real values. We are interested in the posterior distribution of x given y provided by Bayesā€™ theorem, Ļ€(x|y) āˆ f (y|x)Ļ€(x). The likelihood, f (y|x), describes the link between the observed image and the underlying classiļ¬cation. 430 / 458
  • 19. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Menteith dataset (1) The Menteith dataset is a 100 Ɨ 100 pixel satellite image of the lake of Menteith. The lake of Menteith is located in Scotland, near Stirling, and oļ¬€ers the peculiarity of being called ā€œlakeā€ rather than the traditional Scottish ā€œloch.ā€ 431 / 458
  • 20. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Menteith dataset (2) 100 80 60 40 20 20 40 60 80 100 Satellite image of the lake of Menteith 432 / 458
  • 21. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Random Fields If we take a lattice I of sites or pixels in an image, we denote by i āˆˆ I a coordinate in the lattice. The neighborhood relation is then denoted by āˆ¼. A random ļ¬eld on I is a random structure indexed by the lattice I, a collection of random variables {xi ; i āˆˆ I} where each xi takes values in a ļ¬nite set Ļ‡. 433 / 458
  • 22. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Markov Random Fields Let xn(i) be the set of values taken by the neighbors of i. A random ļ¬eld is a Markov random ļ¬eld (MRF) if the conditional distribution of any pixel given the other pixels only depends on the values of the neighbors of that pixel; i.e., for i āˆˆ I, Ļ€(xi |xāˆ’i ) = Ļ€(xi |xn(i) ) . 434 / 458
  • 23. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Ising Models If pixels of the underlying (true) image x can only take two colors (black and white, say), x is then binary, while y is a grey-level image. We typically refer to each pixel xi as being foreground if xi = 1 (black) and background if xi = 0 (white). We have Ļ€(xi = j|xāˆ’i ) āˆ exp(Ī²ni,j ) , Ī² > 0, where ni,j = ā„“āˆˆn(i) Ixā„“ =j is the number of neighbors of xi with color j. 435 / 458
  • 24. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Ising Models The Ising model is deļ¬ned via full conditionals exp(Ī²ni,1 ) Ļ€(xi = 1|xāˆ’i ) = , exp(Ī²ni,0 ) + exp(Ī²ni,1 ) and the joint distribution satisļ¬es ļ£« ļ£¶ Ļ€(x) āˆ exp ļ£­Ī² Ixj =xi ļ£ø , jāˆ¼i where the summation is taken over all pairs (i, j) of neighbors. 436 / 458
  • 25. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Simulating from the Ising Models The normalizing constant of the Ising Model is intractable except for very small lattices I... Direct simulation of x is not possible! 437 / 458
  • 26. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Ising Gibbs Sampler Algorithm (Ising Gibbs Sampler) Initialization: For i āˆˆ I, generate independently (0) xi āˆ¼ B(1/2) . Iteration t (t ā‰„ 1): 1 Generate u = (ui )iāˆˆI , a random ordering of the elements of I. (t) (t) 2 For 1 ā‰¤ ā„“ ā‰¤ |I|, update nuā„“ ,0 and nuā„“ ,1 , and generate (t) exp(Ī²nuā„“ ,1 ) x(t) āˆ¼ B uā„“ (t) (t) . exp(Ī²nuā„“ ,0 ) + exp(Ī²nuā„“ ,1 ) 438 / 458
  • 27. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Ising Gibbs Sampler (2) 20 60 100 20 60 100 20 60 100 20 60 100 20 60 100 20 20 20 20 20 40 40 40 40 40 60 60 60 60 60 80 80 80 80 80 100 100 100 100 100 20 60 100 20 60 100 20 60 100 20 60 100 20 60 100 20 20 20 20 20 40 40 40 40 40 60 60 60 60 60 80 80 80 80 80 100 100 100 100 100 Simulations from the Ising model with a four-neighbor neighborhood structure on a 100 Ɨ 100 array after 1, 000 iterations of the Gibbs sampler: Ī² varies in steps of 0.1 from 0.3 to 1.2. 439 / 458
  • 28. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Potts Models If there are G colors and if ni,g denotes the number of neighbors of i āˆˆ I with color g (1 ā‰¤ g ā‰¤ G) (that is, ni,g = jāˆ¼i Ixj =g ) In that case, the full conditional distribution of xi is chosen to satisfy Ļ€(xi = g|xāˆ’i ) āˆ exp(Ī²ni,g ). This choice corresponds to the Potts model, whose joint density is given by ļ£« ļ£¶ Ļ€(x) āˆ exp ļ£­Ī² Ixj =xi ļ£ø . jāˆ¼i 440 / 458
  • 29. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Simulating from the Potts Models (1) Algorithm (Potts Metropolisā€“Hastings Sampler) Initialization: For i āˆˆ I, generate independently (0) xi āˆ¼ U ({1, . . . , G}) . Iteration t (t ā‰„ 1): 1 Generate u = (ui )iāˆˆI a random ordering of the elements of I. 2 For 1 ā‰¤ ā„“ ā‰¤ |I|, generate x(t) āˆ¼ U ({1, xuā„“ āˆ’ 1, xuā„“ + 1, . . . , G}) , Ėœ uā„“ (tāˆ’1) (tāˆ’1) 441 / 458
  • 30. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Simulating from the Potts Models (2) Algorithm (continued) (t) compute the nul ,g and (t)(t) Ļl = exp(Ī²nuā„“ ,Ėœ )/ exp(Ī²nuā„“ ,xu ) āˆ§ 1 , x ā„“ (t) and set xuā„“ equal to xuā„“ with probability Ļl . Ėœ 442 / 458
  • 31. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Simulating from the Potts Models (3) 20 60 100 20 60 100 20 60 100 20 60 100 20 60 100 20 20 20 20 20 40 40 40 40 40 60 60 60 60 60 80 80 80 80 80 100 100 100 100 100 20 60 100 20 60 100 20 60 100 20 60 100 20 60 100 20 20 20 20 20 40 40 40 40 40 60 60 60 60 60 80 80 80 80 80 100 100 100 100 100 Simulations from the Potts model with four grey levels and a four-neighbor neighborhood structure based on 1000 iterations of the Metropolisā€“Hastings sampler. The parameter Ī² varies in steps of 0.1 from 0.3 to 1.2. 443 / 458
  • 32. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Posterior Inference (1) The prior on x is a Potts model with G categories, ļ£« ļ£¶ 1 Ļ€(x|Ī²) = exp ļ£­Ī² Ixj =xi ļ£ø , Z(Ī²) iāˆˆI jāˆ¼i where Z(Ī²) is the normalizing constant. Given x, we assume that the observations in y are independent normal random variables, 1 1 f (y|x, Ļƒ 2 , Āµ1 , . . . , ĀµG ) = exp āˆ’ 2 (yi āˆ’ Āµxi )2 . (2Ļ€Ļƒ 2 )1/2 2Ļƒ iāˆˆI 444 / 458
  • 33. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Posterior Inference (2) Priors Ī² āˆ¼ U ([0, 2]) , Āµ = (Āµ1 , . . . , ĀµG ) āˆ¼ U ({Āµ ; 0 ā‰¤ Āµ1 ā‰¤ . . . ā‰¤ ĀµG ā‰¤ 255}) , Ļ€(Ļƒ 2 ) āˆ Ļƒ āˆ’2 I]0,āˆž[ (Ļƒ 2 ) , the last prior corresponding to a uniform prior on log Ļƒ. 445 / 458
  • 34. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Posterior Inference (3) Posterior distribution ļ£« ļ£¶ 1 Ļ€(x, Ī², Ļƒ 2 , Āµ|y) āˆ Ļ€(Ī², Ļƒ 2 , Āµ) Ɨ exp ļ£­Ī² Ixj =xi ļ£ø Z(Ī²) iāˆˆI jāˆ¼i 1 āˆ’1 Ɨ exp (yi āˆ’ Āµxi )2 . (2Ļ€Ļƒ 2 )1/2 2Ļƒ 2 iāˆˆI 446 / 458
  • 35. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Full conditionals (1) ļ£± ļ£¼ ļ£² 1 ļ£½ P(xi = g|y, Ī², Ļƒ 2 , Āµ) āˆ exp Ī² Ixj =g āˆ’ (yi āˆ’ Āµg )2 , ļ£³ 2Ļƒ 2 ļ£¾ jāˆ¼i can be simulated directly. ng = Ixi =g and sg = Ixi =g yi iāˆˆI iāˆˆI the full conditional distribution of Āµg is a truncated normal distribution on [Āµgāˆ’1 , Āµg+1 ] with mean sg /ng and variance Ļƒ 2 /ng 447 / 458
  • 36. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Full conditionals (2) The full conditional distribution of Ļƒ 2 is an inverse gamma distribution with parameters |I|2 /2 and iāˆˆI (yi āˆ’ Āµxi )2 /2. The full conditional distribution of Ī² is such that ļ£« ļ£¶ 1 Ļ€(Ī²|y) āˆ exp ļ£­Ī² Ixj =xi ļ£ø . Z(Ī²) iāˆˆI jāˆ¼i 448 / 458
  • 37. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Path sampling Approximation (1) Z(Ī²) = exp {Ī²S(x)} , x where S(x) = iāˆˆI jāˆ¼i Ixj =xi dZ(Ī²) exp(Ī²S(x)) = Z(Ī²) S(x) dĪ² x Z(Ī²) = Z(Ī²) EĪ² [S(X)] , d log Z(Ī²) = EĪ² [S(X)] . dĪ² 449 / 458
  • 38. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Path sampling Approximation (2) Path sampling identity Ī²1 log {Z(Ī²1 )/Z(Ī²0 )} = EĪ² [S(x)]dĪ² . Ī²0 450 / 458
  • 39. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Path sampling Approximation (3) For a given value of Ī², EĪ² [S(X)] can be approximated from an MCMC sequence. The integral itself can be approximated by computing the value of f (Ī²) = EĪ² [S(X)] for a ļ¬nite number of values of Ī² and Ė† approximating f (Ī²) by a piecewise-linear function f (Ī²) for the intermediate values of Ī². 451 / 458
  • 40. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Path sampling Approximation (3) 40000 35000 30000 25000 20000 15000 10000 0.0 0.5 1.0 1.5 2.0 Approximation of f (Ī²) for the Potts model on a 100 Ɨ 100 image, a four-neighbor neighborhood, and G = 6, based on 1500 MCMC iterations after burn-in. 452 / 458
  • 41. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Posterior Approximation (1) 35.0 58.5 59.0 59.5 60.0 60.5 34.0 33.0 32.0 0 500 1000 1500 0 500 1000 1500 70.0 70.5 71.0 71.5 72.0 72.5 83.0 83.5 84.0 84.5 85.0 0 500 1000 1500 0 500 1000 1500 113.5 95.0 95.5 96.0 96.5 97.0 97.5 112.5 111.5 110.5 0 500 1000 1500 0 500 1000 1500 Dataset Menteith: Sequence of Āµg ā€™s based on 2000 iterations of the hybrid Gibbs sampler (read row-wise from Āµ1 to Āµ6 ). 453 / 458
  • 42. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Posterior Approximation (2) 100 40 80 30 60 20 40 10 20 0 0 32.0 32.5 33.0 33.5 34.0 34.5 35.0 58.5 59.0 59.5 60.0 60.5 60 50 50 40 40 30 30 20 20 10 10 0 0 70.0 70.5 71.0 71.5 72.0 72.5 83.0 83.5 84.0 84.5 85.0 100 50 80 40 60 30 40 20 20 10 0 0 95.0 95.5 96.0 96.5 97.0 97.5 110.5 111.0 111.5 112.0 112.5 113.0 113.5 Histograms of the Āµg ā€™s 454 / 458
  • 43. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Posterior Approximation (3) 120 20.5 100 20.0 80 60 19.5 40 19.0 20 18.5 0 0 500 1000 1500 18.5 19.0 19.5 20.0 20.5 1.295 1.300 1.305 1.310 1.315 1.320 100 120 80 60 40 20 0 0 500 1000 1500 1.295 1.300 1.305 1.310 1.315 1.320 Raw plots and histograms of the Ļƒ 2 ā€™s and Ī²ā€™s based on 2000 iterations of the hybrid Gibbs sampler 455 / 458
  • 44. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Image segmentation (1) Based on (x(t) )1ā‰¤tā‰¤T , an estimator of x needs to be derived from an evaluation of the consequences of wrong allocations. Two common ways corresponding to two loss functions L1 (x, x) = Ixi =Ė†i , x iāˆˆI L2 (x, x) = Ix=b , x Corresponding estimators xM P M = arg max PĻ€ (xi = g|y) , Ė†i iāˆˆI, 1ā‰¤gā‰¤G xM AP = arg max Ļ€(x|y) , x 456 / 458
  • 45. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Image segmentation (2) xM P M and xM AP not available in closed form! Approximation N xM P M Ė†i = max Ix(j) =g , gāˆˆ{1,...,G} i j=1 based on a simulated sequence, x(1) , . . . , x(N ) . 457 / 458
  • 46. Bayesian Core:A Practical Approach to Computational Bayesian Statistics Image analysis Image Segmentation Image segmentation (3) 100 80 60 40 20 20 40 60 80 100 100 80 60 40 20 20 40 60 80 100 (top) Segmented image based on the MPM estimate produced after 2000 iterations of the Gibbs sampler and (bottom) the observed image 458 / 458