3d mixture-of-Gaussians classification (RGB)

Visualize distributions

gobj1 = PlotRepresentativeSample3D[data1, 2000, pointStyle1] ; gobj2 = PlotRepresentativeSample3D[data2, 2000, pointStyle2] ; gdist1 = Show[{gobj1, gobj2}, yS, ImageSize288] ;

[Graphics:../HTMLFiles/index_87.gif]

EM experiment #1: object 1 (Kellogs car)

Normalize 3d data

Choose number of Gaussians in mixture model

Assumes full covariance matrices

Compute EM algorithm

RowBox[{Timing, [, RowBox[{RowBox[{em,  , =,  , RowBox[{EM, [, RowBox[{d3d1, ,, initFull, ,, 0.001}], ]}]}], ;}], ]}]

RowBox[{{, RowBox[{RowBox[{3.12,  , Second}], ,, Null}], }}]

Save solution

Plot log-likelihood of data given the model as a function of EM iteration

PlotLogLikelihood[em, d3d1] ;

[Graphics:../HTMLFiles/index_97.gif]

EM experiment #2: object 2 (Tide car)

Normalize 3d data

Choose number of Gaussians in mixture model

Assumes full covariance matrices

Compute EM algorithm

RowBox[{Timing, [, RowBox[{RowBox[{em,  , =,  , RowBox[{EM, [, RowBox[{d3d2, ,, initFull, ,, 0.001}], ]}]}], ;}], ]}]

RowBox[{{, RowBox[{RowBox[{4.75,  , Second}], ,, Null}], }}]

Save solution

Plot log-likelihood of data given the model as a function of EM iteration

PlotLogLikelihood[em, d3d2] ;

[Graphics:../HTMLFiles/index_105.gif]

Image classification

lpdf1 = Log[MixtureLikelihoodNative[{x, y, z}, model1]] ; lpdf2 = Log[MixtureLikelihoodNative[{x, y, z}, model2]] ;

RowBox[{RowBox[{class1, =, RowBox[{ClassifyImage3DAugmented, [, RowBox[{obj1, ,, lpdf1, ,, lpd ...  255.}], ]}]}], ;}] Show[GraphicsArray[{{obj1, class1}, {obj2, class2}}], ImageSize400] ;

[Graphics:../HTMLFiles/index_108.gif]

Error plot

cnt1 = CountPixels[class1] ; cnt2 = CountPixels[class2] ;

bar = {cnt1, cnt2} // Transpose ; BarChart[bar[[1]], bar[[2]], BarStyle {pointStyle1[[ ... , FrameTrue, BarLabels {"Kellogs", "Tide"}, imSize] ;

[Graphics:../HTMLFiles/index_111.gif]


Created by Mathematica  (September 8, 2003)