next up previous index
Next: Reduction of Nonzero Parameters Up: Further Issues Previous: Further Issues

6.3.1

Estimating the Number of MRFs

 

In the previous discussions on parameter estimation of multiple MRFs, the number of distinct MRF models (regions, textures, etc. ), here denoted by K, is assumed to be known. In completely unsupervised estimation and clustering schemes, this number is an unknown parameter and has also to be estimated along other parameters. The selection of the model number is generally a difficult issue in clustering analysis [Jain and Dubes 1988]. Various criteria have been proposed for the estimation of K. They can be classified as two broad categories: heuristic approaches and statistical approaches. An example of the former category is the bootstrap technique [Jain and Moreau 1987]; there, the value of K which provides the most stable partitions (in terms of a heuristic criterion) is the estimate of the number of clusters in the data set. In the following, a particular class of statistically based criteria are discussed.

Let K be the number to be chosen from the set of numbers , the ML estimate of the model parameter vector of the K models, the maximized likelihood with , the number of independent adjustable parameters in the K models, and N the sample size. Various maximization criteria take the form [Sclove 1987]

 

Basically, the value of increases with the maximized likelihood but due to other terms, decreases with the value M. Criteria differ in the choices of and .

For , the K which maximizes is reduced to the maximum likelihood estimate. Such an estimate has been shown to be biased [Sakamoto et al. 1987]. The Akaike information criterion (AIC)   [Akaike 1974] is obtained as an expected entropy, giving and . The AIS is shown asymptotically unbiased [Sakamoto et al. 1987]. From Bayes viewpoint [Schwarts 1978] and [Kashyap 1988], can be obtained by expanding the posterior probability . In [Schwarts 1978], and . Kashyap has the same as Schwarts but with where is the Hassian of evaluated at the maximize likelihood estimates of K. Rissanen (1978) formulates in terms of minimum description length   and gives and . Some results comparing the performance of the criteria are presented in [Sclove 1987], showing that Schwarts's gives better results than the AIC.

Sclove (1983) use the AIC to verify the number of region classes in image segmentation. Zhang and Modestino (1990) present a cluster validation scheme based on AIC [Akaike 1974] and argue that the AIC approach is advantageous over other heuristic schemes for cluster validation. Some modified criteria based on (6.81) have also been proposed. Bouman and Liu (1991) adopt a modified criterion for texture segmentation where a texture is modeled as a Gaussian AR random field [Jain 1981] with a number of H nonzero prediction coefficients. They replace the likelihood function with the joint function where f is the labeling which determines the partition of into regions, . The criterion is optimized with respect to the parameters , the partition f and the number of texture K. Won and Derin (1992) propose , where c is a prespecified constant, for determining the number of texture regions.



next up previous index
Next: Reduction of Nonzero Parameters Up: Further Issues Previous: Further Issues