next up previous index
Next: Relaxation Labeling Up: Minimization with Discrete Labels Previous: Minimization with Discrete Labels

8.2.1

Iterated Conditional Modes

Since it is difficult to maximize the joint probability of an MRF, Besag (1986) proposed a deterministic algorithm called iterated conditional modes (ICM) which maximizes local conditional probabilities sequentially. The ICM algorithm uses the ``greedy'' strategy in the iterative local maximization. Given the data d and the other labels , the algorithm sequentially updates each into by maximizing , the conditional (posterior) probability, with respect to .

Two assumptions are made in calculating : First, the observation components are conditionally independent given f and each has the same known conditional density function dependent only on . Thus

The second assumption is that f depends on the labels in the local neighborhood, which is the Markovianity. From the two assumptions and the Bayes theorem, it follows that

 

Obviously, is much easier to maximize than , which is the point of ICM.

Maximizing (8.12) is equivalent to minimizing the corresponding posterior potential using the following rule

where

For example, for the discrete restoration formulated in Section 2.2.2, the posterior potential for (2.19) is

where is the number of neighboring sites whose labels differ from .

For discrete , is evaluated with each and the label causing the lowest value is chosen as the value for . When applied to each i in turn, the above defines an updating cycle of ICM. The iteration continues until convergence. The convergence is guaranteed for the serial updating and is rapid [Besag 1986].

The result obtained by ICM depends very much on the initial estimator , as widely reported. Currently, it is not known how to set the initialization properly to obtain a good solution. A natural choice for is the maximum likelihood estimate

which is the same as the data, , when the noise is identically, independently distributed   Gaussian. ICM can also be applied to problems where takes a continuous value. In minimizing (2.22) for continuous restoration, for example, one needs to maximize

for each . To do this, it is necessary to solve .

In a genuine MRF algorithm, no two neighboring sites should be updated simultaneously. The ``coding method''   [Besag 1974] may be incorporated into ICM to parallelize the iteration. Using codings, are partitioned into several sets such that no two sites in one set are neighbors ( cf. Section 6.1.3). Therefore, all on a single coding can be updated in parallel.