On two-dimensional spectral realization

Share Embed


Descrição do Produto

~

,

I

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 40, NO. 5, SEPTEMBER 1994

Given the observations y , we suppose that associated with each pixel 1 E L is an independent probability distribution p(1) = ( p ~ ( l ) , p ~ ( l ) ; . . , p ~ over ( l ) ) S. Write p = ( p ( l ) , l E L ) and Ep for the expectation under p . We now wish to determine the distribution p which maximizes E p 9 ( x ) . That is, we wish to maximize E,-Y(x) subject to the constraints

p,(l) 2 0

V1 E L and

j

=

l;..,M M

and

C p , ( l )= 1 J=

VI E L .

1

To effect this, consider real variables p,(l) such that p:(l) = p,(1). Then we require Cp,(1I2 = 1, V I E L . Write L( p , A) = E , P ( x ) + E,, LAI(CKl~ , ( 1 ) ~ 1). Differentiating L( p, A) with respect to p , ( l ) and A, for 1 5 j I M , 1 I 1 I L , gives a sparse system of ( M + 1)L equations with ( M 1)L unknowns. Once a candidate for the critical value 5 = (@,(l),j = l;.., M , I E L ) has been found, an estimate for 4 = (4,, 1 E L ) is obtained by choosing, for each 1 E L , the state e, in S corresponding to the maximum value of C(1) = (@1(l),-., @M(I)). For example, to simplify the notation, let us consider the case when the observations y are not “blurred”; then y, = (g,X , ) + ( a ,X , h , and

+

M

M -

REFERENCES J. J. Besag, “Efficiency of pseudo-likelihood estimation for simple Gaussian fields,” Biometnka, vol. 64, pp. 616-618, 1977. J. J. Besag, “On the statistical analysis of dirty pictures,” J . R. Stat. SOC.B , vol. 48, pp. 259-302, 1986. R. J. Elliott, “A general recursive discrete time filter,” J . Appl. Prob., vol. 30, pp. 575-588, 1993. R. J. Elliott and L. Aggoun, “M.A.P. estimation using measure change for continuous state random fields,” unpublished. R. J. Elliott and L. Aggoun, “Measure change estimation for hidden Markov random fields,” unpublished. S. Geman and D. Geman, “Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images,” IEEE Trans. Pattern Anal. Image Restoration, vol. 6, pp. 721-741, 1984. R. Kinderman and L. Snell, “Markov random fields and their applications,” in Contemporary Mathematics, vol. 1. Providence, RI: American Math Society, 1980. W. Qian and D. M. Titterington, “Parameter estimation for hidden Gibbs chains,” Stat. Prob. Lett., vol. 10, pp. 49-58, 1990. B. D. Ripley, Statistical Inference for Spatial Processes. New York: Cambridge University Press, 1988.

On Two-Dimensional Spectral Realization F. Gamboa and M. Lavielle Abstract-Reconstruction of a spectral density function from a finite set of covariances can be performed by maximizing an entropy fnnctional. The method of the maximum entropy on the mean is used for computing a discrete version of this spectral density and allows one to give a new interpretation of these reconstruction methods. In fact, we show that the choice of the entropy is directly related to a prior distribution. In particular, we consider processes on 77’. Steepest descent procedures permit the numerical computation of discrete realizations for a wide class of entropies. To ensure the nonnegativity of the solution related to the Burg entropy, we present a new algorithm based on a fixed-point method and the Yule-Walker equations to compute this solution. Then, the solution of the dual problem is obtained as the limit of the trajectory of an ordinary differential equation.

Consequently,

loga(e,)

1603

log ai -

i=l [EL

Index Terms-Maximum entropy, spectral density, stationary process, Bayesian reconstruction, Markov random field.

Let us write k(1) for those k such that 1 E Nk. The critical values of the p j ( l ) are, therefore, determined by the sparse system of equations

I. INTRODUCTION Let A be a given finite subset of Z2 such that 0 E A and -s E A whenever s E A. Let f be a given continuous nonnegative even function on the two-dimensional torus T 2 = I - n-, n-1

x I - n-,TI.

Therefore, f can be viewed as the spectral density of a stationary real-valued process. Let ( X J s Ep be such a process. So. we have

M

r(t)

:=

E ( X s X f + , )=

1 e’(‘>*)f(A)d A .

(1.1)

T2

=0,

Here, d A is the uniform probability on T 2 and E denotes the expectation. Let r A := { r ( t ) 2 E A be given. A spectral density

l I i l M , l l 1 I L

and

Manuscript received July 23, 1992; revised November 17, 1993. F. Gamboa is with the Laboratoire de Statistiques, Universitt Paris ACKNOWLEDGMENT The authors are grateful to Donald Geman for comments on an early version of this paper.

Sud, 91405 Orsay,’ France, and Universitt Paris-Nord, Insitut Galilte, 93430 Villetaneuse, France. M. Lavielle is with the Laboratoire de Statistiques, Universit6 Paris Sud, 91405 Orsay, France, and Universitt Rent Descartes, U.F.R. Mathtmatique, 45 Rue des Saints-Peres, 75006 Paris. IEEE Log Number 9403711.

0018-9448/94$04.00 0 1994 IEEE

1604

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 40, NO. 5 , SEPTEMBER 1994

which satisfies (1.1) for every t E A will be called an absolutely probability distribution on ]a, b[. Let continuous (a.c.1 realization of (r(t)), ,. := 1ogLhcxp ( ~ yd )F ( y ) . (2.1) To solve this problem, maximum entropy methods have been widely used and discussed [1]-[5]. Using the maximum entropy on the mean (MEM) techniques [6], we present here a new We will make the following assumptions on F : interpretation of these methods. (H1) The convex hull of the support of F is ] a , b [ . We begin by looking for a spectral measure d f t supported on (H2) I) is defined on the open interval D ( F ) :=] - E, a [ , the discrete torus T i which maximizes an entropy function W. a! E R:. Using the maximum entropy on the mean (MEM) principle, this (H3) If a < CO, ~/I)'(T) = O(T - a ) as T a-. measure is constructed as the mean of a distribution which Let minimizes the Kullback information with respect to a given prior measure [7], [SI. An absolutely continuous realization of r A is then defined as the weak limit of the sequence Using results of [7] and [SI, we will obtain reconstructions of the form be the Legendre dual function of I). Our particular family of f , ( ~ )= U,* expi(s, A)), (1.2) realizations is constructed using the maximum entropy principle through the maximum entropy method on the mean (MEM). SEA where I) is an appropriate convex function and A is The MEM originates with the work of Navaza [6], where reconchosen such that f , is a realization of r A . struction problems in crystallography are considered. Furthermore, U * is the limit of a sequence (U;), > where U: Let us describe how this technique can be used for spectral minimizes a function If,, :, realization. We discretize T 2 by setting T , := ( A k , , k , := (2k1r/N,2k2r/N), 1 I k,,k, I N).At stage N,we look for a U , expi(s, A)) u,r(s). spectral measure supported on T i which fits our given covariSEA (1.3) ance {r(t)JtE A . For this, we use the maximum entropy principle: For a given N, the coefficients (U;,,), A and the discrete we look for P;", the probability of maximum entropy, defined version f + , N can be computed by using an optimization proce- by dure. (2.3) K(P/", Fa") := min K ( P , We propose various realizations, obtained with different enPE M tropies such as the Shannon entropy, the Fermi-Dirac entropy, or the generalized Burg entropy, for which the realization pos- where the admissible class of probability .d is defined by

-

(fy).

I)!(

,

N2

Pprobabilityonm

:

\

1" [x

'"

k,=l k,=l

sesses special probabilistic properties (in particular cases, the underlying process has an autoregressive (AR) or a Markov K denotes the Kullback information (or cross entropy), that is, for P and Q probabilities on an arbitrary probability space, representation). The choice of the entropy is directly related to the prior dP d P , if P
Lihat lebih banyak...

Comentários

Copyright © 2017 DADOSPDF Inc.