Postscript version of these notes
STAT 804
Lecture 11
Likelihood Theory
First we review likelihood theory for conditional and full
maiximum likelihood estimation.
Suppose the data is X=(Y,Z) and write the density of X
as
Differentiate the identity
with respect to
(the jth component of )
and pull the derivative under the integral sign to get
where
is the jth component of
,
the derivative of the log conditional
likelihood; UY|Z is called a conditional score. Since
we may take expected values to see
that
It is also true
that the other two scores
and
have mean 0 (when
is the
true value of ).
Differentiate the identity a further time with respect to to get
We define the conditional Fisher information matrix
to have jkth entry
and get
The corresponding identities based on fX and fZ are
and
Now let's look at the model
.
Putting
and Z=X0 we find
Differentiating again gives the matrix of second derivatives
Taking conditional expectations given X0 gives
To compute
write
and get
with W0=X02.
You can check carefully that in fact Wk converges to some as
.
This
satisfies
which gives
It follows that
Notice that although the conditional Fisher information
might have been expected to depend on X0 it does not, at least for
long series.
Richard Lockhart
1999-11-01