Hi! I have quite a specific question regarding the use of estimating equations - so hope that it’s in place here. Currently I’m reading a paper that uses estimating equations for variance estimation. I’m not familiar with estimating equations and a bit confused about two of the estimating equations used and in search for some help or perhaps some references.

The problem is the following: you have a dataset of N people, and assume that all N people have two repeated measures of an endpoint that is contaminated with measurement error (V_1 and V_2) and additionally, M people (M < N) have an additional (error free) measure of the endpoint, Y. Further, we assume that the expected value of V is equal to \theta_0 + \theta_1Y (and there is no dependence between the two measures V). Using this, we would like to form the estimating equations for the parameters \theta_0 and \theta_1 (and eventually use this information to predict Y in all people without a measure of Y). According to the paper these would respectively be:

N^{-1}\sum^N_i(Y_i-\theta_0-\theta_1V_{1i})I_iN/M = 0

N^{-1}\sum^N_i(Y_i-\theta_0-\theta_1V_{1i})V_{2i}I_iN/M = 0

where I_i is 1 if individual i is contained in the subset consisting of individuals with measures of Y and 0 if not.

I’m a bit confused on why you would only use the first measure (V_1) to form an estimating equation for \theta_0 and then for the estimating equation of \theta_1 you multiply it with the second measure (V_2) (for me it would make more sense to take the mean of the two measures).

Really would appreciate help of any kind!

Best,

Linda

1 Like

Linda, this is precisely why I am (methodologically) Bayesian. I bet it would be quicker to write out a reasonable toy model of the physiology, put some priors on its parameters, and estimate it with JAGS, than to read that paper (would you add a citation to your Q?) and retrofit its off-the-shelf model to your particular problem. I bet in fact you could then proceed to advance a criticism of the assumptions embedded in an estimating-equations approach, based on simulations from this very same model. (JAGS has the lovely feature that it is fully declarative, and therefore permits ‘forward’ simulation as well as ‘backward’ inference from data to parameters.)

One obvious problem with the reasonableness of said ‘retrofit’ would be the assumed independence of repeated measures within individuals, if indeed this estimating-equations approach requires this.

1 Like

Thanks David! Certainly interested in finding new methods to deal with this kind of problems in data because the independence between the two measures is indeed a strong assumption that is often violated. It’s this paper, which implements a methodology proposed by Buonaccorsi in this book (not completely sure about the original source, and I’m only able to put two links as I’m a new user here ;-)). The paper proposes a MLE and compares it with the Buonaccorsi method (where the estimating equations are needed for variance estimation). A (new?) Bayesian approach will be interesting .

1 Like

Don’t despair, in my experience the various privileges on this platform accrue rapidly! Looking briefly at that first paper, I note that indeed it pursued a simulation agenda, and that to this end it actually wrote out its hierarchical models in just about the same manner you’d lay them out in JAGS or Stan. So the modeling work’s done … except for the “typing”! A great reference on Bayesian analysis of such models is [1].

- Gelman A, Hill J.
*Data Analysis Using Regression and Multilevel/Hierarchical Models*. Cambridge ; New York: Cambridge University Press; 2007. [Home page for book]

1 Like