Collections > Electronic Theses and Dissertations > Bayesian and frequentist methods for approximate inference in generalized linear mixed models

Closed form expressions for the likelihood and the predictive density under the Generalized Linear Mixed Model setting are often nonexistent due to the fact that they involve integration of a nonlinear function over a high-dimensional space. We derive approximations to those quantities useful for obtaining results connected with the estimation and prediction from a Bayesian as well as from a frequentist point of view. Our asymptotic approximations work under the assumption that the sample size becomes large with a higher rate than the number of random effects. The first part of the thesis presents results related to frequentist methodology. We derive an approximation to the log-likelihood of the parameters which, if maximized, gives estimates with low mean square error compared to other methods. Similar techniques are used for the prediction of the random effects where we propose an approximate predictive density from the Gaussian family of densities. Our simulations show that the predictions obtained using our method is comparable to other computationally intensive methods. Focus is given toward the analysis of spatial data where, as an example, the analysis of the rhizoctonia root rot data is presented. The second part of the thesis is concerned with the Bayesian prediction of the random effects. First, an approximation to the Bayesian predictive distribution function is derived which can be used to obtain prediction intervals for the random effects without the use of Monte Carlo methods. In addition, given a prior for the covariance parameters of the random effects we derive approximations to the coverage probability bias and the Kullbak-Leibler divergence of the predictive distribution constructed using that prior. A simulation study is performed where we compute these quantities for different priors to select the prior with the smallest coverage probability bias and Kullbak-Leibler divergence.