We propose a general Bayesian method of comparing models. The approach is based on the Kullback-Leibler distance between two families of models, one nested within the other. For each parameter value ...
Bayesian variable selection has gained much empirical success recently in a variety of applications when the number K of explanatory variables $(x_{1},\ldots ,x_{K})$ is possibly much larger than the ...
Many response variables are handled poorly by regression models when the errors are assumed to be normally distributed. For example, modeling the state damaged/not damaged of cells after treated with ...
In this module, we will introduce generalized linear models (GLMs) through the study of binomial data. In particular, we will motivate the need for GLMs; introduce the binomial regression model, ...
In the early 1970s, statisticians had difficulty in analysing data where the random variation of the errors did not come from the bell-shaped normal distribution. Besides normality, these traditional ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results