Abstract
We consider Bayesian shrinkage predictions for the Normal regression problem under the frequentist Kullback-Leibler risk function. Firstly, we consider the multivariate Normal model with an unknown mean and a known covariance. While the unknown mean is fixed, the covariance of future samples can be different from that of training samples. We show that the Bayesian predictive distribution based on the uniform prior is dominated by that based on a class of priors if the prior distributions for the covariance and future covariance matrices are rotation invariant. Then, we consider a class of priors for the mean parameters depending on the future covariance matrix. With such a prior, we can construct a Bayesian predictive distribution dominating that based on the uniform prior. Lastly, applying this result to the prediction of response variables in the Normal linear regression model, we show that there exists a Bayesian predictive distribution dominating that based on the uniform prior. Minimaxity of these Bayesian predictions follows from these results.
Original language | English |
---|---|
Pages (from-to) | 1888-1905 |
Number of pages | 18 |
Journal | Journal of Multivariate Analysis |
Volume | 99 |
Issue number | 9 |
DOIs | |
Publication status | Published - 2008 Oct |
Externally published | Yes |
Keywords
- 62C10
- 62F07
- 62F15
- 62J07
- Bayesian prediction
- Kullback-Leibler divergence
- Minimaxity
- Normal regression
- Shrinkage estimation
- Superharmonic function
- primary
- secondary
ASJC Scopus subject areas
- Statistics and Probability
- Numerical Analysis
- Statistics, Probability and Uncertainty