Generalized predictive information criteria for the analysis of feature events

Mike K.P. So, Tomohiro Ando

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)


This paper develops two weighted measures for model selection by generalizing the Kullback-Leibler divergence measure. The concept of a model selection process that takes into account the special features of the underlying model is introduced using weighted measures. New informa- tion criteria are defined using the bias correction of an expected weighted loglikelihood estimator. Using weight functions that match the features of interest in the underlying statistical models, the new information criteria are applied to simulated studies of spline regression and copula model selection. Real data applications are also given for predicting the incidence of disease and for quantile modeling of environmental data.

Original languageEnglish
Pages (from-to)742-762
Number of pages21
JournalElectronic Journal of Statistics
Issue number1
Publication statusPublished - 2013
Externally publishedYes


  • Feature matching
  • Information criteria
  • Model selection
  • Weighted Kullback-Leibler measure

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Generalized predictive information criteria for the analysis of feature events'. Together they form a unique fingerprint.

Cite this