Обобщение информационного критерия Акаике для выбора значений непрерывных параметров в моделях данных

The crucial restriction of the Akaike Information Criterion (AIC) as means of adjusting a model to the given data set within a succession of nested parametric model classes is the assumption that the classes are rigidly defined by the growing dimension of an unknown vector parameter. We extend the Kullback information maximization principle underlying the classical AIC onto a wider class of data models in which the dimension of the parameter is fixed, but the freedom of its values is softly constrained by a class of continuously nested a priori probability distributions. We illustrate the proposed continuous generalization of AIC by its application to the problem of time-varying regression estimation which implies the inevitable necessity to choose the time-variability of regression coefficients treated a nonstationary model of the given signal.
UDC: 
004.9311