Error message

Notice: Trying to get property 'name' of non-object in eval() (line 13 of /home/byfet/public_html/tvim.su/modules/computed_field/computed_field.module(468) : eval()'d code).

On a consistency of orthogonal series estimators with respect to Jacobi polynomials system.

Authors: 
Novikov V. V., On a consistency of orthogonal series estimators with respect to Jacobi polynomials system. // Taurida Journal of Computer Science Theory and Mathematics, – 2019. – T.18. – №2. – P. 67-
logo DOI https://doi.org/10.37279/1729-3901-2019-2-43-67-76

Consider a nonparametric regression model
$$Y_i = m(X_i) + ε_i , i = 1, . . . , n,$$
where $m(x)$ is the unknown regression function to be estimated, $\{(X_i, Y_i)\}^n_{i=1}$ is a dataset and $\{ε_i\}^n_{i=1}$ are observation errors. Suppose that the regression function can be represented as a Fourier series
$$m(x) = \sum^{\infty}_{j=0} \beta_j\varphi_j (x) ,$$
where the system of functions $\{\varphi_j (x)\}^\infty_{j=0}$ constitutes an orthonormal basis on $[−1, 1]$, with respect to inner product
$$(f, g) = \overset{1}{\underset{−1}{\int}} f(x)g(x)\rho(x) dx,$$
and $\{\beta_j\}$ are Fourier coeffcients. Next assume that observations $\{Y_i\}^n_{i=1}$ have been taken at equidistant points $\{X_i\}^n_{i=1}$ over the interval $[−1, 1]$ and let $\{A_i\}^n_{i=1}$ be a set of disjoint intervals such that $\bigcup^n_{i=1}A_i = [−1, 1]$ and $X_i \in A_i, i = 1, ..., n$. Put
$$\hat{m}_{N(n)}(x) = \sum^{N(n)}_{j=0} \hat{\beta}_j\varphi_j(x) , \hat{β}_j = \sum^{n}_{i=1}Y_i \underset{A_i}{\int} \varphi_j(x) \rho(x)dx,$$
where $N(n)$ is a suitable finite number. This estimator is called an orthogonal series estimator of $m(x)$.
In the present paper, we give the consistency condition for $\hat{m}(x)$ provided that the regression function $m(x)$ is Lipschitz continuous and $\varphi_j(x) = P^{(α,β)}_j (x), j = 0, 1, . . .$, is the Jacobi orthonormal polynomials system with certain restrictions for exponents $α, β$. The main result is as follows.

Theorem 1.$\ Suppose\ that\ the\ following\ conditions\ are\ satisfied:$
$\qquad i)\,Eε_i = 0, E(ε_iε_j) = 0, i \not= j,\ and\ Eε^2_i < C, i = 1, . . . , n;$
$\qquad ii)$
$$m(·) \in \mathrm{Lip}_M1;$$

$\qquad iii)$
$$p := min\{α; β\} ≥ −1/2;$$
$\qquad iv)$
$$(N(n))^2 = o \{A_n(α; β)\} , n → \infty,$$
$where\ A_n(α, β) = n,\ if\ p > −1/2\ and\ A_n(α, β) = n/log\,n,\ if\ p = −1/2.\ Then\ \hat{m}_N (x) p \rightarrow m(x) , N (n) \rightarrow \infty\ for\ every\ x \in (−1, 1).$
Theorem 2.$\ Let\ the\ conditions\ i)–iii)\ of\ previous\ theorem\ are\ satisfied, q = max\{α; β\} < 1/2,\ and$
$$(N(n))^{2q+3} = o\{A_n(α, β)\} , n \rightarrow \infty.$$
$Then\ \hat{m}_N(x) \overset{p}{\rightarrow} m(x), N(n) \rightarrow \infty,\ for\ every\ x \in [−1, 1]$.
Keywords: nonparametric regression, consistency, estimator, orthogonal series, Jacobi polynomials

UDC: 
519.23