|
This article is cited in 2 scientific papers (total in 2 papers)
MATHEMATICS
Estimating parameters in a regression model with dependent noises
M. A. Povzun, E. A. Pchelintsev Tomsk State University, Tomsk, Russian Federation
Abstract:
Let on the probability space $(\Omega, \mathcal{F}, \mathrm{P})$ the observations be described by the equation
\begin{equation}
Y=\theta+\nu\xi, \tag{1}
\end{equation}
where $\theta\in\Theta\subset\mathbb{R}^d$ is a vector of unknown parameters, $\nu$ is a known positive number, $\xi$ is the
vector of first $d$ values of the $\mathrm{AR(p)/ARCH(q)}$ process which satisfies the equation
\begin{equation}
\xi_t=\beta_0+\sum_{i=1}^p\beta_1\xi_{i-1}+\sqrt{\alpha_0+\sum_{j=1}^q\alpha_j\xi^2_{j-1}}\varepsilon_t.\tag{2}
\end{equation}
We suppose that the noise $\xi$ has a conditionally Gaussian distribution with respect to some
$\sigma$-algebra $\mathcal{G}$ with a zero mean and the conditional covariance matrix $D(\mathcal{G})$ such that
$$
trD(\mathcal{G})-\lambda_{\max}(D(\mathcal{G}))\geqslant\kappa(d)\geqslant0
$$
and
$$
\mathrm{E}\lambda_{\max}(D(\mathcal{G}))\leqslant\lambda^*.
$$
Let $\xi_0$ be a random variable with a zero mean and variance $s^2$. The matrix $D(\mathcal{G})$ may depend
on $\nu, \beta_i,\alpha_j,s^2$. The coefficients $\alpha_0,\dots,\alpha_k$ are assumed to be nonnegative. The noise $(\varepsilon_t)_{t\geqslant0}$ in (2) is a sequence of i.i.d. random variables with a finite mean and constant variance $\sigma^2$ [13]. The
nuisance parameters $(\beta_i)_{1\leqslant i\leqslant p}$, $(\alpha_j)_{1\leqslant j\leqslant q}$, and $s^2$
of the noise are unknown.
The problem is to estimate the vector of unknown parameters $\theta=(\theta_1,\dots, \theta_d)$ in the model (1)
by observations $Y$.
It is known that, in the class of linear unbiased estimators, the best one is the least-squares
estimator (LSE)
\begin{equation}
\hat{\theta}=Y.\tag{3}
\end{equation}
However, for example, in the case of pulse-type disturbances, such an estimate may have a low
accuracy. In [8–12], special modifications of this estimate were developed for discrete and
continuous models with dependent conditionally Gaussian noises. Following this approach, this
paper proposes the following shrinkage procedure for estimating the parameter $\theta$:
\begin{equation}
\theta^*=\left(1-\frac c{|Y|}\right),
\tag{4}
\end{equation}
where $c=\nu^2\kappa(d)\delta_d$, $\delta_d=\left(\rho+\sqrt{2\lambda^*}\nu\frac{\Gamma((d+1)/2)}{\Gamma(d/2)}\right)^{-1}$, $\rho=\sup\limits_{\theta\in\Theta}\{|\theta|\}$.
The main result of this paper is the following theorem.
Theorem. There exists an integer $d_0\geqslant2$ such that for any $d\geqslant d_0$ the estimate $\theta^*$ given by (4) outperforms the LSE (3) in the mean square accuracy. Moreover, the minimal gain in the
mean square accuracy satisfies the inequality
$$
\Delta(\theta)=R(\theta^*,\theta)-R(\hat{\theta},\theta)\leqslant-c^2.
$$
The results of numerical simulation of the empirical risks of the proposed improved estimate
and LSE for the $\mathrm{AR(1)/ARCH(1)}$ noise model confirm the statement of the theorem.
Keywords:
regression, improved estimation, mean square risk, conditionally Gaussian noise, $\mathrm{AR/ARCH}$ process.
Received: 11.07.2017
Citation:
M. A. Povzun, E. A. Pchelintsev, “Estimating parameters in a regression model with dependent noises”, Vestn. Tomsk. Gos. Univ. Mat. Mekh., 2017, no. 49, 43–51
Linking options:
https://www.mathnet.ru/eng/vtgu606 https://www.mathnet.ru/eng/vtgu/y2017/i49/p43
|
Statistics & downloads: |
Abstract page: | 206 | Full-text PDF : | 62 | References: | 50 |
|