时间序列:理论和方法外文翻译资料

 2023-01-02 12:43:01

本科毕业设计(论文)

外文翻译

时间序列:理论和方法

作者:Peter J.Brockwell ,Richard A.Davis

国籍:America

出处:Peter J.Brockwell ,Richard A.Davis. Time Series:Theory and Methods[M].2rd.America: Springer,2008.274-289.

中文译文:

9.1 非平稳时间序列的ARIMA模型

我们已经讨论了表示平稳序列的ARMA模型的重要性.这类过程包含了广泛的非平稳序列,它的推广是由ARIMA过程提供的,即经过有限次微分后的ARMA过程.

定义9.1.1(ARIMA(p,d,q)过程)如果d是非负整数,则{}称为ARIMA(p,d,q)过程,如果,则称之为ARMA(p,q)过程.

这个定义意味着{}满足差分方程的形式

其中和分别是p阶和q阶的多项式,0当1时.多项式在z=1时有一个d阶零点.过程{}是静止的,当且仅当d=0,在这种情况下,它降为ARMA(p,q)过程.

注意,如果d1,我们可以在不违反差分方程(9.1.1)的情况下,向{}添加任意的多项式次数趋势(d-1).因此,ARIMA模型有助于用趋势表示数据(见第1.4节和第9.2节).但应注意的是,ARIMA过程也可适用于无趋势序列的建模.除非d=0,否则{}的平均值不是由等式(9.1.1)确定的.对于d1,等式(9.1.1)决定了{}的二阶性质,而不是{}(问题9.1)的二阶性质,因此、和的估计将基于观察到的差分来进行,并且预测需要其他假设(见第9.5节).

例9.1.1. {}是一个ARIMA(1,1,0)进程,如果对于某些(-1,1),

我们可以写

{}在=0,=0.8,=1时的值与样本自相关函数和部分自相关函数一起如图9.1所示.

表明ARIMA模型的一个显著特征如图9.1中所示的缓慢衰减的正样本自相关函数.因此,如果我们只得到了数据,并且希望找到一个合适的模型,那么很自然地会重复地应用算子=1-B,希望对于某些j,{}将具有与ARMA过程相容的快速衰减样本自相关函数,且在单位圆附近没有自回归多项式的零点.对于本例中的特定时间序列,运算符的一个应用产生如图9.2所示的实现,其样本自相关函数和部分自相关函数表示{}的AR(l)模型.从PEST(假设E()=0)得到的和的最大似然估计分别为.808和.978,给出了模型,

与真正的潜在过程非常相似,

根据样本部分自相关函数的建议,尝试拟合AR(2)过程.使用程序PEST进行的最大似然估计,假设E=0,给出了模型,

它虽然是平稳的,但其系数与真实的非平稳过程(9.1.3)的系数非常相似.

从一个有限长的样本中,很难区分一个非平稳过程(如9.1.3)和一个过程(如9.1.4),这两个过程的系数非常相似,但其所有零点都在单位圆之外.然而,在这两种情况下,如果可以通过差分生成具有快速衰减的样本自相关函数的序列,那么差分数据可以通过一个低阶ARMA过程进行拟合,该过程的自回归多项式具有零,且零完全地位于单位圆之外,这意味着拟合参数将远离允许参数集的边界.这对于参数估计的数值计算是可取的,并且对于某些估计方法是非常关键的.例如,如果我们应用Yule-Walker方程将AR(2)模型拟合到图9.1中的数据,我们就得到了模型

与极大似然模型(9.1.4)或真模型(9.1.3)几乎没有相似之处.在这种情况下,(8.1.7)中出现的矩阵几乎是奇异的.

将ARIMA(p,d,q)过程{}拟合到数据中的一个明显限制是{}只允许以非常特殊的方式非平稳,即通过允许表示中的多项式在单位圆上的点1处具有正多重性d为零的点.当数据的样本自相关函数如图9.1所示是缓慢衰减的正函数时,这种模型是合适的,因为这种形式的样本自相关函数与模型相关,其中在1处或接近1处为零.

如图9.3和9.4所示,具有缓慢衰减振荡行为的样本自相关与模型相关,其中在除=0以外的中具有接近的零.图9.3是从过程中的200个模拟观测中获得的,

在附近为零.图9.4显示了过程中200个观测值的样本自相关函数,

其中在附近有零.在这种情况下,可以通过对数据应用运算符,而不是如前一段中的运算符(1-B),使样本自相关衰减得更快.如果2/接近于一些整数,则样本自相关函数将是周期性的,并且运算符=(1-)(零点接近时B)也可用于产生具有更快衰减的自相关函数的序列(另见第9.6节).图9.3和9.4中的样本自相关函数分别具有周期2和周期6的近似周期性.将运算符(1-)应用于第一个序列和(1-)应用于第二个序列,得到了两个新的序列,其具有更快速衰减的样本自相关函数,分别如图9.5和9.6所示.对于新的序列,则不难拟合ARMA模型,其中的零点都在单位圆之外.识别和确定此类ARMA模型的技术将会在后续的章节中进行讨论.

9.2 鉴定技术

(a) 初步转变.第8章中描述的估计方法使我们能够找到对于给定的p和q值,用ARMA(p,q)模型来拟合给定的一系列数据.为了使这个过程有意义,数据实际上是ARMA过程的实现,特别是它是平稳过程的实现,这至少是合理的.如果数据显示的特征表明非平稳性(如趋势性和季节性),则可能需要进行转换,以产生更符合平稳性假设的新序列.

对平稳性的偏离可以由级数本身的图形或由样本自相关函数亦或是两者同时提出.

变化性在序列水平上具有依赖性,在这种情况下,应首先转换数据以减少或消除这种依赖性.例如,图9.7显示了Box和Jenkins(1976)的国际航空公司的乘客数据{ 144}.从图中可以清楚地看出,变异性随着的增加而增加.另一方面,如图9.8所示,转换后的序列=在中,不显示增加的可变性.当{}是标准差随均值线性增加的级数时,这里使用的对数变换实际上是合适的.为了系统地描述一类一般的方差稳定变换,我们让读者参考Box和Cox(1964)变换.一般Box-Cox变换的定义方程是

PEST程序提供了在从数据中消除趋势或季节性之前应使用.在实践中,如果Box-Cox变换是必要的,那么通常情况下或就足够了.

趋势性和季节性通常通过检查(可能转换的)序列的图形来检测.然而,它们也具有样本自相关函数的特征,它们分别是缓慢衰减和接近周期的.第1.4节讨论了趋势和季节性的消除,我们描述了两种方法:

(i) 将序列“经典分解”为趋势分量、季节分量和随机剩余分量

(ii)差异化

程序PEST(选项1)提供了这些技术之间的选择.这两种方法都适用于转换后的在上段中的航空公司数据=.图9.9和9.10分别显示了通过(i)估计和去除周期12的{}线性趋势分量和季节分量,以及(ii)将差分算子(1-B)(1-)应用于{}.而从PEST中发现两个结果序列都没有显示出与平稳性的任何明显偏差,也没有显示出它们的样本自相关函数与平稳性的任何明显偏差(图9.11显示了{}的样本自相关函数).

在消除趋势性和季节性之后,仍然有可能样本自相关函数看起来是非平稳或接近非平稳过程的函数,在这种情况下,可以进行第9.1节中所述的进一步差分.

(b) 识别问题.设{}表示(a)中所述的校正后的平均横截面级数.现在的问题是找到最满意的ARMA(p,q)模型来表示{}.如果事先知道p和q,这将是第8章中开发的估计技术的直接应用.然而,通常情况并非如此,因此也有必要确定p和q的适当值.

乍一看,选择的p和q值越高,拟合模型就越好.例如,如果我们拟合一个AR(p)过程序列,p=1,2,...最大似然估计通常随着p的增加而单调减少(见表9.2).但是,我们必须注意过度装配的危险,即根据观察到的特定数字过密地调整装配.如果我们拟合由模型=a bt ,(其中{}是标准正态随机变量的独立序列)生成的99到100次观测值的多项式,就会出现过度拟合的极端情况.这种拟合对于给定的数据集是完美的,但是使用该模型预测未来的值可能会导致严重的误差.

已经制定了标准,特别是Akaike的AIC标准和Parzenrsquo;s的CAT标准,它们试图通过有效地分配引入每个附加参数的成本来防止过度拟合.在第9.3节中,我们讨论了AIC的一种偏差校正形式

其中L()是高斯ARMA模型下数据的可能性,参数()和S()是第8.7节中定义的平方和的残差.在第9.3节给出的分析基础上,选择的模型是使AICC值最小的模型.直观地,我们可以把(9.2.1)中的2(p q l)n/(n-p-q-2)看作一个惩罚项来阻止过度参数化.一旦找到一个使AICC值最小化的模型,就必须检查它的拟合优度(基本上是通过检查残差是否像白噪声一样),如第9.4节所述.

附:外文原文

sect;9.1 ARIMA Models for Non-Stationary Time Series

We have already discussed the importance of the class of ARMA models for representing stationary series. A generalization of this class, which incorporates a wide range of non-stationary series, is provided by the ARIMA processes, i.e. processes which, after differencing finitely many times, reduce to ARMA processes.

Definition 9.1.1 (The ARIMA(p, d, q) Process). If d is a non-negative integer, then {} is said to be an ARIMA(p, d, q) process if , is a causal ARMA(p, q) process.

This definition means that {} satisfies a difference equation of the form

where and are polynomials of degrees p and q respectively and 0 for 1. The polynomial has a zero of order d at z = 1. The process {} is stationary if and only if d = 0, in which case it reduces to an ARMA(p, q) process.

Notice that if d1 we can add an arbitrary polynomial trend of degree (d-1) to {} without violating the difference equation (9.1.1). ARIMA models are therefore useful for representing data with trend (see Sections 1.4 and 9.2). It should be noted however that ARIMA processes can also be appropriate for modelling series with no trend. Except when d = 0, the mean of {} is not determined by equation (9.1.1) and it can in particular be zero. Since for d1, equation (9.1.1) determines the second order properties of {} but not those of {} (Problem 9.1), estimation of 、 and will be based on the observed differences ,Additional assumptions are needed for prediction (see Section 9.5).

EXAMPLE 9.1.1. {} is an ARIMA(1, 1, 0) process if for some (-1,1),

We can then write

where

A realization of {} with X0 = 0, = .8 and = 1 is shown in Figure 9.1 toget

剩余内容已隐藏,支付完成后下载完整资料


附:外文原文

sect;9.1 ARIMA Models for Non-Stationary Time Series

We have already discussed the importance of the class of ARMA models for representing stationary series. A generalization of this class, which incorporates a wide range of non-stationary series, is provided by the ARIMA processes, i.e. processes which, after differencing finitely many times, reduce to ARMA processes.

Definition 9.1.1 (The ARIMA(p, d, q) Process). If d is a non-negative integer, then {} is said to be an ARIMA(p, d, q) process if , is a causal ARMA(p, q) process.

This definition means that {} satisfies a difference equation of the form

where and are polynomials of degrees p and q respectively and 0 for 1. The polynomial has a zero of order d at z = 1. The process {} is stationary if and only if d = 0, in which case it reduces to an ARMA(p, q) process.

Notice that if d1 we can add an arbitrary polynomial trend of degree (d-1) to {} without violating the difference equation (9.1.1). ARIMA models are therefore useful for representing data with trend (see Sections 1.4 and 9.2). It should be noted however that ARIMA processes can also be appropriate for modelling series with no trend. Except when d = 0, the mean of {} is not determined by equation (9.1.1) and it can in particular be zero. Since for d1, equation (9.1.1) determines the second order properties of {} but not those of {} (Problem 9.1), estimation of 、 and will be based on the observed differences ,Additional assumptions are needed for prediction (see Section 9.5).

EXAMPLE 9.1.1. {} is an ARIMA(1, 1, 0) process if for some (-1,1),

We can then write

where

A realization of {} with X0 = 0, = .8 and = 1 is shown in Figure 9.1 together with the sample autocorrelation and partial autocorrelation functions.

A distinctive feature of the data which suggests the appropriateness of an ARIMA model is the slowly decaying positive sample autocorrelation function seen in Figure 9.1. If therefore we were given only the data and wished to find an appropriate model it would be natural to apply the operator =1-B repeatedly in the hope that for some j, {} will have a rapidly decaying sample autocorrelation function compatible with that of an ARMA process with no zeroes of the autoregressive polynomial near the unit circle. For the particular time series in this example, one application of the operator produces the realization shown in Figure 9.2, whose sample autocorrelation and partial autocorrelation functions suggest an AR(l) model for {}. The maximum likelihood estimates of and obtained from PEST (under the assumption that E()=0) are .808 and .978 respectively, giving the model,

which bears a close resemblance to the true underlying process,

Instead of differencing the series in Figure 9.1 we could proceed more directly by attempting to fit an AR(2) process as suggested by the sample partial autocorrelation function. Maximum likelihood estimation, carried out using the program PEST and assuming that E=0, gives the model,

which, although stationary, has coefficients which closely resemble those of the true non-stationary process (9.1.3).

From a sample of finite length it will be extremely difficult to distinguish between a non-stationary process such as (9.1.3) for which , and a process such as (9.1.4), which has very similar coefficients but for which has all of its zeroes outside the unit circle. In either case however, if it is possible by differencing to generate a series with rapidly decaying sample autocorrelation function, then the differenced data can be fitted by a low order ARMA process whose autoregressive polynomial has zeroes which are comfortably outside the unit circle. This means that the fitted parameters will be well away from the boundary of the allowable parameter set. This is desirable for numerical computation of parameter estimates and can be quite critical for some methods of estimation. For example if we apply the YuleWalker equations to fit an AR(2) model to the data in Figure 9.1, we obtain the model

which bears little resemblance to either the maximum likelihood model (9.1.4) or the true model (9.1.3). In this case the matrix appearing in (8.1.7) is nearly singular.

An obvious limitation in fitting an ARIMA(p, d, q) process {} to data is that {} is permitted to be non-stationary only in a very special way, i.e. by allowing the polynomial in the representation, to have a zero of positive multiplicity d at the point 1 on the unit circle. Such models are appropriate when the sample autocorrelation function of the data is a slowly decaying positive function as in Figure 9.1, since sample autocorrelation functions of this form are associated with models , in which has a zero either at or close to 1.

Sample autocorrelations with slowly decaying oscillatory behavior as in Figures 9.3 and 9.4 are associated with models, in which has a zero close to for some other than =0. Figure 9.3 was obtained from a sample of 200 simulated observations from the process,

for which has a zero near . Figure 9.4 shows the sample autocorrelation function of 200 observations from the process,

for which has zeroes near . In such cases the sample autocorrelations can be made to decay more rapidly by applying the operator to the data, instead of the operator (1 -B) as in the previous paragraph. If 2/ is close to some integers then the sample autocorrelation function will be nearly periodic with periods and the operator =(1-)(with zeroes near B ) can also be applied to produce a series with more rapidly decaying autocorrelation function (see also Section 9.6). The sample autocorrelation functions in Figures 9.3 and 9.4 are nearly periodic with periods 2 and 6 respectively. Applying the operators (1-) to the first series and (1-) to the second gives two new series with the much more rapidly decaying sample autocorr

剩余内容已隐藏,支付完成后下载完整资料


资料编号:[268357],资料为PDF文档或Word文档,PDF文档可免费转换为Word

原文和译文剩余内容已隐藏,您需要先支付 30元 才能查看原文和译文全部内容!立即支付

以上是毕业论文外文翻译,课题毕业论文、任务书、文献综述、开题报告、程序设计、图纸设计等资料可联系客服协助查找。