ENTS ix
3.3.1 Invertibility Conditions for Moving Average
Processes, 71
3.3.2 Autocorrelation Function and Spectrum of
Moving Average Processes, 72
3.3.3 First-Order Moving Average Process, 73
3.3.4 Second-Order Moving Average Process, 75
3.3.5 Duality Between Autoregressive and Moving
Average Processes, 78
3.4 Mixed Autoregressive–Moving Average Processes, 79
3.4.1 Stationarity and Invertibility Properties, 79
3.4.2 Autocorrelation Function and Spectrum of Mixed
Processes, 80
3.4.3 First-Order Autoregressive–First-Order Moving
Average Process, 82
3.4.4 Summary, 86
A3.1 Autocovariances, Autocovariance Generating Function, and
Stationarity Conditions for a General Linear Process, 86
A3.2 Recursive Method for Calculating Estimates of
Autoregressive Parameters, 89
4 Linear Nonstationary Models 93
4.1 Autoregressive Integrated Moving Average Processes, 93
4.1.1 Nonstationary First-Order Autoregressive
Process, 93
4.1.2 General Model for a Nonstationary Process
Exhibiting Homogeneity, 95
4.1.3 General Form of the Autoregressive Integrated
Moving Average Model, 100
4.2 Three Explicit Forms for The Autoregressive Integrated
Moving Average Model, 103
4.2.1 Difference Equation Form of the Model, 103
4.2.2 Random Shock Form of the Model, 104
4.2.3 Inverted Form of the Model, 111
4.3 Integrated Moving Average Processes, 114
4.3.1 Integrated Moving Average Process of
Order (0, 1, 1), 115
4.3.2 Integrated Moving Average Process of
Order (0, 2, 2), 119
4.3.3 General Integrated Moving Average Process of
Order (0, d, q), 123
A4.1 Linear Difference Equations, 125
A4.2 IMA(0, 1, 1) Process with Deterministic Drift, 131 CONTENTS
2.1.3 Positive Definiteness and the Autocovariance
Matrix, 25
2.1.4 Autocovariance and Autocorrelation
Functions, 29
2.1.5 Estimation of Autocovariance and
Autocorrelation Functions, 31
2.1.6 Standard Errors of Autocorrelation Estimates, 33
2.2 Spectral Properties of Stationary Models, 35
2.2.1 Periodogram of a Time Series, 35
2.2.2 Analysis of Variance, 37
2.2.3 Spectrum and Spectral Density Function, 38
2.2.4 Simple Examples of Autocorrelation and Spectral
Density Functions, 43
2.2.5 Advantages and Disadvantages of the
Autocorrelation and Spectral Density
Functions, 45
A2.1 Link between the Sample Spectrum and Autocovariance
Function Estimate, 45
3 Linear Stationary Models 47
3.1 General Linear Process, 47
3.1.1 Two Equivalent Forms for the Linear Process, 47
3.1.2 Autocovariance Generating Function of a Linear
Process, 50
3.1.3 Stationarity and Invertibility Conditions for a
Linear Process, 51
3.1.4 Autoregressive and Moving Average
Processes, 53
3.2 Autoregressive Processes, 55
3.2.1 Stationarity Conditions for Autoregressive
Processes, 55
3.2.2 Autocorrelation Function and Spectrum of
Autoregressive Processes, 57
3.2.3 First-Order Autoregressive (Markov) Process, 59
3.2.4 Second-Order Autoregressive Process, 61
3.2.5 Partial Autocorrelation Function, 66
3.2.6 Estimation of the Partial Autocorrelation
Function, 69
3.2.7 Standard Errors of Partial Autocorrelation
Estimates, 70
3.3 Moving Average Processes, 71 CONTENTS
A4.3 Arima Processes with Added Noise, 131
A4.3.1 Sum of Two Independent Moving Average
Processes, 132
A4.3.2 Effect of Added Noise on the General
Model, 133
A4.3.3 Example for an IMA(0, 1, 1) Process with Added
White Noise, 134
A4.3.4 Relation between the IMA(0, 1, 1) Process and a
Random Walk, 135
A4.3.5 Autocovariance Function of the General Model
with Added Correlated Noise, 135
5 Forecasting 137
5.1 Minimum Mean Square Error Forecasts and Their
Properties, 137
5.1.1 Derivation of the Minimum Mean Square Error
Forecasts, 139
5.1.2 Three Basic Forms for the Forecast, 141
5.2 Calculating and Updating Forecasts, 145
5.2.1 Convenient Format for the Forecasts, 145
5.2.2 Calculation of the ψ Weights, 147
5.2.3 Use of the ψ Weights in Updating the
Forecasts, 148
5.2.4 Calculation of the Probability Limits of the
Forecasts at Any Lead Time, 150
5.3 Forecast Function and Forecast Weights, 152
5.3.1 Eventual Forecast Function Determined by the
Autoregressive Operator, 152
5.3.2 Role of the Moving Average Operator in Fixing
the Initial Values, 153
5.3.3 Lead l Forecast Weights, 154
5.4 Examples of Forecast Functions and Their Updating, 157
5.4.1 Forecasting an IMA(0, 1, 1) Process, 157
5.4.2 Forecasting an IMA(0, 2, 2) Process, 160
5.4.3 Forecasting a General IMA(0, d, q) Process, 163
5.4.4 Forecasting Autoregressive Processes, 164
5.4.5 Forecasting a (1, 0, 1) Process, 167
5.4.6 Forecasting a (1, 1, 1) Process, 169Contents
Preface to the Fourth Edition xxi
Preface to the Third Edition xxiii
1 Introduction 1
1.1 Five Important Practical Problems, 2
1.1.1 Forecasting Time Series, 2
1.1.2 Estimation of Transfer Functions, 3
1.1.3 Analysis of Effects of Unusual Intervention
Events to a System, 4
1.1.4 Analysis of Multivariate Time Series, 5
1.1.5 Discrete Control Systems, 5
1.2 Stochastic and Deterministic Dynamic Mathematical
Models, 7
1.2.1 Stationary and Nonstationary Stochastic Models
for Forecasting and Control, 7
1.2.2 Transfer Function Models, 12
1.2.3 Models for Discrete Control Systems, 14
1.3 Basic Ideas in Model Building, 16
1.3.1 Parsimony, 16
1.3.2 Iterative Stages in the Selection of a Model, 17
Part One Stochastic Models and Their Forecasting 19
2 Autocorrelation Function and Spectrum of Stationary Processes 21
2.1 Autocorrelation Properties of Stationary Models, 21
2.1.1 Time Series and Stochastic Processes, 21
2.1.2 Stationary Stochastic Processes, 24
viiENTS xiii
7.2.6 Large-Sample Information Matrices and
Covariance Estimates, 264
7.3 Some Estimation Results for Specific Models, 268
7.3.1 Autoregressive Processes, 268
7.3.2 Moving Average Processes, 270
7.3.3 Mixed Processes, 271
7.3.4 Separation of Linear and Nonlinear Components
in Estimation, 271
7.3.5 Parameter Redundancy, 273
7.4 Likelihood Function Based on the State-Space Model, 275
7.5 Unit Roots in Arima Models, 280
7.5.1 Formal Tests for Unit Roots in AR Models, 281
7.5.2 Extensions of Unit-Root Testing to Mixed
ARIMA Models, 286
7.6 Estimation Using Bayes’s Theorem, 287
7.6.1 Bayes’s Theorem, 287
7.6.2 Bayesian Estimation of Parameters, 289
7.6.3 Autoregressive Processes, 290
7.6.4 Moving Average Processes, 293
7.6.5 Mixed Processes, 294
A7.1 Review of Normal Distribution Theory, 296
A7.1.1 Partitioning of a Positive-Definite Quadratic
Form, 296
A7.1.2 Two Useful Integrals, 296
A7.1.3 Normal Distribution, 297
A7.1.4 Student’s t Distribution, 300
A7.2 Review of Linear Least Squares Theory, 303
A7.2.1 Normal Equations and Least Squares, 303
A7.2.2 Estimation of Error Variance, 304
A7.2.3 Covariance Matrix of Least Squares
Estimates, 305
A7.2.4 Confidence Regions, 305
A7.2.5 Correlated Errors, 305
A7.3 Exact Likelihood Function for Moving Average and Mixed
Processes, 306
A7.4 Exact Likelihood Function for an Autoregressive
Process, 314
A7.5 Asymptotic Distribution of Estimators for Autoregressive
Models, 323 CONTENTS
A7.6 Examples of the Effect of Parameter Estimation Errors on
Variances of Forecast Errors and Probability Limits for
Forecasts, 327
A7.7 Special Note on Estimation of Moving Average
Parameters, 330
8 Model Diagnostic Checking 333
8.1 Checking the Stochastic Model, 333
8.1.1 General Philosophy, 333
8.1.2 Overfitting, 334
8.2 Diagnostic Checks Applied to Residuals, 335
8.2.1 Autocorrelation Check, 337
8.2.2 Portmanteau Lack-of-Fit Test, 338
8.2.3 Model Inadequacy Arising from Changes in
Parameter Values, 343
8.2.4 Score Tests for Model Checking, 344
8.2.5 Cumulative Periodogram Check, 347
8.3 Use of Residuals to Modify the Model, 350
8.3.1 Nature of the Correlations in the Residuals When
an Incorrect Model Is Used, 350
8.3.2 Use of Residuals to Modify the Model, 352
9 Seasonal Models 353
9.1 Parsimonious Models for Seasonal Time Series, 353
9.1.1 Fitting versus Forecasting, 353
9.1.2 Seasonal Models Involving Adaptive Sines and
Cosines, 354
9.1.3 General Multiplicative Seasonal Model, 356
9.2 Representation of the Airline Data by a Multiplicative
(0, 1, 1) × (0, 1, 1)
12
Model, 359
9.2.1 Multiplicative (0, 1, 1) ×(0, 1, 1)
12
Model, 359
9.2.2 Forecasting, 360
9.2.3 Identification, 367
9.2.4 Estimation, 370
9.2.5 Diagnostic Checking, 375
9.3 Some Aspects of More General Seasonal ARIMA
Models, 375
9.3.1 Multiplicative and Nonmultiplicative
Models, 375
9.3.2 Identification, 379ENTS xi
5.5 Use of State-Space Model Formulation for Exact
Forecasting, 170
5.5.1 State-Space Model Representation for the
ARIMA Process, 170
5.5.2 Kalman Filtering Relations for Use
in Prediction, 171
5.5.3 Smoothing Relations in the State Variable
Model, 175
5.6 Summary, 177
A5.1 Correlations Between Forecast Errors, 180
A5.1.1 Autocorrelation Function of Forecast Errors at
Different Origins, 180
A5.1.2 Correlation Between Forecast Errors at the Same
Origin with Different Lead Times, 182
A5.2 Forecast Weights for Any Lead Time, 182
A5.3 Forecasting in Terms of the General Integrated Form, 185
A5.3.1 General Method of Obtaining the Integrated
Form, 185
A5.3.2 Updating the General Integrated Form, 187
A5.3.3 Comparison with the Discounted Least Squares
Method, 187
Part Two Stochastic Model Building 193
6 Model Identification 195
6.1 Objectives of Identification, 195
6.1.1 Stages in the Identification Procedure, 195
6.2 Identification Techniques, 196
6.2.1 Use of the Autocorrelation and Partial
Autocorrelation Functions in Identification, 196
6.2.2 Standard Errors for Estimated Autocorrelations
and Partial Autocorrelations, 198
6.2.3 Identification of Some Actual Time Series, 200
6.2.4 Some Additional Model Identification Tools, 208
6.3 Initial Estimates for the Parameters, 213
6.3.1 Uniqueness of Estimates Obtained from the
Autocovariance Function, 213
6.3.2 Initial Estimates for Moving Average
Processes, 213
6.3.3 Initial Estimates for Autoregressive
Processes, 215ENTS xv
9.3.3 Estimation, 380
9.3.4 Eventual Forecast Functions for Various Seasonal
Models, 381
9.3.5 Choice of Transformation, 383
9.4 Structural Component Models and Deterministic Seasonal
Components, 384
9.4.1 Structural Component Time Series Models, 384
9.4.2 Deterministic Seasonal and Trend Components
and Common Factors, 388
9.4.3 Estimation of Unobserved Components in
Structural Models, 390
9.5 Regression Models with Time Series Error Terms, 397
9.5.1 Model Building, Estimation, and Forecasting
Procedures for Regression Models, 399
9.5.2 Restricted Maximum Likelihood Estimation for
Regression Models, 404
A9.1 Autocovariances for Some Seasonal Models, 407
10 Nonlinear and Long Memory Models 413
10.1 Autoregressive Conditional Heteroscedastic (ARCH)
Models, 413
10.1.1 First-Order ARCH Model, 415
10.1.2 Consideration for More General Models, 416
10.1.3 Model Building and Parameter Estimation, 417
10.2 Nonlinear Time Series Models, 420
10.2.1 Classes of Nonlinear Models, 421
10.2.2 Implications and Examples of Nonlinear
Models, 424
10.3 Long Memory Time Series Processes, 428
10.3.1 Fractionally Integrated Processes, 429
10.3.2 Estimation of Parameters, 433
Part Three Transfer Function and Multivariate Model
Building 437
11 Transfer Function Models 439
11.1 Linear Transfer Function Models, 439
11.1.1 Discrete Transfer Function, 439
11.1.2 Continuous Dynamic Models Represented
by Differential Equations, 442 CONTENTS
6.3.4 Initial Estimates for Mixed
Autoregressive–Moving Average Processes, 216
6.3.5 Initial Estimate of Error Variance, 218
6.3.6 Approximate Standard Error for w, 218
6.3.7 Choice Between Stationary and Nonstationary
Models in Doubtful Cases, 220
6.4 Model Multiplicity, 221
6.4.1 Multiplicity of Autoregressive–Moving Average
Models, 221
6.4.2 Multiple Moment Solutions for Moving Average
Parameters, 224
6.4.3 Use of the Backward Process to Determine
Starting Values, 225
A6.1 Expected Behavior of the Estimated Autocorrelation
Function for a Nonstationary Process, 225
A6.2 General Method for Obtaining Initial Estimates of the
Parameters of a Mixed Autoregressive–Moving Average
Process, 226
7 Model Estimation 231
7.1 Study of the Likelihood and Sum-of-Squares Functions, 231
7.1.1 Likelihood Function, 231
7.1.2 Conditional Likelihood for an ARIMA
Process, 232
7.1.3 Choice of Starting Values for Conditional
Calculation, 234
7.1.4 Unconditional Likelihood; Sum-of-Squares
Function; Least Squares Estimates, 235
7.1.5 General Procedure for Calculating the
Unconditional Sum of Squares, 240
7.1.6 Graphical Study of the Sum-of-Squares
Function, 245
7.1.7 Description of “Well-Behaved” Estimation
Situations; Confidence Regions, 248
7.2 Nonlinear Estimation, 255
7.2.1 General Method of Approach, 255
7.2.2 Numerical Estimates of the Derivatives, 257
7.2.3 Direct Evaluation of the Derivatives, 258
7.2.4 General Least Squares Algorithm for the
Conditional Model, 260
7.2.5 Summary of Models Fitted to Series A to F, 263 CONTENTS
11.2 Discrete Dynamic Models Represented by Difference
附件列表