Copyright 1996 Lawrence C. Marsh 0 PowerPoint Slides for Undergraduate Econometrics by
Copyright 1996 Lawrence C. Marsh
Download 0.54 Mb. Pdf ko'rish
|
allChap6
Copyright 1996 Lawrence C. Marsh Adaptive Expectations y t
α λ - ( 1 − λ ) y t-1
+ β [x* t - ( 1 −
λ ) x* t-1 ] + e t - ( 1 −
λ ) e t-1
Since
λ x t-1
= [x* t - (1- λ ) x*
t-1 ] we get: y t = α λ - ( 1 − λ ) y t-1
+ βλ x t-1 +
u t where u t = e t - ( 1 − λ ) e t-1
15.28 Copyright 1996 Lawrence C. Marsh Adaptive Expectations y t
α λ - ( 1 − λ ) y t-1
+ βλ x t-1 +
u t y t = β 1 + β 2 y t-1 + β 3 x t-1 +
u t Use ordinary least squares regression on: and we get: β = ( 1 − β 2 )
β 3
^ ^
^ λ = (
1 −
β 2 ) ^ ^ α = ( 1 −
β 2 ) β 1 ^
^
^ 15.29 Copyright 1996 Lawrence C. Marsh Partial Adjustment y t
t-1 =
γ (y*
t - y
t-1 ) inventories partially adjust , 0 < γ < 1, towards optimal or desired level, y* t
y* t = α +
β x t + e t
15.30 Copyright 1996 Lawrence C. Marsh Partial Adjustment y t
t-1 =
γ (y*
t - y
t-1 ) = γ ( α + β x t + e
t - y
t-1 ) = γ α + γβ x t
- γ y t-1 + γ e t
y t =
γ α + (1 - γ) y t-1
+ γβ x t +
γ e t Solving for y t
15.31 68 Copyright 1996 Lawrence C. Marsh Partial Adjustment y t
γ α + (1 - γ) y t-1
+ γβ x t +
γ e t y t = β 1 + β 2 y t-1
+ β 3 x t + ν t β = ( 1 −
β 2 ) β 3 ^ ^
^ γ = ( 1 −
β 2 ) ^ ^ α = ( 1 −
β 2 ) β 1 ^ ^
^ Use ordinary least squares regression to get: 15.32
Copyright © 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the copyright owner is unlawful. Request for further information should be addressed to the Permissions Department, John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these programs or from the use of the information contained herein. 16.1
Copyright 1996 Lawrence C. Marsh Previous Chapters used Economic Models 1. economic model for dependent variable of interest. 2. statistical model consistent with the data. 3. estimation procedure for parameters using the data. 4. forecast variable of interest using estimated model. Times Series Analysis does not use this approach. 16.2
Time Series Analysis is useful for short term forecasting only. Time Series Analysis does not generally incorporate all of the economic relationships found in economic models. Times Series Analysis uses more statistics and
less economics . Long term forecasting requires incorporating more involved behavioral economic relationships into the analysis. 16.3
Univariate Time Series Analysis can be used to relate the current values of a single economic variable to: 1. its past values 2. the values of current and past random errors Other variables are not used in univariate time series analysis. 16.4
Copyright 1996 Lawrence C. Marsh 1. autoregressive (AR) 2. moving average (MA) 3. autoregressive moving average (ARMA) Three types of Univariate Time Series Analysis processes will be discussed in this chapter: 16.5
69 Copyright 1996 Lawrence C. Marsh 1. its past values . 2. the past values of the other forecasted variables . 3. the values of current and past random errors . Multivariate Time Series Analysis can be used to relate the current value of each of several economic variables to: Vector autoregressive models discussed later in this chapter are multivariate time series models. 16.6
Copyright 1996 Lawrence C. Marsh First-Order Autoregressive Processes, AR(1): y t
= δ + θ 1 y t-1 + e
t , t = 1, 2,...,T . (16.1.1)
δ is the intercept. θ 1
e t is an uncorrelated random error with mean zero and variance σ e 2 . 16.7 Copyright 1996 Lawrence C. Marsh Autoregressive Process of order p, AR(p) : y t
= δ + θ 1 y t-1 + θ 2 y t-2 +...+ θ p y t-p
+ e t
(16.1.2) δ is the intercept. θ i ’s are parameters generally between -1 and +1. e t is an uncorrelated random error with mean zero and variance σ e 2 . 16.8 Copyright 1996 Lawrence C. Marsh AR models always have one or more lagged dependent variables on the right hand side. Consequently, least squares is no longer a best linear unbiased estimator ( BLUE
), but it does have some good asymptotic properties including consistency .
16.9
Copyright 1996 Lawrence C. Marsh AR(2) model of U.S. unemployment rates y t
t-1
- 0.6515 y t-2
(0.1267) (0.0707) (0.0708) Note: Q1-1948 through Q1-1978 from J.D.Cryer (1986) see unempl.dat positive negative
16.10 Copyright 1996 Lawrence C. Marsh Choosing the lag length, p, for AR(p): The Partial Autocorrelation Function (PAF) The PAF is the sequence of correlations between (y t
t-1 ), (y
t and y
t-2 ), (y
t and y
t-3 ), and so on, given that the effects of earlier lags on y t are held constant. 16.11
70 Copyright 1996 Lawrence C. Marsh Partial Autocorrelation Function y t
t-1
+ 0.3 y t-2
+ e t 0 2 / T
− 2
/ T
1 − 1 k θ kk is the last (k th ) coefficient in a k th order AR process. This sample PAF suggests a second order process AR(2) which is correct. Data simulated from this model: θ kk
16.12 Copyright 1996 Lawrence C. Marsh Using AR Model for Forecasting: unemployment rate: y T-1
= 6.63 and y T = 6.20 y T+1
= δ + θ 1 y T +
θ 2 y T-1
= 0.5051 + (1.5537)(6.2) - (0.6515)(6.63) = 5.8186 ^ ^ ^ ^ y T+2 =
δ +
θ 1 y T+1 +
θ 2 y T
= 0.5051 + (1.5537)(5.8186) - (0.6515)(6.2) = 5.5062 ^ ^ ^ ^ y T+1 =
δ +
θ 1 y T +
θ 2 y T-1
= 0.5051 + (1.5537)(5.5062) - (0.6515)(5.8186) = 5.2693 ^ ^ ^ ^ 16.13 Copyright 1996 Lawrence C. Marsh Moving Average Process of order q, MA(q): y t
= µ + e t +
α 1 e t-1 + α 2 e t-2 +...+ α q e t-q
+ e t
(16.2.1) µ is the intercept. α i ‘s are unknown parameters. e t is an uncorrelated random error with mean zero and variance σ e 2 . 16.14 Copyright 1996 Lawrence C. Marsh An MA(1) process: y t
= µ + e t +
α 1 e t-1
(16.2.2) Minimize sum of least squares deviations: S( µ
α 1 ) = Σ e t = Σ( y t
- µ
- α 1 e t-1 ) (16.2.3) 2 t=1 T t=1 T 2 16.15
Copyright 1996 Lawrence C. Marsh stationary : A stationary time series is one whose mean, variance, and autocorrelation function do not change over time. nonstationary : A nonstationary time series is one whose mean, variance or autocorrelation function change over time. Stationary vs. Nonstationary 16.16
y t = z
t - z
t-1 First Differencing is often used to transform a nonstationary series into a stationary series: where z
t is the original nonstationary series and y
is the new stationary series. 16.17
71 Copyright 1996 Lawrence C. Marsh Choosing the lag length, q, for MA(q): The Autocorrelation Function (AF) The AF is the sequence of correlations between (y t
t-1 ), (y
t and y
t-2 ), (y
t and y
t-3 ), and so on, without holding the effects of earlier lags on y
t constant. The PAF controlled for the effects of previous lags but the AF does not control for such effects. 16.18
Autocorrelation Function y t
t
− 0.9 e
t-1 0 2 / T
− 2
/ T
1 − 1 k r kk r kk is the last (k th ) coefficient in a k th order MA process. This sample AF suggests a first order process MA(1) which is correct. Data simulated from this model: 16.19
Autoregressive Moving Average ARMA(p,q) An ARMA(1,2) has one autoregressive lag and two moving average lags: y t = δ
+ θ 1 y t-1
+ e t
+ α 1 e t-1
+ α 2 e t-2
16.20 Copyright 1996 Lawrence C. Marsh Integrated Processes A time series with an upward or downward trend over time is nonstationary. Many nonstationary time series can be made stationary by differencing them one or more times. Such time series are called integrated processes. 16.21
Copyright 1996 Lawrence C. Marsh The number of times a series must be differenced to make it stationary is the order of the integrated process, d. An autocorrelation function, AF, with large, significant autocorrelations for many lags may require more than one differencing to become stationary. Check the new AF after each differencing to determine if further differencing is needed. 16.22
Unit Root z t
= θ 1 z t-1
+ µ + e t + α 1 e t -1 (16.3.2) -1 <
θ 1
stationary ARMA(1,1) θ 1 = 1 nonstationary process θ 1
unit root 16.23
72 Copyright 1996 Lawrence C. Marsh Unit Root Tests ∆ z
= θ 1 z t-1
+ µ + e t + α 1 e t -1 (16.3.3) Testing
θ 1 θ 1 = 1 z t - z t-1 = ( θ 1 -
1) z t -1 + µ + e t + α 1 e t -1 * where
∆ z t = z t - z t-1 and θ 1 = θ 1 -
1
* 16.24
Copyright 1996 Lawrence C. Marsh Unit Root Tests H 0
θ 1 = 0 vs. H 1 : θ 1 < 0 (16.3.4)
* * Computer programs typically use one of the following tests for unit roots: Dickey-Fuller Test Phillips-Perron Test 16.25
Copyright 1996 Lawrence C. Marsh Autoregressive Integrated Moving Average ARIMA(p,d,q) An ARIMA(p,d,q) model represents an AR(p) - MA(q) process that has been differenced (integrated, I(d)) d times. y t
= δ + θ 1 y t-1 +...+
θ p y t-p + e
t + α 1 e t-1 +... + α q e t-q
16.26 Copyright 1996 Lawrence C. Marsh The Box-Jenkins approach: 1. Identification determining the values of p, d, and q. 2. Estimation linear or nonlinear least squares. 3. Diagnostic Checking model fits well with no autocorrelation? 4. Forecasting short-term forecasts of future y t values.
16.27 Copyright 1996 Lawrence C. Marsh Vector Autoregressive (VAR) Models y t
= θ 0 + θ 1 y t-1 +...+ θ p y t-p
+ φ 1 x t-1
+... + φ p x t-p
+ e t x t = δ 0
+ δ 1 y t-1 +...+
δ p y t-p + α 1 x t-1 +... + α p x t-p
+ u t Use VAR for two or more interrelated time series: 16.28 Copyright 1996 Lawrence C. Marsh 1. extension of AR model . 2. all variables endogenous . 3. no structural (behavioral) economic model . 4. all variables jointly determined (over time). 5. no simultaneous equations (same time). Vector Autoregressive (VAR) Models 16.29
73 Copyright 1996 Lawrence C. Marsh The random error terms in a VAR model may be correlated if they are affected by relevant factors that are not in the model such as government actions or national/international events, etc. Since VAR equations all have exactly the same set of explanatory variables, the usual seemingly unrelation regression estimation produces exactly the same estimates as least squares on each equation separately. 16.30
Consequently, regardless of whether the VAR random error terms are correlated or not, least squares estimation of each equation separately will provide consistent regression coefficient estimates. Least Squares is Consistent 16.31
VAR Model Specification To determine length of the lag, p, use: 2. Schwarz’s SIC criterion 1. Akaike’s AIC criterion These methods were discussed in Chapter 15. 16.32
Spurious Regressions y t
= β 1 + β 2 x t + ε t where ε t = θ 1 ε t-1 + ν t -1 <
θ 1
I(0) (i.e. d=0) θ 1 = 1 I(1) (i.e. d=1) If θ
=1 least squares estimates of β 2 may appear highly significant even when true β 2
16.33 Download 0.54 Mb. Do'stlaringiz bilan baham: |
ma'muriyatiga murojaat qiling