Learn Before
Concept

Structural Vector Autoregressive (VAR) Analysis

Let us consider a vector YtY_t of kk time series variables. For example, Yt=(xt,yt)Y_t = ( x_t , y_t )' , in which case k=2k = 2. We assume that YtfollowsastochasticprocessthatcanbewellapproximatedbyalinearVARprocessoftheformY_t follows a stochastic process that can be well approximated by a linear VAR process of the form Y_t = \mu +A_1 Y_{t-1} + \cdots + A_p Y_{t-p} + u_t (1)wherewhereμisak×1vectorofconstants,is a k × 1 vector of constants,A_i ( i = 1, …, p )isak×kmatrixandis a k × k matrix andu_t$ is a k × 1 vector of white noise, whose elements are referred to as reduced-form residuals .

Each element of utu_t is in turn assumed to be a linear combination of latent structural shocks, ε1t,,εktε_{1 t} , …, ε_{kt} which are the sources of variation of the system. An usual assumptions is that ε1t,,εktε_{1 t} , …, ε_{kt} are mutually independent, although orthogonality is sufficient in many applications. Thus we have: ut=Bεt(2)u_t = B ε_t (2) where B is a k × k invertible matrix (the impact or mixing matrix) and εt=(ε1t,,εkt)ε_t = ( ε_{1 t} , …, ε_{kt} )' is a vector of independent shocks. Let W be B1B^{−1} . Then we get the structural VAR form: WYt=μ+Γ1Yt1++ΓpYtp+εt(3)WY_t = \mu'+Γ_1 Y_{t-1} + \cdots + Γ_p Y_{t-p} + ε_t (3) where μ=Wμμ' = Wμ and Γi=WAiΓ_i = WA_i for i=1,,pi = 1, …, p.

The idea of VAR analysis is to follow a two-step procedure:

  • First Eq. (1) is estimated through standard regression methods to obtain an estimate of the reduced-form residuals utu_t .
  • Second, the parameters of Eq. (3) can be recovered by analyzing the relationships among the elements of utu_t. Notice that, having estimated (1), knowing B is sufficient for identifying (3).

0

1

Updated 2020-07-30

Tags

Data Science