It’s a very important concept in Growth Model.
In this page, we try to answer two questions:
- Can we analyze the speed of Convergence?
- How analyze stability if two- or -dimensional state ? Because now we can no longer draw the Phase Diagram.
Suppose we already arrive at the Steady State, let and the function define a dynamic system:
If it is at Steady State, it’s obvious that .
Consider a first order approximation of around : (see Taylor Expansion):
where is the Jacobian of evaluated at , we then define the gap of : , thus in the traditional Growth Model, we could have:
The Jacobian of is , in this example, is
We have , it simply means that how would evolve around the steady state if the economy is deviated from the Steady State.
To solve this system, we need to find the eigenvalues and eigenvectors of matrix . From Acemoglu, with initial value , and let be a matrix. Suppose that of the eigenvalues of have negative real parts. Then there exists an -dimensional subspace of such that starting from any , we have as . If all eigenvalues of have negative real parts, then .
Simply saying:
- if , we have “saddle-path stable”, we have a unique optimal trajectory. And the negative eigenvalue would govern the speed of convergence.
- if , we have “unstable”, there would be infinite trajectories diverging from the steady state. does not converge to steady state.
- if , multiple optimal trajectories.
We skip the proof of this theorem because it is unlikely to be in the exams.
Linearized Growth Model
We continue using the traditional Growth Model as an example. The dynamic system is:
We then introduce how to find the eigenvalues of matrix .
Eigenvalues and Eigenvectors
The eigenvalue should satisfy:
where is the identity matrix. Thus we have:
Then calculate the determinant:
Thus we have the eigenvalue equation:
Simply calculation leads:
Since , thus the two eigenvalues have opposite signs, we let , then we have , it is saddle-path stable.
Now we introduce how to solve the matrix differential equation.
Solution to Matrix Differential Equation
We try to solve the following equation:
The intuition is we diogonalize matrix to decouple the system. Thus we have independent differential equations.
Step 1: We dioagonalize matrix .
Suppose has linearly independent eigenvectors, we denote the eigenvectors as and construct matrix as:
- Diagonal matrix:
By definition, we have , thus we have
For example, recall that $$ A = \begin{bmatrix} 0 & \frac{1}{\sigma} f”(k^)c^ \ -1 & \rho \end{bmatrix}
We suppose the eigenvalues are $\lambda_1$ and $\lambda_2$, eigenvectors are:v_1 = \begin{bmatrix} v_{11} \ v_{12} \end{bmatrix} \quad v_2= \begin{bmatrix} v_{21} \ v_{22} \end{bmatrix}
P = \begin{bmatrix} v_{11} & v_{21} \ v_{12} & v_{22} \end{bmatrix} \quad \Lambda = \begin{bmatrix} \lambda_1 & 0 \ 0 & \lambda_2 \end{bmatrix}