Skip to content
This repository has been archived by the owner. It is now read-only.

Investigate problem with schur_state_space_transformation #1312

Closed
JohannesPfeifer opened this Issue Oct 11, 2016 · 7 comments

Comments

Projects
None yet
4 participants
@JohannesPfeifer
Copy link
Contributor

JohannesPfeifer commented Oct 11, 2016

There is a problem with the initialization of the diffuse filter and smoother via schur_statespace_transformation.m. The mod-file Local_trend_3.mod (sent via email on 11/10/16) by a user contains three independent local trend models as in Durbin/Koopman (2012). Because the three processes are independent, all Kalman filter matrices should be block recursive. When using the Harvey initialization of the covariance Pinf, this is exactly what happens and results make sense. But when using the diffuse filter, in dsge_likelihood.m after

[Ztmp,Ttmp,Rtmp,QT,Pstar,Pinf]=schur_statespace_transformation(Z,T,R,Q,DynareOptions.qz_criterium,[1:length(T)]);
    Pinf = QT*Pinf*QT';
    Pstar = QT*Pstar*QT';

Pinf is not block diagonal anymore. Rather, there significant off-diagonal entries. This introduces a correlation between the independent local trend models that is very visible in the last graph. All observed trending variables are perfectly matched by the smoother, but the different trends interact in a way that is incompatible with the initial structure. Thus, there seems to be a problem with schur_statespace_transformation.m. Either there is a bug or there are numerical problems.

Weirdly, the same problem is not present when I run just two local trend models (Local_trend_2.mod). Here, Pinf is again block-recursive. So something happens when going from two to three local trend models.

@JohannesPfeifer

This comment has been minimized.

Copy link
Contributor Author

JohannesPfeifer commented Oct 12, 2016

@MichelJuillard writes

The Schur transformation modifies the state space so as to separate
trends from stationary variables. The trends in the transformed model
are not necessarily the original trends in the model. It is sufficient
if they span the same space.

This seems fine for dsge_likelihood. But, for DsgeSmoother, it seems
that we are missing a last step where we should transform back the
smoothed variables computed with the transformed model into the smoothed
variables of the original model. We need to multiply with the projector
of the transposed of the projector of the Schur decomposition (this
needs to be checked)

@JohannesPfeifer

This comment has been minimized.

Copy link
Contributor Author

JohannesPfeifer commented Oct 12, 2016

@MichelJuillard Do we have a a reference for what we are doing there? I don't feel comfortable with that part of the code, so someone else would need to do this ticket.

@rattoma

This comment has been minimized.

Copy link
Member

rattoma commented Oct 12, 2016

@JohannesPfeifer @MichelJuillard I can look into this, but not very quickly

@JohannesPfeifer

This comment has been minimized.

Copy link
Contributor Author

JohannesPfeifer commented Nov 20, 2016

@MichelJuillard Did you fix this with your recent commits 2f9dc09 and 897af97 ? Because the mod-file in question seems to be working now.

@MichelJuillard

This comment has been minimized.

Copy link
Member

MichelJuillard commented Nov 20, 2016

Yes @JohannesPfeifer , that was the idea. I think it is fixed. I still want to add a test case inspired by the *.mod file and refactor the code

@JohannesPfeifer

This comment has been minimized.

Copy link
Contributor Author

JohannesPfeifer commented Nov 20, 2016

@MichelJuillard Perfect. Thanks. But then you should assign yourself the ticket.

@houtanb houtanb assigned MichelJuillard and unassigned rattoma Nov 21, 2016

@JohannesPfeifer

This comment has been minimized.

Copy link
Contributor Author

JohannesPfeifer commented Nov 28, 2016

Closed by 80eeee6

rattoma pushed a commit to rattoma/dynare that referenced this issue Nov 16, 2017

Marco Ratto
fix to the Kitigawa transformation that allows to reduce the computin…
…g time of the likelihood in large models, with a lot of static variables, by 30-50%.

This fixes the bug in e97e5c3 that led to 2f9dc09
It is tested and completely fixes the issue highlighted in DynareTeam#1312.

stepan-a added a commit that referenced this issue Jan 26, 2018

fix to the Kitigawa transformation that allows to reduce the computin…
…g time of the likelihood in large models, with a lot of static variables, by 30-50%.

This fixes the bug in e97e5c3 that led to 2f9dc09
It is tested and completely fixes the issue highlighted in #1312.

stepan-a added a commit that referenced this issue Jan 26, 2018

fix to the Kitigawa transformation that allows to reduce the computin…
…g time of the likelihood in large models, with a lot of static variables, by 30-50%.

This fixes the bug in e97e5c3 that led to 2f9dc09
It is tested and completely fixes the issue highlighted in #1312.

(cherry picked from commit cf8213f)

stepan-a added a commit to stepan-a/dynare that referenced this issue Feb 5, 2018

fix to the Kitigawa transformation that allows to reduce the computin…
…g time of the likelihood in large models, with a lot of static variables, by 30-50%.

This fixes the bug in e97e5c3 that led to 2f9dc09
It is tested and completely fixes the issue highlighted in DynareTeam#1312.

tholden pushed a commit to tholden/dynare that referenced this issue Aug 20, 2018

fix to the Kitigawa transformation that allows to reduce the computin…
…g time of the likelihood in large models, with a lot of static variables, by 30-50%.

This fixes the bug in 9b6a640 that led to 7447fb9
It is tested and completely fixes the issue highlighted in DynareTeam#1312.

(cherry picked from commit 7fbed6f)
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
You can’t perform that action at this time.