You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to figure out why my solutions have significantly larger values at the first time step than other time steps. I'm not sure if this is an issue, or if it would solve my problem, but I'm wondering if it's appropriate to use Near while time indexing in convolve
returnsum([E[ii,:] ⋅ x[Near(t-ll),:] for (ii,ll) inenumerate(lags)])
For times t-ll < t[1], this will just access t[1]. So in my case, where the first year we have data for is 1470yr, the model matrix will assume that, when trying to account for the lag at 1470yr, that every preceding year has the same data as 1470yr (at least this is my understanding of what's happening here). Would it make more sense for this to only access times that exist?
The text was updated successfully, but these errors were encountered:
Okay, new update. The conditional should actually be t - ll >= t1 - 2*unit.(t1). The 2 is because, for our lagged M matrix, at t = 1yr, the values are 0, and we want to actually have values for the first row of our E matrix.
Okay, I think this is my final opinion on this. Except I'm now wondering if this is moot because we shouldn't be subsampling M.
But, for what it's worth it, I think the right conditional is
xtime = x.dims[1]
return sum([E[At(ll),:] ⋅ x[At(t-ll),:] for (ii,ll) in enumerate(lags) if t-ll in xtime])
I'm trying to figure out why my solutions have significantly larger values at the first time step than other time steps. I'm not sure if this is an issue, or if it would solve my problem, but I'm wondering if it's appropriate to use
Near
while time indexing inconvolve
BLUEs.jl/src/BLUEs.jl
Line 605 in cabf403
For times
t-ll
<t[1]
, this will just accesst[1
]. So in my case, where the first year we have data for is 1470yr, the model matrix will assume that, when trying to account for the lag at 1470yr, that every preceding year has the same data as 1470yr (at least this is my understanding of what's happening here). Would it make more sense for this to only access times that exist?The text was updated successfully, but these errors were encountered: