-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Time series prediction #862
Comments
Hello, as you already pointed out the model from the keras tutorial uses a single response for the complete sequence; but the model as implemented right now expects one output for each input in your case something like:
you could also use
so the data should look like:
On the other hand, it's fairly easy to replicate the behavior, in fact, it worked before the ann revamp. I will open an issue maybe someone likes to implement the feature if not I'll do it. I hope this was helpful. |
Hi @zoq. Thank you for response! |
|
Saying "state" I mean this, so I can make something like online training/predictions with different sequence lengths. |
The implemented LSTM layer can handle data points of variable length. But, since the cell states are reset at each sequence it is stateless as per Keras definition. In a stateful model, you have to specify the batch size in the LSTM layer, which isn't implemented at the moment. I agree, a simple regression interface would be great, especially if you're doing time series forecasting a stateful model would be super helpful. I can open an issue for that if you like. |
Sure, thank you very much! |
Sorry for the slow response, I agree it's sometimes difficult to use all these libraries. If you tell us more about what you like to achive, maybe we can push forward in that direction; And it's just an idea but I think improving the recurrent neural network infrastructure could be an interesting GSoC project. Let me know what you think. |
My main task right now is to build and train model which should predict multivariate time series "step by step", I mean I have some continuous running process which generates some data, and I want to predict some parameters of this process for several steps forward. Stateless lstm implementation can be used for this kind of tasks, but I don't want to pass 50 (or so) previous data slices to predict next on each step, thats why I need a stateful lstm to pass only new data and read network output on each new time step. |
Thanks for this valuable notes, I will definitely take a closer look at how we could design a regression API on top the recurrent network class. Regarding GPU acceleration, you can link with NVIDIA NVBLAS which is a GPU-accelerated implementation of BLAS it can accelerate most BLAS Level-3 routine. However, I agree it would be nice to have a fully GPU accelerated interface which also includes the convolution operator, which I guess would come in the near feature. |
Closing for inactivity. |
Hello guys.
I'm trying to reproduce this simple keras tutorial (only "sinwave" part).
So I have the following model:
The main issue in the training phase. I would like to predict the next value after feeding some (50) previous. The
model.Train(...)
function gets 2 matrices - inputs and expected outputs, and it will throw "error: Mat::rows(): indices out of bounds or incorrectly used" if these matrices both will not be of shape(rho, sequences_count)
. In the keras tutorial they have only one value as output per input sequence.So the main question: how can I reproduce keras example with mlpack? I can PR this as an example, test or something if you help me a bit.
Thank you a lot!
The text was updated successfully, but these errors were encountered: