-
Notifications
You must be signed in to change notification settings - Fork 166
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
prediction problems #39
Comments
Hi, I understand the problem you are trying to solve. I fixed That being said, it doesn't mean that the Transformer can't be adapted for predictions problems, but it will require some modifications.
|
thank for your answer. Though your repo is not applicable for the prediction problems. Then how should I modificate the model to apply to my problem. I have changed the embedding layer and replaced it by Linear layer, also, I changed the output layer with the sigmoid layer. What elso should I do to apply the transformer model to my problem. What's more,
|
As I said, in order to adapt this repo to your problem you could start by:
|
hi,I have read something about #5, but I am still confused about the applicatility of prediction problems. Now, I am working to apply the transformer to the load forecasting. I want to use the load value of the first 168 moments to predict the load value of the next 24 moments, so I set the input_shape(x) as [batch_size, 168, 1],which is also the input of the encoderlayer, also, I set the target_shape(y) as [batch_size, 24, 1], which is also part of the input of the decoderlayer. Obviously, this is not work in your code, because the K is mismatch(168!=24). The output_shape is still [batch_size, 168, 1], instead of [batch_size, 24, 1], which is I want. Then, I want to know whether the original transformer or your transformer can apply to my problem?
What's more, why the K is the same in the encoderlayer and decoderlayer in your transformer? I have searched other code, where the sequence_length(K) may can be different in the encoderlayer and decoderlayer?
Thank you very much!
The text was updated successfully, but these errors were encountered: