New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How does "time_in_day_feat = self.T_i_D_emb[history_data[:, :, :, num_feat] * 288] " work? #2
Comments
Thanks for your attention. The The first dimension is the target feature (e.g., traffic speed in METR-LA). This feature is used in the input embedding layer. You can refer to here and here. You can use more target features in the input embedding process by modifying the The second dimension is the The third dimension is the I think you can generate the index and embeddings according to your data. |
Thanks for your quick response! I understood the meaning of the input data, and that's not the thing bothered me. The key point of my problem is that why the (288,10)size tensor receives a (16,12,207) tensor and becomes (16,12,207,10)? The code is in model.py line 114. Thanks again!
|
In |
Understood!That's pretty clear! |
Dear Shao,
I tried to reproduce your result with my own dataset, one thing bothered me is that the dimension of history_data of METR-LA is (batch_size,12,207,3),and the dimension of self.T_i_D_emb is (288,10). How does the code in my question work where the input data can be transferred to (batch_size,12,207,10)? I created random tensors in the console but failed to obtain the same result. Could you please give me some insight in it? Here is the reproduction code:
error info:
Traceback (most recent call last):
File "/home/trp-mrta/anaconda3/envs/step_env/lib/python3.9/code.py", line 90, in runcode
exec(code, self.locals)
File "", line 1, in
IndexError: index 368 is out of bounds for dimension 0 with size 288
The code wrote by you runs smoothly without any problem,but the version above failed.
The text was updated successfully, but these errors were encountered: