New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
code run error #11
Comments
I have the same problem |
1 similar comment
I have the same problem |
@tubu @soloman817 @4575759ww Copy data from https://drive.google.com/file/d/0B6ZrYxEMNGR-MEd5Ti0tTEJjMTQ/view and https://drive.google.com/file/d/0B6ZrYxEMNGR-Q0YwWWVpVnJ3YmM/view?usp=sharing into tensor-reinforcement directory. |
@deependersingla I download the two files and copy data to tensor-reinforcement directory,but have the same problem. |
@deependersingla, I also have the two .pkl files in the directory yet the error shows. |
I have the same problems,were you able to work through it? @deependersingla |
I also have the problem, although i download the two .pkl file and copy to the directory. |
when I run python dqn_model.py in tensor-reinforcement,i got an error(I have mkdir saved_networks,and get data.pk land data_dict.pkl in tensor-reinforcement):
Traceback (most recent call last):
File "/Users/nicefilm/Documents/code/python/deep-trader/tensor-reinforcement/dqn_model.py", line 11, in
from train_stock import *
File "/Users/nicefilm/Documents/code/python/deep-trader/tensor-reinforcement/train_stock.py", line 23, in
supervised_y_data = episodic_data.make_supervised_data(data, data_dict)
File "/Users/nicefilm/Documents/code/python/deep-trader/tensor-reinforcement/episodic_data.py", line 100, in make_supervised_data
supervised_data.append(episode_supervised_data(episode, data_dict))
File "/Users/nicefilm/Documents/code/python/deep-trader/tensor-reinforcement/episodic_data.py", line 87, in episode_supervised_data
prices.append(data_average_price(data_dict, iteration))
File "/Users/nicefilm/Documents/code/python/deep-trader/tensor-reinforcement/episodic_data.py", line 92, in data_average_price
data = data_dict[list_md5_string_value(data)]
KeyError: '297956b2300474fda50a2a6b1d41a714'
The text was updated successfully, but these errors were encountered: