You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm facing a bit of an issue with newer versions of DJL. My project is based on a Q-learning example taken from https://github.com/kingyuluk/RL-FlappyBird, but I wanted to run on the later versions of DJL, and the problem is that by simply specifying for example version 0.15.0 in the pom, the memory consumption goes out of control. This does not occur with 0.8.0, which that project used originally.
The only changes I made when selecting a newer version (besides the pom) was changing the Tracker declaration on line 92 and the initializer on line 217 in src/main/java/com/kingyu/rlbird/ai/TrainBird.java. So instead of Tracker exploreRate = new LinearTracker.Builder()..., I have Tracker exploreRate = LinearTracker.builder()... and on line 217 I just added Parameter.Type.WEIGHT as a second argument to optInitializer(...).
I don't see why those changes would cause any issues and I can't find anything else that wouldn't be compatible with 0.15.0. I would greatly appreciate if someone could help me out here so that I'm not stuck with DJL 0.8.0 and PyTorch 1.6.0.
Edit: I should probably add that I'm running on Linux, CPU only and Java 8.
Edit 2: Looked through the history of the TicTacToe RL example in repo as well since the FlappyBird example is clearly build off of that, and I just don't get what the problem is..
Edit 3: I think I found the issue. Seems to be some incorrect usage of NDManagers so I guess some resources weren't released.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi,
I'm facing a bit of an issue with newer versions of DJL. My project is based on a Q-learning example taken from https://github.com/kingyuluk/RL-FlappyBird, but I wanted to run on the later versions of DJL, and the problem is that by simply specifying for example version 0.15.0 in the pom, the memory consumption goes out of control. This does not occur with 0.8.0, which that project used originally.
The only changes I made when selecting a newer version (besides the pom) was changing the
Tracker
declaration on line 92 and the initializer on line 217 in src/main/java/com/kingyu/rlbird/ai/TrainBird.java. So instead ofTracker exploreRate = new LinearTracker.Builder()...
, I haveTracker exploreRate = LinearTracker.builder()...
and on line 217 I just addedParameter.Type.WEIGHT
as a second argument tooptInitializer(...)
.I don't see why those changes would cause any issues and I can't find anything else that wouldn't be compatible with 0.15.0. I would greatly appreciate if someone could help me out here so that I'm not stuck with DJL 0.8.0 and PyTorch 1.6.0.
Edit: I should probably add that I'm running on Linux, CPU only and Java 8.
Edit 2: Looked through the history of the TicTacToe RL example in repo as well since the FlappyBird example is clearly build off of that, and I just don't get what the problem is..
Edit 3: I think I found the issue. Seems to be some incorrect usage of NDManagers so I guess some resources weren't released.
Thanks,
J
Beta Was this translation helpful? Give feedback.
All reactions