Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training on top of an existing model #129

Closed
ghost opened this issue Jun 8, 2017 · 8 comments
Closed

Training on top of an existing model #129

ghost opened this issue Jun 8, 2017 · 8 comments

Comments

@ghost
Copy link

ghost commented Jun 8, 2017

Hi there,
I am trying to train a new clstm model containing +1000 lines, the training process would take days.
My technique would be to train a couple of hours a day, and continue training the next day, as such.
I created an arabic-8000.clstm model for testing, and added to the script:
load=arabic-8000.clstm
start=8000

But the problem is that clstmocrtrain starts from 0 all over again.
Waiting for your reply

@amitdo
Copy link
Contributor

amitdo commented Jun 9, 2017

Please paste here the full commands you used to:

  • Train the model
  • Load the model

@amitdo
Copy link
Contributor

amitdo commented Jun 9, 2017

Which branch/tag are you using? The option to load a model is only supported in the master branch.

@ghost
Copy link
Author

ghost commented Jun 9, 2017

@amitdo The train model

set -x
-a
sort -R manifest.txt > /tmp/manifest2.txt
sed 1,100d /tmp/manifest2.txt > train.txt
sed 100q /tmp/manifest2.txt > test.txt

report_every=1000
save_every=1000
maxtrain=50000
target_height=48
dewarp=center
display_every=1000
test_every=1000
hidden=100
lrate=1e-4
save_name=arabic
clstmocrtrain train.txt test.txt

-------------------------------------------------------------
And the Load model is:
set -x
-a
sort -R manifest.txt > /tmp/manifest2.txt
sed 1,100d /tmp/manifest2.txt > train.txt
sed 100q /tmp/manifest2.txt > test.txt

report_every=1000
save_every=1000
maxtrain=50000
target_height=48
dewarp=center
display_every=1000
test_every=1000
hidden=100
lrate=1e-4
save_name=arabic
load=arabic-8000.clstm
start=8000
clstmocrtrain train.txt test.txt

@kba
Copy link
Collaborator

kba commented Jun 9, 2017

@Christophered Your "load"/"train" steps are the same script?

Also, you can enclose multi-line code in triple backticks (```) in markdown, like so:

#!/bin/bash
set -x
-a
sort -R manifest.txt > /tmp/manifest2.txt
sed 1,100d /tmp/manifest2.txt > train.txt
sed 100q /tmp/manifest2.txt > test.txt

report_every=1000
save_every=1000
maxtrain=50000
target_height=48
dewarp=center
display_every=1000
test_every=1000
hidden=100
lrate=1e-4
save_name=arabic
load=arabic-8000.clstm
start=8000
clstmocrtrain train.txt test.txt

@ghost
Copy link
Author

ghost commented Jun 9, 2017

@kba loading script is similar to the training script except for the last 3 lines
load=arabic-8000.clstm
start=8000

@ghost
Copy link
Author

ghost commented Jun 9, 2017

#129 (comment)
@amitdo what do you mean? I have the default main clstm installed.

@ghost
Copy link
Author

ghost commented Jun 9, 2017

How can I train on top of an existing model, or stop and continue training later?

@ghost
Copy link
Author

ghost commented Aug 3, 2017

I was using the seperat-derive "legacy" clstm version, it doesn't have save/load options

@ghost ghost closed this as completed Aug 3, 2017
This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants