Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor models so they all follow the same interface #3

Closed
6 tasks done
Michael0x2a opened this issue Apr 19, 2017 · 1 comment
Closed
6 tasks done

Refactor models so they all follow the same interface #3

Michael0x2a opened this issue Apr 19, 2017 · 1 comment
Assignees
Milestone

Comments

@Michael0x2a
Copy link
Owner

Michael0x2a commented Apr 19, 2017

We'll probably follow approximately the same interface as the character n-gram model.

  • Refactor RNN model
  • Refactor bag-of-words model
  • Refactor character n-gram model
  • Make base class or abc to help code typecheck
  • Make sure all models can accept multiple classes?
  • Strip out all dataset-specific code
@briankchan
Copy link
Collaborator

So, I'm doing something weird in the character n-gram model right now where I create a loss op once when the classifier is created, then once again every time new training starts; IIRC making the first op had something to do with getting something working properly with saving or summaries, but I'm not entirely sure anymore; I can try to see again if this is really necessary.
It's possible that it makes more sense to just pass in training parameters and such when creating the classifier, so the training and loss ops only need to be created once.

Also, I'm pretty sure that the way I'm doing things now, saving and reloading on a new classifier, then trying to train more, will end up resetting to the default loss op; and if there's a new training op for each loss op, I don't think the variables for Adam will get restored properly (but that probably only affects training time).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants