Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Omitting normalisation step in update_PFSMs has no effect on training tests #127

Open
rolyp opened this issue Oct 15, 2020 · 2 comments
Open

Comments

@rolyp
Copy link
Collaborator

rolyp commented Oct 15, 2020

As discussed, this seems to either be a bug in the training code, or a consequence of poorly-chosen training data. For the training tests to be useful, they have to be sensitive to breaking the training algorithm.

@tahaceritli
Copy link
Collaborator

tahaceritli commented Oct 15, 2020

See #129 for the changes. The training looks fine for the toy example; however, we still have an issue with testing real datasets. Perhaps the issue comes from the datasets used. I will try to understand this.

@tahaceritli
Copy link
Collaborator

See #143 for the inspection of the test datasets. I have summarized the whole inspection in https://github.com/alan-turing-institute/ptype-dmkd/blob/develop/notebooks/train-model.ipynb.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Development

No branches or pull requests

2 participants