Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Same nonsense output for any input when inference #12

Closed
lamhoangtung opened this issue Oct 4, 2018 · 1 comment
Closed

Same nonsense output for any input when inference #12

lamhoangtung opened this issue Oct 4, 2018 · 1 comment

Comments

@lamhoangtung
Copy link

  • I modified your models with more layers of CNN and MDLSTM instead of Basic LSTM, training and validating worked perfectly for me.
  • But when I try to inference a single input images, i get the same output for all input when inference, but when I feed different image to the infer batch then it will be fine
  • Fill all the batch with the same img or with white or black image are all result in same output for any input

Have you ever encountered this weird problem ? What thing that i possibly missed here ? Thanks

@githubharald
Copy link
Owner

githubharald commented Oct 4, 2018

This is on purpose.
A batch has a fixed number of batch elements, therefore I repeat the single image for inference here:
https://github.com/githubharald/SimpleHTR/blob/master/src/main.py#L91
You have to change the code there if you want another behaviour.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants