Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

weird max_training_processes parameter behaviour #1442

Closed
frascuchon opened this issue Oct 2, 2018 · 5 comments · Fixed by #1449
Closed

weird max_training_processes parameter behaviour #1442

frascuchon opened this issue Oct 2, 2018 · 5 comments · Fixed by #1449
Labels
type:bug 🐛 Inconsistencies or issues which will cause an issue or problem for users or implementors.

Comments

@frascuchon
Copy link

Rasa NLU version: 0.13.5

Operating system (windows, osx, ...): ubuntu server 16.04

Issue:
Setting argument max_training_processes: 1 server deny training more than one model for a project with error:

{
  "error": "The server can't train more models right now!"
}

But training a new project is possible

I really don't know if is an issue or an expected behaviour. It max_training_processes is related to a single project, the error seems to indicate that you cannot train at all.

If it's a global parameter, there is a bug here. In this case, I've locate the bug in source code and I can prepare a little pull request for that.

Thanks in advance

@akelad
Copy link
Contributor

akelad commented Oct 4, 2018

Thanks for raising this issue, @ricwo will get back to you about it soon.

@wrathagom
Copy link
Contributor

@dcalvom FYI since you were likely the last one to touch this code.

@ricwo
Copy link
Contributor

ricwo commented Oct 5, 2018

@frascuchon Thanks for pointing out this issue - it is indeed a bug. I have just pushed a fix with #1449

@akelad akelad added the type:bug 🐛 Inconsistencies or issues which will cause an issue or problem for users or implementors. label Oct 5, 2018
@frascuchon
Copy link
Author

Cool @ricwo! Do you know in which release this fix will be included?

@ricwo
Copy link
Contributor

ricwo commented Oct 5, 2018

@frascuchon It'll be in the next release, so 0.13.7

@tmbo tmbo closed this as completed in #1449 Oct 5, 2018
znat referenced this issue in botfront/rasa-for-botfront Oct 16, 2018
* 0-13-7: (40 commits)
  preparing next version #92
  set one value
  #1442 travis did not report status -> rebuild
  #1442 make max_training_processes apply globally
  removed livechat.html
  removed livechat
  added custom language example
  prepared next release
  update pushing tags command in readme
  #1437 remove rogue newlines
  #1437 use multiprocessing star method spawn
  changelog
  #1437 check py2 first
  #1437 run tf training in separate thread on py3
  update on language support
  language support
  updated docs and community links
  prepare next release implementing #1425
  annotation too long
  svm supports gamma parameter
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:bug 🐛 Inconsistencies or issues which will cause an issue or problem for users or implementors.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants