Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove multinomial sample from ONNX model output #5788

Closed
Houtamelo opened this issue Sep 20, 2022 · 0 comments
Closed

Remove multinomial sample from ONNX model output #5788

Houtamelo opened this issue Sep 20, 2022 · 0 comments
Assignees
Labels
request Issue contains a feature request.

Comments

@Houtamelo
Copy link

On the model api version 1 the outputs where the log probabilities of each action and the ml-agents unity package would use a multinomial sample in order to provide the actions requested by the agent. If a user wanted to access the probabilities directly instead of using a multinomial they could with moderate difficulty do so by editing how the ActionBuffers are constructed.

Since model api version 2 the multinomial sample is embedded directly into the ONNX model. If a user wants to access those probabilities they would have to:

  • Edit the output of the ONNX model (by removing the multinomial sample at the end) using a tool like https://github.com/ZhangGe6/onnx-modifier or editing how it is generated in the python package itself.
  • With a lot more difficulty, edit (and bugfix) the source code in order to make the LegacyDiscreteActionOutputApplier work with the edited ONNX model(which was made for the api version 1), this requires much deeper knowledge of how barracuda works and how the outputs in the tensors are layered.

In which scenarios accessing the probabilities can be useful?

  • Evaluate how good the model is, by accessing the probabilities the user can place the agent in pre-defined scenarios and check if the model is predicting the correct action, this is specially useful for discrete actions.
  • Manually editing the probabilities after they have been generated, this gives more control on how we use the models at runtime.
@Houtamelo Houtamelo added the request Issue contains a feature request. label Sep 20, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
request Issue contains a feature request.
Projects
None yet
Development

No branches or pull requests

3 participants