Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests and fixes for ParallelInference #4836

merged 11 commits into from Mar 27, 2018


Copy link

commented Mar 24, 2018

Fixes: #4813

What specifically this adds/fixes:

  • Support for different size inputs (time series lengths, CNN sizes) when in batched mode
  • Support for 'pre-batched' inputs (i.e., if the user passes in a minibatch size > 1, they get the same minibatch size back out)
  • Exceptions that occur during net output are now propagated to ParallelInference.output() calls (previously: could block indefinitely)
  • Adds output method overloads that support input mask arrays
  • Fixes some issues (edge cases) with how inputs were batched together and split

Note: with regard to batching different size input arrays - currently this just processes them separately.
This certainly isn't ideal for performance, but at least it works robustly - I'll leave it at that for now, given the limited time before the planned release.

@AlexDBlack AlexDBlack requested a review from raver119 Mar 26, 2018

Copy link

left a comment

Looks great.

protected InferenceObservable setInput(@NonNull Observer observer, INDArray... input) {

This comment has been minimized.

Copy link

raver119 Mar 27, 2018


I'd probably just keep this method for cases when there's no inputMask, and set null internally


public void setInput(INDArray... input) {

This comment has been minimized.

Copy link

raver119 Mar 27, 2018


same as above

AlexDBlack added 2 commits Mar 27, 2018

@AlexDBlack AlexDBlack merged commit 858a629 into master Mar 27, 2018

0 of 2 checks passed

codeclimate 6 issues to fix
continuous-integration/jenkins/pr-merge This commit is being built

@AlexDBlack AlexDBlack deleted the ab_4813_parallel_inference branch Mar 27, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
2 participants
You can’t perform that action at this time.