-
Notifications
You must be signed in to change notification settings - Fork 340
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
gstkaldinnet2onlinedecoder vs online2-tcp-nnet3-decoder-faster #241
Comments
Probably you are using chain models and are missing the attribute |
Yes I am using chain file but use-nnet2: True use-vad: False post-processor: perl -npe 'BEGIN {use IO::Handle; STDOUT->autoflush(1);} s/(.*)/\1./;' logging: And the client command is this: |
I have tweaked frame-subsampling-factor and ironically it is not putting any effect on latency |
Can you give some numbers -- the actual difference in decoding time that you are seeing? I assume you understand that |
Numbers (in milliseconds)
|
Try changing to |
Tried but no effect. However, average of multiple experiments gives a difference of ~1 second in latency with |
Hi,
I just experimented online decoding with online2-tcp-nnet3-decoder-faster which was being done using kaldinnet2onlinedecoder (through kaldi-gstreamer-server) earlier. I experienced about 3 times faster decoding with online2-tcp-nnet3-decoder-faster. I went through codes of both decoders and realized that the working is fairly identical. Can you please guide why is the later is faster? Is it my mistake or something else?
PS: parameters (like beam, lattice beam and maximum-active) kept identitical for both decoders.
Best Regards
Umar
The text was updated successfully, but these errors were encountered: