You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The beam search decoder for deployment in PR#139 takes advantage of trie tree as the data structure for prefix search and finite-state transducers for spelling correction, which speedup the decoding process and lower the WER. With a larger (compared with the model in #115 ) well-trained acoustic model, parameters alpha and beta for the decoder are retuned on the development dataset of LibriSpeech, as shown in the figure below.
alpha: language model weight
beta: word insertion weight
WER: word error rate
As usual, the WER is mainly affected by the variation of parameter alpha. And the optimal parameters pair appears at (alpha,beta) = (2.15, 0.35), which produces a minimum WER 7.87% on the test dataset of LibriSpeech, and attenuates the WER by 0.8% compared to the prototype decoder in Python.
The text was updated successfully, but these errors were encountered:
您好,此issue在近一个月内暂无更新,我们将于今天内关闭。若在关闭后您仍需跟进提问,可重新开启此问题,我们将在24小时内回复您。因关闭带来的不便我们深表歉意,请您谅解~感谢您对PaddlePaddle的支持!
Hello, this issue has not been updated in the past month. We will close it today for the sake of other user‘s experience. If you still need to follow up on this question after closing, please feel free to reopen it. In that case, we will get back to you within 24 hours. We apologize for the inconvenience caused by the closure and thank you so much for your support of PaddlePaddle Group!
The beam search decoder for deployment in PR#139 takes advantage of trie tree as the data structure for prefix search and finite-state transducers for spelling correction, which speedup the decoding process and lower the WER. With a larger (compared with the model in #115 ) well-trained acoustic model, parameters
alpha
andbeta
for the decoder are retuned on the development dataset of LibriSpeech, as shown in the figure below.alpha
: language model weightbeta
: word insertion weightWER
: word error rateAs usual, the WER is mainly affected by the variation of parameter
alpha
. And the optimal parameters pair appears at(alpha,beta) = (2.15, 0.35)
, which produces a minimum WER 7.87% on the test dataset of LibriSpeech, and attenuates the WER by 0.8% compared to the prototype decoder in Python.The text was updated successfully, but these errors were encountered: