You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
Since the GPU nodes are being in maintenance, I need to compile the code with --use-cuda=no option.
However, there were a few compile errors, especially related to ExpA, so I changed them to exp.
By the way, when the EESEN enters into the training phase, the log show like this:
train-ctc-parallel --report-step=1000 --num-sequence=10 --frame-limit=25000 --learn-rate=0.00004 --momentum=0.9 --verbose=1 'ark,s,cs:copy-feats scp:exp/train_phn_l4_c320/train_local.scp ark:- | add-deltas ark:- ark:- |' 'ark:gunzip -c exp/train_phn_l4_c320/labels.tr.gz|' exp/train_phn_l4_c320/nnet/nnet.iter0 exp/train_phn_l4_c320/nnet/nnet.iter1
copy-feats scp:exp/train_phn_l4_c320/train_local.scp ark:-
add-deltas ark:- ark:-
LOG (train-ctc-parallel:main():train-ctc-parallel.cc:112) TRAINING STARTED
VLOG1 After 1010 sequences (0.695456Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -36.2847%
VLOG1 After 2020 sequences (1.62006Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -30.1322%
VLOG1 After 3030 sequences (2.67991Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -25.7895%
VLOG1 After 4040 sequences (3.84653Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -23.9529%
VLOG1 After 5050 sequences (5.10758Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -23.4412%
VLOG1 After 6060 sequences (6.45321Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -22.1382%
VLOG1 After 7070 sequences (7.87719Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -20.9931%
VLOG1 After 8080 sequences (9.37634Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -20.5369%
VLOG1 After 9090 sequences (10.944Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -20.2233%
Is this normal? What does the negative accuracy mean?
Thank you.
Jinserk
The text was updated successfully, but these errors were encountered:
We just checked in fixes to Eesen that would solve the CPU compilation issue. Please update to the latest.
Please stick to the original ExpA() function (rather than exp()) which deals with some critical extreme cases.
CPU-based CTC training is NOT available in Eesen. You have to rely on GPUs for training.
Thank you for reply. I confirmed the updated code is compiled well in CPU only mode.
By the way, is there any reason the CTC cost function doesn't support the CPU only mode? Too slow?
Hi,
Since the GPU nodes are being in maintenance, I need to compile the code with --use-cuda=no option.
However, there were a few compile errors, especially related to ExpA, so I changed them to exp.
By the way, when the EESEN enters into the training phase, the log show like this:
train-ctc-parallel --report-step=1000 --num-sequence=10 --frame-limit=25000 --learn-rate=0.00004 --momentum=0.9 --verbose=1 'ark,s,cs:copy-feats scp:exp/train_phn_l4_c320/train_local.scp ark:- | add-deltas ark:- ark:- |' 'ark:gunzip -c exp/train_phn_l4_c320/labels.tr.gz|' exp/train_phn_l4_c320/nnet/nnet.iter0 exp/train_phn_l4_c320/nnet/nnet.iter1
copy-feats scp:exp/train_phn_l4_c320/train_local.scp ark:-
add-deltas ark:- ark:-
LOG (train-ctc-parallel:main():train-ctc-parallel.cc:112) TRAINING STARTED
VLOG1 After 1010 sequences (0.695456Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -36.2847%
VLOG1 After 2020 sequences (1.62006Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -30.1322%
VLOG1 After 3030 sequences (2.67991Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -25.7895%
VLOG1 After 4040 sequences (3.84653Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -23.9529%
VLOG1 After 5050 sequences (5.10758Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -23.4412%
VLOG1 After 6060 sequences (6.45321Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -22.1382%
VLOG1 After 7070 sequences (7.87719Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -20.9931%
VLOG1 After 8080 sequences (9.37634Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -20.5369%
VLOG1 After 9090 sequences (10.944Hr): Obj(log[Pzx]) = -1e+30 TokenAcc = -20.2233%
Is this normal? What does the negative accuracy mean?
Thank you.
Jinserk
The text was updated successfully, but these errors were encountered: