You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
In the cnn feature extractor, is it the case that in a batch you are assuming all the words to be of the same length? If so, then there must have been padding to the smaller words, will it not disturb the char level features?
The text was updated successfully, but these errors were encountered:
Yes, the batch implementation is used, so the character sequences are padded as the same length. There is a pooling operation following the CNN calculation, so the padding part will not affect the representation of character sequence.
Hi,
In the cnn feature extractor, is it the case that in a batch you are assuming all the words to be of the same length? If so, then there must have been padding to the smaller words, will it not disturb the char level features?
The text was updated successfully, but these errors were encountered: