You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The standard transformer block supports input tensors that will have same sequence length.
This requires padding the example sequences to the same length in a batch if the lengths are not equal.
The introduction of JaggedTensor does support the variable length examples. Can we support a transformer block that can efficiently work with JaggedTensor or KeyedJaggedTensor as input, instead of converting them to the dense and padded form?
The text was updated successfully, but these errors were encountered:
The standard transformer block supports input tensors that will have same sequence length.
This requires padding the example sequences to the same length in a batch if the lengths are not equal.
The introduction of
JaggedTensor
does support the variable length examples. Can we support a transformer block that can efficiently work withJaggedTensor
orKeyedJaggedTensor
as input, instead of converting them to the dense and padded form?The text was updated successfully, but these errors were encountered: