You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
infinite sequence length inference is currently supported by stepping through each token; however, I do not see any support for training on this. The parallel scan applied per batch does not have support to save the final hidden state for the following batch, so are there any plans to train mamba on context lengths > 4096 or whatever is the length cap from memory?
The text was updated successfully, but these errors were encountered:
infinite sequence length inference is currently supported by stepping through each token; however, I do not see any support for training on this. The parallel scan applied per batch does not have support to save the final hidden state for the following batch, so are there any plans to train mamba on context lengths > 4096 or whatever is the length cap from memory?
The text was updated successfully, but these errors were encountered: