You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Any plans / workarounds for ORT Gen AI inference support for text component (decoder only) with external inputs_embeds ?
(e.g. enable running models w/ exclude_embed flag)
The text was updated successfully, but these errors were encountered:
We can make the input_ids optional. We can probably add this feature. It seems like it is useful for people to do this. No ETA yet though. Will update this issue with more details once I can share.
Any plans / workarounds for ORT Gen AI inference support for text component (decoder only) with external inputs_embeds ?
(e.g. enable running models w/ exclude_embed flag)
The text was updated successfully, but these errors were encountered: