You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi community, I just got familiar with LLM recently so sorry if my question doesn't make sense.
As my work requires I need to process data locally, is there a way to use replit_lm_tokenizer locally instead having to set trust_remote_code = True as described in ReadME. Thanks a lot.
The text was updated successfully, but these errors were encountered:
AFAIK this option is required to run the model locally as you are explicitly deciding to trust the code contained in the replit_lm_tokenizer. "remote" is being used in the sense that the code wasn't written by you but is part of the replit_lm_tokenizer you have downloaded from remote repository.
Hi community, I just got familiar with LLM recently so sorry if my question doesn't make sense.
As my work requires I need to process data locally, is there a way to use replit_lm_tokenizer locally instead having to set trust_remote_code = True as described in ReadME. Thanks a lot.
The text was updated successfully, but these errors were encountered: