Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using replit_lm_tokenizer locally #30

Closed
Chorleon opened this issue Aug 2, 2023 · 1 comment
Closed

Using replit_lm_tokenizer locally #30

Chorleon opened this issue Aug 2, 2023 · 1 comment

Comments

@Chorleon
Copy link

Chorleon commented Aug 2, 2023

Hi community, I just got familiar with LLM recently so sorry if my question doesn't make sense.

As my work requires I need to process data locally, is there a way to use replit_lm_tokenizer locally instead having to set trust_remote_code = True as described in ReadME. Thanks a lot.

@gjmulder
Copy link

gjmulder commented Aug 2, 2023

AFAIK this option is required to run the model locally as you are explicitly deciding to trust the code contained in the replit_lm_tokenizer. "remote" is being used in the sense that the code wasn't written by you but is part of the replit_lm_tokenizer you have downloaded from remote repository.

@Chorleon Chorleon closed this as completed Aug 2, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants