Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tokenizer类实现问题 #488

Open
liwb1219 opened this issue Apr 18, 2024 · 2 comments
Open

Tokenizer类实现问题 #488

liwb1219 opened this issue Apr 18, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@liwb1219
Copy link

169行的 if i + j > tokens_len:
这个无法扫描到句子最后的ngram
是不是应该修改为 if j > tokens_len:

@liwb1219 liwb1219 added the bug Something isn't working label Apr 18, 2024
@shibing624
Copy link
Owner

哪个文件?

@liwb1219
Copy link
Author

pycorrector/pycorrector/utils/tokenizer.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants