Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix bug in tokenize_dataset_rows.py and infer.ipynb #125

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

zwkkk
Copy link

@zwkkk zwkkk commented Apr 3, 2023

在最近更新的版本中,
1 tokenize_dataset_rows.py存在[150004] not in list的bug。
2 infer.ipynb中存在mask bug。

@mymusise
Copy link
Owner

mymusise commented Apr 7, 2023

感谢PR~

晚点我会测试下,没问题的话会merge

add_special_tokens=False)
input_ids = prompt_ids + target_ids + [config.eos_token_id]
input_ids = prompt_ids + [150001, 150004] + target_ids + [150005]
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这块修改和感觉是不是等价的?

prompt_ids = tokenizer.encode(**, add_special_tokens=True) == prompt_ids = tokenizer.encode(**, add_special_tokens=False) + [150001, 150004]

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

如果用魔法数字会不会不太好,我看官方又更新了下token_id 🤣

Copy link
Owner

@mymusise mymusise Apr 7, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

看了下官方的预处理代码,好像用它提供的build_inputs_with_special_tokens更好

prompt_ids = tokenizer.encode(, add_special_tokens=False)
target_ids = tokenizer.encode(, add_special_tokens=False)
tokenizer.build_inputs_with_special_tokens(prompt_ids, target_ids )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants