New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RoFormerConfig载入报错 #6
Comments
修改ckpt的层名称就可以了,不过长度原来是预训练就限定好的,没看论文我以为是可以在模型里动态修改的。 |
不对,我尝试了roformer_chinese_base好像是支持在RoFormerConfig载入并修改长度的。是不是roformer_chinese_char_base这边您转torch的时候config多写了一个"max_position_embeddings": 512? |
本地自己删除pytorch_model.bin中的embed_positions.weight import torch
from collections import OrderedDict
s = OrderedDict()
state_dict = torch.load("pytorch_model.bin")
for k, v in state_dict.items():
if "embed_positions" in k:
continue
s[k] = v
torch.save(s, "new_pytorch_model.bin", _use_new_zipfile_serialization=False) 然后new_pytorch_model.bin这个权重就不带embed_positions.weight。 from roformer.modeling_roformer import RoFormerModel, RoFormerConfig
import torch
myconfig = RoFormerConfig.from_pretrained(
'./config.json')
myconfig.max_position_embeddings = 2000
model = RoFormerModel.from_pretrained("./", config=myconfig) 或者这样加载模型,并修改max_position_embeddings from roformer.modeling_roformer import RoFormerModel
model = RoFormerModel.from_pretrained("./", max_position_embeddings=2000) '''
|
感谢解答! |
非常感谢您开源这个项目,我在使用
roformer_chinese_char_base
的时候想扩大max_position_embeddings
的长度,所以在尝试通过RoFormerConfig
的方式载入权重时报错。Missing key(s) in state_dict: "embeddings.word_embeddings.weight", "embeddings.token_type_embeddings.weight", "embeddings.LayerNorm.weight", "embeddings.LayerNorm.bias", "encoder.embed_positions.weight", "encoder.layer.0.attention.self.query.weight", "encoder.layer.0.attention.self.query.bias", "encoder.layer.0.attention.self.key.weight", "encoder.layer.0.attention.self.key.bias", "encoder.layer.0.attention.self.value.weight", "encoder.layer.0.attention.self.value.bias", "encoder.layer.0.attention.output.dense.weight", "encoder.layer.0.attention.output.dense.bias", "encoder.layer.0.attention.output.LayerNorm.weight", "encoder.layer.0.attention.output.LayerNorm.bias", "encoder.layer.0.intermediate.dense.weight", "encoder.layer.0.intermediate.dense.bias", "encoder.layer.0.output.dense.weight", "encoder.layer.0.output.dense.bias"
看了下RoFormerModel的网络层,貌似都是不带"roformer",请问这个是不是需要修改RoFormerModel里面的层名称?
The text was updated successfully, but these errors were encountered: