-
Notifications
You must be signed in to change notification settings - Fork 9.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Addressing Hugging Face-Related Errors During the Fine-Tuning of GLIP #10968
Comments
@CDchenlin Hello, there is a very simple solution. You just need to download the corresponding weights to your local computer, then upload them to the server, and finally set the from transformers import BertConfig, BertModel
from transformers import AutoTokenizer
config = BertConfig.from_pretrained("bert-base-uncased")
model = BertModel.from_pretrained("bert-base-uncased", add_pooling_layer=False, config=config)
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
config.save_pretrained("your path/bert-base-uncased")
model.save_pretrained("your path/bert-base-uncased")
tokenizer.save_pretrained("your path/bert-base-uncased") then # lang_model_name = 'bert-base-uncased'
lang_model_name = 'your path/bert-base-uncased' |
@CDchenlin If the pretrained weights cannot be loaded, you can simply download them to your local machine, then upload them to the server, and finally modify the |
@hhaAndroid Hello, thank you so much for your assistance. Could you possibly provide more specific advice? I attempted the following modifications at here code, but they were unsuccessful.
was modified into
However the problem remains. Thank you so much for your advise! |
@CDchenlin In fact, my server is also unable to connect to the internet, so I use the method mentioned above and modify the configuration at this to a local path. |
Thank you so much, the problem is solved, however, the second problem remains as:
|
@CDchenlin This error is usually caused by the failure to read the data. In most cases, it is because the metainfo is written incorrectly, resulting in the JSON data being read but the dataset being empty due to misconfigured category fields. Did you use custom data? |
Yes, I used the custom data. To be honest, I am new to mmdetection. I have read and followed the documentation at this link Documentation to prepare my dataset and config file. I would greatly appreciate it if you could provide me with some additional information regarding this issue, as I found the resources related to mmdet>3.0 scarce. |
I believe the category information has already been defined in the JSON file in
So I omitted |
@hhaAndroid Thank you for your invaluable advice. I have successfully organized my dataset and initiated training using the rtmdet model. However, upon transitioning to the GLIP model using the configuration file
I have attempted multiple potential solutions, including transitioning to the official cat dataset referenced in this notebook. Despite these efforts, the error persists. I would be immensely grateful for any advice or suggestions you might have regarding this issue. Thank you very much for your assistance. I apologize if my repeated inquiries have caused any inconvenience. |
@CDchenlin 你的 pipeline 少了 text key dict(
type='PackDetInputs',
meta_keys=('img_id', 'img_path', 'ori_shape', 'img_shape',
'scale_factor', 'flip', 'flip_direction', 'text',
'custom_entities')) |
Hello,
While fine-tuning the GLIP model on my custom dataset, I encountered the following issue:
I understand that this is a connection error related to Hugging Face. However, since I don’t have administrator privileges on my server, I’m wondering if there are any alternative solutions to this problem. For instance, would it be possible for me to download the weights using my PC and then transfer them to the server?
Despite the fact that I was unable to download the pretrained model, the
train.py
script continues to execute, resulting in the following error:I’m curious as to why this occurred. Could it be related to the unsuccessful download of the pretrained model, or could it be an issue with my custom dataset?
Thank you for your assistance.
The text was updated successfully, but these errors were encountered: