-
Notifications
You must be signed in to change notification settings - Fork 136
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: Error(s) in loading state_dict for CLAP: Unexpected key(s) in state_dict: "text_branch.embeddings.position_ids #127
Comments
I get the same error with this code
versions:
error:
|
ah i see it's related to this: #118 |
Here is a quick and dirty script to remove the offending key from the checkpoint: import argparse
import os
import torch
OFFENDING_KEY = "module.text_branch.embeddings.position_ids"
def main(args):
# Load the checkpoint from the given path
checkpoint = torch.load(
args.input_checkpoint, map_location="cpu"
)
# Extract the state_dict from the checkpoint
if isinstance(checkpoint, dict) and "state_dict" in checkpoint:
state_dict = checkpoint["state_dict"]
else:
state_dict = checkpoint
# Delete the specific key from the state_dict
if OFFENDING_KEY in state_dict:
del state_dict[OFFENDING_KEY]
# Save the modified state_dict back to the checkpoint
if isinstance(checkpoint, dict) and "state_dict" in checkpoint:
checkpoint["state_dict"] = state_dict
# Create the output checkpoint filename by replacing the ".pt" suffix with ".patched.pt"
output_checkpoint_path = args.input_checkpoint.replace('.pt', '.patched.pt')
# Save the modified checkpoint
torch.save(checkpoint, output_checkpoint_path)
print(f"Saved patched checkpoint to {output_checkpoint_path}")
if __name__ == "__main__":
parser = argparse.ArgumentParser(description='Patch a PyTorch checkpoint by removing a specific key.')
parser.add_argument('input_checkpoint', type=str, help='Path to the input PyTorch checkpoint (.pt) file.')
try:
import argcomplete
argcomplete.autocomplete(parser)
except ImportError:
pass
args = parser.parse_args()
main(args)
Then, use |
This problem is fixed! We push the 1.1.6 pypi laion-clap to fix some bugs: https://pypi.org/project/laion-clap/ |
I was running the following code into Colab
I got thte error t the model.load_ckpt()
error
The text was updated successfully, but these errors were encountered: