Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: 'NoneType' object is not subscriptable #75

Open
annabechang opened this issue Dec 8, 2022 · 8 comments
Open

TypeError: 'NoneType' object is not subscriptable #75

annabechang opened this issue Dec 8, 2022 · 8 comments
Labels
bug Something isn't working

Comments

@annabechang
Copy link

I am having this error while trying to load the model.

from detoxify import Detoxify

model = Detoxify('original', device="cuda")


---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In [15], line 3
      1 from detoxify import Detoxify
----> 3 results = Detoxify('original').predict('some text')

File ~/.conda/envs/py/lib/python3.9/site-packages/detoxify/detoxify.py:103, in Detoxify.__init__(self, model_type, checkpoint, device, huggingface_config_path)
    101 def __init__(self, model_type="original", checkpoint=PRETRAINED_MODEL, device="cpu", huggingface_config_path=None):
    102     super().__init__()
--> 103     self.model, self.tokenizer, self.class_names = load_checkpoint(
    104         model_type=model_type,
    105         checkpoint=checkpoint,
    106         device=device,
    107         huggingface_config_path=huggingface_config_path,
    108     )
    109     self.device = device
    110     self.model.to(self.device)

File ~/.conda/envs/py/lib/python3.9/site-packages/detoxify/detoxify.py:56, in load_checkpoint(model_type, checkpoint, device, huggingface_config_path)
     50 change_names = {
     51     "toxic": "toxicity",
     52     "identity_hate": "identity_attack",
     53     "severe_toxic": "severe_toxicity",
     54 }
     55 class_names = [change_names.get(cl, cl) for cl in class_names]
---> 56 model, tokenizer = get_model_and_tokenizer(
     57     **loaded["config"]["arch"]["args"],
     58     state_dict=loaded["state_dict"],
     59     huggingface_config_path=huggingface_config_path,
     60 )
     62 return model, tokenizer, class_names

File ~/.conda/envs/py/lib/python3.9/site-packages/detoxify/detoxify.py:20, in get_model_and_tokenizer(model_type, model_name, tokenizer_name, num_classes, state_dict, huggingface_config_path)
     16 def get_model_and_tokenizer(
     17     model_type, model_name, tokenizer_name, num_classes, state_dict, huggingface_config_path=None
     18 ):
     19     model_class = getattr(transformers, model_name)
---> 20     model = model_class.from_pretrained(
     21         pretrained_model_name_or_path=None,
     22         config=huggingface_config_path or model_type,
     23         num_labels=num_classes,
     24         state_dict=state_dict,
     25         local_files_only=huggingface_config_path is not None,
     26     )
     27     tokenizer = getattr(transformers, tokenizer_name).from_pretrained(
     28         huggingface_config_path or model_type,
     29         local_files_only=huggingface_config_path is not None,
     30         # TODO: may be needed to let it work with Kaggle competition
     31         # model_max_length=512,
     32     )
     34     return model, tokenizer

File ~/.conda/envs/py/lib/python3.9/site-packages/transformers/modeling_utils.py:2379, in PreTrainedModel.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
   2369     if dtype_orig is not None:
   2370         torch.set_default_dtype(dtype_orig)
   2372     (
   2373         model,
   2374         missing_keys,
   2375         unexpected_keys,
   2376         mismatched_keys,
   2377         offload_index,
   2378         error_msgs,
-> 2379     ) = cls._load_pretrained_model(
   2380         model,
   2381         state_dict,
   2382         loaded_state_dict_keys,  # XXX: rename?
   2383         resolved_archive_file,
   2384         pretrained_model_name_or_path,
   2385         ignore_mismatched_sizes=ignore_mismatched_sizes,
   2386         sharded_metadata=sharded_metadata,
   2387         _fast_init=_fast_init,
   2388         low_cpu_mem_usage=low_cpu_mem_usage,
   2389         device_map=device_map,
   2390         offload_folder=offload_folder,
   2391         offload_state_dict=offload_state_dict,
   2392         dtype=torch_dtype,
   2393         load_in_8bit=load_in_8bit,
   2394     )
   2396 model.is_loaded_in_8bit = load_in_8bit
   2398 # make sure token embedding weights are still tied if needed

File ~/.conda/envs/py/lib/python3.9/site-packages/transformers/modeling_utils.py:2572, in PreTrainedModel._load_pretrained_model(cls, model, state_dict, loaded_keys, resolved_archive_file, pretrained_model_name_or_path, ignore_mismatched_sizes, sharded_metadata, _fast_init, low_cpu_mem_usage, device_map, offload_folder, offload_state_dict, dtype, load_in_8bit)
   2569                 del state_dict[checkpoint_key]
   2570     return mismatched_keys
-> 2572 folder = os.path.sep.join(resolved_archive_file[0].split(os.path.sep)[:-1])
   2573 if device_map is not None and is_safetensors:
   2574     param_device_map = expand_device_map(device_map, original_loaded_keys)

TypeError: 'NoneType' object is not subscriptable

pip install information:

Collecting detoxify
  Downloading detoxify-0.5.0-py3-none-any.whl (12 kB)
Collecting transformers!=4.18.0
  Downloading transformers-4.25.1-py3-none-any.whl (5.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.8/5.8 MB 75.2 MB/s eta 0:00:0000:0100:01
Collecting torch>=1.7.0
  Downloading torch-1.13.0-cp39-cp39-manylinux1_x86_64.whl (890.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 890.2/890.2 MB 3.6 MB/s eta 0:00:0000:0100:01
Collecting sentencepiece>=0.1.94
  Downloading sentencepiece-0.1.97-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 107.1 MB/s eta 0:00:00
Collecting typing-extensions
  Downloading typing_extensions-4.4.0-py3-none-any.whl (26 kB)
Collecting nvidia-cuda-nvrtc-cu11==11.7.99
  Downloading nvidia_cuda_nvrtc_cu11-11.7.99-2-py3-none-manylinux1_x86_64.whl (21.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 21.0/21.0 MB 77.9 MB/s eta 0:00:0000:0100:01
Collecting nvidia-cublas-cu11==11.10.3.66
  Downloading nvidia_cublas_cu11-11.10.3.66-py3-none-manylinux1_x86_64.whl (317.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 317.1/317.1 MB 8.9 MB/s eta 0:00:0000:0100:01
Collecting nvidia-cuda-runtime-cu11==11.7.99
  Downloading nvidia_cuda_runtime_cu11-11.7.99-py3-none-manylinux1_x86_64.whl (849 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 849.3/849.3 kB 112.2 MB/s eta 0:00:00
Collecting nvidia-cudnn-cu11==8.5.0.96
  Downloading nvidia_cudnn_cu11-8.5.0.96-2-py3-none-manylinux1_x86_64.whl (557.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 557.1/557.1 MB 6.0 MB/s eta 0:00:0000:0100:01
Requirement already satisfied: wheel in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.7.0->detoxify) (0.37.1)
Requirement already satisfied: setuptools in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from nvidia-cublas-cu11==11.10.3.66->torch>=1.7.0->detoxify) (63.4.1)
Collecting regex!=2019.12.17
  Downloading regex-2022.10.31-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (769 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 770.0/770.0 kB 116.5 MB/s eta 0:00:00
Requirement already satisfied: numpy>=1.17 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from transformers!=4.18.0->detoxify) (1.23.4)
Requirement already satisfied: tqdm>=4.27 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from transformers!=4.18.0->detoxify) (4.64.1)
Requirement already satisfied: pyyaml>=5.1 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from transformers!=4.18.0->detoxify) (6.0)
Collecting filelock
  Downloading filelock-3.8.2-py3-none-any.whl (10 kB)
Requirement already satisfied: packaging>=20.0 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from transformers!=4.18.0->detoxify) (21.3)
Requirement already satisfied: requests in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from transformers!=4.18.0->detoxify) (2.28.1)
Collecting huggingface-hub<1.0,>=0.10.0
  Downloading huggingface_hub-0.11.1-py3-none-any.whl (182 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 182.4/182.4 kB 103.0 MB/s eta 0:00:00
Collecting tokenizers!=0.11.3,<0.14,>=0.11.1
  Downloading tokenizers-0.13.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.6/7.6 MB 33.4 MB/s eta 0:00:0000:0100:01m
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from packaging>=20.0->transformers!=4.18.0->detoxify) (3.0.9)
Requirement already satisfied: certifi>=2017.4.17 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from requests->transformers!=4.18.0->detoxify) (2022.9.24)
Requirement already satisfied: charset-normalizer<3,>=2 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from requests->transformers!=4.18.0->detoxify) (2.1.1)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from requests->transformers!=4.18.0->detoxify) (1.26.12)
Requirement already satisfied: idna<4,>=2.5 in /home/annahaz/.conda/envs/py/lib/python3.9/site-packages (from requests->transformers!=4.18.0->detoxify) (3.4)
Installing collected packages: tokenizers, sentencepiece, typing-extensions, regex, nvidia-cuda-runtime-cu11, nvidia-cuda-nvrtc-cu11, nvidia-cublas-cu11, filelock, nvidia-cudnn-cu11, huggingface-hub, transformers, torch, detoxify
Successfully installed detoxify-0.5.0 filelock-3.8.2 huggingface-hub-0.11.1 nvidia-cublas-cu11-11.10.3.66 nvidia-cuda-nvrtc-cu11-11.7.99 nvidia-cuda-runtime-cu11-11.7.99 nvidia-cudnn-cu11-8.5.0.96 regex-2022.10.31 sentencepiece-0.1.97 tokenizers-0.13.2 torch-1.13.0 transformers-4.25.1 typing-extensions-4.4.0

additional information
python 3.9.13 haa1d7c7_2
on linux

@robvanvolt
Copy link

Me too... python 3.8 on Linux. Maybe this is related to the latest release? It worked fine before.

@jamt9000 jamt9000 added the bug Something isn't working label Dec 10, 2022
@user1342
Copy link

user1342 commented Dec 10, 2022

As a quick fix for this. On version 0.5.0, I was able to get around this error by commenting out line 2572 in modeling_utils.py (see this fork of transformers):

Replace:
folder = os.path.sep.join(resolved_archive_file[0].split(os.path.sep)[:-1])
with
#folder = os.path.sep.join(resolved_archive_file[0].split(os.path.sep)[:-1])

I'm unsure if this fix will have repercussions later down the line, however, for the time being it seems to be working.

@annabechang
Copy link
Author

As a quick fix for this. On version 0.5.0, I was able to get around this error by commenting out line 2572 in modeling_utils.py (see this fork of transformers):

Replace: folder = os.path.sep.join(resolved_archive_file[0].split(os.path.sep)[:-1]) with #folder = os.path.sep.join(resolved_archive_file[0].split(os.path.sep)[:-1])

I'm unsure if this fix will have repercussions later down the line, however, for the time being it seems to be working.

thank you, this work for me as well!

@chrisji
Copy link

chrisji commented Dec 12, 2022

As a workaround, the v4.22.1 release of transformers appears to work as expected. So if installing via pip:

pip install detoxify
pip install transformers==4.22.1

(Other versions between the latest release and v4.22.1 may also work)

@laurahanu
Copy link
Collaborator

Hey all, thanks for raising this and good to see there's a workaround!

We will look into this asap!

@laurahanu
Copy link
Collaborator

Hey everyone, pinned the transformers version to 4.22.1 as a quick fix for now as we investigate this problem further, everything should hopefully work as expected if you reinstall the package/upgrade the transformers version!

@MartinPicc
Copy link

It is working for me with version 4.24.0, the next version however raises this error.
I believe it comes from this specific PR on transformers library merged in 4.25.1 huggingface/transformers#20321

However, it's not clear for me if this is an expected behavior on tranformers' side or if they introduced a bug

@MartinPicc
Copy link

MartinPicc commented Mar 7, 2023

It is apparently a bug on transformers side, and it should be fixed in next release with this PR: huggingface/transformers#21542

If you need a solution right now, you can also install transformers from their main branch directly, the PR is merged already

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

7 participants