Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Argument 'bb' has incorrect type #289

Closed
Zhang-O opened this issue Jan 16, 2020 · 6 comments
Closed

Argument 'bb' has incorrect type #289

Zhang-O opened this issue Jan 16, 2020 · 6 comments

Comments

@Zhang-O
Copy link

Zhang-O commented Jan 16, 2020

I trained my custom data, and encountered the following error.I have no idea, why earlier batchs run normal and suddenly got wrong?

CUDA_VISIBLE_DEVICES=7 python train.py --config=yolact_base_config --batch_size=4
Scaling parameters by 0.50 to account for a batch size of 4.
Per-GPU batch size is less than the recommended limit for batch norm. Disabling batch norm.
loading annotations into memory...
Done (t=0.05s)
creating index...
index created!
loading annotations into memory...
Done (t=0.08s)
creating index...
index created!
Initializing weights...
Begin training!

[ 0] 0 || B: 4.888 | C: 11.120 | M: 4.166 | S: 1.502 | T: 21.677 || ETA: 153 days, 1:00:08 || timer: 8.264
[ 0] 10 || B: 4.933 | C: 6.254 | M: 4.378 | S: 1.342 | T: 16.908 || ETA: 40 days, 10:30:16 || timer: 0.314
[ 0] 20 || B: 4.806 | C: 5.413 | M: 4.367 | S: 1.173 | T: 15.759 || ETA: 32 days, 19:32:42 || timer: 3.033
[ 0] 30 || B: 4.594 | C: 4.978 | M: 4.319 | S: 1.059 | T: 14.950 || ETA: 32 days, 1:39:48 || timer: 0.302
[ 0] 40 || B: 4.433 | C: 4.705 | M: 4.300 | S: 0.963 | T: 14.402 || ETA: 31 days, 19:30:51 || timer: 2.198
[ 0] 50 || B: 4.269 | C: 4.466 | M: 4.279 | S: 0.904 | T: 13.919 || ETA: 30 days, 7:44:57 || timer: 0.304
[ 0] 60 || B: 4.120 | C: 4.286 | M: 4.272 | S: 0.855 | T: 13.533 || ETA: 30 days, 7:43:06 || timer: 0.317
[ 0] 70 || B: 4.114 | C: 4.131 | M: 4.253 | S: 0.816 | T: 13.314 || ETA: 30 days, 12:33:05 || timer: 0.312
[ 0] 80 || B: 4.071 | C: 3.974 | M: 4.225 | S: 0.773 | T: 13.042 || ETA: 31 days, 22:09:37 || timer: 0.311
Traceback (most recent call last):
File "train.py", line 502, in
train()
File "train.py", line 268, in train
for datum in data_loader:
File "/home/ubuntu/anaconda3/envs/zyl/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 801, in next
return self._process_data(data)
File "/home/ubuntu/anaconda3/envs/zyl/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 846, in _process_data
data.reraise()
File "/home/ubuntu/anaconda3/envs/zyl/lib/python3.6/site-packages/torch/_utils.py", line 369, in reraise
raise self.exc_type(msg)
TypeError: Caught TypeError in DataLoader worker process 2.
Original Traceback (most recent call last):
File "/home/ubuntu/anaconda3/envs/zyl/lib/python3.6/site-packages/torch/utils/data/_utils/worker.py", line 178, in _worker_loop
data = fetcher.fetch(index)
File "/home/ubuntu/anaconda3/envs/zyl/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/home/ubuntu/anaconda3/envs/zyl/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 44, in
data = [self.dataset[idx] for idx in possibly_batched_index]
File "/data/zhangyl/yolact-master/data/coco.py", line 94, in getitem
im, gt, masks, h, w, num_crowds = self.pull_item(index)
File "/data/zhangyl/yolact-master/data/coco.py", line 149, in pull_item
masks = [self.coco.annToMask(obj).reshape(-1) for obj in target]
File "/data/zhangyl/yolact-master/data/coco.py", line 149, in
masks = [self.coco.annToMask(obj).reshape(-1) for obj in target]
File "/home/ubuntu/anaconda3/envs/zyl/lib/python3.6/site-packages/pycocotools/coco.py", line 431, in annToMask
rle = self.annToRLE(ann)
File "/home/ubuntu/anaconda3/envs/zyl/lib/python3.6/site-packages/pycocotools/coco.py", line 416, in annToRLE
rles = maskUtils.frPyObjects(segm, h, w)
File "pycocotools/_mask.pyx", line 293, in pycocotools._mask.frPyObjects
TypeError: Argument 'bb' has incorrect type (expected numpy.ndarray, got list)

@VinniaKemala
Copy link

@Zhang-O Have you solved this problem?
See here: cocodataset/cocoapi#139
It worked for me.

@dbolya
Copy link
Owner

dbolya commented Jan 24, 2020

Yeah one of your gt annotations is throwing this error. Try to identify which one it is (by printing the image id or something every iteration) and you should be able to apply the fix @VinniaKemala is talking about.

@Zhang-O
Copy link
Author

Zhang-O commented Mar 9, 2020

It is that some boxex in labelme json file are rectangle, not polygon. After I change all rectangles to polygons, the error disappeared.
Thank you !

@Zhang-O Zhang-O closed this as completed Mar 9, 2020
@rivered
Copy link

rivered commented Oct 14, 2020

This was a painstaking effort for me. Surely we are not going to change all rectangles to polygons in a few thousand annotated images as proposed by Zhang-O.

I loaded my erroneous file and searched for the segmentation which is pretending to be a bounding box:

#It happens to be so that the bounding boxes are lists and not ndarrays? Ndarrays are not JSON serialisable
#TypeError: Argument 'bb' has incorrect type (expected numpy.ndarray, got list)

#Find JSON that gives errors
JSON_LOC="/home/Desktop/annotation.json"

#Open JSON
val_json = open(JSON_LOC, "r")
json_object = json.load(val_json)
val_json.close()

for i, instance in enumerate(json_object["annotations"]):
    if len(instance["segmentation"][0]) == 4:
        print("instance number", i, "raises arror:", instance["segmentation"][0])
instance number 1030 raises arror: [461, 449, 461, 449]

Now we can simply adjust it or delete it or do whatever and write back the file

#Alter object generating the error with something random not causing the error
json_object["annotations"][1030]["segmentation"] = [[461, 449, 462, 449, 461, 449]]

#Write back altered JSON
val_json = open(JSON_LOC, "w")
json.dump(json_object, val_json)
val_json.close()

The error did not show up anymore after this change.

@MichaelMano3
Copy link

This was a painstaking effort for me. Surely we are not going to change all rectangles to polygons in a few thousand annotated images as proposed by Zhang-O.

I loaded my erroneous file and searched for the segmentation which is pretending to be a bounding box:

#It happens to be so that the bounding boxes are lists and not ndarrays? Ndarrays are not JSON serialisable
#TypeError: Argument 'bb' has incorrect type (expected numpy.ndarray, got list)

#Find JSON that gives errors
JSON_LOC="/home/Desktop/annotation.json"

#Open JSON
val_json = open(JSON_LOC, "r")
json_object = json.load(val_json)
val_json.close()

for i, instance in enumerate(json_object["annotations"]):
    if len(instance["segmentation"][0]) == 4:
        print("instance number", i, "raises arror:", instance["segmentation"][0])
instance number 1030 raises arror: [461, 449, 461, 449]

Now we can simply adjust it or delete it or do whatever and write back the file

#Alter object generating the error with something random not causing the error
json_object["annotations"][1030]["segmentation"] = [[461, 449, 462, 449, 461, 449]]

#Write back altered JSON
val_json = open(JSON_LOC, "w")
json.dump(json_object, val_json)
val_json.close()

The error did not show up anymore after this change.

Hi,i also have this problem,i want to ask that how to use this?i save this code as a python file and run.but it shows "python3: can't open file 'json':", i definitely set right path for json file.
And i try to run on vscode and it shows : "json_object["annotations"][1030]["segmentation"] =[[461, 449, 462, 449, 461, 449]]
IndexError: list index out of range"

@rivered
Copy link

rivered commented Nov 11, 2020

Hi,i also have this problem,i want to ask that how to use this?i save this code as a python file and run.but it shows "python3: can't open file 'json':", i definitely set right path for json file.
And i try to run on vscode and it shows : "json_object["annotations"][1030]["segmentation"] =[[461, 449, 462, 449, 461, 449]]
IndexError: list index out of range"

This solution was not written as a python executable file. Therefore it is likely you have to change the json file location in the script and not in the terminal. Also it makes sense you get the indexerror returned, as this was a custom solution for my specific problem.

Therefore you want to probably replace

    print("instance number", i, "raises arror:", instance["segmentation"][0])

with
json_object["annotations"][i]["segmentation"] = [[461, 449, 462, 449, 461, 449]]

And obviously remove this part
#Alter object generating the error with something random not causing the error
json_object["annotations"][1030]["segmentation"] = [[461, 449, 462, 449, 461, 449]]

Probably it would be more wise to just remove this segmentation instead of replacing it with something that does not look alike or use the wrong segmentation to generate a new one thats not giving errors

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants