generated from roboflow/template-python
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Detections _get_item_() does not handle index arrays of booleans correctly for data dictionary in Detections object. #1061
Labels
Comments
2 tasks
Hi @rolson24! 👋🏻 good catch! I conducted my own test, and its results confirm your conclusions. import numpy as np
import supervision as sv
xyxy = np.array([
[100, 100, 110, 110],
[200, 200, 220, 220],
[300, 300, 330, 330],
[400, 400, 440, 440],
[500, 500, 550, 550]
], dtype=float)
confidence = np.array([
0.1, 0.2, 0.3, 0.4, 0.5
], dtype=float)
class_id = np.array([
1, 2, 3, 4, 5
], dtype=int)
letter = [
'a', 'b', 'c', 'd', 'e'
]
detections = sv.Detections(
xyxy=xyxy,
class_id=class_id,
confidence=confidence,
data={
"letter": letter
}
)
detections[[1, 2]]
# Detections(
# xyxy=array([
# [200., 200., 220., 220.],
# [300., 300., 330., 330.]]),
# mask=None,
# confidence=array([0.2, 0.3]),
# class_id=array([2, 3]),
# tracker_id=None,
# data={'letter': ['b', 'c']}
# )
# OK.
detections[np.array([1, 2])]
# Detections(
# xyxy=array([
# [200., 200., 220., 220.],
# [300., 300., 330., 330.]]),
# mask=None,
# confidence=array([0.2, 0.3]),
# class_id=array([2, 3]),
# tracker_id=None,
# data={'letter': ['b', 'c']}
# )
# OK.
detections[np.array([1, 1, 1, 1, 1])]
# Detections(
# xyxy=array([
# [200., 200., 220., 220.],
# [200., 200., 220., 220.],
# [200., 200., 220., 220.],
# [200., 200., 220., 220.],
# [200., 200., 220., 220.]]),
# mask=None,
# confidence=array([0.2, 0.2, 0.2, 0.2, 0.2]),
# class_id=array([2, 2, 2, 2, 2]),
# tracker_id=None,
# data={'letter': ['b', 'b', 'b', 'b', 'b']}
# )
# OK.
detections[np.array([True, True, True, True, True], dtype=bool)]
# Detections(
# xyxy=array([
# [100., 100., 110., 110.],
# [200., 200., 220., 220.],
# [300., 300., 330., 330.],
# [400., 400., 440., 440.],
# [500., 500., 550., 550.]]),
# mask=None,
# confidence=array([0.1, 0.2, 0.3, 0.4, 0.5]),
# class_id=array([1, 2, 3, 4, 5]),
# tracker_id=None,
# data={'letter': ['b', 'b', 'b', 'b', 'b']}
# )
# WRONG DATA.
detections[np.array([False, False, False, False, False], dtype=bool)]
# ---------------------------------------------------------------------------
# ValueError Traceback (most recent call last)
# [<ipython-input-26-634699880563>](https://localhost:8080/#) in <cell line: 1>()
# ----> 1 detections[np.array([False, False, False, False, False], dtype=bool)]
#
# 4 frames
# [/usr/local/lib/python3.10/dist-packages/supervision/detection/utils.py](https://localhost:8080/#) in validate_data(data, n)
# 724 if isinstance(value, list):
# 725 if len(value) != n:
# --> 726 raise ValueError(f"Length of list for key '{key}' must be {n}")
# 727 elif isinstance(value, np.ndarray):
# 728 if value.ndim == 1 and value.shape[0] != n:
#
# ValueError: Length of list for key 'letter' must be 0 |
SkalskiP
added a commit
that referenced
this issue
Mar 28, 2024
fix Detections.get_data_item() bug (#1061)
It was fixed with #1062. Merging! |
Great! Thanks! |
1 task
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Search before asking
Bug
When I tried to use boolean array indexing to get specific detections from a Detections object, the data field sometimes returns the wrong values (everything is the second value) and sometimes it just doesn't work. See output of code:
Environment
Minimal Reproducible Example
https://colab.research.google.com/drive/1C-wlttD-awePYPTCCPmMfIEtME9o3dN7?usp=sharing
Additional
I am pretty sure this is because of how the get_data_item() function in the Detections.utils file is handling indexes that are numpy arrays. I think it's a pretty simple fix, and I can submit a PR.
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: