Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Convert coco segment to yolo segment #49

Open
AISoltani opened this issue May 12, 2023 · 15 comments
Open

Convert coco segment to yolo segment #49

AISoltani opened this issue May 12, 2023 · 15 comments

Comments

@AISoltani
Copy link

Hi, How can i convert coco segment to yolo segment with this tool?

@ryouchinsa
Copy link

In the general_json2yolo.py script, set the directory which includes your COCO JSON file to the first argument of convert_coco_json().

Then, run the script.

if __name__ == '__main__':
    source = 'COCO'

    if source == 'COCO':
        convert_coco_json('../datasets/coco/annotations',  # directory with *.json
                          use_segments=True,
                          cls91to80=True)

@Lokesh-26
Copy link

@ryouchinsa hey, I replaced the argument with the json directory. But the generated images and labels folders are empty. Am I missing anything else?

@ryouchinsa
Copy link

Thanks for writing the issue.
We check it works using this script global_json2yolo.py.

Looking at the global_json2yolo code, there are some flags.
Converting the COCO bbox format to YOLO bbox format.

use_segments=False,
use_keypoints=False,

Converting the COCO segmentation format to YOLO segmentation format.

use_segments=True,
use_keypoints=False,

Converting the COCO keypoints format to YOLO keypoints format.

use_segments=False,
use_keypoints=True,

To convert the COCO segmentation format to YOLO segmentation format.

if __name__ == '__main__':
    source = 'COCO'

    if source == 'COCO':
        convert_coco_json('../datasets/coco/annotations',  # directory with *.json
                          use_segments=True,
                          use_keypoints=False,
                          cls91to80=False)

This is the folder structure when we run the script.

スクリーンショット 2023-10-25 20 51 05

Please let us know your opinion.

@Lokesh-26
Copy link

Thanks for answering. I am getting the below error. I am just trying to convert with without segmentation features.

Annotations /home/loki/segmentation/datasets/coco/annotations/scene_gt_coco_modal.json: 0%| | 0/100 [01:29<?, ?it/s] Traceback (most recent call last): File "/snap/pycharm-community/350/plugins/python-ce/helpers/pydev/pydevd.py", line 1500, in _exec pydev_imports.execfile(file, globals, locals) # execute the script File "/snap/pycharm-community/350/plugins/python-ce/helpers/pydev/_pydev_imps/_pydev_execfile.py", line 18, in execfile exec(compile(contents+"\n", file, 'exec'), glob, loc) File "/home/loki/segmentation/COCO2YOLO/general_json2yolo.py", line 439, in <module> convert_coco_json('/home/loki/segmentation/datasets/coco/annotations', # directory with *.json File "/home/loki/segmentation/COCO2YOLO/general_json2yolo.py", line 319, in convert_coco_json with open((fn / f).with_suffix('.txt'), 'a') as file: FileNotFoundError: [Errno 2] No such file or directory: 'new_dir/labels/scene_gt_coco_modal/rgb/000000.txt'

I do not understand the error. because the .txt file should be generated not read.

My directory: /home/loki/segmentation/datasets/coco/annotations/scene_gt_coco_modal.json

Also my coco json uses RLE in the segmentation. Below is my COCO format

"segmentation": {"counts": [786329, 17, 1183, 36, 1164, 56, 1144, 75, 1125, 94, 1106, 113, 1087, 132, 1068, 151, 1049, 171, 1029, 189, 1011, 209, 991, 228, 972, 247,.......], "size": [1200, 1944]}.

How do I convert these to YOLO format with your script?

@ryouchinsa
Copy link

In our environment, this COCO RLE format is correctly converted to YOLO segmentation format.
If you could, could you share the COCO format with us? We can check whether it can be converted or not.

{
    "annotations": [
    {
        "area": 160373,
        "bbox": [2152, 424, 596, 420],
        "category_id": 1,
        "id": 1,
        "image_id": 1,
        "iscrowd": 0,
        "segmentation":
        {
            "counts": "g__V67jm2e0C9F:K5K5L0N5K3M2O3K4O00001N4K2O2O2N1O0O1M6M1O0O2M4N0O2N2O2L3N102N0O2M3N3N1N1N3O0O3N2M3L3O0O2O002N1N2O2L3O000O2O3M0O2O2L3O000O4M1N10001O0N4M2O1O002N1O0O103K3O00002N001N2O0N4N1N10001O2N1O0O2O2N0N201N2O002N1O000O2O2L201N10001O2N1N10001O0N4N1N101O2N00001N1N3O002M10001O00001N102N000N3O000O2O00002N1O000O102N000N3O0O10001oZMPHbc2P8^\\MPHbc2P8W\\M[Hec2g7l[MQIic2P7o[M[Ilc2f6n[MeImc2b8L4L0N5M1O1O3L102N001O3M1O1O2M2O00003K2000000O10001O0000000000000O10000N2000000O2O000000000000000O3N0000000N20000O2O000000000O0O2OO20M2N3O00OO3O010N2L30O11N1100000ON300O100O100N1N3O10N11N200O0N3O100O1N200O0M4O1M3OOO300O1SOW]MQFlb2o9]]MfEfb2Z:g0000000000O100001O000001O000000000000011N0000000011N000000000010O002N000000001O01O002N000010O0001O002N00001O0011N02N00001O00001O00002N10O000001O00003M0000001O00003M1O3M0020N01021M002OO02M01020N010OO10O02N2O10O00O>B000101N10O002OO01O03M02N10O00002O01N0010O2N000101N010O02N000101N02OO01O000101N11N1O000003NO2N00000000000000001O0000000000000000000000000000000000000000000000000O11O000000000000O10O2O00O1000N3O001O001O0O100000000000000O1000001O00ON3M5M1M3M300O1ZNeZMTJ`e2dN]ZMV73SJee2m5^ZMmIee2S6[ZMlIfe2S6_ZMeIee2[6[ZMeIee2[6^ZMaIce2_6W10O0O12N000010O1O2N107H0TYMbHhf2h7L6H3N0O3M3M0010O2N001O00000M4N1M3K7F8I701G8RLbXM=mg2BTXM9Rh2DPXM9Sh2GoWM5Uh2HlWM4Xh2LhWM4Xh2JjWM3Yh2LiWML_h23aWMJbh25_WMJbh24`WMIdh25]WMJeh25[WMKeh25[WMHih24[WMGih28XWMFkh29UWMDnh2<TWMAoh2<RWMAQi2?oVMARi2<QWMBQi2<PWM@Ti2?mVM^OWi2?lVM_OWi2>jVM_OYi2`0jVM[OYi2e0gVM[OZi2b0fVM_O]i2=eVM_O`i2>aVM@bi2?_VMAbi2=_VMBci2;_VMCdi2<\\VMDdi2;]VMDdi29^VMGei26\\VMHgi26[VMHgi25[VMKfi23[VMLfi24ZVMLii20YVMMWj2BoUM:Sl2O1N3K6J40afk`5",
            "size": [3022, 4666]
        }
    },
    {
        "area": 246140,
        "bbox": [2960, 1336, 602, 784],
        "category_id": 1,
        "id": 2,
        "image_id": 1,
        "iscrowd": 0,
        "segmentation":
        {
            "counts": "\\RTa84Un28I7M0N3N6H6L4M0O2O0M3L7J3O2K4O2O0M5J5O001O0O3N0N3O000O1N3O000O3N0O2O0N20000O101N100N4N000O1iM[MeXMe2Xg2cMbXM^2]g2jM\\XMV2cg2RNTXMP2jg2WNPXMj1Ph2VNPXMj1og2ZNnWMf1Qh2^NlWM`1Vh2aNhWM`1Xh2aNgWM_1Yh2dNbWM^1^h2cN`WM^1`h2bN`WM^1ah2dN\\WM\\1dh2gNYWMY1me2mL`\\Mk1aMY1]e2cNm[M5cNZ1Pe2Lb[MmN[OW1fd2a0g[Mc1Yd2]Ng[Mc1kc2ROn[Mn0jc2o4L4L4N2L310LS1kN;J4L5J5L3N0N4L2N2OO11N200O1L310O1N2O10000O010N200OO20000O01N11N11000O10O01N020O10000O0O2O1000000O1L4M3M3N1N300M3L4L2YOi01O1O002M2O3M1O003M001O1O3M0O2O3M00003N0O1O2N1O002N1O3M1O001O003M1O3M010O2N001O3M1O1O20OO001O3N0O2N1O00003N0O3OOO010O003M2N100O20OO00102M1O2N1O00012OOO3M2O2M3M002N6K1N7K5I8H005L;D7I7I5L6I005M5J2M2N3M4L003N4K3O2L3N0O004L5K3N3L3M2N021N3L1O2N1O0001O03ON0101N1O0002OO2O0O001O002O01N01O2O0O000010O002N00001O00001N1000002N1O0000001O002N1O001O2N1O00001O2N1O001O00003M001O2N00001O1O2N1N100N201N4L1N201N3N001N1M5N1O0O1N200O100O0O200O10000M300O1N20000O1O1YO_bM_@c]2b?`bMY@a]2j?cbMo_O]]2Q`0`0003M1O3M1O1O2N0N3O4L003M1O00003M2N2N1O2N1O00003M1O001O2N001O1O3M00110N0100010O0O010O021N4L3N1N3L4L01204I2O2O2M2M01113J2O2O2M2M00213K0O2O01OO0111OO11N0020M01O40OM102f_OfdM]=X[2bBPeMV=P[2jBPeMV=S[2gBUeMR=mZ2kBbeMh<]Z2WCgeMe<\\Z2VCjeMg<VZ2PBmdMEQ1W>UZ2bAfgMZ>ZX2fAfgMZ>[\\2M100O0O1O2N0010010O0O1O2N0010010O0O001O002O010O0O0nGl_Mj1U`2TNo_Mi1Q`2WNo_Mi1S`2UNn_Ml1Q`2PNV`Mm1j_2QNX`Mn1j_2PNY`Mm1h_2RNY`Mn1f_2RN\\`Ml1d_2TN\\`Ml1e_2SN\\`Mn1e_2oM_`Mm1a_2TNa`Mi1`_2VNa`Mi1a_2TNb`Mj1__2UNb`Mj1^_2VNb`Mj1^_2VNe`Mh1]_2TNh`Mh1Y_2WNh`Mh1X_2VNk`Mi1V_2VNm`Mg1U_2VNm`Mi1S_2WNm`Mi1T_2VNo`Mg1R_2XNQaMe1Q_2YNPaMf1Q_2XNSaMe1m^2[NTaMd1m^2YNXaMd1h^2\\NXaMd1j^2ZNWaMe1j^2YN[aMd1e^2[N[aMe1g^2YNZaMf1g^2XN\\aMf1d^2ZN]aMe1c^2[N]aMe1d^2XN_aMg1d^2UN`aMj1^^2VNbaMj1`^2TNaaMk1`^2TNbaMj1^^2VNcaMi1]^2WNcaMi1^^2UNcaMk1`^2PNeaMm1[^2RNfaMo1Z^2nMiaMQ2Y^2lMkaMQ2V^2nMjaMR2V^2nMjaMR2V^2mMlaMR2W^2hMoaMU2R^2iMoaMV2R^2hMQbMW2P^2gMTbMV2o]2gMQbMW2Q^2iMoaMW2R^2gMPbMX2T^2aMRbM[2R^2bMnaM^2S^2^MSbM_2P^2]MRbMb2n]2]MSbMc2P^2WMTbMg2m]2YMSbMg2n]2XMTbMf2P^2RMUbMk2P^2QMTbMl2m]2PMWbMn2m]2oLSbMQ3n]2kLVbMT3j]2lLVbMT3m]2hLVbMV3n]2cLWbM[3j]2cLYbMZ3j]2aLZbM^3S^2ULmaMk3\\d2M202N3M1O0O2O2N001M5L2O001O2N001O4K2O3M3M1O003M5K102M3M1O003M4L3M2N3M1O004L7I4L4L3M4L002N7POmTeU3",
            "size": [3022, 4666]
        }
    },
    {
        "area": 128163,
        "bbox": [2758, 14, 606, 681],
        "category_id": 2,
        "id": 3,
        "image_id": 1,
        "iscrowd": 0,
        "segmentation":
        {
            "counts": "`d^n71]n22N001O1O011N001O00000000001O00000020N01O00000000010O00002N00001O0000000000010O2N00001O0002N000001O0000011N001O00000000001O00002N00010O0000000002OO2N000010O00000001O00002N00001O0001O01O00002N02N01O000000010O002N0000001O0000001O00011N001O000002N0002N00010O00000000001O0001O02N02OO00000001O01O00002N00010O000002N0000010O02N0000001O0001O0001O00020N00010O00000000001O01O0002N02OO00000001O01O00002N0000010O000002N0001O2OO0001O000000000010O000002N11N000000000010O002N001O00000000010O00000020N01O000000011N00001O000000000011N02YSMYOck2g0]TMYOck2T2Eg0YO<E9F9G9G00;E=C3L4F;E;L0H;@b0D;E:F9J7M0N6E>E7I5L6K3O0L7F;I4K7I5aNaEj^Me:Sa2[Em^Me:m`2fEl^M]:h`2WFh^MT:Sa2g1K3N4L2O0M3M6L2N1N5L2O0L4H;I5L3L4L400L6F8L5K4M3L400L4K6L3O3K3O100M4K4M3N2M3O100M3L4M3O1M3O100M3M3O100O1N200O1O1N2O1O1N200O1O0O2OO20O010M3O0O2OJRdMj]OO0Q\\2Sb0ocMm]OV\\2Tb0jcMl]OV\\2Sb013M00MmcMn]OT\\2oa0kcMP^OY\\2oa02200N0K7M111O0N21N2NN311L1SdMc]Ol[2\\b03MQdMg]OQ\\2Yb001L4N3N3LN500LVdMd]Oi[2\\b0XdMc]Og[2\\b061N2OO110N0O22LM44K1110K3OSdMe]Ol[2[b0UdMe]Oj[2[b030110FQdMP^OT\\2Pb0lcMn]OU\\2Rb0121I64KNicMP^OX\\2Pb03M3J52O00L400GZcMg^Oi\\2Za0VcMf^Oj\\2Va0WcMh^OOO01k\\2Wa0VcMi^OQ]2Ua05MibMm^OW]2o`0ibMR_O[]2l`04L6K7EUbMh_Oo]2W`07L2M5K2O2D`aMb@j^2\\?TaMc@n^2Y1YaMW<IdBV_2n0WaM]<CdBX_2h0^aM`<^OeBW_2b0caMb<ROnBb_2`0\\aMb<TOlBa_29eaMf<[`2B^^M`C1Oca2_<\\^MbC1Ofa2[<Z^MeCna2V<8L2Jh]MUD\\b2j;8H6M4L3M1N104L4L3M4K2M5M004K5L4K6K4J6L009Fb0_O9F;F<Ao0TO00k0TOb2\\M111N10O200ON20N20O010010M100O111N10O20M2OO0200N12O10M3L031O0O0O210O0M03OO20N0210O0M02010N02100M0O03000N02100NOO030N20N0300ON01O0200N02100MO030O010N030O0NO3N20N03O10M00O301M3M3O100M2L5OOO3O0O200L5K4O1M5K3O100M4_O`0J6G<E;E802FTXog3",
            "size": [3022, 4666]
        }
    },
    {
        "area": 46773,
        "bbox": [3569, 1962, 356, 173],
        "category_id": 3,
        "id": 4,
        "image_id": 1,
        "iscrowd": 0,
        "segmentation":
        {
            "counts": "ZeYY:4Vn27H7H9M0L8D=F8I8F9I7M0K7N3M1O2M2O1O002L4N2N1N103M001O1O3M001O3M00012M11OO2O0O00102M021M10O0002N2O1N11OO2N010O3N0O021M0001OO2O000N2000000O1000001O0000002M100000000N3O000O1000000000001N100000000N4N000O2O0000000000O10O1000000N20N200000O1000O1000000000O10000N2000000O01000000000000N20O1000000000O100000000000N20000000O1000000000000000000001O00000000000O100000000000000000001O0N200000000000002N000O100000000000N200000000O1000000000000O10000N110000000O1000000000001N10000N2000000O10001N1M300L4I9N00000000001N10000N2000000O10002N0000000O2O00000N2000000O100000001O000O102N00000N20001N100000000000000O10001O0N200O3N00000O1M301O0O1N20001N1O3N1M2O10001N1N4M101O000O1M4O002M1N201N100O2M202N000O2L30001N1N202N1N1O1N3O0O102NVmUT2",
            "size": [3022, 4666]
        }
    }],
    "categories": [
    {
        "id": 1,
        "name": "anemonefish"
    },
    {
        "id": 2,
        "name": "blue tang"
    },
    {
        "id": 3,
        "name": "unknown"
    },
    {
        "id": 4,
        "name": "object973"
    }],
    "images": [
    {
        "file_name": "wembley-S3Vq97p3gSk-unsplash.jpg",
        "height": 3022,
        "id": 1,
        "width": 4666
    }]
}

@Lokesh-26
Copy link

scene_gt_coco_modal.json
I have attached my Json file. As far as I have seen. It has the same format.

@ryouchinsa
Copy link

Thanks for attaching the JSON file.
Using the general_json2yolo.py, we could convert to YOLO segmentation format.

Updated so that the subdirectory from file_name was removed.

h, w, f = img['height'], img['width'], img['file_name']
f = f.split('/')[-1]

Updated so that the uncompressed RLE is converted to compressed RLE before decoding.

def rle2polygon(segmentation):
    if isinstance(segmentation["counts"], list):
        segmentation = mask.frPyObjects(segmentation, *segmentation["size"])
    m = mask.decode(segmentation) 

Please let us know your opinion.

@Lokesh-26
Copy link

It worked perfectly :) Thank you very much

@AISoltani
Copy link
Author

@ryouchinsa Dear My Friend.

That's nice!...Thank you very much for your great support to solve this problems.

@josh-001
Copy link

josh-001 commented Nov 3, 2023

kpt_shape: [4, 3]
flip_idx: [0, 2, 1, 3]
Annotations /home/bhanu/JSON2YOLO/pose-1.json: 0%| | 0/13 [00:00<?, ?it/s]
Traceback (most recent call last):
File "j2y.py", line 439, in
convert_coco_json('/home/bhanu/JSON2YOLO', # directory with *.json
File "j2y.py", line 289, in convert_coco_json
box[[0, 2]] /= w # normalize x
IndexError: index 0 is out of bounds for axis 0 with size 0

I am getting this error is this because of bounding box? while doing annotations I tried only for keypoints not for bbox.

@josh-001
Copy link

josh-001 commented Nov 4, 2023

bboxpose-1.json
this is my json file
i am getting this error kpt_shape: [4, 3]
flip_idx: [0, 2, 1, 3]
Annotations /home/bhanu/JSON2YOLO/bboxpose-1.json: 0%| | 0/9 [00:00<?, ?it/s]
Traceback (most recent call last):
File "j2y.py", line 439, in
convert_coco_json('/home/bhanu/JSON2YOLO', # directory with *.json
File "j2y.py", line 289, in convert_coco_json
box[[0, 2]] /= w # normalize x
IndexError: index 0 is out of bounds for axis 0 with size 0
even after i drawn the bounding boxes also .where i should make the changes?

@josh-001
Copy link

josh-001 commented Nov 4, 2023

scene_gt_coco_modal.json I have attached my Json file. As far as I have seen. It has the same format.

which tool did use for annotation @Lokesh-26 ?

@Lokesh-26
Copy link

scene_gt_coco_modal.json I have attached my Json file. As far as I have seen. It has the same format.

which tool did use for annotation @Lokesh-26 ?

Hey @josh-001 sorry for the late reply. I used the annotation tool from Bop_toolkit

@glenn-jocher
Copy link
Member

Got it! Utilizing the Bop_toolkit annotation tool could lead to the issue you're currently encountering. In your provided JSON file, the issue seems to be related to the bounding box annotation. It is likely that the bounding box information may not be formatted correctly, causing errors in the conversion process.

To address the issue, you may need to review and verify the bounding box annotations within your JSON file to ensure they are properly defined. It's important to validate that the bounding box coordinates and dimensions are accurately represented, as this will ensure their successful conversion using the current script.

If you encounter any further difficulties, please feel free to share additional details or the relevant parts of your JSON file for further assistance.

@ryouchinsa
Copy link

Hi @josh-001, I am sorry for late reply.
I checked your coco file and found how to fix the problem.
The error is because your keypoints annotation has an empty bbox array.

{"id":1,"datatorch_id":"1c054b51-d581-4a9f-b506-132695bf6101","image_id":3,"category_id":1,"keypoints":[892.73390036452,371.53098420413124,1,741.8712029161603,369.97569866342644,1,1046.7071688942892,411.96840826245443,1,894.2891859052248,407.3025516403402,1],"segmentation":[],"area":0,"bbox":[],"iscrowd":0,"metadata":{}},

We added bbox_from_keypoints() and updated the general_json2yolo.py script.
The bbox array is calculated from the keypoints (x, y) values.

Converting the COCO keypoints format to YOLO keypoints format.

use_segments=False,
use_keypoints=True,

Looking at your coco file, there is a polygon annotation.
Ideally this bbox array should be written to the keypoints annotaiton, too.
Which annotation tool are you using?

{"id":23,"datatorch_id":"8d9ce0df-9951-4b68-9a43-dd0dbb1142f1","image_id":3,"category_id":1,"segmentation":[[723.6,335.8,1080.9,391.6,1070.1,446.5,721.8,390.7]],"area":19720.26,"bbox":[721.8,335.8,359.10000000000014,110.69999999999999],"iscrowd":0,"metadata":{}},

スクリーンショット 2023-11-29 16 02 14

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants