Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fix] Fix UT to be compatible with pytorch 1.6 #8707

Merged
merged 12 commits into from
Sep 5, 2022

Conversation

jbwang1997
Copy link
Collaborator

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

Please describe the motivation of this PR and the goal you want to achieve through this PR.

Modification

Please briefly describe what modification is made in this PR.

BC-breaking (Optional)

Does the modification introduce changes that break the backward-compatibility of the downstream repos?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.

Checklist

  1. Pre-commit or other linting tools are used to fix the potential lint issues.
  2. The modification is covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  3. If the modification has potential influence on downstream projects, this PR should be tested with downstream projects, like MMDet or MMCls.
  4. The documentation has been modified accordingly, like docstring or example tutorials.

@ZwwWayne ZwwWayne merged commit e3c0b69 into open-mmlab:dev-3.x Sep 5, 2022
ZwwWayne pushed a commit that referenced this pull request Sep 26, 2022
* Update

* Update

* force reinstall pycocotools

* Fix build_cuda

* docker install git

* Update

* comment other job to speedup process

* update

* uncomment

* Update

* Update

* Add comments for --force-reinstall
ZwwWayne pushed a commit that referenced this pull request Oct 8, 2022
* [Fix] Fix UT to be compatible with pytorch 1.6 (#8707)

* Update

* Update

* force reinstall pycocotools

* Fix build_cuda

* docker install git

* Update

* comment other job to speedup process

* update

* uncomment

* Update

* Update

* Add comments for --force-reinstall

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* Add CrowdHumanDataset

* [WIP] Add CrowdHumanDataset

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

* [Feature] Add CrowdHumanDataset and Metric

Co-authored-by: jbwang1997 <jbwang1997@gmail.com>
ZwwWayne pushed a commit that referenced this pull request Oct 20, 2022
* [Fix] Fix UT to be compatible with pytorch 1.6 (#8707)

* Update

* Update

* force reinstall pycocotools

* Fix build_cuda

* docker install git

* Update

* comment other job to speedup process

* update

* uncomment

* Update

* Update

* Add comments for --force-reinstall

* [Refactor] Refactor anchor head and base head with boxlist (#8625)

* Refactor anchor head

* Update

* Update

* Update

* Add a series of boxes tools

* Fix box type to support n x box_dim boxes

* revert box type changes

* Add docstring

* refactor retina_head

* Update

* Update

* Fix comments

* modify docstring of coder and ioucalculator

* Replace with_boxlist with use_box_type

* fix: fix config of detr-r18

* fix: modified import of MSDeformAttn in PixelDecoder of Mask2Former

* feat: add TransformerDetector as the base detector of DETR-like detectors

* refactor: refactor modules and configs of DETR

* refactor: refactor DETR-related modules in transformer.py

* refactor: refactor DETR-related modules in transformer.py

* fix: add type comments in detr.py

* correct trainloop in detr_r50 config

* fix: modify the parent class of DETRHead to BaseModule

* refactor: refactor modules and configs of Deformable DETR

* fix: modify the usage of num_query

* fix: modify the usage of num_query in configs

* refactor: replace input_proj of detr with ChannelMapper neck

* refactor: delete multi_apply in DETRHead.forward()

* Update detr_r18_8xb2-500e_coco.py

using channel mapper for r18

* change the name of detection_transfomer.py to base_detr.py

* refactor: modify construct binary masks section of forward_pretransformer

* refactor: utilize abstractmethod

* update ABCmeta to make sure reload class TransformerDetector

* some annotation

* some annotation

* some annotation

* refactor: delete _init_transformer in detectors

* refactor: modify args of deformable detr

* refactor: modify about super().__init__()

* Update detr_head.py

Remove the multi feat lvl in function 'predict_by_feat'

* Update detr.py

update init_weights

* some annotation for head

* to make sure the head args the same as detector

* to make sure the head args the same as detector

* some bug

* fix: fix bugs of num_pred in DeformableDETRHead

* add kwargs to transformer

* support MLP and sineembed position

* detele positional encodeing

* delete useless postnorm

* Revert "add kwargs to transformer"

This reverts commit a265c1a.

* Update detr_head.py

Update type and shape of args

* Update detr_head.py

fix args docstring in predict_by_feat

* Update base_detr.py

Update docstring for forward_pretransformer

* Update deformable_detr.py

Fix docstring

* to support conditional detr with reload forward_transformer

* fix: update config files of Two-stage and Box-refine

* replace all bs with batch_size in detr-related files

* update deformable.py and transformer.py

* update docstring in base_detr

* update docstring in base_detr, detr

* doc refine

* Revert "doc refine"

This reverts commit b69da4f.

* doc refine

* doc refine

* updabase_detr, detr, and le layers/transformdoc

* fix doc in base_detr

* add origin repo link

* add origin repo link

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* doc: add doc of the first edition of Deformable DETR

* batch_size to bs

* refine doc

* refine doc

* feat: add config comments of specific module

* refactor: refactor base DETR class TransformerDetector

* fix: fix wrong return typehint of forward_encoder in TransformerDetector

* refactor: refactor DETR

* refactor: refactor Deformable DETR

* refactor: refactor forward_encoder and pre_decoder

* fix: fix bugs of new edition

* refactor: small modifications

* fix: move get_reference_points to deformable_encoder

* refactor: merge init_&inter_reference to references in Deformable DETR

* modify docstring of get_valid_ratio in Deformable DETR

* add some docstring

* doc: add docstring of deformable_detr.py

* doc: add docstring of deformable_detr_head.py

* doc: modify docstring of deformable detr

* doc: add docstring of deformable_detr_head.py

* doc: modify docstring of deformable detr

* doc: add docstring of base_detr.py

* doc: refine docstring of base_detr.py

* doc: refine docstring of base_detr.py

* a little change of MLP

* a little change of MLP

* a little change of MLP

* a little change of MLP

* refine config

* refine config

* refine config

* refine doc string for detr

* little refine doc string for detr.py

* tiny modification

* doc: refine docstring of detr.py

* tiny modifications to resolve the conversations

* DETRHead.predict() draft

* tiny modifications to resolve conversations

* refactor: modify arg names and forward strategies of bbox_head

* tiny modifications to resolve the conversations

* support MLP

* fix docsting of function pre_decoder

* fix docsting of function pre_decoder

* fix docstring

* modifications for resolving conversations

* refactor: eradicate key_padding_mask args

* refactor: eradicate key_padding_mask args

* fix: fix bug of deformable detr and resolve some conversations

* refactor: rename base class with DetectionTransformer and other modifications

* fix: fix config of detr

* fix the bug of init

* fix: fix init_weight of DETR and Deformable DETR

* resolve conflict

* fix auto-merge bug

* fix pre-commit bug

* refactor: move the position of encoder and decoder

* delete Transformer in ci test

* delete Transformer in ci test

Co-authored-by: jbwang1997 <jbwang1997@gmail.com>
Co-authored-by: KeiChiTse <xqz20@mails.tsinghua.edu.cn>
Co-authored-by: LYMDLUT <70597027+LYMDLUT@users.noreply.github.com>
Co-authored-by: lym <letusgo126@126.com>
Co-authored-by: Kei-Chi Tse <109070650+KeiChiTse@users.noreply.github.com>
jshilong pushed a commit that referenced this pull request Jan 19, 2023
* [Fix] Fix UT to be compatible with pytorch 1.6 (#8707)

* Update

* Update

* force reinstall pycocotools

* Fix build_cuda

* docker install git

* Update

* comment other job to speedup process

* update

* uncomment

* Update

* Update

* Add comments for --force-reinstall

* [Refactor] Refactor anchor head and base head with boxlist (#8625)

* Refactor anchor head

* Update

* Update

* Update

* Add a series of boxes tools

* Fix box type to support n x box_dim boxes

* revert box type changes

* Add docstring

* refactor retina_head

* Update

* Update

* Fix comments

* modify docstring of coder and ioucalculator

* Replace with_boxlist with use_box_type

* fix: fix config of detr-r18

* fix: modified import of MSDeformAttn in PixelDecoder of Mask2Former

* feat: add TransformerDetector as the base detector of DETR-like detectors

* refactor: refactor modules and configs of DETR

* refactor: refactor DETR-related modules in transformer.py

* refactor: refactor DETR-related modules in transformer.py

* fix: add type comments in detr.py

* correct trainloop in detr_r50 config

* fix: modify the parent class of DETRHead to BaseModule

* refactor: refactor modules and configs of Deformable DETR

* fix: modify the usage of num_query

* fix: modify the usage of num_query in configs

* refactor: replace input_proj of detr with ChannelMapper neck

* refactor: delete multi_apply in DETRHead.forward()

* Update detr_r18_8xb2-500e_coco.py

using channel mapper for r18

* change the name of detection_transfomer.py to base_detr.py

* refactor: modify construct binary masks section of forward_pretransformer

* refactor: utilize abstractmethod

* update ABCmeta to make sure reload class TransformerDetector

* some annotation

* some annotation

* some annotation

* refactor: delete _init_transformer in detectors

* refactor: modify args of deformable detr

* refactor: modify about super().__init__()

* Update detr_head.py

Remove the multi feat lvl in function 'predict_by_feat'

* Update detr.py

update init_weights

* some annotation for head

* to make sure the head args the same as detector

* to make sure the head args the same as detector

* some bug

* fix: fix bugs of num_pred in DeformableDETRHead

* add kwargs to transformer

* support MLP and sineembed position

* detele positional encodeing

* delete useless postnorm

* Revert "add kwargs to transformer"

This reverts commit a265c1a.

* Update detr_head.py

Update type and shape of args

* Update detr_head.py

fix args docstring in predict_by_feat

* Update base_detr.py

Update docstring for forward_pretransformer

* Update deformable_detr.py

Fix docstring

* to support conditional detr with reload forward_transformer

* fix: update config files of Two-stage and Box-refine

* replace all bs with batch_size in detr-related files

* update deformable.py and transformer.py

* update docstring in base_detr

* update docstring in base_detr, detr

* doc refine

* Revert "doc refine"

This reverts commit b69da4f.

* doc refine

* doc refine

* updabase_detr, detr, and le layers/transformdoc

* fix doc in base_detr

* add origin repo link

* add origin repo link

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* doc: add doc of the first edition of Deformable DETR

* batch_size to bs

* refine doc

* refine doc

* feat: add config comments of specific module

* refactor: refactor base DETR class TransformerDetector

* fix: fix wrong return typehint of forward_encoder in TransformerDetector

* refactor: refactor DETR

* refactor: refactor Deformable DETR

* refactor: refactor forward_encoder and pre_decoder

* fix: fix bugs of new edition

* refactor: small modifications

* fix: move get_reference_points to deformable_encoder

* refactor: merge init_&inter_reference to references in Deformable DETR

* modify docstring of get_valid_ratio in Deformable DETR

* add some docstring

* doc: add docstring of deformable_detr.py

* doc: add docstring of deformable_detr_head.py

* doc: modify docstring of deformable detr

* doc: add docstring of deformable_detr_head.py

* doc: modify docstring of deformable detr

* doc: add docstring of base_detr.py

* doc: refine docstring of base_detr.py

* doc: refine docstring of base_detr.py

* a little change of MLP

* a little change of MLP

* a little change of MLP

* a little change of MLP

* refine config

* refine config

* refine config

* refine doc string for detr

* little refine doc string for detr.py

* tiny modification

* doc: refine docstring of detr.py

* tiny modifications to resolve the conversations

* DETRHead.predict() draft

* tiny modifications to resolve conversations

* refactor: modify arg names and forward strategies of bbox_head

* tiny modifications to resolve the conversations

* support MLP

* fix docsting of function pre_decoder

* fix docsting of function pre_decoder

* fix docstring

* modifications for resolving conversations

* refactor: eradicate key_padding_mask args

* refactor: eradicate key_padding_mask args

* fix: fix bug of deformable detr and resolve some conversations

* refactor: rename base class with DetectionTransformer and other modifications

* fix: fix config of detr

* fix the bug of init

* fix: fix init_weight of DETR and Deformable DETR

* resolve conflict

* fix auto-merge bug

* fix pre-commit bug

* refactor: move the position of encoder and decoder

* delete Transformer in ci test

* delete Transformer in ci test

Co-authored-by: jbwang1997 <jbwang1997@gmail.com>
Co-authored-by: KeiChiTse <xqz20@mails.tsinghua.edu.cn>
Co-authored-by: LYMDLUT <70597027+LYMDLUT@users.noreply.github.com>
Co-authored-by: lym <letusgo126@126.com>
Co-authored-by: Kei-Chi Tse <109070650+KeiChiTse@users.noreply.github.com>
yumion pushed a commit to yumion/mmdetection that referenced this pull request Jan 31, 2024
* [Fix] Fix UT to be compatible with pytorch 1.6 (open-mmlab#8707)

* Update

* Update

* force reinstall pycocotools

* Fix build_cuda

* docker install git

* Update

* comment other job to speedup process

* update

* uncomment

* Update

* Update

* Add comments for --force-reinstall

* [Refactor] Refactor anchor head and base head with boxlist (open-mmlab#8625)

* Refactor anchor head

* Update

* Update

* Update

* Add a series of boxes tools

* Fix box type to support n x box_dim boxes

* revert box type changes

* Add docstring

* refactor retina_head

* Update

* Update

* Fix comments

* modify docstring of coder and ioucalculator

* Replace with_boxlist with use_box_type

* fix: fix config of detr-r18

* fix: modified import of MSDeformAttn in PixelDecoder of Mask2Former

* feat: add TransformerDetector as the base detector of DETR-like detectors

* refactor: refactor modules and configs of DETR

* refactor: refactor DETR-related modules in transformer.py

* refactor: refactor DETR-related modules in transformer.py

* fix: add type comments in detr.py

* correct trainloop in detr_r50 config

* fix: modify the parent class of DETRHead to BaseModule

* refactor: refactor modules and configs of Deformable DETR

* fix: modify the usage of num_query

* fix: modify the usage of num_query in configs

* refactor: replace input_proj of detr with ChannelMapper neck

* refactor: delete multi_apply in DETRHead.forward()

* Update detr_r18_8xb2-500e_coco.py

using channel mapper for r18

* change the name of detection_transfomer.py to base_detr.py

* refactor: modify construct binary masks section of forward_pretransformer

* refactor: utilize abstractmethod

* update ABCmeta to make sure reload class TransformerDetector

* some annotation

* some annotation

* some annotation

* refactor: delete _init_transformer in detectors

* refactor: modify args of deformable detr

* refactor: modify about super().__init__()

* Update detr_head.py

Remove the multi feat lvl in function 'predict_by_feat'

* Update detr.py

update init_weights

* some annotation for head

* to make sure the head args the same as detector

* to make sure the head args the same as detector

* some bug

* fix: fix bugs of num_pred in DeformableDETRHead

* add kwargs to transformer

* support MLP and sineembed position

* detele positional encodeing

* delete useless postnorm

* Revert "add kwargs to transformer"

This reverts commit a265c1a.

* Update detr_head.py

Update type and shape of args

* Update detr_head.py

fix args docstring in predict_by_feat

* Update base_detr.py

Update docstring for forward_pretransformer

* Update deformable_detr.py

Fix docstring

* to support conditional detr with reload forward_transformer

* fix: update config files of Two-stage and Box-refine

* replace all bs with batch_size in detr-related files

* update deformable.py and transformer.py

* update docstring in base_detr

* update docstring in base_detr, detr

* doc refine

* Revert "doc refine"

This reverts commit b69da4f.

* doc refine

* doc refine

* updabase_detr, detr, and le layers/transformdoc

* fix doc in base_detr

* add origin repo link

* add origin repo link

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* refine doc

* doc: add doc of the first edition of Deformable DETR

* batch_size to bs

* refine doc

* refine doc

* feat: add config comments of specific module

* refactor: refactor base DETR class TransformerDetector

* fix: fix wrong return typehint of forward_encoder in TransformerDetector

* refactor: refactor DETR

* refactor: refactor Deformable DETR

* refactor: refactor forward_encoder and pre_decoder

* fix: fix bugs of new edition

* refactor: small modifications

* fix: move get_reference_points to deformable_encoder

* refactor: merge init_&inter_reference to references in Deformable DETR

* modify docstring of get_valid_ratio in Deformable DETR

* add some docstring

* doc: add docstring of deformable_detr.py

* doc: add docstring of deformable_detr_head.py

* doc: modify docstring of deformable detr

* doc: add docstring of deformable_detr_head.py

* doc: modify docstring of deformable detr

* doc: add docstring of base_detr.py

* doc: refine docstring of base_detr.py

* doc: refine docstring of base_detr.py

* a little change of MLP

* a little change of MLP

* a little change of MLP

* a little change of MLP

* refine config

* refine config

* refine config

* refine doc string for detr

* little refine doc string for detr.py

* tiny modification

* doc: refine docstring of detr.py

* tiny modifications to resolve the conversations

* DETRHead.predict() draft

* tiny modifications to resolve conversations

* refactor: modify arg names and forward strategies of bbox_head

* tiny modifications to resolve the conversations

* support MLP

* fix docsting of function pre_decoder

* fix docsting of function pre_decoder

* fix docstring

* modifications for resolving conversations

* refactor: eradicate key_padding_mask args

* refactor: eradicate key_padding_mask args

* fix: fix bug of deformable detr and resolve some conversations

* refactor: rename base class with DetectionTransformer and other modifications

* fix: fix config of detr

* fix the bug of init

* fix: fix init_weight of DETR and Deformable DETR

* resolve conflict

* fix auto-merge bug

* fix pre-commit bug

* refactor: move the position of encoder and decoder

* delete Transformer in ci test

* delete Transformer in ci test

Co-authored-by: jbwang1997 <jbwang1997@gmail.com>
Co-authored-by: KeiChiTse <xqz20@mails.tsinghua.edu.cn>
Co-authored-by: LYMDLUT <70597027+LYMDLUT@users.noreply.github.com>
Co-authored-by: lym <letusgo126@126.com>
Co-authored-by: Kei-Chi Tse <109070650+KeiChiTse@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants