Skip to content

feat(CenterPoint): Add finetuning to CenterPoint by freezing certain layers#30

Merged
KSeangTan merged 4 commits intomainfrom
feat/add_centerpoint_finetuning
Apr 7, 2025
Merged

feat(CenterPoint): Add finetuning to CenterPoint by freezing certain layers#30
KSeangTan merged 4 commits intomainfrom
feat/add_centerpoint_finetuning

Conversation

@KSeangTan
Copy link
Collaborator

@KSeangTan KSeangTan commented Apr 4, 2025

Summary

Add a feature to freeze parameters of each layer in CenterPoint.
For example, users can specify:

pts_backbone=dict(
        type="SECOND",
        in_channels=32,
        out_channels=[64, 128, 256],
        layer_nums=[3, 5, 5],
        layer_strides=[1, 2, 2],
        norm_cfg=dict(type="BN", eps=1e-3, momentum=0.01),
        conv_cfg=dict(type="Conv2d", bias=False),
        frozen_stages=[0, 1, 2]
    ),

to freeze all layers in the backbone during finetuning

@KSeangTan KSeangTan requested a review from scepter914 April 4, 2025 07:13
@KSeangTan KSeangTan self-assigned this Apr 4, 2025
Copy link
Contributor

@scepter914 scepter914 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 👍

Let's try various experiments of fine tuning, and if any issues arise, we can fix or revert them as needed.

@KSeangTan KSeangTan merged commit 5c449ec into main Apr 7, 2025
2 checks passed
@scepter914 scepter914 deleted the feat/add_centerpoint_finetuning branch June 13, 2025 04:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants