Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem of frozen_keys in 02_finetune_new_observation_action.py #107

Open
BUAAZhangHaonan opened this issue Jun 4, 2024 · 0 comments
Open

Comments

@BUAAZhangHaonan
Copy link

I found a small problem in examples/02_finetune_new_observation_action.py:

frozen_keys.append("BlockTransformer_0")

By printing param_partitions in freeze_weights, I found that it cannot effectively freeze the parameters of BlockTransformer_0. I think it should be expressed as follows:

frozen_keys.append("octo_transformer.BlockTransformer_0.*")

In addition, I found that all frozen modes are not as good as training from scratch (of course, the default *hf_model* needs to be frozen)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant