Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Picking right layers #106

Closed
patryk-at-pieces opened this issue Aug 1, 2022 · 3 comments
Closed

Picking right layers #106

patryk-at-pieces opened this issue Aug 1, 2022 · 3 comments
Labels

Comments

@patryk-at-pieces
Copy link

Hey, I am having some troubles with interpreting this output:

LAYER NAME                      #PARAMS      RATIO       MEM(MB)
--model:                    222,882,048    100.00%        850.23
  --shared             
    --weight:                24,652,800     11.06%         94.04
  --encoder:                 84,954,240     38.12%        324.07
    --embed_tokens     
      --weight(shared):               0      0.00%          0.00
    --block:                 84,953,472     38.12%        324.07
      --0:                    7,079,808      3.18%         27.01
      --1:                    7,079,424      3.18%         27.01
      --2:                    7,079,424      3.18%         27.01
      --3:                    7,079,424      3.18%         27.01
      --4:                    7,079,424      3.18%         27.01
      --5:                    7,079,424      3.18%         27.01
      --6:                    7,079,424      3.18%         27.01
      --7:                    7,079,424      3.18%         27.01
      --8:                    7,079,424      3.18%         27.01
      --9:                    7,079,424      3.18%         27.01
      --10:                   7,079,424      3.18%         27.01
      --11:                   7,079,424      3.18%         27.01
    --final_layer_norm 
      --weight:                     768      0.00%          0.00
  --decoder:                113,275,008     50.82%        432.11
    --embed_tokens     
      --weight(shared):               0      0.00%          0.00
    --block:                113,274,240     50.82%        432.11
      --0:                    9,439,872      4.24%         36.01
      --1:                    9,439,488      4.24%         36.01
      --2:                    9,439,488      4.24%         36.01
      --3:                    9,439,488      4.24%         36.01
      --4:                    9,439,488      4.24%         36.01
      --5:                    9,439,488      4.24%         36.01
      --6:                    9,439,488      4.24%         36.01
      --7:                    9,439,488      4.24%         36.01
      --8:                    9,439,488      4.24%         36.01
      --9:                    9,439,488      4.24%         36.01
      --10:                   9,439,488      4.24%         36.01
      --11:                   9,439,488      4.24%         36.01
    --final_layer_norm 
      --weight:                     768      0.00%          0.00
  --lm_head            
    --weight(shared):                 0      0.00%          0.00

I want to distill this model to have 2 encoder layers and 2 decoder layers (for example I could use first and last layer from both encoder and decoder).

I can't really tell how to set intermediate_matches param in DistillationConfig. In my opinion it would be much easier if these were numbered in last column.

Other than asking for feature, can you please help me out with setting these intermediate_matches?

@patryk-at-pieces
Copy link
Author

this is my distilled model:

LAYER NAME                      #PARAMS      RATIO       MEM(MB)
--model:                     31,136,256    100.00%        118.78
  --shared             
    --weight:                16,449,536     52.83%         62.75
  --encoder:                  6,294,272     20.22%         24.01
    --embed_tokens     
      --weight(shared):               0      0.00%          0.00
    --block:                  6,293,760     20.21%         24.01
      --0:                    3,147,008     10.11%         12.00
      --1:                    3,146,752     10.11%         12.00
    --final_layer_norm 
      --weight:                     512      0.00%          0.00
  --decoder:                  8,392,448     26.95%         32.01
    --embed_tokens     
      --weight(shared):               0      0.00%          0.00
    --block:                  8,391,936     26.95%         32.01
      --0:                    4,196,096     13.48%         16.01
      --1:                    4,195,840     13.48%         16.01
    --final_layer_norm 
      --weight:                     512      0.00%          0.00
  --lm_head            
    --weight(shared):                 0      0.00%          0.00

And here are my intermediate_matches

            intermediate_matches=[
                # encoder
                {"layer_T": 0, "layer_S": 0, "feature": "hidden", "loss": "hidden_mse", "weight": 1},
                {"layer_T": 11, "layer_S": 1, "feature": "hidden", "loss": "hidden_mse", "weight": 1},
                # decoder
                {"layer_T": 12, "layer_S": 2, "feature": "hidden", "loss": "hidden_mse", "weight": 1},
                {"layer_T": 23, "layer_S": 4, "feature": "hidden", "loss": "hidden_mse", "weight": 1},
            ],

Does it look fine?

@stale
Copy link

stale bot commented Aug 12, 2022

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Aug 12, 2022
@stale
Copy link

stale bot commented Sep 20, 2022

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.

@stale stale bot closed this as completed Sep 20, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant