support fused weights for export_model #8554
0.00% of diff hit (target 80.00%)
View this Pull Request on Codecov
0.00% of diff hit (target 80.00%)
Annotations
Check warning on line 50 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L50
Added line #L50 was not covered by tests
Check warning on line 477 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L477
Added line #L477 was not covered by tests
Check warning on line 479 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L479
Added line #L479 was not covered by tests
Check warning on line 482 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L482
Added line #L482 was not covered by tests
Check warning on line 490 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L489-L490
Added lines #L489 - L490 were not covered by tests
Check warning on line 501 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L500-L501
Added lines #L500 - L501 were not covered by tests
Check warning on line 504 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L504
Added line #L504 was not covered by tests
Check warning on line 507 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L507
Added line #L507 was not covered by tests
Check warning on line 510 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L510
Added line #L510 was not covered by tests
Check warning on line 526 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L525-L526
Added lines #L525 - L526 were not covered by tests
Check warning on line 530 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L530
Added line #L530 was not covered by tests
Check warning on line 534 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L533-L534
Added lines #L533 - L534 were not covered by tests
Check warning on line 556 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L556
Added line #L556 was not covered by tests
Check warning on line 580 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L580
Added line #L580 was not covered by tests
Check warning on line 598 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L598
Added line #L598 was not covered by tests
Check warning on line 622 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L622
Added line #L622 was not covered by tests
Check warning on line 1292 in paddlenlp/experimental/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/experimental/transformers/llama/modeling.py#L1292
Added line #L1292 was not covered by tests