Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support fused weights for export_model #8554

Merged
merged 1 commit into from
Jun 6, 2024

support fused weights for export_model

ff61d4a
Select commit
Loading
Failed to load commit list.
Merged

support fused weights for export_model #8554

support fused weights for export_model
ff61d4a
Select commit
Loading
Failed to load commit list.
Codecov / codecov/patch failed Jun 5, 2024 in 1s

0.00% of diff hit (target 80.00%)

View this Pull Request on Codecov

0.00% of diff hit (target 80.00%)

Annotations

Check warning on line 50 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L50

Added line #L50 was not covered by tests

Check warning on line 477 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L477

Added line #L477 was not covered by tests

Check warning on line 479 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L479

Added line #L479 was not covered by tests

Check warning on line 482 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L482

Added line #L482 was not covered by tests

Check warning on line 490 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L489-L490

Added lines #L489 - L490 were not covered by tests

Check warning on line 501 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L500-L501

Added lines #L500 - L501 were not covered by tests

Check warning on line 504 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L504

Added line #L504 was not covered by tests

Check warning on line 507 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L507

Added line #L507 was not covered by tests

Check warning on line 510 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L510

Added line #L510 was not covered by tests

Check warning on line 526 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L525-L526

Added lines #L525 - L526 were not covered by tests

Check warning on line 530 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L530

Added line #L530 was not covered by tests

Check warning on line 534 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L533-L534

Added lines #L533 - L534 were not covered by tests

Check warning on line 556 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L556

Added line #L556 was not covered by tests

Check warning on line 580 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L580

Added line #L580 was not covered by tests

Check warning on line 598 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L598

Added line #L598 was not covered by tests

Check warning on line 622 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L622

Added line #L622 was not covered by tests

Check warning on line 1292 in paddlenlp/experimental/transformers/llama/modeling.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

paddlenlp/experimental/transformers/llama/modeling.py#L1292

Added line #L1292 was not covered by tests