Skip to content

fix: fixing the pooling error for non-llama models for biencoder training#1645

Merged
rnyak merged 5 commits intomainfrom
rny/encoder_refactor
Apr 8, 2026
Merged

fix: fixing the pooling error for non-llama models for biencoder training#1645
rnyak merged 5 commits intomainfrom
rny/encoder_refactor

Conversation

@rnyak
Copy link
Copy Markdown
Collaborator

@rnyak rnyak commented Apr 1, 2026

What does this PR do ?

When I train this model "Qwen/Qwen3-1.7B" as a bi_encoder, I got the following error TypeError: Qwen3Model.__init__() got an unexpected keyword argument 'pooling. This happens for Qwen (and would happen for any backbone loaded via AutoModel.from_pretrained) because pooling is not a valid argument for that HF model.

Details: In the yaml file we set the pooling param, and this param is meant to be propagated to the pooling operation that is executed in class BiEncoderModel(nn.Module). The BiEncoderModel.build calls build_encoder_backbone(..., pooling=pooling, **hf_kwargs), but build_encoder_backbone does not have a pooling parameter. Because of that, Python treated pooling like part of **hf_kwargs. Then it got passed along to the Qwen model, instead of being handled separately.

This MR adds an explicit pooling: Optional[str] = None parameter to build_encoder_backbone function, so that pooling is no longer absorbed into **hf_kwargs.

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?

@copy-pr-bot
Copy link
Copy Markdown

copy-pr-bot bot commented Apr 1, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@rnyak rnyak changed the title Draft: fix pooling error for non-llama models Fix pooling error for non-llama models for biencoder training Apr 1, 2026
@rnyak rnyak marked this pull request as draft April 1, 2026 15:53
@rnyak rnyak changed the title Fix pooling error for non-llama models for biencoder training fix: fixing the pooling error for non-llama models for biencoder training Apr 1, 2026
@akoumpa
Copy link
Copy Markdown
Contributor

akoumpa commented Apr 4, 2026

/ok to test 18eb4ff

@rnyak rnyak self-assigned this Apr 8, 2026
@rnyak rnyak marked this pull request as ready for review April 8, 2026 15:16
Copy link
Copy Markdown
Collaborator

@adil-a adil-a left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@adil-a
Copy link
Copy Markdown
Collaborator

adil-a commented Apr 8, 2026

/ok to test a538f86

@adil-a adil-a added the r0.4.0 Auto-cherrypick to release branch. Apply before merge; cherrypick happens after merge. label Apr 8, 2026
@rnyak rnyak merged commit 4458806 into main Apr 8, 2026
60 checks passed
@rnyak rnyak deleted the rny/encoder_refactor branch April 8, 2026 18:11
svcnvidia-nemo-ci pushed a commit that referenced this pull request Apr 8, 2026
…ning (#1645)

* fix pooling errow for non-llama models

* small fix in the biencoder yaml

* set add_eos_token to false

Signed-off-by: NeMo Bot <nemo-bot@nvidia.com>
akoumpa pushed a commit that referenced this pull request Apr 8, 2026
fix: fixing the pooling error for non-llama models for biencoder training (#1645)

* fix pooling errow for non-llama models

* small fix in the biencoder yaml

* set add_eos_token to false

Signed-off-by: NeMo Bot <nemo-bot@nvidia.com>
Co-authored-by: rnyak <16246900+rnyak@users.noreply.github.com>
akoumpa pushed a commit that referenced this pull request Apr 10, 2026
…ning (#1645)

* fix pooling errow for non-llama models

* small fix in the biencoder yaml

* set add_eos_token to false
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

r0.4.0 Auto-cherrypick to release branch. Apply before merge; cherrypick happens after merge.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants