Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Optimizers not using specified arguments in the Transformers integration #217

@eldarkurtic

Description

@eldarkurtic

Describe the bug
Optimizers in integrations/transformers are always initialized with default arguments. Any user-defined arguments (for example via command line and argparser) are completely discarded.

Expected behavior
To be able to set optimizer's arguments, for example learning rate.

To Reproduce
Run any integrations/transformers example python script and compare specified optimizer's params and the actual optimizer's params (for example --learning_rate X).

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions