Skip to content

Fix broken dependencies and deprecated APIs in Classification Transfo…#194

Open
Ganglet wants to merge 1 commit intoML4SCI:mainfrom
Ganglet:fix-dependencies-and-broken-apis
Open

Fix broken dependencies and deprecated APIs in Classification Transfo…#194
Ganglet wants to merge 1 commit intoML4SCI:mainfrom
Ganglet:fix-dependencies-and-broken-apis

Conversation

@Ganglet
Copy link

@Ganglet Ganglet commented Mar 20, 2026

Fixes #191

What this PR does

Fixes 4 issues that prevent new contributors from running the Classification
Transformers project out of the box.

Changes

1. Added requirements.txt
No dependency file existed. Running train.py crashes with sequential
ModuleNotFoundError for wandb, einops, timm, torchmetrics. Added
requirements.txt with minimum version bounds for all 9 dependencies.

2. Fixed broken torchmetrics API (eval.py:59-61)
torchmetrics >= 0.11 requires task= argument. Added task="multiclass"
to accuracy_fn and auroc_fn calls.

3. Fixed deprecated torch.has_mps (utils.py:32)
Replaced with torch.backends.mps.is_built().

4. Removed verbose=True from CosineAnnealingWarmRestarts (train.py)
verbose parameter was removed in newer PyTorch versions. Using --decay_lr
flag crashed with TypeError.

Testing

  • Verified pip install -r requirements.txt installs all dependencies cleanly
  • Verified train.py --help runs without errors
  • Verified eval.py imports and metric calls work with torchmetrics >= 0.11

…rmers

Fixes ML4SCI#191:
- Add requirements.txt with pinned dependencies
- Fix torchmetrics calls to include task='multiclass'
- Replace deprecated torch.has_mps with torch.backends.mps.is_built()
- Remove verbose= param from CosineAnnealingWarmRestarts
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug] DeepLense_Classification_Transformers: Missing dependencies, broken APIs, and deprecated calls prevent running

1 participant