Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pin Transformers to 4.31.0 #3569

Merged
merged 3 commits into from
Aug 31, 2023
Merged

Pin Transformers to 4.31.0 #3569

merged 3 commits into from
Aug 31, 2023

Conversation

arnavgarg1
Copy link
Contributor

@arnavgarg1 arnavgarg1 commented Aug 31, 2023

It seems like the latest transformers package has a bug. This can be fixed by either using transformers master, or downgrading to 4.31.0 which is stable and works correctly.

This is the error we run into with 4.32.1:

site-packages/ludwig/api.py:1510), in LudwigModel.preprocess(self, dataset, training_set, validation_set, test_set, training_set_metadata, data_format, skip_save_processed_input, random_seed, **kwargs)
   1508     return PreprocessedDataset(proc_training_set, proc_validation_set, proc_test_set, training_set_metadata)
   1509 except Exception as e:
-> 1510     raise RuntimeError(f"Caught exception during model preprocessing: {str(e)}") from e
   1511 finally:
   1512     for callback in self.callbacks:
RuntimeError: Caught exception during model preprocessing: local variable 'tokens' referenced before assignment

Closes: #3568

Copy link
Collaborator

@tgaddair tgaddair left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you file an issue with HF transformers? It might be an API issue on our end, but either way we probably need them to weigh in.

Also, please add a TODO to the requirements file to revert this once the issue is resolved.

requirements.txt Outdated Show resolved Hide resolved
Copy link
Collaborator

@tgaddair tgaddair left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix lower bound.

@github-actions
Copy link

github-actions bot commented Aug 31, 2023

Unit Test Results

       6 files  ±0         6 suites  ±0   1h 34m 21s ⏱️ + 4m 43s
2 826 tests ±0  2 790 ✔️ +2  12 💤 ±0  24  - 2 
2 869 runs  ±0  2 824 ✔️ +2  21 💤 ±0  24  - 2 

For more details on these failures, see this check.

Results for commit 20a532f. ± Comparison against base commit 63f4924.

♻️ This comment has been updated with latest results.

@arnavgarg1
Copy link
Contributor Author

@tgaddair seems like the issue was flagged and fixed on transformers master: huggingface/transformers#25805

@arnavgarg1
Copy link
Contributor Author

Associated Ludwig issue: #3571

@arnavgarg1 arnavgarg1 merged commit cd32fb8 into master Aug 31, 2023
13 of 16 checks passed
@arnavgarg1 arnavgarg1 deleted the pin_transformers branch August 31, 2023 22:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

local variable 'tokens' referenced before assignment
3 participants