-
Notifications
You must be signed in to change notification settings - Fork 861
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Multimodal] Update version bounds #2818
Conversation
For version bounds, we can adopt the guideline to always use: # Packages that have low--up bounds.
PACKAGE_NAME>=LOWER_BOUND,<UPPER_BOUND
# Packages that must be upgraded if it's lower version
PACKAGE_NAME>=LOWER_BOUND
# Most common packages
PACKAGE_NAME |
"sentencepiece>=0.1.95,<0.2.0", | ||
f"autogluon.core[raytune]=={version}", | ||
f"autogluon.features=={version}", | ||
f"autogluon.common=={version}", | ||
"pytorch-metric-learning>=1.3.0,<1.4.0", | ||
"pytorch-metric-learning>=1.3.0,<2.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zhiqiangdon I noticed that pytorch-metric-learning released version 2.0. Do we need to consider to update the version?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I checked and <2
would be enough to get things working. It's not ideal and should be lifted to allow v2, but it's less urgent than lifting the cap from <1.4
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pytorch-metric-learning seems okay to upgrade.
@sxjscience Regarding this: how about the following: "numpy", # version range defined in `core/_setup_utils.py`
"pandas", # version range defined in `core/_setup_utils.py`
"torchvision>=foo,<bar",
... I think not everyone on the team is familiar with the fact that we actually are version range bounding those packages like |
"jinja2>=3.0.3,<3.1", | ||
"openmim>0.1.5,<0.4.0", | ||
"defusedxml>=0.7.1,<0.7.2", | ||
"jinja2>=3.0.3,<3.2", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
jinja2 needs to be <3.1
or it will error, at least it did in my PR before I dropped to <3.1
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error may have been from the requirements_docs
dependencies though, so maybe <3.2
is ok for the package dependencies, lets keep it <3.2
and see if it passes
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, let's see if the CI + document build can pass.
"pytorch_lightning>=1.8.0,<1.10.0", | ||
"text-unidecode<=1.3", | ||
"text-unidecode<1.4", | ||
"torchmetrics>=0.8.0,<0.9.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Latest version for this is 0.11.1, could the cap be lifted?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zhiqiangdon Can we loose the cap of torchmetrics?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can't loose the cap of torchmetrics for now. We need to make some changes in our data preprocessing to loos the cap.
"jsonschema<4.18", | ||
"seqeval<1.2.3", | ||
"evaluate<0.4.0", | ||
"accelerate>=0.9,<0.17", | ||
"timm<0.7.0", | ||
"torch>=1.9,<1.14", | ||
"torchvision<0.15.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Side note: what happened to the torchtext dependency? 🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We managed to remove it by upgrading lightning... See #2739
@@ -235,7 +235,7 @@ def collate_fn(self, text_column_names: Optional[List] = None) -> Dict: | |||
|
|||
def build_one_token_sequence( | |||
self, | |||
text_tokens: Dict[str, NDArray[(Any,), np.int32]], | |||
text_tokens: Dict[str, NDArray], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zhiqiangdon @cheungdaven I changed the typehints to be just NDArray
due to the API change of the latest nptyping.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
NDArray
should be fine, but the shape and dtype info are no longer available.
Job PR-2818-e939343 is done. |
Job PR-2818-6a538d9 is done. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
Issue #, if available:
#612
Description of changes:
Update the version bounds in multimodal dependencies.
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.