Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Multimodal] Update version bounds #2818

Merged
merged 4 commits into from
Feb 3, 2023

Conversation

sxjscience
Copy link
Collaborator

Issue #, if available:

#612

Description of changes:

Update the version bounds in multimodal dependencies.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@sxjscience sxjscience changed the title [Multimodal]Update setup.py [Multimodal] Update version bounds Feb 2, 2023
@sxjscience
Copy link
Collaborator Author

For version bounds, we can adopt the guideline to always use:

# Packages that have low--up bounds.
PACKAGE_NAME>=LOWER_BOUND,<UPPER_BOUND

# Packages that must be upgraded if it's lower version
PACKAGE_NAME>=LOWER_BOUND

# Most common packages
PACKAGE_NAME

"sentencepiece>=0.1.95,<0.2.0",
f"autogluon.core[raytune]=={version}",
f"autogluon.features=={version}",
f"autogluon.common=={version}",
"pytorch-metric-learning>=1.3.0,<1.4.0",
"pytorch-metric-learning>=1.3.0,<2.0",
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@zhiqiangdon I noticed that pytorch-metric-learning released version 2.0. Do we need to consider to update the version?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I checked and <2 would be enough to get things working. It's not ideal and should be lifted to allow v2, but it's less urgent than lifting the cap from <1.4.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pytorch-metric-learning seems okay to upgrade.

@Innixma
Copy link
Contributor

Innixma commented Feb 2, 2023

@sxjscience Regarding this: how about the following:

"numpy",  # version range defined in `core/_setup_utils.py`
"pandas",  # version range defined in `core/_setup_utils.py`
"torchvision>=foo,<bar",

...

I think not everyone on the team is familiar with the fact that we actually are version range bounding those packages like numpy and pandas, even if it isn't obvious in the setup.py without looking closer. The in-line comments might help make this clear.

"jinja2>=3.0.3,<3.1",
"openmim>0.1.5,<0.4.0",
"defusedxml>=0.7.1,<0.7.2",
"jinja2>=3.0.3,<3.2",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

jinja2 needs to be <3.1 or it will error, at least it did in my PR before I dropped to <3.1.

Copy link
Contributor

@Innixma Innixma Feb 2, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error may have been from the requirements_docs dependencies though, so maybe <3.2 is ok for the package dependencies, lets keep it <3.2 and see if it passes

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, let's see if the CI + document build can pass.

@Innixma Innixma added this to the 0.7 Release milestone Feb 2, 2023
@Innixma Innixma added module: multimodal dependency Related to dependency packages priority: 0 Maximum priority labels Feb 2, 2023
multimodal/setup.py Outdated Show resolved Hide resolved
"pytorch_lightning>=1.8.0,<1.10.0",
"text-unidecode<=1.3",
"text-unidecode<1.4",
"torchmetrics>=0.8.0,<0.9.0",

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Latest version for this is 0.11.1, could the cap be lifted?

Copy link
Collaborator Author

@sxjscience sxjscience Feb 3, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@zhiqiangdon Can we loose the cap of torchmetrics?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can't loose the cap of torchmetrics for now. We need to make some changes in our data preprocessing to loos the cap.

"jsonschema<4.18",
"seqeval<1.2.3",
"evaluate<0.4.0",
"accelerate>=0.9,<0.17",
"timm<0.7.0",
"torch>=1.9,<1.14",
"torchvision<0.15.0",

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Side note: what happened to the torchtext dependency? 🤔

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We managed to remove it by upgrading lightning... See #2739

@@ -235,7 +235,7 @@ def collate_fn(self, text_column_names: Optional[List] = None) -> Dict:

def build_one_token_sequence(
self,
text_tokens: Dict[str, NDArray[(Any,), np.int32]],
text_tokens: Dict[str, NDArray],
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@zhiqiangdon @cheungdaven I changed the typehints to be just NDArray due to the API change of the latest nptyping.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NDArray should be fine, but the shape and dtype info are no longer available.

@github-actions
Copy link

github-actions bot commented Feb 3, 2023

Job PR-2818-e939343 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-2818/e939343/index.html

@github-actions
Copy link

github-actions bot commented Feb 3, 2023

Job PR-2818-6a538d9 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-2818/6a538d9/index.html

Copy link
Contributor

@zhiqiangdon zhiqiangdon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@sxjscience sxjscience merged commit 8032b49 into autogluon:master Feb 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependency Related to dependency packages module: multimodal priority: 0 Maximum priority
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants