Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix loading from HF GCP cache #5321

Merged
merged 2 commits into from
Dec 1, 2022
Merged

Fix loading from HF GCP cache #5321

merged 2 commits into from
Dec 1, 2022

Conversation

lhoestq
Copy link
Member

@lhoestq lhoestq commented Dec 1, 2022

As reported in https://discuss.huggingface.co/t/error-loading-wikipedia-dataset/26599/4 it's not possible to download a cached version of Wikipedia from the HF GCP cache

I fixed it and added an integration test (runs in 10sec)

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Dec 1, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Member

@albertvillanova albertvillanova left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fix.

Do you know why this stopped working?

builder_instance._download_and_prepare = None
builder_instance.download_and_prepare()
ds = builder_instance.as_dataset()
assert ds is not None
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you sure this is a regression test?

Before your fix, this test was not raising the reported MissingBeamOptions error, but a TypeError:

TypeError: 'NoneType' object is not callable

in relation with builder_instance._download_and_prepare = None in your test.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should not call builder_instance._download_and_prepare since it should download from the HF cache.

If it raises TypeError, it means it tried to download the dataset normally instead of using the HF cache

@lhoestq
Copy link
Member Author

lhoestq commented Dec 1, 2022

Do you know why this stopped working?

It comes from the changes in https://github.com/huggingface/datasets/pull/5107/files#diff-355ae5c229f95f86895404b72378ecd6e966c41cbeebb674af6fe6e9611bc126

Copy link
Member

@albertvillanova albertvillanova left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fix.

@lhoestq lhoestq merged commit 0a067b4 into main Dec 1, 2022
@lhoestq lhoestq deleted the fix-hf-gcp branch December 1, 2022 16:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants