Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pass encoding dimensions to SequenceCombiner #3321

Merged
merged 7 commits into from
Apr 7, 2023

Conversation

RXminuS
Copy link
Contributor

@RXminuS RXminuS commented Apr 4, 2023

If this argument isn't passed along then the attention reducer doesn't work for me.

Note that although this fixes it for me I suspect there's lots of other places this would have to be applied. But maybe someone more experienced could chip in with if this is indeed an acceptable solution and other things to consider?

If this argument isn't passed along then the `attention` reducer doesn't work for me. Note that although this fixes it for me I suspect there's lots of other places this would have to be applied. But maybe someone more experienced could chip in with if this is indeed an acceptable solution and other things to consider?
@github-actions
Copy link

github-actions bot commented Apr 4, 2023

Unit Test Results

  4 files  ±  0    4 suites  ±0   41m 42s ⏱️ + 12m 59s
33 tests ±  0  27 ✔️  - 3    6 💤 +3  0 ±0 
66 runs  +13  54 ✔️ +8  12 💤 +5  0 ±0 

Results for commit cc7b02b. ± Comparison against base commit e49eae9.

This pull request skips 3 tests.
tests.regression_tests.benchmark.test_model_performance ‑ test_performance[ames_housing.ecd.yaml]
tests.regression_tests.benchmark.test_model_performance ‑ test_performance[ames_housing.gbm.yaml]
tests.regression_tests.benchmark.test_model_performance ‑ test_performance[mercedes_benz_greener.gbm.yaml]

♻️ This comment has been updated with latest results.

@justinxzhao
Copy link
Collaborator

Hi @RXminuS,

The change looks reasonable to me!

We'd probably want to make the same change to all of the encoders in text_encoders that have reducers, but doing this for the Auto encoder first makes sense.

Not totally sure why all of the tests are failing in your PR though. I'll re-run the CI.

@RXminuS
Copy link
Contributor Author

RXminuS commented Apr 5, 2023

Cool, happy to expand the PR to the others once tests pass 👍 thanks for reviewing

@tgaddair
Copy link
Collaborator

tgaddair commented Apr 6, 2023

Thanks for the contribution @RXminuS!

Most of these test failures look like issues with some recently added tests that use S3. So the fix on our side will simply be to exclude those tests when running from a fork. cc @abidwael

The only one I'm not sure on is tests/ludwig/profiling/test_metrics.py::test_get_column_profile_attributes. Seems unrelated to either this PR or credentials, so @justinxzhao would you be able to take a quick look to see what's going on?

@justinxzhao
Copy link
Collaborator

@tgaddair thanks for the pointers.

I've removed this test tests/ludwig/profiling/test_metrics.py::test_get_column_profile_attributes, so rebasing this PR (I've gone ahead and done this) should fix that test.

@justinxzhao
Copy link
Collaborator

I've also gone ahead and applied the private_param and private_test annotations which should hopefully take care of the other failing tests that depend on specific credentials.

@tgaddair tgaddair marked this pull request as ready for review April 7, 2023 16:29
Copy link
Collaborator

@tgaddair tgaddair left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the contribution :)

We should do a follow-up to address other occurrences of this issue.

@tgaddair tgaddair merged commit d67371e into ludwig-ai:master Apr 7, 2023
8 checks passed
@w4nderlust
Copy link
Collaborator

Thanks for the contribution Rik!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants