Skip to content

Conversation

@danieldk
Copy link
Contributor

Description

This PR applies a bunch of fixes that makes our encoder models compatible with TorchScript again. The changes can be grouped in the following categories:

  • Remove reliance on global state.
  • Remove of use of types that are not available in TorchScript.
  • Make type widening/narrowing more visible to the TorchScript JIT compiler, so that inference does not fail.
  • ModuleList can only be indexed with constants.
  • **kwargs is not supported.

Also add tests so that we do not have similar regressions in the future.

Fixes #126

Types of change

Bugfixes.

Checklist

  • I confirm that I have the right to submit this contribution under the project's MIT license.
  • I ran the tests, and all new and existing tests passed.
  • My changes don't require a change to the documentation, or if they do, I've added all required information.

So remove our custom workaround for macOS 13.2, this is fixed
in 13.3.
- Ensure TorchScript type inference works.
- We can't reference global variables, including errors.
- Dataclasses do not work well.
- We need __init__ that can be found in source (not synthesized).
- The tuple type only works fully specified (not Tuple or Tuple[int, ...])
…on#128)"

This reverts commit 68a355a.

The functionality introduced in this PR uses global state to detect
whether the `scaled_dot_product_attention` is available and check
whether the user want to use it. However, we cannot rely on global
state in TorchScript.
@danieldk danieldk added type/bug Type: Bug feat/model Feature: models labels Apr 17, 2023
@shadeMe shadeMe merged commit 9808fa0 into explosion:main Apr 18, 2023
@danieldk danieldk deleted the bugfix/torchscript branch August 2, 2023 17:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feat/model Feature: models type/bug Type: Bug

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Dictionary update length error

2 participants