Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add onnx support for FNet #802

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

hazrulakmal
Copy link

@hazrulakmal hazrulakmal commented Feb 21, 2023

What does this PR do?

Fixes #555

Add support for FNet, an architecture similar to BERT.

@fxmarty, @chainyo & @michaelbenayoun, could you give me a few pointers as to how to test the integration locally?

I did try to check the integration by following this doc but I'm not quite sure why I keep getting ModuleNotFoundError: No module named 'onnx' error text despite setting up the my development setup such that

  1. create a virtual environment
  2. install dev packages in the virenv by running pip install -e ".[dev]"

Before submitting

  • Did you write any new necessary tests?

@regisss
Copy link
Contributor

regisss commented Feb 21, 2023

Hi @hazrulakmal! Thanks for adding support for FNet 🔥
The error you got states that the onnx package is not installed in your environment. Which makes sense since the [dev] extra only installs packages used for testing and formatting the code. You should install Optimum with pip install .[dev,exporters].

@hazrulakmal hazrulakmal marked this pull request as draft February 21, 2023 20:13
@hazrulakmal hazrulakmal marked this pull request as ready for review February 21, 2023 21:40
@fxmarty
Copy link
Contributor

fxmarty commented Feb 22, 2023

Would even do pip install -e .[dev,exporters].

To test only your addition: pytest tests/exporters/onnx/test_*.py -k "fnet" -s --exitfirst

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

@hazrulakmal
Copy link
Author

hazrulakmal commented Feb 22, 2023

It seems that FNet integration does not pass some of the tests. I think It may be attributed to the operator aten::fft_fftn (fourier transform used in FNet) to ONNX opset version 11 not supported as mentioned in this doc so that means FNet is not ONNX convertible yet?

@regisss
Copy link
Contributor

regisss commented Feb 22, 2023

It seems that FNet integration does not pass some of the tests. I think It may be attributed to the operator aten::fft_fftn (fourier transform used in FNet) to ONNX opset version 11 not supported as mentioned in this doc so that means FNet is not ONNX convertible yet?

What is the error message you get? Maybe you just need to specify a higher opset version in your model config (see an example here).

@hazrulakmal
Copy link
Author

hazrulakmal commented Feb 26, 2023

Thanks! Based on this onnx documentation, Fourier transform related operators, DFT and STFT, are supported since version 17 so I have bumped up the opset version to 17 as per you suggested but still receives the error message below

torch.onnx.errors.UnsupportedOperatorError: Exporting the operator 'aten::fft_fftn' to ONNX opset version 17 is not supported. Please feel free to request support or submit a pull request on PyTorch GitHub: https://github.com/pytorch/pytorch/issue

Also did bump it to 18 and 19 but still to no avail and received Unsupported ONNX opset version: 18. onnx version used in development is 1.13.0.

I checked the conversion using the following block of code as per suggested in this doc

from pathlib import Path
from optimum.optimum.exporters import TasksManager
from optimum.optimum.exporters.onnx import export
from transformers import AutoModel
from optimum.optimum.exporters.onnx import validate_model_outputs
import onnx

print(onnx.__version__)

fnet = "hf-internal-testing/tiny-random-FNetModel"
base_model = AutoModel.from_pretrained(fnet)

onnx_path = Path("model.onnx") 
onnx_config_constructor = TasksManager.get_exporter_config_constructor("onnx", base_model)
onnx_config = onnx_config_constructor(base_model.config)

onnx_inputs, onnx_outputs = export(base_model, onnx_config, onnx_path, onnx_config.DEFAULT_ONNX_OPSET)

onnx_model = onnx.load("model.onnx")
onnx.checker.check_model(onnx_model)

validate_model_outputs(
    onnx_config, base_model, onnx_path, onnx_outputs, onnx_config.ATOL_FOR_VALIDATION
)

on top of that, I also did run pytest tests/exporters/onnx/test_exporters_onnx_cli.py -k "fnet" -s --exitfirst, and an error message below was received. the error message is a bit vague to me hence, preventing me to explore further. could anyone help narrow down the search space for which I should focus on in order to resolve the problem?

OnnxCLIExportTestCase::test_exporters_cli_pytorch_144_fnet_default - subprocess.CalledProcessError: Command 'python3 -m optimum.exporters.onnx --model hf-internal-testing/tiny-random-FNetModel --task default C:\Users\HAZRUL~1\AppData\Local\Temp\tmpu5qegdnv' returned non-zero exit status 1.

@fxmarty
Copy link
Contributor

fxmarty commented Feb 27, 2023

@hazrulakmal Thank you for investigating! Can you try pytest tests/exporters/onnx/test_exporters_onnx_cli.py -k "fnet" -s --exitfirst --full-trace? For now test_exporters_onnx_cli.py launches subprocesses, and getting only the last stack of the trace is not very informative indeed.

Reading pytorch/pytorch#81075 I would also assume that with opset 17 torch.fft.fftn should be supported. Is it the op used in FNet implementation? According to this issue, STFT is not supported yet: pytorch/pytorch#81075 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Community contribution - optimum.exporters.onnx support for new models!
4 participants