New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running ORTModule with other EPs from ORT #78
Comments
Hi @chethanpk, can you please post the output of the configure step? |
@natke |
Hi @natke, did you get a chance to take a look at this? |
Hi @chethanpk we currently do not have support for running |
@baijumeswani will try it on linux and let you know. |
@baijumeswani I tried it on linux and I was able to complete the training and it did not error out at ORTModule. However it is not using the OneDNN EP. It was using the default CPU EP. |
Thanks @chethanpk for reporting this. On further looking, it would appear that we currently have support for cuda and rocm execution providers through |
Just a minor update: supporting other EPs in ORTModules is on our to-do list but we don't have a deadline for it. |
Is there any update on this? I am currently running by forcing it to use DNNL EP by default and building the wheel with DNNL EP but we need it so that anyone else can directly build and use it. |
Hi @chethanpk, I'm the PM for this package. Can you reach out to me at nakersha@microsoft.com and we can have a conversation about your use case |
Closing this issue now. Please re-open the issue in case we can provide more assistance through this channel. |
I am building a new wheel with the OneDNN EP using Onnx runtime training. After that is installed, I install torch_ort and then run the configure, but it does not seem to work ( I get the same error asking me to run the configure again). From the instructions, I see that there is no recipe for this combination. Is this possible or is there any other way for me to build a custom wheel and use it to train bert model with OneDNN and ORT?
The text was updated successfully, but these errors were encountered: