You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A pytorch lightning model that is converted to torchscript and saved in an environment running pytorch lightning 2.2 cannot be loaded in any environment not containing pytorch lightning 2.2, including both python environments and environments running other languages such as C++. This is obviously completely breaking, as PTL 2.2 cannot be installed in C++ environment, and this means we cannot serve our models.
Temp solution: downgrade to pytorch lightning 2.0, which does not have this problem
JIT models are supposed to be completely package independent. Of course only in python we could just always run PTL 2.2, but this is not an option in C++ deployment and this makes it completely breaking for that use case.
### Error messages and logs
2024-04-05 00:54:30.593
class AttributeDict:
2024-04-05 00:54:30.593
Serialized File "code/torch/pytorch_lightning/utilities/parsing.py", line 2
2024-04-05 00:54:30.593
expected indent but found 'newline' here:
2024-04-05 00:54:30.593
Exception in loading model z5yixmej
### Environment
This occurs across environments.
### More info
_No response_
The text was updated successfully, but these errors were encountered:
Bug description
A pytorch lightning model that is converted to torchscript and saved in an environment running pytorch lightning 2.2 cannot be loaded in any environment not containing pytorch lightning 2.2, including both python environments and environments running other languages such as C++. This is obviously completely breaking, as PTL 2.2 cannot be installed in C++ environment, and this means we cannot serve our models.
Temp solution: downgrade to pytorch lightning 2.0, which does not have this problem
What version are you seeing the problem on?
v2.2
How to reproduce the bug
in environment running PTL 2.2:
in any environment not running PTL 2.2, say python for example:
JIT models are supposed to be completely package independent. Of course only in python we could just always run PTL 2.2, but this is not an option in C++ deployment and this makes it completely breaking for that use case.
2024-04-05 00:54:30.593
class AttributeDict:
2024-04-05 00:54:30.593
Serialized File "code/torch/pytorch_lightning/utilities/parsing.py", line 2
2024-04-05 00:54:30.593
expected indent but found 'newline' here:
2024-04-05 00:54:30.593
Exception in loading model z5yixmej
The text was updated successfully, but these errors were encountered: