compact packaging - all inside namespace #149
Comments
I don't think we should do this:
|
If anyone can explain to me the issue with the |
well you can still have just a wrapper in the root (almost no code duplication)
Well here you can use the configs in the package which would be kind of frozen, but still, you are free to use any config from your local/custom path
In the case if you install the package the |
@Borda but couldn't you simply put them to the ignore part of find_packages in setup.py? |
yes, you can also ignore them in the manifest, but that is not the point, you as a user want to have the configs, right? not need aways to clone the repo, just to install it with pip and use... |
This is already what we have: https://github.com/PyTorchLightning/lightning-transformers/blob/master/train.py
But then you would have to copy the train.py code and change the hydra path to the config, so we want people to use our config directory
https://github.com/PyTorchLightning/internal-dev/issues/132
Yes for the tests, (see issue above) but not for the configs as we want to distribute them. |
What if instead of yaml configs we ship dataclass configs as part of the package? Then the location doesn't matter so much since they can be imported. |
Hi @carmocca @Borda @SeanNaren, Gianluca's here. I've a few contributions on PL and wanted to give my opinion on the issue. I think having everything inside the package could be really useful for end-user. Since users usually install packages from pypi, we cannot rely on the repository to distribute training scripts and configurations. I think having all inside a unique package, instead of having "lightinin-transformer-train" and so on, would be cleaner for the users. |
I don't think this is supported by Hydra. And even if it did, we want to facilitate users modifying/extending these configs and
Why not? This is exactly the point of the
Are you saying you prefer?:
over
This is totally fine, but note that these two options are not mutually exclusive. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
馃殌 Feature
refactor to make the package more compact, move configs and train inside
Motivation
be able to call it from everywhere as
python -m lightning_transformers.train --some asrgs
Pitch
Easier to use from everywhere
reduce collision with configs (if any) from other packages
Alternatives
Additional context
The text was updated successfully, but these errors were encountered: