Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Package has 0 dependencies #124

Closed
dinjazelena opened this issue Sep 27, 2023 · 6 comments
Closed

Package has 0 dependencies #124

dinjazelena opened this issue Sep 27, 2023 · 6 comments

Comments

@dinjazelena
Copy link
Contributor

Hey,

when u publish to pypi, only main dependencies are installed(dependencies under [tool.poetry.dependencies].
Currently when i pip install quinn, it does not install any library:
image

Right now u have all of your libraries under dev-dependencies, which is installable only by poetry for development purposes.
Anyway dev-dependencies will be depracated and u must migrate to poetry groups such as linting group, testing group, and docs group.

But anyway most important thing is that all libraries that quinn package depends on must be under main dependencies where now u have only python version specified.
image

I can open PR and fix all of this..

@dinjazelena dinjazelena changed the title Python as only main dependency Package has 0 dependencies Sep 27, 2023
@SemyonSinchenko
Copy link
Collaborator

Quinn was designed for using on PySpark clusters where all the spark binaries and also PySaprk bindings are already installed . So, for me it is a right behavior cause I do not want to install another version of PySpark on my cluster.

For me it is not an issue. @MrPowers what do you think?

@MrPowers
Copy link
Owner

The only dependency that we would consider changing from a dev-dependency => a regular "dependency" would be pyspark. Dependencies like pyspark and chispa are intentionally marked as dev-dependencies.

I think @dinjazelena is right that technically speaking, pyspark should be considered a regular dependency (not a dev-dependency).

We want pip install quinn to work in a variety of execution contexts like Databricks notebooks, EMR notebooks, locally, etc. If I recall correctly, including pyspark as a dependency causes problems with pip install quinn is run in those execution contexts. Feel free to correct me if I'm recalling incorrectly.

@SemyonSinchenko
Copy link
Collaborator

We can make pyspark an external dependency, so for example we will have the ability to run pip install quinn[pyspark] and it will install pyspark. But we will still be able to just run pip install and simplify the dependency resolution process and also not break managed Spark environments like Databricks. It can be done via poetry groups I guess.

@dinjazelena
Copy link
Contributor Author

Aha, so u expect this to be installed in place with already defined environments like DBR. Then yeah, as long as u test for all possible spark versions, good to go.

Still dev-dependencies will be deprecated soon:
image

Ideally you should have:

development group: pyspark or as Sem mentioned as extras so installable with pip install quinn[pyspark] for example
testing group: pytest, chipsa, pytest-describe
linting group: black, ruff, mypy
docs group: mkdocs thingies

whole dev project is installable then by:
poetry install --with=development,testing,linting,docs

Which helps u then in CI for example for linting u only need linting group or in docs u only need docs group, u get the point.

Would u like to change it in this way? With Makefile as set of common commands.

@SemyonSinchenko
Copy link
Collaborator

I like this idea!

@MrPowers
Copy link
Owner

@dinjazelena - sounds like a good idea, can you please submit a PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants