Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support deploy environments (deploy_env) #576

Open
hmemcpy opened this issue Aug 8, 2018 · 11 comments
Open

Support deploy environments (deploy_env) #576

hmemcpy opened this issue Aug 8, 2018 · 11 comments

Comments

@hmemcpy
Copy link
Contributor

hmemcpy commented Aug 8, 2018

(subsumes #560, expands on bazelbuild/bazel#1402)

Bazel has an internal feature called deploy_env for java_binaries, which allows performing classpath subtractions for specified targets (see a comment by Ulf here).

We tested this feature (by cherrypicking the commit to expose it in java_binary) and it is suitable for our needs for specifying different deployment targets having different dependencies (something we can't use neverlink for).

As other people seemed to find it useful as well, I would like to propose adding support for this for scala_binaries. The classpath subtraction can be done in skylark using the code we already wrote, to support deployment scenarios explained in #560.

WDYT?
cc @ittaiz

@johnynek
Copy link
Member

johnynek commented Aug 8, 2018

Do we need to do anything in rules_scala? Why not use java_binary?

We have talked about getting rid of scala_binary or making it only a macro that wraps java_binary. Does his ticket inform that discussion?

@samschlegel
Copy link
Contributor

It looks like support for deploy_env in java_binary has finally landed in master bazelbuild/bazel@a92347e

@johnynek
Copy link
Member

did that make it into 0.22?

@samschlegel
Copy link
Contributor

Doesn't look like it

@thundergolfer
Copy link
Contributor

For anyone else arriving here, deploy_env is on track to be released with 0.23 in late Feb 2019. bazelbuild/bazel#6495 (comment)

@stijndehaes
Copy link

Any updates on this? Would be useful for building fat jars for spark application. I do not want to include spark every time as this dependency is big and already exists on the spark cluster itself

@johnynek
Copy link
Member

Did you try using deploy_env with java_binary?

You should be able to use java_binary with scala_library.

@johnynek
Copy link
Member

You could also use jar jar to zap the classes you don’t want from the deploy jar:

https://github.com/johnynek/bazel_jar_jar

@ittaiz
Copy link
Member

ittaiz commented Jul 20, 2019 via email

@thundergolfer
Copy link
Contributor

What about using java_binary

This is what we do. It works. The main_class can be 'fake', and you put the provided deps in runtime_deps. You then can use this target like so:

deploy_env = [
   ""//tools/build/spark-cluster-runtime",
]

@johnynek
Copy link
Member

The reason we made scala_binary is that it predates “java sandwich” so java could not depend on scala rules. Now that motivation is gone.

gergelyfabian pushed a commit to gergelyfabian/rules_scala that referenced this issue May 31, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants