-
Notifications
You must be signed in to change notification settings - Fork 645
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sharing code between google cloud modules #773
Comments
Related to this: |
@ennru would you be open for a shared (internal?) module google-common that is used by fcm/storage/pubsub? Otherwise we should close this. |
I think shared module sounds good. It will still have to be published as a jar, but all of the classes in that jar can be inside the |
#2451 copied in code from another Google connector. A shared module would make sense to not duplicate code but on the other hand, users will need to use the exact same Alpakka version for different connectors, which is not required this far. |
Do we need that? Can't we sbt magic the module files to the given submodules at compile time? There are pros and cons with this solution, but in that case we could write/test code as a module, but we don't need to publish it, and all submodules would use their own implementations. |
What it seems to be very valuable would be to extract the http code that happens behind the scenes accompanied with the re-authentication/token-refresh, and have a library or even a http-connector (that handles the token refresh) so that if you cannot find a connector for a GCP component you can roll one out on your own easily. |
I get it, but as @ennru said, at this point the GCP-authenticator version would force you to use the same version in your google based alpakka modules (which could be good or bad). I think if we decide that this is a problem, and also decide that rolling out a GCP-auth lib is not in the scope of the alpakka codebase, we have a second way to share the code between modules. We can write an sbt-task that can copy the compiled "shared" code to the given google based alpakka libs, so in this case we don't need to roll out a lib, bcs we compile time hack together all of the things. If you start a new google based alpakka module, you just add the precompile-copy step to the sbt. I'm not convinced that the sbt hack is the way, but I also see that rolling out a lib could be problematic in some cases. Also, I think the GCP-auth code is not an easy connector. The backpressure, and "waiting" branches could couse troubles. |
Thank you @tg44 and @gkatzioura for your ideas around this. A separate library to plug in the authentication seems attractive, especially if it gets improved to what we have today. |
I've thought about this a bit and I think adding a new sub project to alpakka to host this shared code is good enough. It may slightly complicate the release process. The sub-project would be built and released along with Alpakka itself. We'll have to take care that shared code works across all google connectors every time it's updated. |
If @tg44 or @armanbilge are interested feel free to reference this issue and put a PR up, but let's tackle it after the BigQuery connector PR is merged. |
Implemented in #2613 |
Something to handle after #764 and #650 have been merged
This would unify configuration / session handling for all google cloud related modules. For amazon we are using their sdk but @tg44 pointed out that it would be a bad idea for the google cloud sdk
We should still keep the possibility to configure manually if you want to use multiple service accounts for example
Previous discussion:
#650 (comment)
The text was updated successfully, but these errors were encountered: