Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Binary dependencies are now downloaded and copied to Carthage/Checkout folder #2532

Closed
wants to merge 4 commits into from

Conversation

KoCMoHaBTa
Copy link
Contributor

Upon carhage checkout - binary dependencies are now downloaded and copied to Carthage/Checkout folder.
This also implies to carthage update --no-build and carthage bootstrap --no-build

The PR is implementation for #2319
The implementation required that binary dependencies are downloaded prior copying them, so this might be a partial fix for #2449

@tmspzz
Copy link
Member

tmspzz commented Jul 27, 2018

Thanks for the PR! 🌟

Can you add a bats test to check that binaries are indeed downloaded and we don't break behavior later?

@KoCMoHaBTa
Copy link
Contributor Author

Sure, I'll look into this and update the PR.

@tmspzz
Copy link
Member

tmspzz commented Jul 27, 2018

@KoCMoHaBTa if you feel that another type of test is more appropriate add that. it's also ok

@KoCMoHaBTa
Copy link
Contributor Author

@blender Also if other test cases comes to your mind - let me know.

@KoCMoHaBTa
Copy link
Contributor Author

@blender The PR is updated with tests that are validating that the downloaded binary dependencies are copied to the Carthage/Checkouts folder.
The test covers the new developed behavior during checkout and update commands.

Copy link
Member

@tmspzz tmspzz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jdhealy jdhealy self-requested a review July 29, 2018 19:02
@KoCMoHaBTa
Copy link
Contributor Author

@jdhealy any updates on the review?

Copy link
Member

@jdhealy jdhealy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Finding a place for binary dependencies’ archives to be accessible for scripting somewhere that’s not the temporary directory, I’m on board with… but, I guess I’m not sure why that place would be a Carthage/Checkouts/… directory?

Historically, it’s only been checkouts of git repositories having a home in that directory. To that effect, there’s a number of preconditions about the state of that directory at various points after side-effectual code has run.

I started out by looking into #2041 which uses the contents of Carthage/Checkouts/… directories and their Cartfiles for carthage build dependency graph resolution — whether it handled the case of binary-only dependencies, and it thankfully does.

To a larger degree, I was looking into this PR’s deletion of Carthage/Checkouts/$(dep.name /* a Dependency.binary */) — whether a binary-only dependency gets ruled-out when alongside a non-binary-only dependency with the same name, and here’s the trouble: they can be in the graph alongside each other.

Under this PR, when a binary-only dependency enacts the deletion of an previously checked out directory for a non-binary-only dependency with the same name, further assumptions about the state of what is assumed to still be on-disk are invalidated and danger is threatened.

All of the above has me wanting the directory for this behavior to be somewhere other than a Carthage/Checkouts/… directory.

@jdhealy
Copy link
Member

jdhealy commented Aug 9, 2018

Sorry for taking so long in writing up the review…

Appreciate the work you’ve done on this, and especially the tests 👏

I’m not sure where I’d want that directory (for binary dependencies’ archives to be accessible for scripting) to be located. There probably is a high bar (from developers not wanting to add to .gitignores) in adding another Carthage subdirectory for this, unfortunately.

Maybe there’s an alternative mechanism for hooking a script onto the binary-only archive download?

@KoCMoHaBTa
Copy link
Contributor Author

How about creating a subdirectory called Binary within the Carthage/Checkout/ directory - eg. Carthage/Checkout/Binary and place the binaries there.
Would that solve the issue of having non-binary and binary dependency with the same name?

@KoCMoHaBTa
Copy link
Contributor Author

And not bothering developers with another directory :)

@tmspzz
Copy link
Member

tmspzz commented Aug 9, 2018

To a larger degree, I was looking into this PR’s deletion of Carthage/Checkouts/$(dep.name /* a Dependency.binary */) — whether a binary-only dependency gets ruled-out when alongside a non-binary-only dependency with the same name, and here’s the trouble: they can be in the graph alongside each other.

They can but carthage doesn't support namespacing and neither do the imports in a project if 2 frameworks have the same name.

@jdhealy
Copy link
Member

jdhealy commented Aug 9, 2018

To a larger degree, I was looking into this PR’s deletion of Carthage/Checkouts/$(dep.name /* a Dependency.binary */) — whether a binary-only dependency gets ruled-out when alongside a non-binary-only dependency with the same name, and here’s the trouble: they can be in the graph alongside each other.

They can but carthage doesn't support namespacing and neither do the imports in a project if 2 frameworks have the same name.

This isn’t really about sensical users, but more about potential malicious actors.

(And a slight concern about Carthage contributors accidentally inferring too much from disk-state, rather than drawing info from the dependency graph as intended.)

@jdhealy
Copy link
Member

jdhealy commented Aug 9, 2018

@KoCMoHaBTa, if I could ask for specifics — for your workflow, what in a compressed archive is needed, and needed to achieve what goal?

Many people manage dependencies like jspahrsummers/xcconfigs (a repository without an Xcode project) through Carthage, and even some use Git LFS through Carthage for repositories with larger files.

Would it be possible to have the needed files from the compressed archive hosted in a git repository instead?

@KoCMoHaBTa
Copy link
Contributor Author

My use case is for resolving dependencies from 3rd party frameworks that are prebuilt, eg. Firebase and Facebook that provide organized view of their zip content based on what you would need and also provide resources outside of their frameworks as bundles.

Due to the way how currently the build step works - it validates frameworks and copy them, however this leads to inability to access resources and other related files, like module maps and header files that are provided within the archive, but ignored by Carthage.
Also the framework files are shuffled together and we lose the organization provided by the framework provider.

Uploading the archives into repository and using non-binary approach would work for most of the cases, however this means that we have to manually download newer versions and re-upload them all the time. Also we are limited in size - take Firebase for example - their archive is ~500MB containing multiple frameworks and resources.

Having the archive contents inside the checkout folder solves all of these issues, because it will allow us to just update the URL of the dependency to the newer version and would do the trick. Also the companies that provide closed source frameworks will finally have an alternative to CocoaPods.

In my workflow - i only checkout the resolved dependencies and integrate my projects directly from the checkout folder and let the project build the dependencies during its build phase.

@KoCMoHaBTa
Copy link
Contributor Author

@jdhealy How should we proceed?

@jdhealy
Copy link
Member

jdhealy commented Aug 16, 2018

In my workflow - i only checkout the resolved dependencies and integrate my projects directly from the checkout folder and let the project build the dependencies during its build phase.

Yeah, totally, there’s a deficiency — no way to specify to ignore GitHub-release binaries and skip building via xcodebuild, but still get binary-only archive extractions. That’s #2449.

take Firebase for example - their archive is ~500MB containing multiple frameworks and resources.

Firebase, in particular, now has a more granular (as marked, experimental) approach using multiple binary-only archives.

Having a single archive is convenient, until it isn’t (large download size, less explicit interdependency statements, conflated versioning without semantics, no granularity to suit workflows as needed).

As far as modulemaps and header files in Firebase, I’m not familiar with that aspect of the new approach, but does it remove the need to have them manually acted upon?

Also the framework files are shuffled together

Not quite sure what you mean by this?


Ideally, people would respond fluidly to the addition of a Carthage/Resources-from-Archives/… set of subdirectories (or even a Carthage/Checkouts/Resources-from-Archives), but it might prove too unexpected for people. I haven’t heard enough feedback to know for sure, if people have additional usecases please post in this thread and let me know.

Until other feedback comes in, I’d rather hold off on this feature and have people use the existing ways of having Carthage handle pared-down-to-their-essentials per-repository dependencies.¹.

⑴ The documentation to help people achieve this is slightly lacking, @KoCMoHaBTa if you’d like to ask questions about it, or help contribute additions, that would be wonderful 👏

@KoCMoHaBTa
Copy link
Contributor Author

Hey @jdhealy, Thanks for the input and let me add several more things.
Sorry for the huge post, but hopefully it will give more light on the topic with the attempt to highlight the problems which i'm trying to solve.

Lets take for example Firebase, Facebook and Crashlytics - very common closed source frameworks that i use in some of my projects.
Ideally, my goal is to be able to specify binary dependencies for closed source frameworks and manage them by Carthage. The problem is that they usually don't have any particular support for Carthage.

The good thing is that they don't need to - because all of them have direct links (with versions) to download the contents, and thanks to the ability for Carthage to specify relative path for binary dependencies - anyone can very easily just add a new definition for a binary dependency within his project and be able to use Carthage to resolve the binary dependencies.

The bad thing with most of them is that they distribute static libraries, wrapped as frameworks and due to that, they can't embed their resources and have to distribute them separately.

In my workflow - i only checkout the resolved dependencies and integrate my projects directly from the checkout folder and let the project build the dependencies during its build phase.

Considering that, at this point, binary dependencies are not delivered during checkout - we could workaround this by using a second Carthage setup within a single project that will be responsible only for the binary dependencies.

In the way Carthage works right now - it requires that you use the build binary phase in order to give you access to the frameworks within the archives. This leads to the following problems:
1. you don't have access to the resources distributed within the archive
2. you end up with shuffled frameworks


Firebase, in particular, now has a more granular (as marked, experimental) approach using multiple binary-only archives.

I wasn't aware of this and will definitely try it out, however as i'm looking into it i see a ridiculous way to workaround the problem with the resources (problem 1 above) - they embed the resources within the framework wrapper in order for them to be delivered, but they still require you to manually add them to Copy Bundle Resources Build Phase of your target. This is an obvious attempt to workaround Carthage's ignorance of the binary resources (problem 1 above)

At least, good for Google that they are trying something about it - its definitely better than nothing, but that does not solve the problem at all and other frameworks, like Facebook - they don't care, yet.

As far as modulemaps and header files in Firebase, I’m not familiar with that aspect of the new approach, but does it remove the need to have them manually acted upon?

This not part of the new approach - this is part of Firebase's current official approach - download the archive from they official Guide, under section Integrate without CocoaPods. There is a README.md inside which explains the details, but in short there is top-level header and modulemap file that you need to link to your project in order for everything to work correctly. Due to problem 1 above, you literally can't do it.
On top of it - Carthage can't handle every case for framework and for some reason it fails to resolve Firebase, even using the binary build phase - firebase_error

IMO - binary dependencies should be delivered as they are - without any assumptions, but that's a different talk.


Also the framework files are shuffled together
Not quite sure what you mean by this?

Consider that you are resolving Crashlytics and Facebook at the same time, using carthage update
You will end up with all frameworks from all dependencies put into a single folder.
shuffling

This leads to the need to spend at least 5 minutes every time you need to integrate a binary dependency in order to find the new stuff.
On top of it - it is more risky in case there are 2 framework files with the same name from 2 different dependencies.

Distributors of binary dependencies, usually organize them in some way that will ease your navigation trough its contents.
facebook-organized
In addition - i've highlighted the missing resource bundles.


whether a binary-only dependency gets ruled-out when alongside a non-binary-only dependency with the same name, and here’s the trouble: they can be in the graph alongside each other.

Regarding this - well, even now you can have 2 github dependencies with the same name from 2 users - guess what - only 1 of them wins.

duplicate_cartfile
duplicate_error

IMO - this is totally separate issue that should be resolved on its own at different time.


And finally - regarding the additional folders related to the binary checkout process - IMO, since these are just additions and will not break anything that currently exists. Most of the ppl will probably not even be aware that they exist initially, which i don't see to be an issue at this time.

Ofc they should be documented, as everything else, for which i will also contribute in the appropriate way, based on the decision we will take at the end.

@jdhealy
Copy link
Member

jdhealy commented Aug 17, 2018

Ofc they should be documented, as everything else, for which i will also contribute in the appropriate way, based on the decision we will take at the end.

Oh, sorry; miscommunication on this: I was referring, entirely separately from features this PR proposes, to the lack of documentation on the feature of xcodeproj-less git-repository-based dependencies that contain configuration files or resource bundles (which us Carthage maintainers should begin to document).

a ridiculous way to workaround the problem with the resources (problem 1 above) - they embed the resources within the framework wrapper in order for them to be delivered, but they still require you to manually add them to Copy Bundle Resources Build Phase of your target. This is an obvious attempt to workaround Carthage's ignorance of the binary resources (problem 1 above)

At least, good for Google that they are trying something about it - its definitely better than nothing

Yeah, appreciate the aspect of a more granular approach, but clearly lacks intuitiveness dealing with resource bundles. Carthage should have a more structured solution to improve that. Obviously, not just Carthage-centric — inherent with static libraries, not being able to bundle resource dependencies (on the filesystem) is an issue.

regarding the additional folders related to the binary checkout process - IMO, since these are just additions and will not break anything that currently exists. Most of the ppl will probably not even be aware that they exist initially

We would, for sure, get a lot of feedback from Carthage users aggravated about having to add a line to their .gitignores, or questioning whether it should be opt-in.

This leads to the need to spend at least 5 minutes every time you need to integrate a binary dependency in order to find the new stuff.

Sidenote: carthage build takes as arguments a list of dependency names (which can absolutely all be binary-only), so a sequence of those might make that workflow easier. Kind of a partial solution, but moreso:

The indication for when depended-upon new stuff gets added is Semantic Versioning — that’s the covenant we require from vendors for Carthage and needs to hold true throughout binary-only and source-available dependencies.

If the items in the compressed archives have interdependencies on each other (say a static framework depending upon a resource bundle), there’s no structured representation of that. It’s conflating an (unrepresented) dependency graph as a single leaf-node dependency.

@jdhealy
Copy link
Member

jdhealy commented Aug 17, 2018

Oh, most of all @KoCMoHaBTa , thanks for the additional feedback — clarified a lot, especially the need for additional features around static frameworks and resource bundles.

Greatly appreciated 👏

@KoCMoHaBTa
Copy link
Contributor Author

We would, for sure, get a lot of feedback from Carthage users aggravated about having to add a line to their .gitignores, or questioning whether it should be opt-in.

@jdhealy How about if this feature is triggered by an additional flag, explicitly, eg. --copy-binaries? This way there will be no additional folder to unexpectedly bug existing users.

@KoCMoHaBTa
Copy link
Contributor Author

This way, whoever wants to opt-in and use it, will be aware of the additional folder.

@jdhealy
Copy link
Member

jdhealy commented Aug 17, 2018

We would, for sure, get a lot of feedback from Carthage users aggravated about having to add a line to their .gitignores, or questioning whether it should be opt-in.

@jdhealy How about if this feature is triggered by an additional flag, explicitly, eg. --copy-binaries? This way there will be no additional folder to unexpectedly bug existing users.
This way, whoever wants to opt-in and use it, will be aware of the additional folder.

That is, to be clear, a more minor point than the rest of the above.

@KoCMoHaBTa
Copy link
Contributor Author

Ok, i've got a little bit confused already, so to the point - what changes are needed for this feature to be accepted?

@ikesyo
Copy link
Member

ikesyo commented Sep 11, 2018

ping @jdhealy

@stale
Copy link

stale bot commented Oct 11, 2018

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Oct 11, 2018
@stale stale bot closed this Oct 18, 2018
@ikesyo
Copy link
Member

ikesyo commented Oct 19, 2018

Reopening.

ping @jdhealy

@ikesyo ikesyo reopened this Oct 19, 2018
@stale stale bot removed the stale label Oct 19, 2018
@stale
Copy link

stale bot commented Nov 18, 2018

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Nov 18, 2018
@mdiep mdiep removed the stale label Nov 19, 2018
@stale
Copy link

stale bot commented Dec 19, 2018

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@rdingman
Copy link

rdingman commented Apr 18, 2020

cc: @jdhealy

Any chance this PR could be resurrected? I'd find this particularly useful for FirebaseCrashlytics which distributes some binaries (non-framework) for uploading dSYMs to their service. Check out any of the zip files referenced in:

https://dl.google.com/dl/firebase/ios/carthage/FirebaseCrashlyticsBinary.json

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants