-
Notifications
You must be signed in to change notification settings - Fork 265
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request: Download first, compile after #2000
Comments
I'm not sure how feasible this would be with our current setup, as we use the status of the downloading command to determine if a rebuild is needed in the first place |
If it was C#, I'd download, decide whether rebuilding is needed, and store the decision into a variable. Once all downloads are finished (I assume 15 minutes), I'd use that group of variables to rebuild as needed (which takes hours). Looking at the Bash code, it seems the function in charge of downloading is |
bruh |
^ I take that as a "yes". 😊 |
On somewhat similar lines, what would also be helpful if the script did not automatically abort upon failure to build a ffmpeg dependency. How it could work is the following:
|
@GyanD Then maybe the With this approach, you can resume a failed package compilation with only ten seconds delay, as opposed to hours. Failures are also less likely, because there are fewer moving parts. Of course, you'd eventually want to update the package. You issue an update command on your own schedule. It updates the artifacts, then rebuilds as needed. |
That takes care of one scenario - a dep can't be downloaded. But the most common scenario is not lack of access to the dep repo but that an updated dep src can't be built or linked successfully with ffmpeg. |
Why? ("Can't be built" is a bit too broad. Can you think of possible reasons besides download?) |
Breaking change. Currently, libass is broken due to underspecified dependencies of its own in in its .pc This kind of breakage is a semi-regular occurrence. |
Well, that's not a problem with vcpkg, because like I said, once you successfully build an artifact, like But I think you might still have a problem with this because you only publish static builds. Shared builds don't have this problem, because as long as the EXE can find the DLL and the function signatures match, the entire product works. |
So, it keeps older artifacts around? |
Yes. It doesn't update or discard any artifact unless you say so. In addition, it has three caches: Downloads, Packages, and Buildtrees. You can reuse or delete all three if you are so inclined. I think |
You could use Windows File History on the local32/local64 directories to keep a "backup" of previous versions of a library and in case of failures to build, you can usually override the check for updated versions at least for git/svn-using libraries. |
I have problems with the "override the check for updated versions" portion. I don't know how. Would you care to explain? As for keeping a backup copy of |
(Shrugs) 🤷♀️ |
Hi. 😀
Before I begin, I'd like to thank you for your script. It made things a lot easier. 🙏
Now, while the script is very useful, it takes hours to run. During this time, goes over a list of remote repos, pulls the first item, compiles it, then repeats the same for all subsequent items. The pulling takes a few minutes in total. Compilation takes hours, 3 to 5. A simple blip on the Internet breaks the whole process. On many occasions, I've left it to run overnight, only to wake up and see an error message indicated a problem on the remote side that is not the fault of your script. Maybe the remote repo was in maintenance mode and the script tried to reach it during that small window.
But I think a small change in the script could the life easier; one that I myself can't make. What if the script pulled all the remote repos first before beginning the compilation process? The user could run it, wait for 15 minutes while the remote repos get pulled, then leave it unattended for the next five hours? If one of the pulls failed, the user can retry in a few minutes and salvage hours of work.
The text was updated successfully, but these errors were encountered: