-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ocpn-plugins.xml update process broken #130
Comments
I agree Jon, as the file gets bigger it is going to take longer and longer checking all the xml and urls. Also it is very manual, downloading the metadata and adding it to the plugins/metadata folder, building the new file and checking all xml and urls and then making a PR. I was thinking today about making a script or batch file that downloads the correct xml files from cloudsmith to a default directory. This is possible using wget I think and using a search somehow in cloudsmith. Since I am now using the OpenCPN organization repository and since everything is "public" I can use "https://cloudsmith.io/~opencpn/repos/" and adding the plugin info
Which produces a list of 15 xml files. Next they have to be download to some convenient specific location to be added to the plugins/metadata file. I think the checking of xml files is problematic because I still do not get a list of files that have bad urls or are mal formed. It requires manually cutting and pasting batches of xml file into a temp folder and running the check again until the culprit is isolated. This is not efficient and is a great waste of time. Consequently I do not intend to do this very much.! |
Jon... Jon...
The key here is removing your local copy of ocpn-plugins.xml before pulling. So, no conflicts. |
I agree that there could be better tooling for this process. But I would not call the current process "broken". Just sometimes inconvenient.. |
Scripting the Cloudsmith download using wget in a loop, and forming the file names from the known git commit SHA suffix is one way, and works well for me. |
You obviously don't use it like a plugin developer (hardly surprising since you control this). From the developer point of view it is an exercise in frustration. The real problem is that pulls frequently invalidate other pulls. So I put up a pull against the master and it all passed OK and got the green light. However, you accepted another pull before mine which then invalidated my pull which you then asked me to fix up. This suggests that the whole update process for ocpn-plugins.xml is very fragile and needs fixing up. There is a possibility that adding a main.yml fie to the '.github/workflows' directory that has something like this:
to cause the generation of the ocpn-plugins.xml file may be a solution. |
Jon... |
Jon... |
We could clearly improve this. But, I remind you, simultaneous commits involving the same file on github will always generate conflict. It is the nature of the collaborative beast. |
Nearly every time I do a pull request I get the issue with the xml file. When I put it up it is OK, by the time you accept it is not. I know about conflict on files, but why are we using a process that almost guarantees it? As for a pull request I have only just found that you can run scripts based on events on github and I only looked to try and find a better way to do what is being done. I am not an expert in any of this, just a time poor user. I am trying to investigate this, but I really don't have the time at the moment. I did suggest a way months ago about using 'pointer' content in the xml file so that it could find files based on ones maintained by the developer. I think it is sort of working for alpha changes, but it is not implemented for other levels of code. That is fine, but the proponent of this whole process needs to do more thinking and provide workable solutions. I have only done minor changes to my plugins but have done a large amount of work trying to get this process working and this is just another part of the process which is OK for alpha testing but not suitable for production. |
Jon/Rick...
Call this script cl-download.sh, make executable. This will download all of the required XMLs into the "metadata" directory, in one command. Then all set to rebuild ocpn-plugins.xml, and create a PR. Dave |
I have created a pull request for a new download_xml.sh file. This will find and download all xml files for a plugin from a cloudsmith repository based on the plugin name, the version, use cloudsmith owner and cloudsmith level. So a command like: This is just a first pass to help with downloads and it does require lynx to be installed. I have only done initial testing on a Ubuntu machine. |
This looks like a solution. #131 How big is lynx Jon? Is it for Windows too? I get embedded software....? Couldn't we use python? I have python already. I can run bash through Git-Gui to run the script. Boy is this arcane. Jon's script download-xml.sh
|
I recall Kees made a script for downloading CS radar files. Hakan may know...have written him. |
Lynx is a web browser that allows running from the command line. It is small ~1.4MB. I have tried to use wget and curl but I cannot seem to get them to do the query needed to find the files for download. So, at the moment, this command only runs on Linux after installing the lynx package. I will look and see if I can find another way to get the information from the web, but..... |
I have just created a git workflow action that will recreate the ocpn-plugins.xml file for each successful merge. This will then require the users to add/remove versions of their plugins and push these to this git. When the pull request is merged successfully the ocpn-plugins.xml file will be recreated. Hopefully this should then alleviate the issue of conflicting pull requests |
That makes good sense to me! |
Hakan, About Kees script: I looked, didn't find the script, I guess I dreamt it. |
The basic problem here is that each plugin developer holds a copy of all files in metadata/, right? And that while I work on one plugin, other devs might change the metadata/ files leading to merge conflicts. In normal cases, we should not have two plugin developers working on the same files in metadata/, there is basically one dev foir each plugin, right? IMHO, getting this to work smoothly is about applying a standard git workflow. The first thing to recognize is that ocpn-plugins.xml is a derived file, we can always re-construct it from the files in metadata/. Such files should not be part of the git repo. So a first step would be the remove ocpn-plugins.xml from the repo, and add it to .gitignore. This would effectively stop all conflicts in that file, for sure. Next is the workflow. A plugin dev could always work like this:
As long as each dev only changes her own plugins, there are no conflicts in this flow. OTOH, if two devs changes the same file in metadata/ this is a real conflict which has to be resolved. But in the end, this is a workflow issue, it shouldn't really happen. To summarize:
|
Alec, Your point about two developers interfering with each other has occurred to me, but not in real life yet, Jon has encouraged us to us a different process which utilizes a personal branch, say "rg-master" which is created from upstream master, and then commit and push that. |
Well, this whole thread is about that the process is broken due to conflicts... Using an own branch does not really change anything in this respect. The thing is this downloading of metadata files, with or without support scripts, could and should be avoided. A simple 'git rebase' does the same thing, but is much more efficient and reliable. You could apply the workflow above on a separate branch, it's a no-brainer. Likewise, there are variants if you intend to merge to the Beta/Alpha branches. In any case there is no need to download stuff or to use any scripts, git is fully capable to handle this as well as much more complicated cases. |
OK, my bad: of course you need to download the metadata files which have been updated, this is how your process works. |
I am not that familiar with git rebase, and I assume "origin" is your personal remote repository (like github.com/rgleason) The difference is you clone and delete a local repository (master branch) versis creating and deleting a branch. |
You only clone once., at project start. The repo should then be used as long as the project exists. A branch could be created at any point, but should not be deleted until the PR is merged upstream. The 'rebase' operation effectively makes it work against whatever changes has been done in the upstream repo. Jon's two remotes upstream and origin are sound (and standard). I tried to keep the description here simple. |
Drop ocpn-plugins.xml from git repo (#130).
Can this be closed now? |
Yes |
Jon, can you please close this now? I can't. Have linked to it in OpenCPN/OpenCPN#3183 |
The process for updating the ocpn-plugins.xml was dodgy at best when it was first introduced, but is now becoming unworkable. As more plugins move to the new Plugin Manager process the ability to maintain this file will decrease to the point where most people will give up trying.
Each commit/pull request for an update is based on the 'current' master, but if two or more pull requests are available to be merged the first will succeed the others will fail with conflicts in the ocpn-plugins.xml file. This requires the owners of the other pull requests to close their current pull request, delete their branch completely, pull the lastest version of master, reapply all their additions and deletions of files, recreate the ocpn-plugins.xml file, create a new pull request then hope that no-one else is doing updates at the same time or they will have to redo the whole process again.
Each update currently requires downloading 14 xml files from cloudsmith for each plugin as individual downloads (there is no 'mass' download capability). This rapidly gets frustrating in the extreme if the above process has to be followed multiple times. I know I can save stuff around, but it is a very manual process. There MUST be a better way of building the ocpn-plugins.xml file after the merge of a pull request that adds/deletes xml files such that it is always consistent.
The text was updated successfully, but these errors were encountered: