Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ocpn-plugins.xml update process broken #130

Open
jongough opened this issue Jun 13, 2020 · 26 comments
Open

ocpn-plugins.xml update process broken #130

jongough opened this issue Jun 13, 2020 · 26 comments

Comments

@jongough
Copy link
Contributor

The process for updating the ocpn-plugins.xml was dodgy at best when it was first introduced, but is now becoming unworkable. As more plugins move to the new Plugin Manager process the ability to maintain this file will decrease to the point where most people will give up trying.

Each commit/pull request for an update is based on the 'current' master, but if two or more pull requests are available to be merged the first will succeed the others will fail with conflicts in the ocpn-plugins.xml file. This requires the owners of the other pull requests to close their current pull request, delete their branch completely, pull the lastest version of master, reapply all their additions and deletions of files, recreate the ocpn-plugins.xml file, create a new pull request then hope that no-one else is doing updates at the same time or they will have to redo the whole process again.

Each update currently requires downloading 14 xml files from cloudsmith for each plugin as individual downloads (there is no 'mass' download capability). This rapidly gets frustrating in the extreme if the above process has to be followed multiple times. I know I can save stuff around, but it is a very manual process. There MUST be a better way of building the ocpn-plugins.xml file after the merge of a pull request that adds/deletes xml files such that it is always consistent.

@rgleason
Copy link
Contributor

I agree Jon, as the file gets bigger it is going to take longer and longer checking all the xml and urls.

Also it is very manual, downloading the metadata and adding it to the plugins/metadata folder, building the new file and checking all xml and urls and then making a PR.

I was thinking today about making a script or batch file that downloads the correct xml files from cloudsmith to a default directory. This is possible using wget I think and using a search somehow in cloudsmith.

Since I am now using the OpenCPN organization repository and since everything is "public" I can use "https://cloudsmith.io/~opencpn/repos/" and adding the plugin info

Which produces a list of 15 xml files. Next they have to be download to some convenient specific location to be added to the plugins/metadata file.

I think the checking of xml files is problematic because I still do not get a list of files that have bad urls or are mal formed. It requires manually cutting and pasting batches of xml file into a temp folder and running the check again until the culprit is isolated. This is not efficient and is a great waste of time.

Consequently I do not intend to do this very much.!

@bdbcat
Copy link
Member

bdbcat commented Jun 13, 2020

Jon...
Did you read and try this:

Jon...
It is always safe to delete an ocpn-plugins.xml file from your local copy, and regenerate from a freshly pulled github master, with your metadata updated. Probably much quicker than manually fixing the conflicts. Like this:

rm ocpn-plugins.xml
git pull origin master
Update your plugin metadata, by whatever means you use.
tools/ocpn-metadata generate --userdir metadata --destfile ocpn-plugins.xml --force
Commit this state.
git push origin master

The key here is removing your local copy of ocpn-plugins.xml before pulling. So, no conflicts.
Dave

@bdbcat
Copy link
Member

bdbcat commented Jun 13, 2020

I agree that there could be better tooling for this process. But I would not call the current process "broken". Just sometimes inconvenient..

@bdbcat
Copy link
Member

bdbcat commented Jun 13, 2020

Scripting the Cloudsmith download using wget in a loop, and forming the file names from the known git commit SHA suffix is one way, and works well for me.

@jongough
Copy link
Contributor Author

You obviously don't use it like a plugin developer (hardly surprising since you control this). From the developer point of view it is an exercise in frustration.

The real problem is that pulls frequently invalidate other pulls. So I put up a pull against the master and it all passed OK and got the green light. However, you accepted another pull before mine which then invalidated my pull which you then asked me to fix up. This suggests that the whole update process for ocpn-plugins.xml is very fragile and needs fixing up. There is a possibility that adding a main.yml fie to the '.github/workflows' directory that has something like this:

# This is a basic workflow to help you get started with Actions
name: CI
# Controls when the action will run. Triggers the workflow on push or pull request
# events but only for the master branch
on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
  # This workflow contains a single job called "build"
  build:
    # The type of runner that the job will run on
    runs-on: ubuntu-latest

    # Steps represent a sequence of tasks that will be executed as part of the job
    steps:
    # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
    - uses: actions/checkout@v2

    # Runs a single command using the runners shell
    - name: Run a one-line script
      run: echo Hello, world!

    # Runs a set of commands using the runners shell
    - name: Run a multi-line script
      run: |
        echo Add other actions to build,
        echo test, and deploy your project.

to cause the generation of the ocpn-plugins.xml file may be a solution.

@bdbcat
Copy link
Member

bdbcat commented Jun 13, 2020

Jon...
PRs always welcome....

@bdbcat
Copy link
Member

bdbcat commented Jun 13, 2020

Jon...
"You obviously don't use it like a plugin developer"
I maintain 3 plugins. And, sure enough, if I do it wrong, then I pay the price in time to redo it all. So I try to push the correct keys.

@bdbcat
Copy link
Member

bdbcat commented Jun 13, 2020

We could clearly improve this. But, I remind you, simultaneous commits involving the same file on github will always generate conflict. It is the nature of the collaborative beast.

@jongough
Copy link
Contributor Author

jongough commented Jun 13, 2020

Nearly every time I do a pull request I get the issue with the xml file. When I put it up it is OK, by the time you accept it is not. I know about conflict on files, but why are we using a process that almost guarantees it?

As for a pull request I have only just found that you can run scripts based on events on github and I only looked to try and find a better way to do what is being done. I am not an expert in any of this, just a time poor user. I am trying to investigate this, but I really don't have the time at the moment.

I did suggest a way months ago about using 'pointer' content in the xml file so that it could find files based on ones maintained by the developer. I think it is sort of working for alpha changes, but it is not implemented for other levels of code. That is fine, but the proponent of this whole process needs to do more thinking and provide workable solutions. I have only done minor changes to my plugins but have done a large amount of work trying to get this process working and this is just another part of the process which is OK for alpha testing but not suitable for production.

@bdbcat
Copy link
Member

bdbcat commented Jun 13, 2020

Jon/Rick...
Make life simpler with a script to download all of the XMLs ffrom cloudsmith:
Adapt from this, to your directory structure and cloudsmith repo name:

#!/bin/bash
Platforms=('darwin-10.13.6' 'ubuntu-x86_64-16.04' 'ubuntu-x86_64-18.04' 'msvc-10.0.14393' 'mingw-10' 'debian-x86_64-10' 'raspbian-armhf-10' 'raspbian-armhf-9.4' 'flatpak-18.08' 'fedora-x86_64-29' 'ubuntu-gtk3-x86_64-18.04' 'ubuntu-gtk3-x86_64-20.04' 'ubuntu-arm64-18.04')


for u in "${Platforms[@]}"
do


  METADATA="$1-${u}-metadata/versions/$2/$1-plugin-${u}.xml"
  echo $METADATA

  REPO="https://dl.cloudsmith.io/public/david-register/ocpn-plugins-unstable/raw/names/"
  
  URL="$REPO$METADATA"
  echo $URL
 
 
  rm metadata/$1-plugin-${u}.tmp
  wget --progress=bar:force:noscroll -c $URL -O metadata/$1-plugin-${u}.tmp
  filesize=$(stat -c%s "metadata/$1-plugin-${u}.tmp")
  echo "Size of metadata/$1-plugin-${u}.tmp = $filesize bytes."
  if (( filesize > 600 )); then
    mv metadata/$1-plugin-${u}.tmp metadata/$1-plugin-${u}.xml
  else
    rm metadata/$1-plugin-${u}.tmp
  fi
  
done


Call this script cl-download.sh, make executable.
Usage example (for oernc_pi, latest build)
$cd plugins
$./cl-download oernc 1.2.2.0.d91f87a

This will download all of the required XMLs into the "metadata" directory, in one command. Then all set to rebuild ocpn-plugins.xml, and create a PR.

Dave

@jongough
Copy link
Contributor Author

jongough commented Jun 13, 2020

I have created a pull request for a new download_xml.sh file. This will find and download all xml files for a plugin from a cloudsmith repository based on the plugin name, the version, use cloudsmith owner and cloudsmith level. So a command like:
./download_xml.sh testplugin_pi 1.0.114 jon-gough alpha
will download all xml files available that match the name and version from my testplugin_pi alpha repository. This works for me, but for others there may be a need to add a fifth parameter to specify the repository prefix if it is different to the plugin name.

This is just a first pass to help with downloads and it does require lynx to be installed. I have only done initial testing on a Ubuntu machine.

@rgleason
Copy link
Contributor

rgleason commented Jun 13, 2020

This looks like a solution. #131

How big is lynx Jon? Is it for Windows too? I get embedded software....? Couldn't we use python? I have python already. I can run bash through Git-Gui to run the script. Boy is this arcane.

Jon's script

download-xml.sh


#!/bin/bash

if [ "$#" -ne "4" ]; then
	echo "Incorrect invocation: Should be download_xml.sh plugin_name plugin_version cloudsmith_user cloudsmith_level"
	echo "where:"
	echo "   plugin_name is the name of the plugin, i.e. testplugin_pi"
	echo "   plugin_version is the version number, i.e. 1.0.114.0"
	echo "   cloudsmith_user is the user name associated with the cloudsmith repository, i.e. jon-gough"
	echo "   cloudsmith_level is the level of the repository and is one of: prod, beta, alpha"
	echo ""
	echo "   Full command should look like:"
	echo "      download_xml.sh testplugin_pi 1.0.114.0 jon-gough alpha"
	exit
fi

REPO="https://cloudsmith.io/~$3/repos/"
echo "Issuing command: lynx -dump $REPO$1-$4/packages/?q=*$2*.xml |  awk {'print $2'} |grep '.xml$' |grep $1-$2"
my_array=()
echo "Show current files that match criteria"
ls metadata/$1-$2*xml -la
echo "Deleting current files that match criteria"
rm metadata/$1-$2*xml
echo "Finding files on remote cloudsmith repository"
while IFS= read -r line; do
    my_array+=( "$line" )
done < <( lynx -dump $REPO$1-$4/packages/?q=*$2*.xml |  awk {'print $2'} |grep '.xml$' |grep $1-$2 )

echo "Downloading files found that match criteria"
for URL in "${my_array[@]}"
do
  echo $URL
  wget --progress=bar:force:noscroll -c $URL -P metadata
done
echo "Files downloaded"
ls metadata/$1-$2*xml -la

@rgleason
Copy link
Contributor

I recall Kees made a script for downloading CS radar files. Hakan may know...have written him.

@jongough
Copy link
Contributor Author

Lynx is a web browser that allows running from the command line. It is small ~1.4MB. I have tried to use wget and curl but I cannot seem to get them to do the query needed to find the files for download. So, at the moment, this command only runs on Linux after installing the lynx package. I will look and see if I can find another way to get the information from the web, but.....

@jongough
Copy link
Contributor Author

I have just created a git workflow action that will recreate the ocpn-plugins.xml file for each successful merge. This will then require the users to add/remove versions of their plugins and push these to this git. When the pull request is merged successfully the ocpn-plugins.xml file will be recreated.

Hopefully this should then alleviate the issue of conflicting pull requests

@rgleason
Copy link
Contributor

That makes good sense to me!

@rgleason
Copy link
Contributor

Hakan, About Kees script:
Rick..
I haven't been enough involved in the plugin download procedure to understand what you want.
All things for radar_pi are here: https://github.com/opencpn-radar-pi/radar_pi/tree/ci
Is there anything you can use?
Håkan

I looked, didn't find the script, I guess I dreamt it.

@leamas
Copy link
Contributor

leamas commented Oct 14, 2020

The basic problem here is that each plugin developer holds a copy of all files in metadata/, right? And that while I work on one plugin, other devs might change the metadata/ files leading to merge conflicts.

In normal cases, we should not have two plugin developers working on the same files in metadata/, there is basically one dev foir each plugin, right? IMHO, getting this to work smoothly is about applying a standard git workflow.

The first thing to recognize is that ocpn-plugins.xml is a derived file, we can always re-construct it from the files in metadata/. Such files should not be part of the git repo. So a first step would be the remove ocpn-plugins.xml from the repo, and add it to .gitignore. This would effectively stop all conflicts in that file, for sure.

Next is the workflow. A plugin dev could always work like this:

    $ git clone https://github.com/OpenCPN/plugins
    $ cd plugins

    -->  do work, During this time the metadata/ files in the repository will change when other
           devs merges changes to metadata. When it's time to make a PR:

    $ git remote update origin
    $ git rebase  origin/master
    --> This will always work without conflicts as long as each dev work on her own plugins.
    $ git push origin master:my-pr-branch

    -->  make a PR. If merged at once, no problem. If other PR:s comes in between,  the PR
           must be rebased before it can be merged:

    $ git remote update origin 
    $ git rebase origin/master
    $ git push -f origin master:my-pr-branch

As long as each dev only changes her own plugins, there are no conflicts in this flow. OTOH, if two devs changes the same file in metadata/ this is a real conflict which has to be resolved. But in the end, this is a workflow issue, it shouldn't really happen.

To summarize:

  • We should remove ocpn-plugins.xml from the git repo, thus resolving the conflicts in this file once and for all.
  • There should be a understanding on which dev changing which file in metadata/.
  • There is no need for new tools, git is designed for these kind of workflows.

leamas pushed a commit to leamas/plugins that referenced this issue Oct 14, 2020
@rgleason
Copy link
Contributor

Alec, Your point about two developers interfering with each other has occurred to me, but not in real life yet, Jon has encouraged us to us a different process which utilizes a personal branch, say "rg-master" which is created from upstream master, and then commit and push that.
https://github.com/rgleason/GPS-Odometer/blob/master/Read-Build.md#use-a-manual-process-not-using-smartgit down at Use a manual process.

@leamas
Copy link
Contributor

leamas commented Oct 14, 2020

Well, this whole thread is about that the process is broken due to conflicts...

Using an own branch does not really change anything in this respect. The thing is this downloading of metadata files, with or without support scripts, could and should be avoided. A simple 'git rebase' does the same thing, but is much more efficient and reliable.

You could apply the workflow above on a separate branch, it's a no-brainer. Likewise, there are variants if you intend to merge to the Beta/Alpha branches. In any case there is no need to download stuff or to use any scripts, git is fully capable to handle this as well as much more complicated cases.

@leamas
Copy link
Contributor

leamas commented Oct 14, 2020

OK, my bad: of course you need to download the metadata files which have been updated, this is how your process works.

@rgleason
Copy link
Contributor

I am not that familiar with git rebase, and I assume "origin" is your personal remote repository (like github.com/rgleason)
I can try this alternative process at some point, and once I am comfortable with it I will add it to the Readme-Build.md.
I agree that using your own branch does not change much, Jon wanted that done because we were having problems when the master branch was used.

The difference is you clone and delete a local repository (master branch) versis creating and deleting a branch.

@leamas
Copy link
Contributor

leamas commented Oct 14, 2020

You only clone once., at project start. The repo should then be used as long as the project exists.

A branch could be created at any point, but should not be deleted until the PR is merged upstream. The 'rebase' operation effectively makes it work against whatever changes has been done in the upstream repo.

Jon's two remotes upstream and origin are sound (and standard). I tried to keep the description here simple.

bdbcat added a commit that referenced this issue Oct 14, 2020
Drop ocpn-plugins.xml from git repo (#130).
@rgleason
Copy link
Contributor

Can this be closed now?

@bdbcat
Copy link
Member

bdbcat commented Aug 26, 2021

Yes

@rgleason
Copy link
Contributor

rgleason commented May 2, 2023

Jon, can you please close this now? I can't. Have linked to it in OpenCPN/OpenCPN#3183

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants