Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move wiki to github pages #1826

Merged
merged 14 commits into from Mar 28, 2021
Merged

Conversation

wernerfred
Copy link
Member

@wernerfred wernerfred commented Feb 22, 2021

Description

Decision was made to pick mkdocs. See the current version of docs on gh-pages of the PR:
https://wernerfred.github.io/docker-mailserver/

Open tasks (detailed description in this comment):

Will be done by further PR of @polarathene and me:

  • (Assigned: @polarathene ) Add custom CSS required to fix a UX bug that upstream is unwilling to resolve.
  • (Assigned: @polarathene ) Document relevant information related to this PR feature
  • (Assigned: @wernerfred ) Improved linting for docs content. markdownlint or similar

Done:

  • (Assigned: @wernerfred ) Updating references from content external of the docs to reference the docs. (README.md, DockerHub).
  • (Assigned: @polarathene ) Final interactive rebase before we merge the PR. Trimming out redundant/minor commits.
  • (Assigned: @wernerfred ) Github Wiki will remove all pages except the main page, which will temporarily remain for some duration of time to point existing users to the new docs site.
  • (Assigned: @polarathene and @wernerfred ) Add version with the versions.json approach and custom workflow

Fixes #1825

Type of change

  • New feature (non-breaking change which adds functionality)
  • Improvement (non-breaking change that does improve existing functionality)
  • This change requires a documentation update

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation (README.md or ENVIRONMENT.md or the Wiki)
  • If necessary I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes

@wernerfred wernerfred added priority/low kind/improvement Improve an existing feature, configuration file or the documentation area/documentation labels Feb 22, 2021
@wernerfred wernerfred self-assigned this Feb 22, 2021
@wernerfred
Copy link
Member Author

@polarathene you should have collaborator permissions on the pr repository

@polarathene
Copy link
Member

Seems we created a similar PR at roughly the same time 😛

I've got the docs migrated and documented the process to do so. There's an additional branch I have up with Github Action figured out for deploying to Github Pages as requested :)

I could switch to MKDocs if you like, but I think Docusaurus seems like a pretty good option, we can weigh the choices up, I see you created an issue about this so I'll take a read of it over there and continue discussion about it on the related issue.

.github/workflows/documentation.yml Outdated Show resolved Hide resolved
.github/workflows/documentation.yml Outdated Show resolved Hide resolved
.github/workflows/documentation.yml Outdated Show resolved Hide resolved
@polarathene
Copy link
Member

Just to clarify, docs from wiki will be migrated to the repo separately to this PR correct? And the PR will rebase onto that when ready.

My day has just started, so I'll be focusing on tackling some concerns and how to handle those. Once I've resolved that, I'll have a better idea of project structure and will commit the docs to that structure (which we can all agree on first of course) and PR that through.

Afterwards, I'll collab with you on this :)

@wernerfred
Copy link
Member Author

wernerfred commented Feb 24, 2021

Currently my plan was to have a strucutre like you proposed like this:

├── docs
│   ├── content
│   │   └── home.md
│   └── mkdocs.yml

The current wiki then needs to live inside docs/content. (content can be renamed of course if necessary)

I was thinking of you pushing the wiki with your commands that keep the history onto this PR. This way we can build and see the generated sites already during development. But we can of course also go an different approach.

My day ends now and I will think overnight on how to proceed further. I'm looking forward to read your thoughts tomorrow :)

@polarathene
Copy link
Member

polarathene commented Feb 24, 2021

The current wiki then needs to live inside docs/content. (content can be renamed of course if necessary)

That location seems fine. I'll chime in when I've completed my investigation of adding extensions. We'll likely want to keep setup easy for contributors, thus a separate repo to maintain the Dockerfile and push to DockerHub is probably going to be best. The docs and any other files that can be mounted via --volume can remain in this repo of course.

If using Docker to bundle our extensions and maintain an image for it on DockerHub is going to happen, we may just want to adopt that for the Github Actions workflow too? (simplifies maintenance and keeps contributor and CI docs workflow more consistent)

I was thinking of you pushing the wiki with your commands that keep the history onto this PR. This way we can build and see the generated sites already during development. But we can of course also go an different approach.

I should be done with my investigation by the end of the day. mermaid.js and awesome-pages are the two I want to verify setup for.

  • awesome-pages would change how we approach the nav config for this, thus it's good to get that out of the way early.
  • mermaid.js is complicated with quite a few gotchas. From what I understand, without the $10/month subscription to MkDocs Material Insiders edition, besides losing some minor perks, config setup is a lot more work, there's a separate extension with it's own list of issues and setup to be aware of, and another integration that details it's own config compatibility to be aware of (superfences that MkDocs Material bundles by default).
  • Some other extensions can require providing custom CSS to actually render correctly (eg ProgressBar, but that's not officially supported by MkDocs, nor is that particular extension something we need afaik). I'll be looking at a few others if time permits, such as versioning docs support via mike which is a nice to have but not essential.

In the meantime, feel free to experiment on a different branch/repo like you and I have been doing, or even locally, the squidfunk/mkdocs-material docker image is very easy to setup locally and play with:

# Runs `mkdocs serve` by default, but you can pass in a command such as `build` and it'll forward that to `mkdocs` to run within the container:
docker run --rm -it -p 8000:8000 -v ${PWD}:/docs squidfunk/mkdocs-material

# eg: docker run --rm -it -p 8000:8000 -v ${PWD}:/docs squidfunk/mkdocs-material build --strict

I'd rather the large 488 commit history not be mixed into this PRs commit history for review. You can use an interactive rebase or squash/remove commits if necessary to clean up commit history. I can assist with this when the time comes if needed and you prefer I handle it :)

@polarathene
Copy link
Member

My day ends now and I will think overnight on how to proceed further.

And now my day ends :)

We'll need a Dockerfile, it can start off in this repo if you like in the docs/ directory discussed here beside mkdocs.yml? Or we can have it in a separate repo up to you and if you think we should have it published as an image for contributors to pull or just build locally.

I'll convert history of the wiki files to this location tomorrow, along with folder hierarchy matching the current wiki.

@wernerfred
Copy link
Member Author

Sadly I was busy the whole day so i can only provide further thoughts and no doings.

One question: Do we really need the extension you mentioned? Imho it is way easier to use the official unmodified image for the owrkflow as well as for local testing for contributors. It also takes away maintaining time of a separate repo/image of the modified one.
What do you think of starting with the plain mkdocs-material and see how far we get? If we see that we need those plugins we can still create a separate docker image for that.
Don't get me wrong - I admire your energy and work but the goal of this whole project is simplicity. And I would like to keep this even in the docs.

So my proposal:

  1. Finish your PR and move the wiki content to /docs/content
  2. We rebase this PR to have the docs included
  3. We work out a proper mkdocs.yml and define a proper workflow/action
  4. See the result/deployment
  5. Tweak the result until we are statisfied (move pages / rename / rework / ...)
  6. Decide whether we need further plugins/integrations
  7. Write proper documentation on how to use/contribute to the new wiki solution

Tomorrow (friday) I will be very busy but I will respond/start working on the weekend on your thoughts.

@polarathene
Copy link
Member

One question: Do we really need the extension you mentioned? Imho it is way easier to use the official unmodified image for the workflow as well as for local testing for contributors. It also takes away maintaining time of a separate repo/image of the modified one.

awesome-pages isn't necessary but it came up in the discussion by @casperklein . We don't necessarily need that and can specify every single doc in the mkdocs.yml nav section, but it does seem like a nice to have.

mermaid.js isn't strictly necessary either, we have a github wiki solution atm that relies on a third-party site returning an image, but the quality is not as good and if theme/styling was important we lack that too for this content. Although, we can probably save the SVG content and just serve that instead.

If there is any other extensions worth adding, we'd need to do this anyway, so it might be nice to have in place early. I don't mind doing the work for that. As it is, we'd still use the upstream Docker image, we'd just have a minimal Dockerfile that extends it with a single RUN line/layer to add/install the extensions.

That can either be distributed via DockerHub or another registry (Github has it's own one I think and there should be an existing Github Action for deploying such, I haven't looked into how this project presently manages pushing to DockerHub). Build/deployment can be automated all the same, I can have the upstream image monitored and just have the CI/CD take care of maintenance on that front.

Alternatively we just have a Dockerfile in docs/ here and the contributor who already has docker runs docker build -t mkdocs-docker-mailserver . before the docker run command, barely any additional work on their part and lower maintenance and setup for us.

  • If going that route the Github workflow can either build that same docker image.
  • or pull the existing upstream one and just run exec commands (or entrypoint change that installs the extensions).
  • or without docker use native shell commands like you had setup (but this would be out of sync with what contributors workflow is).

I don't see it as much of a complication / overhead to include the extensions personally. If you want to avoid them and their value that is fine, but should you need extensions in future someone else may need to figure out how to set these up, and as it stands right now mermaid.js can be confusing due to the various implementations to choose from and compare with their different setup steps and gotchas. If I contribute this, that decision has been taken care of and evaluated and I'll clearly document it in the new docs contributor section regarding our image. My availability in the future for the project will likely be limited.

If versioning docs is of value to you, this is also something I was planning to investigate and setup, so that future breaking changes with major versions of the image could be easier to match docs to the image without sifting through git history directly. The docs would have a drop-down or URL change for the version and you'd be good AFAIK.

If docker-mailserver organization would not like to manage the image, I can do so with my own account. I don't plan to require much maintenance for it once it's setup.

What do you think of starting with the plain mkdocs-material and see how far we get? If we see that we need those plugins we can still create a separate docker image for that.

They're not necessary, they just add additional value. Flow charts can be better presented, docs nav is easier to manage (loses some relevance if we hard-code the entire nav from the get go though).

We can avoid the extensions for now if you prefer, I was mostly exploring them to make sure that I was aware of requirements should we desire them at some point and how that might impact structure of the docs/ directory for mkdocs-material image.

Don't get me wrong - I admire your energy and work but the goal of this whole project is simplicity. And I would like to keep this even in the docs.

That's fine. Things often seem simple to me, but I do find myself biting off more than I can chew at times :)

Personally it doesn't seem like too much of a complication. I'm not sure how comfortable you are with Docker or the others, nor web development.

# Would ideally be version pinned instead of using `:latest`
FROM squidfunk/mkdocs-material:latest

RUN pip install --no-cache-dir \
  mkdocs-awesome-pages-plugin \
  mkdocs-mermaid2-plugin

# mermaid2 is only needed if using that plugin
# Alternative mermaid.js approach can volume mount or copy some JS
# `mkdocs.yml` would set `theme.custom_dir: theme`, and `extra_javascript: theme/assets/javascripts/some-script.js`
# COPY build-files/some-script.js theme/assets/javascripts/some-script.js

# Regular image entrypoint is `mkdocs` binary, `docker run` commands get appended to that
# Otherwise default command for it is `serve --dev-addr=0.0.0.0:8000`

# docker build -t docker-mailserver-mkdocs .
# docker run --rm -it -p 8000:8000 -v ${PWD}:/docs docker-mailserver-mkdocs
# alternatively, override the upstream ENTRYPOINT and CMD without needing a custom image:
# docker run --rm -it -p 8000:8000 -v ${PWD}:/docs --entrypoint /bin/sh squidfunk/mkdocs-material -c "pip install mkdocs-awesome-pages-plugin && mkdocs serve"

It's basically two lines, and we probably don't need that mermaid2 plugin, nor the awesome-pages if we don't use it. I included in those comments some alternatives and other related info if it's useful. Then it's just some tweaks to the mkdocs.yml to use either extension.

You can also run a build locally of the docs, it'll output to ./site by default. If you want to test this for any differences against mkdocs serve, like if we had deployed to Github Pages, you don't have to wait on CI and pushing changes, just spin up a web server container to serve the static generated site like so:

docker run --rm -it -p 8001:80 -v ${PWD}/site:/usr/share/caddy/ caddy

Caddy is a simple yet great web server. The default config of that image is setup for local dev and you can access it at http://localhost:8001/<nav url name> (eg: http://localhost:8001/Understanding-the-ports). There's probably little value of serving the static site locally for you if you've already got mkdocs-material locally doing mkdocs serve.

@polarathene
Copy link
Member

polarathene commented Feb 26, 2021

Docs are ready for merging on my other PR, once that is done and this PR rebases onto that it should build fine. I have a branch already available for live demo here(future readers: this demo will be removed after PR). I've got some commits to cherry pick over once the rebase happens that will update the docs further.

.github/workflows/documentation.yml Outdated Show resolved Hide resolved
.github/workflows/documentation.yml Outdated Show resolved Hide resolved
@wernerfred
Copy link
Member Author

wernerfred commented Feb 26, 2021

I haven't looked into how this project presently manages pushing to DockerHub

Pushing to ghcr is only a matter of logging-in to gh and push to registry like known with the image name prefixed by ghcr.io/

If docker-mailserver organization would not like to manage the image, I can do so with my own account. I don't plan to require much maintenance for it once it's setup.

I would like to ask @aendeavor on his opinion. Imho we have 3 options:

  1. Maintain and publish Image in a 3rd party repo (e.g. by @polarathene)
  2. Maintain and publish Image in docker-mailserver in a separate repository
  3. Dont publish image and only keep the Dockerfile in /docs and use docker build during ci and locally

I'm not sure how comfortable you are with Docker

I think there are not much things I don't know about docker and containers. But of course that might not count for everyone.

I have a branch already available for live demo here.

Really nice. I really like the layout. Thanks for your effort and time!

@polarathene
Copy link
Member

Presently we're not in need of a custom docker image since I hard-coded the nav fully anyway. Github Wiki contributors had to modify the sidebar menu file sometimes just like they would mkdocs.yml for the nav entry now.

There is a request for awesome-pages to support a mode that will detect docs that aren't specified in a nav while no ... (spread/rest operator) exists in the nav to indicate where to put implicit pages. Otherwise their pages won't show in a build sidebar unless they update mkdocs.yml, or the current nav was more relaxed for awesome-pages.

The only other extension worth possibly having atm is one to show the last git revision (commit) date for a doc page, but this isn't a major need. Probably still worth discussing the options if we ever need a custom Docker image. awesome-pages is a mere ~300KB to add in weight, and was previously in mkdocs-material prior to version 6.0, but I've not been able to convince the maintainer to include it.

@georglauterbach
Copy link
Member

I would like to ask @aendeavor on his opinion. Imho we have 3 options:

1. Maintain and publish Image in a 3rd party repo (e.g. by @polarathene)

2. Maintain and publish Image in docker-mailserver in a separate repository

3. Dont publish image and only keep the Dockerfile in `/docs` and use `docker build` during ci and locally

I haven't read through the details here, therefore I will keep my answer rather short: If it was up to me, I would pick 2) or 3). It's not that I think @polarathene isn't suited, I just feel like this organization is the right place to put everything related to this project in. I mean, that's its sole purpose. With this way, we can make sure everything is in one place and better maintainable.

Ultimately; I would make it up to you two to decide since you're the driving forces, but now you've heard my opinion:)

@wernerfred if you need it, I can grant you more privileges if you need them - just reach back to me.

@wernerfred
Copy link
Member Author

  • I'll be looking at a few others if time permits, such as versioning docs support via mike which is a nice to have but not essential.

I would appreciate mike very much! As far as I can see it is included in mkdocs-material by default and not a sponsor only feature. So we should definitely start using mike imho.

I had another option in mind: the traefik project build documentation during PRs and publishes them (might be using mike) with a pr# label/version. So maintainers can look at the generated docs without the need to clone and build/serve locally. We could look at it if it is feasible with mike too. Would be nice. One thing actually might speak against it: We definitely do not want to have the PR versions in the offical docs version dropdown. But maybe you have an idea to solve this. Otherwise we will put this backwards as it has low priority.

@wernerfred
Copy link
Member Author

I really like your changes so far 👍🏻
Is there anything we can do yet without the *.md files?

Next steps will be:

  1. merge your PR that moves docs to /docs/content
  2. publish to gh-pages of this PR/repo and look for problems/display errors
  3. Modify layout / rearrange sections if necessary
  4. Update all wiki references to new docs references

@polarathene
Copy link
Member

So we should definitely start using mike imho.

From what I've read, you are advised to not retrofit mike but start with it. I will look into it once the docs are merged and I've cherry picked over my changes thus far.

I had another option in mind: the traefik project build documentation during PRs and publishes them (might be using mike) with a pr# label/version. So maintainers can look at the generated docs without the need to clone and build/serve locally.

I'll have to take a look, but I don't think mike is used for that by Traefik. They're probably deploying a staging preview which Netlify and Vercel can make rather easily, I'm not sure if github pages supports such.

Someone with authority would need to connect the organization to a service like Vercel for this, if so I can probably set something up and use the "transfer" feature to provide a working setup, otherwise I can relay instructions.

One thing actually might speak against it: We definitely do not want to have the PR versions in the offical docs version dropdown. But maybe you have an idea to solve this. Otherwise we will put this backwards as it has low priority.

It's low priority for me atm over other doc tasks.

My understanding is that with a service like Vercel, they have branch deploys/previews, aka staging deploy. mike and version drop-down would be a separate thing. I didn't have luck with Vercel and Docusaurus working when I changed to Github Pages as the base URL changed. I think that was an issue specific to Docusaurus though.

So rather than the github.io URL, you'd get vercel.app or similar with a random temporary subdomain just for PR purposes. I'd also need to look into it only triggering when docs are actually touched in a PR.


merge your PR that moves docs to /docs/content

Looking forward to this, but I want to ensure proper history is retained if possible :)

As in, I should be able to git blame a doc and see previous edits, not all linked to the PRs list of commits.

publish to gh-pages of this PR/repo and look for problems/display errors

👍

I know presently that:

  • Headings are a bit inconsistent or invalid (affects ToC generation on the top right, and typography).
    • The doc page title H1 logic is a bit messy. I think the default is to use the file name, then for:
      • The nav it'll also accept an H1 in the MD file, or give priority to a label assigned in mkdocs.yml config nav section. Otherwise I think will also use front-matter title if set.
      • Document header seems to use nav label (at least with Material theme), or an H1 if provided, both have priority over the front-matter title.
      • The pages tab/window title takes priority with nav label, or if available the front-matter title takes the highest priority here.
  • Code fences missing language specified for highlighting. Some are specified but not recognized (I can fix that and did so for yml to yaml).
  • Any usage of <details> has some compatibility issues and needs to use the ??? (with space indent for subsequent lines) collapsible admonition syntax instead.
  • Lists presently don't render correctly if they don't start with a new line. Github allows this, and there is a way to support it with MkDocs but it requires a plugin being added which means we'll need to add custom Docker image, otherwise fix it and enforce via markdown lint rule.

Modify layout / rearrange sections if necessary

I've not given much thought to this at present, other than I had to remove the duplicate FAQ link (one is under Config near Debugging, and another under Tutorials) as mkdocs didn't seem to like this unless the document was duplicated and would just redirect to the other one (which in my current setup broke nav for the Tutorial section).

Update all wiki references to new docs references

I've been doing this already on my end. I've added front-matter metadata to each md file in one commit, and presently going over the links for another commit to cherry pick over.

Additionally some URLs are provided without markdown syntax, so they did not create a clickable link. I've fixed that with the MagicLink plugin (but we need to make sure all headings start with a space, which probably requires a MD rule to enforce with the linter workflow). There's also some wiki link syntax used as Github Wiki supported that, if I spot them I'm presently addressing that (but leaving the raw URLs as is with MagicLink for now).

After that we're good for collab, although we should probably take ownership of some tasks or sections/pages while we work on them to avoid duplicate work :)

@polarathene
Copy link
Member

polarathene commented Feb 27, 2021

Is there anything we can do yet without the *.md files?

There's the workflows being tweaked here, and any plugin related stuff. If you're not familiar with MkDocs Material Docs or PyMarkdown Extensions, you could go have a read through those and learn about the features.

I forked my docs PR to work on a branch locally and for Github Pages workflow testing for this purpose. You could investigate mike, but seems like we'll have the docs merged upstream soon enough.

Markdown linting rules should probably be addressed, as well as a link checker.


The build uses --strict which will raise an error on invalid local document links (but these need to be specific files or *.md only AFAIK), otherwise URLs are ignored. It's possible that users might try to link via the GH-Pages URL too (which may match the folder hierarchy but replaces .md with /), additionally if they start with a root level path (eg /config/setup.sh.md instead of config/setup.sh.md or ../config/setup.sh.md as links are relative from the document not the base path), then these won't be caught either and will be invalid with the .md suffix which is no longer retained on the deployed site.

I don't know how either link checker (build time or post-deploy) handles non-existent files like setup.sh, there might be exceptions for which extensions are treated as files such as images, while others may just be treated as a directory/path. Additionally if doing a link check on built site, we could either do it against github pages, or within the CI by spinning up a web server to test against.

This helps address a concern about Github rate limiting which only allows around 1k requests within an hour (probably won't hit that, but if we're testing this on pull requests with someone frequently pushing commits on an open PR it might be possible), as we can limit the internal URL requests at least.

For external another approach is to detect only new/modified URLs to check against eg by utilizing a diff action but this complicates the workflow a bit. Otherwise a scheduled daily workflow could check the current site deployment and any invalid URLs it notices can be reported via automated issue to notify us.

I've done a little research into this as you can tell, and I'm aware of a few Github Actions / tools we could leverage depending on what sort of coverage we would like to ensure the maintenance of working URLs in the docs. As I update existing ones, I notice not only the many tomav/docker-mailserver links, but some to direct commits history of some files (which is good technically since the referenced line would have changed long ago I think, but should probably be updated to a newer commit reference), some wiki links referring to pages that don't exist (renamed) or sections that don't exist anymore (future edits).

EDIT: Just adding a note that --strict doesn't pick up on section links if the file referenced is valid but the section is not (eg aliases.md#notes instead of accounts.md#notes)

@polarathene

This comment has been minimized.

@wernerfred
Copy link
Member Author

wernerfred commented Feb 28, 2021

#1827 is merged. git blame shows every change but can you explain me why the histroy only shows the last commit? Just curious.

EDIT: now there are a few more shown. Maybe github gui needs some time to index that properly?

@polarathene
Copy link
Member

can you explain me why the histroy only shows the last commit? Just curious.

A merge commit signifies where the merge occurred in the target branch AFAIK. Without it, git would have to rewrite the entire commit history which would be bad for an open-source project where there are many collaborators and force pushing to master is risky.

A rebase merge can avoid the merge commit being needed as it pushes all commits onto the target branch above all other existing commits, this can be handy sometimes, but usually it's a more common task to perform while working on a PR.

A squash and merge is kind of a mix, where all commits are bundled into the single commit and merged into the target branch. This is usually the desirable choice since an entire feature is represented as a single commit on the target branch and contributors don't have to adhere to conventional commits if upstream uses that, since their commit history is only relevant prior to merge where the maintainer can ensure master commits adhere to conventional commit style if they prefer that.

Perhaps a visual would help?

Screenshot_20210301_001208

If we follow the pink commit tree down, we'll find more recent commits related to it:

Screenshot_20210301_002005

If I click one of those commits and then a file that was modified, I can view the history for the file:

Screenshot_20210301_002259

Or it's git blame view:

Screenshot_20210301_002344

EDIT: now there are a few more shown. Maybe github gui needs some time to index that properly?

Quite possibly. Github also has limited history tracking if the file is moved/renamed. A desktop GUI like I've shown above (GitKraken) doesn't have that issue and is fast since it just has to serve me and not optimize to serve as many users as Github does at a time.

@polarathene
Copy link
Member

Safe to rebase this branch onto master?

@wernerfred
Copy link
Member Author

wernerfred commented Mar 25, 2021

If you still have a local copy of your master branch, you should be able to verify that I've not sneaked anything in with the rebase + force-push 😅

Omg no. That is not my intention. I just wanted a bit more time to go over the ghpages build preview once again and look for problems :)

I marked the PR as ready. I would like to have an review of @docker-mailserver/maintainers and then merge it myself.
@polarathene a merge commit or a rebase?

I will see if the wiki is lockable. I also will update the top level description accordingly

EDIT:

I kinda locked the wiki right now:
image

@wernerfred wernerfred requested a review from a team March 25, 2021 07:30
@@ -1,6 +1,6 @@
# Contributing

This project is Open Source. That means that you can contribute on enhancements, bug fixing or improving the documentation in the [Wiki](https://github.com/docker-mailserver/docker-mailserver/wiki).
This project is Open Source. That means that you can contribute on enhancements, bug fixing or improving the [documentation](https://docker-mailserver.github.io/docker-mailserver/edge).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does the edge here mean that the source of the pages would be a branch called edge? Wasn't it only the tag on the Docker Hub that is called edge? The branch that this pr points to however is still master.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

edge just reflects the state of the docs on master branch. It's equivalent to master/latest, which is what edge tagged docker images are meant to reflect AFAIK?

It can be whatever label maintainers would prefer, I assume edge was chosen to match the docker edge tag since the project itself is docker focused and the docs are for those releases?

When a new version is tagged, it'll build the docs from their state in the tagged commit and that will be selectable from a drop-down menu, allowing us to retain old docs as breaking changes and deprecations are introduced.

@polarathene
Copy link
Member

@polarathene a merge commit or a rebase?

I don't think it matters either way? Just don't squash merge thanks! 😄

AFAIK, the main difference is where the commits are placed in the target branch history.

  • Rebase will just place all these commits together on top of the current master branch commit.
  • Merge commit places a separate merge commit instead, and you can visualize the commit they were branched off? Along with the merge commit linking back to the actual PR that a rebase merge lacks?

Screenshot_20210325_224451

Which is less visible in Github history view, but you can see how the commits are listed interleaved by time:

Screenshot_20210325_224626

Despite being a series of commits in a single PR:

Screenshot_20210325_224810

I assume that's a merge commit. As the commits for this PR are rebased, it seems they'll all be batched together either way. A merge commit might be better given the above context?

@georglauterbach georglauterbach mentioned this pull request Mar 25, 2021
11 tasks
fi
}

_update-versions-json
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This script looks fine, but I would like to see either one of two things changed:

  1. Remove the function completely as it is not really needed
  2. Use
function _update_versions_json
{
...

Really a minor thing, but I figured why not :D

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My original version had a few functions but that seemed unnecessary. Probably best to remove the function wrapper. I can commit that in this PR or in a follow up, we mostly just want to get this merged due to the sheer size of it and discussion thread becoming more difficult to manage.

There's still more to do for the docs, but the migration of the wiki is in good shape.

Copy link
Member

@georglauterbach georglauterbach left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Absolutely huge PR. I'm certain there's always some more tweaking and stuff possible, but this really LGTM. 👍🏼

@wernerfred This this and #1866 merged, I think it's safe to consider releasing v9.1.0. Technically, only minor updates happened, but overall a huge step for this project was done. I leave the honors of releasing to you :D

@polarathene
Copy link
Member

I think it's safe to consider releasing v9.1.0.

This PR doesn't need to be part of a tagged release, I don't think it would have any issues with #1866 being merged earlier, we should be able to rebase over that without any troubling conflicts.

Alternatively the version tag trigger for deploying version docs could be disabled if needed.

@georglauterbach
Copy link
Member

I think it's safe to consider releasing v9.1.0.

This PR doesn't need to be part of a tagged release, I don't think it would have any issues with #1866 being merged earlier, we should be able to rebase over that without any troubling conflicts.

Alternatively the version tag trigger for deploying version docs could be disabled if needed.

Alright. Then we can go ahead and merge #1866 and release v9.0.2. The changes from v9.0.1 without this are not justifying a new minor version, rather a patch (speaking in SemVer terms). But I'd leave it up to you @wernerfred :)

@casperklein
Copy link
Member

Alright. Then we can go ahead and merge #1866 and release v9.0.2. The changes from v9.0.1 without this are not justifying a new minor version, rather a patch (speaking in SemVer terms). But I'd leave it up to you @wernerfred :)

It should be v9.1.0 😉 => MAJOR.MINOR.BUGFIX

@georglauterbach
Copy link
Member

Alright. Then we can go ahead and merge #1866 and release v9.0.2. The changes from v9.0.1 without this are not justifying a new minor version, rather a patch (speaking in SemVer terms). But I'd leave it up to you @wernerfred :)

It should be v9.1.0 wink => MAJOR.MINOR.BUGFIX

Ooops. Always thought the third number indicates patches :D

@casperklein
Copy link
Member

Isn't it the same patch/bugfix?

@georglauterbach
Copy link
Member

Isn't it the same patch/bugfix?

Quite the possibility... 😂

@polarathene
Copy link
Member

polarathene commented Mar 25, 2021

Might be easier to remember the difference for semver when thinking about how these docs will publish once merged:

  • Minor version bump for new functionality where docs would benefit from an update before the tag which will create the version and archive it.
  • Patch version bump if you fix a bug or anything similar. Docs will rebuild for that version (albeit from master unless you maintained a branch from older commits). That is edge builds/updates whenever docs on master are updated, and a separate build/deploy happens for commits that are tagged of the docs at that point in commit history.

Eventually ENVIRONMENT.md would live in the docs, and would have more value tracking those changes along with the relevant versions that introduce new vars (or remove existing ones). So the amavis PR would be a minor version release AFAIK.

@wernerfred wernerfred merged commit 666de3e into docker-mailserver:master Mar 28, 2021
@georglauterbach
Copy link
Member

@all-contributors add @wernerfred for doc and maintenance

@allcontributors
Copy link
Contributor

@georglauterbach

This project's configuration file has malformed JSON: .all-contributorsrc. Error:: Unexpected token : in JSON at position 916

@georglauterbach
Copy link
Member

@wernerfred the JSON config seems malformed? I thought this was fixed.

@polarathene
Copy link
Member

polarathene commented Jul 8, 2021

There's a { missing at the end of contributors array. After georglauterbach entry for the hnws entry.

Here is the commit that messed it up: 542a92b

Previous commit that did it properly by appending to the end of the object array, unlike the broken one which did an insert incorrectly.

@polarathene
Copy link
Member

polarathene commented Jul 8, 2021

@georglauterbach @wernerfred looking at the PR times (and after checking the CLI tool source), looks like the bug is due to having simultaneous PRs open/triggered. Merging from a common ancestor is why one did not get appended like the rest?

Nope, seems that @wernerfred did attempt to rebase the hnws merge into it, but the diff/commit for georglauterbach entry was intended as an append with the diff I guess, thus the breakage?

Having a JSON lint check on the file to prevent merging if invalid JSON should avoid that accident in future :)

@wernerfred
Copy link
Member Author

wernerfred commented Jul 9, 2021

Seems like i messed it up, yes. @polarathene you are completely right, there were two PRs at the same time and i tried to resolve the merge conflict but missed a {.

I think JSON lint isn't worth it only for this file - I will not attempt to play with the file manually again but use the bot only instead. Should be enough for now.

I am a bit disappointed that the CLI will fix those issues by itself but the bot knows only 1 command.

I provide a PR with the missing {

@wernerfred wernerfred mentioned this pull request Jul 9, 2021
7 tasks
@georglauterbach
Copy link
Member

@all-contributors please add @wernerfred for doc and maintenance

@allcontributors
Copy link
Contributor

@georglauterbach

I've put up a pull request to add @wernerfred! 🎉

@wernerfred
Copy link
Member Author

@all-contributors add @polarathene for maintenance, doc, security, question

@allcontributors
Copy link
Contributor

@wernerfred

I've put up a pull request to add @polarathene! 🎉

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/ci area/documentation kind/improvement Improve an existing feature, configuration file or the documentation priority/medium
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FR] Move from wiki to github pages
6 participants