New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Move wiki to github pages #1826
Conversation
@polarathene you should have collaborator permissions on the pr repository |
Seems we created a similar PR at roughly the same time 😛 I've got the docs migrated and documented the process to do so. There's an additional branch I have up with Github Action figured out for deploying to Github Pages as requested :) I could switch to MKDocs if you like, but I think Docusaurus seems like a pretty good option, we can weigh the choices up, I see you created an issue about this so I'll take a read of it over there and continue discussion about it on the related issue. |
Just to clarify, docs from wiki will be migrated to the repo separately to this PR correct? And the PR will rebase onto that when ready. My day has just started, so I'll be focusing on tackling some concerns and how to handle those. Once I've resolved that, I'll have a better idea of project structure and will commit the docs to that structure (which we can all agree on first of course) and PR that through. Afterwards, I'll collab with you on this :) |
Currently my plan was to have a strucutre like you proposed like this:
The current wiki then needs to live inside I was thinking of you pushing the wiki with your commands that keep the history onto this PR. This way we can build and see the generated sites already during development. But we can of course also go an different approach. My day ends now and I will think overnight on how to proceed further. I'm looking forward to read your thoughts tomorrow :) |
That location seems fine. I'll chime in when I've completed my investigation of adding extensions. We'll likely want to keep setup easy for contributors, thus a separate repo to maintain the Dockerfile and push to DockerHub is probably going to be best. The docs and any other files that can be mounted via If using Docker to bundle our extensions and maintain an image for it on DockerHub is going to happen, we may just want to adopt that for the Github Actions workflow too? (simplifies maintenance and keeps contributor and CI docs workflow more consistent)
I should be done with my investigation by the end of the day.
In the meantime, feel free to experiment on a different branch/repo like you and I have been doing, or even locally, the # Runs `mkdocs serve` by default, but you can pass in a command such as `build` and it'll forward that to `mkdocs` to run within the container:
docker run --rm -it -p 8000:8000 -v ${PWD}:/docs squidfunk/mkdocs-material
# eg: docker run --rm -it -p 8000:8000 -v ${PWD}:/docs squidfunk/mkdocs-material build --strict I'd rather the large 488 commit history not be mixed into this PRs commit history for review. You can use an interactive rebase or squash/remove commits if necessary to clean up commit history. I can assist with this when the time comes if needed and you prefer I handle it :) |
And now my day ends :) We'll need a Dockerfile, it can start off in this repo if you like in the I'll convert history of the wiki files to this location tomorrow, along with folder hierarchy matching the current wiki. |
Sadly I was busy the whole day so i can only provide further thoughts and no doings. One question: Do we really need the extension you mentioned? Imho it is way easier to use the official unmodified image for the owrkflow as well as for local testing for contributors. It also takes away maintaining time of a separate repo/image of the modified one. So my proposal:
Tomorrow (friday) I will be very busy but I will respond/start working on the weekend on your thoughts. |
If there is any other extensions worth adding, we'd need to do this anyway, so it might be nice to have in place early. I don't mind doing the work for that. As it is, we'd still use the upstream Docker image, we'd just have a minimal Dockerfile that extends it with a single That can either be distributed via DockerHub or another registry (Github has it's own one I think and there should be an existing Github Action for deploying such, I haven't looked into how this project presently manages pushing to DockerHub). Build/deployment can be automated all the same, I can have the upstream image monitored and just have the CI/CD take care of maintenance on that front. Alternatively we just have a Dockerfile in
I don't see it as much of a complication / overhead to include the extensions personally. If you want to avoid them and their value that is fine, but should you need extensions in future someone else may need to figure out how to set these up, and as it stands right now If versioning docs is of value to you, this is also something I was planning to investigate and setup, so that future breaking changes with major versions of the image could be easier to match docs to the image without sifting through git history directly. The docs would have a drop-down or URL change for the version and you'd be good AFAIK. If
They're not necessary, they just add additional value. Flow charts can be better presented, docs nav is easier to manage (loses some relevance if we hard-code the entire nav from the get go though). We can avoid the extensions for now if you prefer, I was mostly exploring them to make sure that I was aware of requirements should we desire them at some point and how that might impact structure of the
That's fine. Things often seem simple to me, but I do find myself biting off more than I can chew at times :) Personally it doesn't seem like too much of a complication. I'm not sure how comfortable you are with Docker or the others, nor web development. # Would ideally be version pinned instead of using `:latest`
FROM squidfunk/mkdocs-material:latest
RUN pip install --no-cache-dir \
mkdocs-awesome-pages-plugin \
mkdocs-mermaid2-plugin
# mermaid2 is only needed if using that plugin
# Alternative mermaid.js approach can volume mount or copy some JS
# `mkdocs.yml` would set `theme.custom_dir: theme`, and `extra_javascript: theme/assets/javascripts/some-script.js`
# COPY build-files/some-script.js theme/assets/javascripts/some-script.js
# Regular image entrypoint is `mkdocs` binary, `docker run` commands get appended to that
# Otherwise default command for it is `serve --dev-addr=0.0.0.0:8000`
# docker build -t docker-mailserver-mkdocs .
# docker run --rm -it -p 8000:8000 -v ${PWD}:/docs docker-mailserver-mkdocs
# alternatively, override the upstream ENTRYPOINT and CMD without needing a custom image:
# docker run --rm -it -p 8000:8000 -v ${PWD}:/docs --entrypoint /bin/sh squidfunk/mkdocs-material -c "pip install mkdocs-awesome-pages-plugin && mkdocs serve" It's basically two lines, and we probably don't need that You can also run a build locally of the docs, it'll output to
Caddy is a simple yet great web server. The default config of that image is setup for local dev and you can access it at |
Docs are ready for merging on my other PR, once that is done and this PR rebases onto that it should build fine. I have a branch already available for live demo here(future readers: this demo will be removed after PR). I've got some commits to cherry pick over once the rebase happens that will update the docs further. |
Pushing to ghcr is only a matter of logging-in to gh and push to registry like known with the image name prefixed by ghcr.io/
I would like to ask @aendeavor on his opinion. Imho we have 3 options:
I think there are not much things I don't know about docker and containers. But of course that might not count for everyone.
Really nice. I really like the layout. Thanks for your effort and time! |
Presently we're not in need of a custom docker image since I hard-coded the nav fully anyway. Github Wiki contributors had to modify the sidebar menu file sometimes just like they would There is a request for The only other extension worth possibly having atm is one to show the last git revision (commit) date for a doc page, but this isn't a major need. Probably still worth discussing the options if we ever need a custom Docker image. |
I haven't read through the details here, therefore I will keep my answer rather short: If it was up to me, I would pick 2) or 3). It's not that I think @polarathene isn't suited, I just feel like this organization is the right place to put everything related to this project in. I mean, that's its sole purpose. With this way, we can make sure everything is in one place and better maintainable. Ultimately; I would make it up to you two to decide since you're the driving forces, but now you've heard my opinion:) @wernerfred if you need it, I can grant you more privileges if you need them - just reach back to me. |
I would appreciate I had another option in mind: the traefik project build documentation during PRs and publishes them (might be using mike) with a pr# label/version. So maintainers can look at the generated docs without the need to clone and build/serve locally. We could look at it if it is feasible with mike too. Would be nice. One thing actually might speak against it: We definitely do not want to have the PR versions in the offical docs version dropdown. But maybe you have an idea to solve this. Otherwise we will put this backwards as it has low priority. |
I really like your changes so far 👍🏻 Next steps will be:
|
From what I've read, you are advised to not retrofit
I'll have to take a look, but I don't think Someone with authority would need to connect the organization to a service like Vercel for this, if so I can probably set something up and use the "transfer" feature to provide a working setup, otherwise I can relay instructions.
It's low priority for me atm over other doc tasks. My understanding is that with a service like Vercel, they have branch deploys/previews, aka staging deploy. So rather than the
Looking forward to this, but I want to ensure proper history is retained if possible :) As in, I should be able to
👍 I know presently that:
I've not given much thought to this at present, other than I had to remove the duplicate FAQ link (one is under Config near Debugging, and another under Tutorials) as mkdocs didn't seem to like this unless the document was duplicated and would just redirect to the other one (which in my current setup broke nav for the Tutorial section).
I've been doing this already on my end. I've added front-matter metadata to each md file in one commit, and presently going over the links for another commit to cherry pick over. Additionally some URLs are provided without markdown syntax, so they did not create a clickable link. I've fixed that with the After that we're good for collab, although we should probably take ownership of some tasks or sections/pages while we work on them to avoid duplicate work :) |
There's the workflows being tweaked here, and any plugin related stuff. If you're not familiar with MkDocs Material Docs or PyMarkdown Extensions, you could go have a read through those and learn about the features. I forked my docs PR to work on a branch locally and for Github Pages workflow testing for this purpose. You could investigate Markdown linting rules should probably be addressed, as well as a link checker. The build uses I don't know how either link checker (build time or post-deploy) handles non-existent files like This helps address a concern about Github rate limiting which only allows around 1k requests within an hour (probably won't hit that, but if we're testing this on pull requests with someone frequently pushing commits on an open PR it might be possible), as we can limit the internal URL requests at least. For external another approach is to detect only new/modified URLs to check against eg by utilizing a diff action but this complicates the workflow a bit. Otherwise a scheduled daily workflow could check the current site deployment and any invalid URLs it notices can be reported via automated issue to notify us. I've done a little research into this as you can tell, and I'm aware of a few Github Actions / tools we could leverage depending on what sort of coverage we would like to ensure the maintenance of working URLs in the docs. As I update existing ones, I notice not only the many EDIT: Just adding a note that |
This comment has been minimized.
This comment has been minimized.
#1827 is merged. git blame shows every change but can you explain me why the histroy only shows the last commit? Just curious. EDIT: now there are a few more shown. Maybe github gui needs some time to index that properly? |
A merge commit signifies where the merge occurred in the target branch AFAIK. Without it, git would have to rewrite the entire commit history which would be bad for an open-source project where there are many collaborators and force pushing to master is risky. A rebase merge can avoid the merge commit being needed as it pushes all commits onto the target branch above all other existing commits, this can be handy sometimes, but usually it's a more common task to perform while working on a PR. A squash and merge is kind of a mix, where all commits are bundled into the single commit and merged into the target branch. This is usually the desirable choice since an entire feature is represented as a single commit on the target branch and contributors don't have to adhere to conventional commits if upstream uses that, since their commit history is only relevant prior to merge where the maintainer can ensure master commits adhere to conventional commit style if they prefer that. Perhaps a visual would help? If we follow the pink commit tree down, we'll find more recent commits related to it: If I click one of those commits and then a file that was modified, I can view the history for the file: Or it's
Quite possibly. Github also has limited history tracking if the file is moved/renamed. A desktop GUI like I've shown above (GitKraken) doesn't have that issue and is fast since it just has to serve me and not optimize to serve as many users as Github does at a time. |
Safe to rebase this branch onto master? |
Omg no. That is not my intention. I just wanted a bit more time to go over the ghpages build preview once again and look for problems :) I marked the PR as ready. I would like to have an review of @docker-mailserver/maintainers and then merge it myself. I will see if the wiki is lockable. I also will update the top level description accordingly EDIT: |
@@ -1,6 +1,6 @@ | |||
# Contributing | |||
|
|||
This project is Open Source. That means that you can contribute on enhancements, bug fixing or improving the documentation in the [Wiki](https://github.com/docker-mailserver/docker-mailserver/wiki). | |||
This project is Open Source. That means that you can contribute on enhancements, bug fixing or improving the [documentation](https://docker-mailserver.github.io/docker-mailserver/edge). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does the edge
here mean that the source of the pages would be a branch called edge
? Wasn't it only the tag on the Docker Hub that is called edge? The branch that this pr points to however is still master.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
edge
just reflects the state of the docs on master
branch. It's equivalent to master
/latest
, which is what edge
tagged docker images are meant to reflect AFAIK?
It can be whatever label maintainers would prefer, I assume edge
was chosen to match the docker edge
tag since the project itself is docker focused and the docs are for those releases?
When a new version is tagged, it'll build the docs from their state in the tagged commit and that will be selectable from a drop-down menu, allowing us to retain old docs as breaking changes and deprecations are introduced.
I don't think it matters either way? Just don't squash merge thanks! 😄 AFAIK, the main difference is where the commits are placed in the target branch history.
Which is less visible in Github history view, but you can see how the commits are listed interleaved by time: Despite being a series of commits in a single PR: I assume that's a merge commit. As the commits for this PR are rebased, it seems they'll all be batched together either way. A merge commit might be better given the above context? |
fi | ||
} | ||
|
||
_update-versions-json |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This script looks fine, but I would like to see either one of two things changed:
- Remove the function completely as it is not really needed
- Use
function _update_versions_json
{
...
Really a minor thing, but I figured why not :D
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My original version had a few functions but that seemed unnecessary. Probably best to remove the function wrapper. I can commit that in this PR or in a follow up, we mostly just want to get this merged due to the sheer size of it and discussion thread becoming more difficult to manage.
There's still more to do for the docs, but the migration of the wiki is in good shape.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Absolutely huge PR. I'm certain there's always some more tweaking and stuff possible, but this really LGTM. 👍🏼
@wernerfred This this and #1866 merged, I think it's safe to consider releasing v9.1.0
. Technically, only minor updates happened, but overall a huge step for this project was done. I leave the honors of releasing to you :D
This PR doesn't need to be part of a tagged release, I don't think it would have any issues with #1866 being merged earlier, we should be able to rebase over that without any troubling conflicts. Alternatively the version tag trigger for deploying version docs could be disabled if needed. |
Alright. Then we can go ahead and merge #1866 and release |
It should be v9.1.0 😉 => MAJOR.MINOR.BUGFIX |
Ooops. Always thought the third number indicates patches :D |
Isn't it the same patch/bugfix? |
Quite the possibility... 😂 |
Might be easier to remember the difference for semver when thinking about how these docs will publish once merged:
Eventually |
@all-contributors add @wernerfred for doc and maintenance |
This project's configuration file has malformed JSON: .all-contributorsrc. Error:: Unexpected token : in JSON at position 916 |
@wernerfred the JSON config seems malformed? I thought this was fixed. |
There's a Here is the commit that messed it up: 542a92b Previous commit that did it properly by appending to the end of the object array, unlike the broken one which did an insert incorrectly. |
@georglauterbach @wernerfred looking at the PR times (and after checking the CLI tool source), looks like the bug is due to having simultaneous PRs open/triggered. Merging from a common ancestor is why one did not get appended like the rest? Nope, seems that @wernerfred did attempt to rebase the Having a JSON lint check on the file to prevent merging if invalid JSON should avoid that accident in future :) |
Seems like i messed it up, yes. @polarathene you are completely right, there were two PRs at the same time and i tried to resolve the merge conflict but missed a {. I think JSON lint isn't worth it only for this file - I will not attempt to play with the file manually again but use the bot only instead. Should be enough for now. I am a bit disappointed that the CLI will fix those issues by itself but the bot knows only 1 command. I provide a PR with the missing { |
@all-contributors please add @wernerfred for doc and maintenance |
I've put up a pull request to add @wernerfred! 🎉 |
@all-contributors add @polarathene for maintenance, doc, security, question |
I've put up a pull request to add @polarathene! 🎉 |
Description
Decision was made to pick mkdocs. See the current version of docs on gh-pages of the PR:
https://wernerfred.github.io/docker-mailserver/
Open tasks (detailed description in this comment):
Will be done by further PR of @polarathene and me:
Done:
versions.json
approach and custom workflowFixes #1825
Type of change
Checklist: