Join GitHub today
GitHub is home to over 31 million developers working together to host and review code, manage projects, and build software together.Sign up
Delete old Markdown file when editing the EXPORT_FILE_NAME #34
One idea could be to maintain a "database" of the posts that have been exported from
If we go with this idea, we could also store a hash of the subtree/post contents, to detect changes to any of the posts, and let "export all subtrees" identify and export only the changed ones.
Would you like to implement this? Also I don't know what the performance impact would be for big blogs if the hash has to be calculated for dozens/hundreds of posts with, let's say, 2000 words each, for each export. Poor man solution would be to rely on git diff to see which posts changed, and export just those :) Git diff also helps catch any unintended text change in older posts.
About the implementation specific to deleting old Markdown files, here's my thought: Each time a post is exported, this one property should be saved to the subtree:
Before first export
After first export
(assuming the extension to be always
.. and after the export, the
I would like to try using the after save hook that you have provided with
This seems like a reasonable way to go about it.
No, I haven't been working on this, and so have tagged as a wishlist item. Would you like to work on this?
But "this", I mean, creating a hash of all the subtrees and detecting which subtree content changed vs not.
It would be a great feature to implement, but I can use some help. This feature can also live as a separate package; either in ox-hugo repo
I'd like to work on this issue but currently I focus on
Hmm... I feel now the problem is that an unexpected post could be exposed to a hugo website.
If a user configures
But the point is when we should check the unexpected files are placed in
Should we have to check them all at the time exporting even a single post by introducing potential heavy calculations based on hash database?
Yes, I understand the issue. The way I avoid is by carefully looking at the git diffs when committing. This of course doesn't help if one is not using the git flow for site deployment, and is instead directly copying the files from content, public, etc. via rsync, etc.