Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wiki.zip is too small #5

Open
yu-tang opened this issue May 11, 2020 · 6 comments
Open

Wiki.zip is too small #5

yu-tang opened this issue May 11, 2020 · 6 comments

Comments

@yu-tang
Copy link

yu-tang commented May 11, 2020

Today I downloaded a wiki.zip file for offline viewing via Main page.
I extracted it and got just 1,149 files. Most of the topics are missing and the hyperlinks are broken everywhere. It seems something is wrong with wiki archiver.

@nagolove
Copy link

nagolove commented Jul 9, 2020

Today I downloaded a wiki.zip file for offline viewing via Main page.
I extracted it and got just 1,149 files. Most of the topics are missing and the hyperlinks are broken everywhere. It seems something is wrong with wiki archiver.

You can try tool like httrack. Database size would be more 1.8GB

@slime73 slime73 transferred this issue from love2d/love Oct 17, 2021
@slime73
Copy link
Member

slime73 commented Jan 19, 2022

If it's not trivial to fix the download, maybe we can delete the link or replace it with a link to something like https://love2d-community.github.io/love-api/ (while mentioning it's unofficial)?

My wiki account doesn't have permission to edit the home page which has the broken link.

@atmaranto
Copy link

atmaranto commented Jan 19, 2022

@slime73's suggestion would be best, in my opinion. As a casual user of Love2D, I was somewhat frustrated by the incomplete dump of the wiki.

@TheLinx
Copy link
Collaborator

TheLinx commented Jan 19, 2022

You're right, it's been deleted for now.

There's sadly no good tool for archiving a wiki that I'm aware of.
I'm not talking about XML dumps or whatever MW offers, I'm talking about useful-for-users rendered HTML.

I was using some random mw2html.py (looking at it now, it's from 2005 and made for some ancient python version) but that's not an option now.

@atmaranto
Copy link

atmaranto commented Jan 20, 2022

This is coming from someone who has only run minor web servers (and never MediaWiki), but would wget offer a potential alternative? I'll admit it would be quite an inefficient solution, since it's meant to scrape remote sites, but I've had some success with it in creating static copies of websites in the past.

@TheLinx
Copy link
Collaborator

TheLinx commented Jan 20, 2022

It was the first option I tried when we were making the wiki backup but it had some issues that I forgot, it's been a while.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants