Skip to content
Go to file


Failed to load latest commit information.
Latest commit message
Commit time
Nov 21, 2020
Nov 17, 2020
Nov 9, 2017


MWoffliner is a tool for making a local offline HTML snapshot of any online Mediawiki instance. It goes through all articles (or a selection if specified) and create the corresponding ZIM file. It has mainly been tested against Wikimedia projects like Wikipedia, Wiktionary, ... But it should also work for any recent Mediawiki.

Read to know more about MWoffliner development.


npm Docker Build Status Build Status codecov CodeFactor License


  • Scrape with or without image thumbnail
  • Scrape with or without audio/video multimedia content
  • S3 cache (optional)
  • Image size optimiser
  • Scrape all articles in namespaces or title list based
  • Specify additional/non-main namespaces to scrape

Run mwoffliner --help to get all the possible options.


  • *NIX Operating System (GNU/Linux, macOS, ...)
  • NodeJS
  • Redis
  • Libzim (On GNU/Linux & macOS we automatically download it)
  • Various build tools that are probably already installed on your machine (libjpeg, gcc)

... and an online Mediawiki with its API available.

See Environment setup hints to know more about how to install them.


To install MWoffliner globally:

npm i -g mwoffliner

You might need to run this command with the sudo command, depending how your npm is configured.

Then to run it:

mwoffliner --help

To use MWoffliner with a S3 cache, you should provide a S3 URL like this:



MWoffliner provides also an API and therefore can be used as a NodeJS library. Here a stub example:

const mwoffliner = require('mwoffliner');
const parameters = {
    mwUrl: "",
    adminEmail: "",
    verbose: true,
    format: "nopic",
    articleList: "./articleList"
mwoffliner.execute(parameters); // returns a Promise


Complementary information about MWoffliner:

  • MediaWiki software is used by dozen of thousands of wikis, the most famous ones being the Wikimedia ones, including Wikipedia.
  • MediaWiki is a PHP wiki runtime engine.
  • Wikitext is the name of the markup language that MediaWiki uses.
  • MediaWiki includes a parser for WikiText into HTML, and this parser creates the HTML pages displayed in your browser.
  • There is another WikiText parser, called Parsoid, implemented in Javascript/NodeJS. MWoffliner uses Parsoid.
  • Parsoid is planned to eventually become the main parser for MediaWiki.
  • MWoffliner calls Parsoid and then post-processes the results for offline format.

Environment setup hints


Install NodeJS:

curl -o- | bash && \
source ~/.bashrc && \
nvm install stable && \
node --version

Install Redis:

brew install redis

GNU/Linux - Debian based distributions

Install NodeJS:

curl -o- | bash && \
source ~/.bashrc && \
nvm install stable && \
node --version

Install Redis:

sudo apt-get install redis-server


GPLv3 or later, see LICENSE for more details.

You can’t perform that action at this time.