Skip to content
Go to file

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

About This Repository

This repository contains the python3 scripts used by the Nordic Museum to upload images to Wikimedia Commons. It is based on lokal-profil/upload_batches.

This is a work in progress started in autumn 2017 and ongoing throughout 2018. For more details, contact Aron Ambrosiani or Alicia Fagerving. The remaining work is listed as Issues.

Additional Reading:


To run it you will have to install BatchUploadTools and pywikibot using: pip install -r requirements.txt

Note: You might have to add the --process-dependency-links flag to the above command if you are running a different version of pywikibot from the required one.

User Account

The script must be run from a Wikimedia Commons account with the upload_by_url user right. On Wikimedia Commons this is limited to users with one of the image-reviewer, bot, gwtoolset or sysop flags. Apply for bot rights.


Every upload batch relies upon two settings files:

  • Batch-specific settings.
  • Institution-specific settings. The name of this file has to correspond to the institution code in the DigitaltMuseum system, e.g. S-NM.json for Nordiska Museet.

Some of the settings can be provided via command line parameters (use -help to see the available ones), but most of them have to be stated in the appropriate settings file. See the settings directory for examples.

Command lines values take preference over those provided by the settings file.

The following settings cannot use the default options:

Institutions with existing settings and templates already added:


The basic workflow is the following:

  1. Modify settings.json to fit your project.
  2. If it doesn't exist yet, create an institution settings file (see above).
  3. Create with the bot username
  4. Create with the bot username & password. Generate a bot password.

(On Wikimedia Commons, prepare the templates needed & apply for bot rights including upload_by_url if you don't already have them)

The following commands are run from the root folder of your installation:

  1. Run python importer/ -api_key:yourDiMuAPIkey to scrape info from the DiMu API and generate a "harvest file". Example output (note: if the harvest breaks, check the harvest_log_file to find the last UUID in the list). If you want to re-harvest from the local cache, add the flag -cache:True
  2. Run python importer/ to pull the harvest file and generate mapping files for Wikimedia Commons

Upload mappings to Wikimedia Commons

  1. Upload the generated mappings files in the /connections folder to Wikimedia Commons. Example: location of the Nordic Museum mappings
  2. Perform the mapping in the mapping tables.

After uploading the mappings to Wikimedia Commons, the following commands are run from the root folder of your installation:

  1. Run python importer/ -batch_settings:settings/settings.json -in_file:dimu_harvest_data.json -base_name:nm_output -update_mappings:True to pull the harvest file and mappings and prepare the batch file. Example output
  2. Run python importer/ -type:URL -in_path:nm_output.json to perform the actual batch upload. -cutoff:X limits the number of files uploaded to X (this will override settings)


Python scripts for copying images from Digitalt museum to Wikimedia Commons




No packages published

Contributors 4



You can’t perform that action at this time.