Skip to content

SR-G/evernote-enex-to-omnivore

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Evernote to Omninove CLI

Principle

Process .enex files with the following algorithm - will try to inject all specified entries into Omnivore, with that logic :

  • if the source URL is still accessible online (the program checks this), then a "save URL" command will be issued (and Omnivore will collect the content itself on server side)
  • if the source URL is NOT found in the .enex file OR if it is NOT online anymore, then a "save Page" command will be triggered with the content saved in the Evernote note

All successfully processed entries are put in a .cache file, in order to not be processed if you launch the command again and again. You can manually edit / add / delete from that whole (or wipe the whole file if you want to start over, knowing that it should be stateless : already existing entries should not create duplicates inside Omnivore, thanks to "consistent UUID" built from the URL inside the program)

Usage

Available commands :

evernote-enex-to-omnivore

  Flags: 
       --version         Displays the program version string.
    -h --help            Displays help with available flag, subcommand, and positional value parameters.
    -a --api             OMNIVORE APIKey
    -u --url             OMNIVORE Graphql HTTP endpoint / URL (optional) (default: https://api-prod.omnivore.app/api/graphql)
    -i --input           Input files, comma separated (like '-i file1.enex,file2.enex')
    -p --preview         Activate preview mode (optional)
    -r --resume-from     ID (hash) of the last valid URL : only the following URL will be processed (optional)
    -s --skip            IDs (hash) to be skipped (like '-s ID1,ID2') (optional)
    -c --count           Number of items to process (optional, default -1) (default: -1)
    -fa --force-article   Force saving as articles
    -fu --force-url       Force one specific URL

Example of a regular execution :

go run . --input resources/Mes\ notes.0.enex -c 50 --api XXXXXXXXXXXXXXXXXXX

Example of an execution by skipping some IDs (note : you can also add them in the .cache local file, same folder) :

go run . --input resources/Mes\ notes.0.enex --skip "33613733-6465-3435-6565-393639393233,63343733-3538-6266-6461-636431383537" -c 50 --api XXXXXXXXXXXXXXXXXXX

Example of an execution by forcing again a specific URL (in order to reprocess it here by forcing it as an article) :

go run . --input resources/Mes\ notes.0.enex -c 1 --force-article --force-url "http://remote-url-not-available-online-anymore.com/" --api XXXXXXXXXXXXXXXXXXX

Notes :

  • --skip UUID1,UUI2 is useful is there are some entries for which some troubles are happening (too big content for example)
  • --resume-from <UUID> allows to skip all the entries BEFORE the entry provided with that parameter

Disclaimer

This is a one shot program that i fully used on my side with great success, but probably not a very polished one :

  • code is readabable but not really great ...
  • no binaries are provided at this time (you need to run it with go)
  • no guarantees that it will work for you - it worked really fine on my side to migrate ~4000 web saved pages, but you may encounter different corner cases in your own situation (= please do iterative tests, as stated below)
  • it's not designed to be fast (not multi-threading, no optimizations, ...), but, well, you are probably going to only use it once

Procedure (how to use this CLI)

  1. Create .enex files (for example, under Windows, through the official Evernote desktop client installed from the MS Store - select a notebook, use the ... button and select the Export notebook entry - note : it is adviced to split them in 2GB files, as proposed by the Evernote client during the export)
  2. (under linux) GIT clone this project : git clone https://github.com/SR-G/evernote-enex-to-omnivore.git
  3. cd into it cd evernote-enex-to-omnivore
  4. Put all your .enex files in that folder (for example in a resources/ subfolder)
  5. Prepare the go package : go mod tidy
  6. Create your API key inside Omnivore, per the Omnivore documentation : https://docs.omnivore.app/integrations/api.html (chapter "Getting an API token")
  7. Launch the tool once, in preview mode, with only a subset of entries (10 here), to see how it's behaving : go run . --input resources/Mes\ notes.0.enex --api XXXXXXXXXXXXXXXXXXX -c 10 --preview
  8. Check the results in the logs ...
  9. If everything is fine, launch without the "preview" mode : ``go run . --input resources/Mes\ notes.0.enex --api XXXXXXXXXXXXXXXXXXX -c 10`
  10. Check in Omnivore that you see your entries (as "Archived" entries)
  11. Launch the tool again (see one of the next chapter about rate limiter, also)

Example of execution

Only [50] items will be processed before stopping
Files to be processed : resources/Mes notes.0.enex
OMNIVORE URL [https://api-prod.omnivore.app/api/graphql], OMNIVORE APIKey [XXXXXXXXXXXXXXXXXXXXXXXXXXX]
URLs IDs to be skipped : 33613733-6465-3435-6565-393639393233, 63343733-3538-6266-6461-636431383537
Starting to process file [resources/Mes notes.0.enex]

36626237-6364-6230-3838-666364316333 | SKIPPED (skipped ID from .cache previous file) | https://...

(...)
34386562-3763-3964-6635-393735313131 | IMPORT/Evernote | https://www.lemonde.fr/les-decodeurs/article/2020/09/04/...
  > [INFO] url [https://www.lemonde.fr/les-decodeurs/article/2020/09/04/...] still accessible, will save as URL
  > [INFO] Correctly saved to Omnivore :  {"data":{"saveUrl":{"url":"...","clientRequestId":"34386562-3763-3964-6635-393735313131"}}}

Stopping, as [50] entries have been processed

===================================================
Total number of items processed : 50
Total number of items processed as URL : 47
Total number of items processed as Article : 3
Total number of errors while saving as URL : 0
Total number of errors while saving as Article : 0

Rate limiter

Keep in mind there is a rate limiter on Omnivore, allowing at max 100 requests per minutes. You have to handle this by yourself.

Solution #1

Add a small delay inside the evernote-enex-to-omnivore.go (like a 1 seconds sleep)

Solution #2

Just batch the command

while [[ true ]] ; do
  go run . ... -c 90 
  sleep 60
done

And break the script once everything has been processed

Solution #3

There is now a 50 seconds delay inside the script (each 100 entries), so you can also launch it in "infinite mode" (-c 2000, -c -1, or not defining that parameter at all) and it should be OK (provided Omnivore doesn't change the rate limiter values over time - be careful about that).

Links

Documentations

Relevant extract from GraphQL Omnivore schema :

  enum ArticleSavingRequestStatus {
    PROCESSING
    SUCCEEDED
    FAILED
    DELETED
    ARCHIVED
  }

  type Label {
    id: ID!
    name: String!
    color: String!
    description: String
    createdAt: Date
    position: Int
    internal: Boolean
  }

  input CreateLabelInput {
    name: String! @sanitize(maxLength: 64)
    color: String @sanitize(pattern: "^#([A-Fa-f0-9]{6}|[A-Fa-f0-9]{3})$")
    description: String @sanitize(maxLength: 100)
  }

  input SaveUrlInput {
    url: String!
    source: String!
    clientRequestId: ID!
    state: ArticleSavingRequestStatus
    labels: [CreateLabelInput!]
    locale: String
    timezone: String
    savedAt: Date
    publishedAt: Date
  }

Example of command line CURL command :

curl -X POST -d '{ "query": "mutation SaveUrl($input: SaveUrlInput!) { saveUrl(input: $input) { ... on SaveSuccess { url clientRequestId } ... on SaveError { errorCodes message } } }", "variables": { "input": { "clientRequestId": "85282635-4DF4-4BFC-A3D4-B3A004E57067", "source": "api", "url": "https://blog.omnivore.app/p/contributing-to-omnivore" }} }' -H 'content-type: application/json' -H 'authorization: <your api key>' https://api-prod.omnivore.app/api/graphql

Example of a full Label description :

{id: "1a548f08-70e4-11ee-a5dd-37c570b5ca9f", name: "IMPORT/Evernote", color: "#00D084",…}

About

CLI allowing to migrate Evernote saved web pages to Omnivore (through .enex files)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages