You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When developing the converter I hit the wiki thousands of times a day. Each run of the program is over 2000 network requests. These requests are slow, waste bandwidth, and appear to cause throttling. One technique I use between runs is using a hard coded list of monsters to avoid all the network requests. A better solution would be to locally cache the wiki pages next to the executable to make them easy to manage and clear.
Expectations:
Cache TW pages in plain text
Store cache files local to executable in a folder
Skip network request when file is found in local cache
Optional: TTL for cache to go to network when files are over 24 hours old
The improvement is a nice to have, it mainly helps developers. For end users, that use TW as an input source there is no reason to run the tool more then once a day, even once a week is enough in most cases as the data on the wiki is updated slowly.
The text was updated successfully, but these errors were encountered:
When developing the converter I hit the wiki thousands of times a day. Each run of the program is over 2000 network requests. These requests are slow, waste bandwidth, and appear to cause throttling. One technique I use between runs is using a hard coded list of monsters to avoid all the network requests. A better solution would be to locally cache the wiki pages next to the executable to make them easy to manage and clear.
Expectations:
The improvement is a nice to have, it mainly helps developers. For end users, that use TW as an input source there is no reason to run the tool more then once a day, even once a week is enough in most cases as the data on the wiki is updated slowly.
The text was updated successfully, but these errors were encountered: