Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code LGTM.
I'd consider not tying yourself to global mutable state for the cache though...
|
||
However, `setCacheStrategy` is provided to allow you to integrate your own caching solutions. The function expects an options argument with two keys: | ||
|
||
- `get` should accept a single argument, the key, and return a Promise resolving to a cached value. If no cached value is found, the Promise should resolve to `null`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are there any legitimate cases for ever wanting to cache a val=null
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fortunately, no. Due to the internal representation of completed sequences, the cached values will be of type Array<String|Integer>
. This wasn't always the case, but I moved in that direction awhile back to keep things easily serializable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I should clarify the expected shape of the cache values in the docs, however! Thanks for pointing it out.
const asyncStrategy = require("./strategies/async"); | ||
|
||
|
||
let cacheStrategy = defaultStrategy(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would there ever be a scenario for having two rapscallion instances which would have differing cache strategies?
You've got a global mutable singleton, which ties your hands a bit vs. maybe creating a cache object to initialize and pass to renderers...
const { SequenceCache } = require("../sequence-cache"); | ||
|
||
|
||
const cache = Object.create(null); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Again, maybe consider instantiable cache objects for renderers instead of global mutable singletons.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Both of these are really good points...
To be honest, I'm sure there are cases where a single process could potentially utilize differing cache strategies. I attempted that at first, but it made everything much more complex, and required intermediate object instantiation that slowed down traversal.
So although its a bit icky, this seemed a reasonable first implementation. It might be that somebody can come up with a clever way of improving the flexibility of the implementation here, or I might revisit at some point. One thing I've considered is instantiating several values on the Renderer
itself, and then passing that through the traversal. That might be cleaner, and would be a good place to start if it is determined that the implementation in this PR is inadequate.
However, this does seem "good enough" in a lot of ways, since most people will probably utilize a single cache backend for a given process. And, if someone really needs to fork their behavior, they could do so in the context of the cache strategy that they register. They wouldn't have access to every piece of information, but they would have access to the cacheKey
and the sequence buffer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Captured potential for enhancement and out-of-band conversation here: #37
…ise is resolved.
4395b39
to
ed5ce35
Compare
This adds support for external caching solutions like Redis or memcached.
It does add some complexity, which is a shame, but it was necessary to support the ad-hoc sequence interruptions that enabled third-party integrations. And, on the plus side, the external API is very clean and easy to understand. For clarity and to provide a starting place, a Redis example has been provided in the README.
Closes #22.
Closes #25.