Skip to content
This repository has been archived by the owner on Feb 19, 2022. It is now read-only.

Improved component caching #36

Merged
merged 7 commits into from Feb 15, 2017
Merged

Improved component caching #36

merged 7 commits into from Feb 15, 2017

Conversation

divmain
Copy link
Contributor

@divmain divmain commented Feb 15, 2017

This adds support for external caching solutions like Redis or memcached.

It does add some complexity, which is a shame, but it was necessary to support the ad-hoc sequence interruptions that enabled third-party integrations. And, on the plus side, the external API is very clean and easy to understand. For clarity and to provide a starting place, a Redis example has been provided in the README.

Closes #22.
Closes #25.

Copy link
Member

@ryan-roemer ryan-roemer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code LGTM.

I'd consider not tying yourself to global mutable state for the cache though...


However, `setCacheStrategy` is provided to allow you to integrate your own caching solutions. The function expects an options argument with two keys:

- `get` should accept a single argument, the key, and return a Promise resolving to a cached value. If no cached value is found, the Promise should resolve to `null`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are there any legitimate cases for ever wanting to cache a val=null?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fortunately, no. Due to the internal representation of completed sequences, the cached values will be of type Array<String|Integer>. This wasn't always the case, but I moved in that direction awhile back to keep things easily serializable.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I should clarify the expected shape of the cache values in the docs, however! Thanks for pointing it out.

const asyncStrategy = require("./strategies/async");


let cacheStrategy = defaultStrategy();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would there ever be a scenario for having two rapscallion instances which would have differing cache strategies?

You've got a global mutable singleton, which ties your hands a bit vs. maybe creating a cache object to initialize and pass to renderers...

const { SequenceCache } = require("../sequence-cache");


const cache = Object.create(null);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again, maybe consider instantiable cache objects for renderers instead of global mutable singletons.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both of these are really good points...

To be honest, I'm sure there are cases where a single process could potentially utilize differing cache strategies. I attempted that at first, but it made everything much more complex, and required intermediate object instantiation that slowed down traversal.

So although its a bit icky, this seemed a reasonable first implementation. It might be that somebody can come up with a clever way of improving the flexibility of the implementation here, or I might revisit at some point. One thing I've considered is instantiating several values on the Renderer itself, and then passing that through the traversal. That might be cleaner, and would be a good place to start if it is determined that the implementation in this PR is inadequate.

However, this does seem "good enough" in a lot of ways, since most people will probably utilize a single cache backend for a given process. And, if someone really needs to fork their behavior, they could do so in the context of the cache strategy that they register. They wouldn't have access to every piece of information, but they would have access to the cacheKey and the sequence buffer.

Copy link
Contributor Author

@divmain divmain Feb 15, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Captured potential for enhancement and out-of-band conversation here: #37

@divmain divmain merged commit b3e3524 into master Feb 15, 2017
@divmain divmain deleted the improved-component-caching branch February 15, 2017 05:58
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants