Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: implement caching mechanism #6267

Closed
lwansbrough opened this issue Dec 13, 2013 · 27 comments
Closed

Feature: implement caching mechanism #6267

lwansbrough opened this issue Dec 13, 2013 · 27 comments

Comments

@lwansbrough
Copy link
Contributor

Hey @particlebanana et al, I'm thinking it would be very useful to be able to provide a mechanism for specifying caching methods. Personally I think it would be very useful to have this functionality available within Waterline due to the importance of the way in which a cached object's key is generated. I was considering doing this outside of Waterline - but then I wouldn't have access to which properties were being populated, etc in order to build a corresponding key. Here's what I'm thinking:

Specifying a cache for the query would look something like this:

User.findOne(123).populate('roles').cache({
    get: function(key) {
        // call to memcached/redis/etc here
    },
    set: function(key, value) {
        // call to memcached/redis/etc here
    }
}).exec(function(err, user) { console.log(user); }));

The key would be a hashed representation of the normalized criteria/joins/options etc and value of course being the model. remove and update could also be specified for User.update and User.remove queries.

I will probably end up doing this anyway, but I want to request for comments in case you've got a better solution for this.

Cheers

@vanetix
Copy link
Member

vanetix commented Dec 16, 2013

I like the idea, but I would be curious to what @particlebanana thinks. Adding a cache layer is something that is not necessarily needed in my opinion since it can easily be done above the ORM in an application. This is just my two cents on the matter

@particlebanana
Copy link
Contributor

I've thought about adding caching and it's on the roadmap somewhere after associations and transactions are wrapped up. I think it could belong in the ORM if it's a simple lightweight cache layer. If you need anything more complicated it should probably be at a layer above the ORM. Because we have this concept of adapters it should be some kind of pluggable interface where any adapter can be used as a cache adapter.

I would probably prefer you not have to specify a getter/setter for the cache object. You should be able to just specify a cache adapter somewhere and all find queries will use it unless otherwise specified. Behind the scenes it could work the way you described, using the hash of the options as a key, but the user facing API shouldn't change.

I'd gladly accept some ideas on what the api would look like and how it would work behind the scenes. If anyone wants to try and tackle it that would be awesome. I think we should all agree on what it would look like before any code is written though.

@lwansbrough
Copy link
Contributor Author

This is why I thought it would be good to ask. :) I think what you've proposed is a good idea (global cache settings), however I could see some cases where it wouldn't be ideal: you may want to handle caching differently for different models, so it would be necessary to have per-model configurations available for things like cache expiration. I'm very much in favour of the adapter approach as long as it doesn't limit verbosity too much - caching is about optimization and as such should be tuneable without too much trouble.

On Dec 16, 2013, at 10:44 AM, "Cody Stoltman" notifications@github.com wrote:

I've thought about adding caching and it's on the roadmap somewhere after associations and transactions are wrapped up. I think it could belong in the ORM if it's a simple lightweight cache layer. If you need anything more complicated it should probably be at a layer above the ORM. Because we have this concept of adapters it should be some kind of pluggable interface where any adapter can be used as a cache adapter.

I would probably prefer you not have to specify a getter/setter for the cache object. You should be able to just specify a cache adapter somewhere and all find queries will use it unless otherwise specified. Behind the scenes it could work the way you described, using the hash of the options as a key, but the user facing API shouldn't change.

I'd gladly accept some ideas on what the api would look like and how it would work behind the scenes. If anyone wants to try and tackle it that would be awesome. I think we should all agree on what it would look like before any code is written though.


Reply to this email directly or view it on GitHub.

@mikermcneil
Copy link
Member

@lwansbrough still interested in jumping in on this? It's something we'll be able to help on after v0.10 is stable (built-in transactions support will probably come first though). But in the mean time, we could ideate on it, and even get started

@lwansbrough
Copy link
Contributor Author

I may be able to contribute some time to this, sure.

@luislobo
Copy link
Contributor

luislobo commented Apr 2, 2014

Having long experience with Yii, I can just tell how it works on that framework.
In Yii you have a default Cache setting, a dummy one that does just nothing. You can then configure several cache adapters. All have a simple interface (get, set, flushall, and the like). You have memcache, redis, etc.

We could allow a config/cache.js with several cache adapters, being one the default.
and we could use something like what you said, but more simple, just like:
´´´
User.findOne(123).populate('roles').cache(60000).exec(function(err, user) { console.log(user); }));
´´´

That would cache to the default and expire at 60000 milliseconds.

Yii also has Cache Dependencies, that allows you to add several dependencies on when to expire the cache, based on DB, memory, expressions, etc.

Here are Yii's way of doing that: http://www.yiiframework.com/doc/guide/1.1/en/caching.overview

Specially, check 2. Query Caching, here: http://www.yiiframework.com/doc/guide/1.1/en/caching.data

@tjwebb
Copy link
Contributor

tjwebb commented Aug 25, 2014

Caching could be done at the policy layer instead of globally. In the policies config, requests could be routed through a "cache" policy, and customized per-controller if needed.

@rudeluv
Copy link

rudeluv commented Oct 17, 2014

@tjwebb do you have any code up of this setup? I'm looking to throw a cache layer in front of my Models and this sounds like a good route.

@lwansbrough
Copy link
Contributor Author

I would really recommend not implementing your cache at the request level. If you're going to do that, you may as well use the cache built into nginx or something similar, unless you've got some sort of authentication system in place. Even then, putting the caching interface at the request level inside the application seems like a somewhat bad idea. If you're in need of a quick fix, sure. Cache should be implemented at the data manager or ORM-query level.

@rudeluv
Copy link

rudeluv commented Oct 17, 2014

@lwansbrough currently I'm implementing it at the controller level using a service, but it does seem a little cumbersome. I don't have quite enough experience to extend Waterline to include a cache layer at the ORM level.

@lwansbrough
Copy link
Contributor Author

Neither. :( It'd be one of the bigger features to develop I think. I've been wanting to take a look at this for a while but I've been too busy with the actual work that needs to be done to worry about the framework itself.

@tjwebb
Copy link
Contributor

tjwebb commented Oct 17, 2014

unless you've got some sort of authentication system in place.

Yea I love nginx cache, but mix session-based auth into the mix and it might not make sense. Sails already supports redis as a session cache; I think leveraging this to extend it to work as a model cache is a sensible approach.

Manual caching is just sometimes necessary, and its better to implement as a framework feature than as ad-hoc bandaids whenever the need arises. Postgres, for example, relies heavily on the OS page-cache, meaning that you're sharing this cache with every other application on the system.

@devinivy
Copy link

Been thinking about it. Here's my brain-dump on the topic.

I think this project would be high-value for the amount of time it will take to implement. I'd suggest starting by implementing a write-around cache, probably at the collection level. Junction tables can inherit caching from their dominant collection. One option is to create the idea of a "cache adapter" which implements certain methods that enable caching when composed with another adapter, i.e. taking advantage of the already-existing-and-sort-of-working syntax adapter: ['mongo', 'redis'] in collection definition. One last idea is to let the adapter for the persistent data store optionally determine the key used in the cache.

A look at sequelize's solution (in-progress):
They're going with a more general "plugin" route, which seem to be an extension of their hooks: sequelize/sequelize#803
Here's the redis cache layer for sequelize: https://github.com/rfink/sequelize-redis-cache
It only hooks into reads, so it's write-around caching.

Perhaps we could use https://www.npmjs.com/package/json-stable-stringify to help create a deterministic hash of find options, and cache using that as a key.

@tjwebb
Copy link
Contributor

tjwebb commented Jan 23, 2015

I'd be open to using redis as an optional caching layer: you should take a
look at waterline-cursor which is the module I think would benefit most
from this.

@devinivy
Copy link

Where do you see waterline-cursor benefiting from this? Doesn't that module do the in-memory joins? (Perhaps I'm misunderstanding something!) I'm thinking about query/result caching. I guess it could benefit waterline-cursor in the sense that waterline-cursor may not have to be used as often!

@tjwebb
Copy link
Contributor

tjwebb commented Jan 23, 2015

Yea, e.g. what if you wanted to do a join via waterline-cursor of
non-trivial size? Be nice not to rely only on ram.

I like the general idea of a key/value adapter spec.
On Jan 23, 2015 10:30 AM, "devin ivy" notifications@github.com wrote:

Where do you see waterline-cursor benefiting from this? Doesn't that
module do the in-memory joins? (Perhaps I'm misunderstanding something!)
I'm thinking about query/result caching. I guess it could benefit
waterline-cursor in the sense that waterline-cursor may not have to be used
as often!


Reply to this email directly or view it on GitHub
https://github.com/balderdashy/waterline/issues/200#issuecomment-71209684
.

@dmarcelino dmarcelino changed the title Might fork for caching - thoughts? Feature: implement caching mechanism May 20, 2015
@stuk88
Copy link

stuk88 commented Jun 13, 2015

User.findOne(123).populate('roles').cache(60000).exec(function(err, user) { console.log(user); })); 

would be awesome

@ronenteva
Copy link

+1 for @stuk88 idea

@vidia
Copy link

vidia commented Aug 26, 2015

+1 as well to adding a cache.

@bojidar-bg
Copy link

Though it won't help much in pushing things forward, but still.... 👍

@sailsbot
Copy link

Thanks for posting, @lwansbrough. I'm a repo bot-- nice to meet you!

It has been 30 days since there have been any updates or new comments on this page. If this issue has been resolved, feel free to disregard the rest of this message. On the other hand, if you are still waiting on a patch, please:

  • review our contribution guide to make sure this submission meets our criteria (only verified bugs with documented features, please; no questions, commentary, or bug reports about undocumented features or unofficial plugins)
  • create a new issue with the latest information, including updated version details with error messages, failing tests, and a link back to the original issue. This allows GitHub to automatically create a back-reference for future visitors arriving from search engines.

Thanks so much for your help!

@devinivy devinivy reopened this Sep 28, 2015
@sailsbot
Copy link

Thanks for posting, @lwansbrough. I'm a repo bot-- nice to meet you!

It has been 30 days since there have been any updates or new comments on this page. If this issue has been resolved, feel free to disregard the rest of this message. On the other hand, if you are still waiting on a patch, please:

  • review our contribution guide to make sure this submission meets our criteria (only verified bugs with documented features, please; no questions, commentary, or bug reports about undocumented features or unofficial plugins)
  • create a new issue with the latest information, including updated version details with error messages, failing tests, and a link back to the original issue. This allows GitHub to automatically create a back-reference for future visitors arriving from search engines.

Thanks so much for your help!

@luislobo
Copy link
Contributor

Can anyone reopen this one? Its a Feature issue... I guess Feature issues should never be auto closed?

@armellarcier
Copy link
Contributor

+1 please reopen

@particlebanana
Copy link
Contributor

This is linked to on the Roadmap. We are trying to keep issues as only issues and track features using the Roadmap. Closing this doesn't mean it's no longer valid. Caching is very cool and something we will keep in mind as features progress.

@atiertant
Copy link

this is now available in offshore 0.0.4, documentation could be found here

@johnabrams7 johnabrams7 transferred this issue from balderdashy/waterline May 20, 2019
@vikas-0
Copy link

vikas-0 commented Sep 28, 2019

this is now available in offshore 0.0.4, documentation could be found here

How to integrated this in sails installation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests