Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FAQ list #25

Closed
yoonghm opened this issue Sep 4, 2018 · 1 comment
Closed

FAQ list #25

yoonghm opened this issue Sep 4, 2018 · 1 comment

Comments

@yoonghm
Copy link

yoonghm commented Sep 4, 2018

Hi is there a forum or faq list so that new users could ask or find answer they are looking for?

I could not wait for this forum or faq, I would like find out a few things before deciding a design of a new project. I would like to use elixir, ecto, phoenix and nebulex. However, i would like to know the difference between local and distributed cache. Does the distributed cache refer to the caches found in several running copies of nebulex in different hardware boxes? What are the benefits as compared to increase local cache memory?

@cabol
Copy link
Owner

cabol commented Sep 5, 2018

Hi @yoonghm

Hi is there a forum or faq list so that new users could ask or find answer they are looking for?

Unfortunately no, well, not yet, but sounds like a good idea to have one, so I'll open an issue to create a FAQ page or something similar. However, meanwhile feel free to open an issue, ping me or just open a discussion in the Elixir forum.

Does the distributed cache refer to the caches found in several running copies of nebulex in different hardware boxes?

Yes, distributed cache means setup and run the cache across multiple nodes, in cluster (they might be logical or physical nodes). But there are several ways to do it, or even better, different topologies for distributed caching, such as: Replicated, Partitioned, Near, etc. This is where Nebulex comes in, it allows you to setup all these different topologies very easy, just matter of configuration.

What are the benefits as compared to increase local cache memory?

If you are going to deploy your app on a single node, a local or in-process memory is more than enough, and yes, you can increase your memory to have more capacity. Local cache gives the best performace, there is not network latency like in distributed cache, but even though, it is still much better than not caching at all. On other hand, if you'll deploy your app on multiple servers/nodes, a load-balancer on top of them, you will end up with as many caches as your application instances, each having a different state resulting in inconsistency. Besides, in the worst case, all cache instances will store almost the same data, which means, if the max expected size for a cache is X bytes, you will end up consuming N times that size, being N the number of cache instances. Now, one of the most interesting parts about Nebulex, is it allows you to setup different topologies (check out the examples), and you can take advantage of both, local and distributed cache by setting up a Near-Cache topology.

Summing up, distributed or local cache, the decision depends on the kind of system or app you want to build, how you are planning to deploy it, if it is distributed or not, the amount of data you are going to store, if we are taking about large data, maybe the distributed cache is the option to go. The best thing is Nebulex it is not only a cache library, it is more like a caching framework, where you can get the best of both, distributed and local cache, different distributed topologies, adapters (built-in, redis, memcached, etc.), etc.

Additionally, you can take a look to this post: In-Process Caching vs. Distributed Caching.

So, I hope this has been helpful for you, otherwise, don't hesitate to reach out to me again :)

@cabol cabol closed this as completed Nov 24, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants