perf: cache our arrays as json files #303

Closed
pascalchevrel opened this Issue Jun 27, 2014 · 2 comments

Comments

Projects
None yet
1 participant
@pascalchevrel
Member

pascalchevrel commented Jun 27, 2014

After some experimenting and talking to Nicolas Pierron at the Paris office (he works on Ionmonkey performances), I found an easy performance gain to implement on the site which is caching our big arrays of translations per repo as json files on first use and use the json file instead of including the file. The reason is that json is a strict and simple format to parse and including that is less work for the engine than including a php file with just an array because a php file can contain any piece of code therefore the engine doesn't know what to expect.

On the entity view here is the before/after result on our most intensive view (entity search across all locales for Firefox Desktop):

Before:
Memory peak: 18874368 (17.25MB)
Elapsed time (s): 5.7142

After:
Memory peak: 13369344 (12.57MB)
Elapsed time (s): 1.5571

We get performance and memory gains on all the views working with strings, here is for the main search view:

Before:
Memory peak: 22020096 (19.57MB)
Elapsed time (s): 0.1894

After:
Memory peak: 20709376 (19.09MB)
Elapsed time (s): 0.0837

On my local server, if I simulate 500 requests to the API with batches of 100 simultaneous requests, I get 12 requests/s before the patch, 29 after.

@pascalchevrel pascalchevrel self-assigned this Jun 27, 2014

@pascalchevrel

This comment has been minimized.

Show comment
Hide comment
@pascalchevrel

pascalchevrel Jun 27, 2014

Member

On our real server, for the same API call:
before:
Requests per second: 22.85 #/sec
Time per request: 4377.272 ms
After:
Requests per second: 38.30 #/sec
Time per request: 2610.636 ms

Member

pascalchevrel commented Jun 27, 2014

On our real server, for the same API call:
before:
Requests per second: 22.85 #/sec
Time per request: 4377.272 ms
After:
Requests per second: 38.30 #/sec
Time per request: 2610.636 ms

@pascalchevrel

This comment has been minimized.

Show comment
Hide comment
@pascalchevrel

pascalchevrel Sep 29, 2014

Member

For the record, we are not storing as json but as serialized php data (serialize()/unserialize()) which gave better results than storing as json.

Member

pascalchevrel commented Sep 29, 2014

For the record, we are not storing as json but as serialized php data (serialize()/unserialize()) which gave better results than storing as json.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment