Join GitHub today
GitHub is home to over 50 million developers working together to host and review code, manage projects, and build software together.Sign up
perf: cache our arrays as json files #303
After some experimenting and talking to Nicolas Pierron at the Paris office (he works on Ionmonkey performances), I found an easy performance gain to implement on the site which is caching our big arrays of translations per repo as json files on first use and use the json file instead of including the file. The reason is that json is a strict and simple format to parse and including that is less work for the engine than including a php file with just an array because a php file can contain any piece of code therefore the engine doesn't know what to expect.
On the entity view here is the before/after result on our most intensive view (entity search across all locales for Firefox Desktop):
We get performance and memory gains on all the views working with strings, here is for the main search view:
On my local server, if I simulate 500 requests to the API with batches of 100 simultaneous requests, I get 12 requests/s before the patch, 29 after.