-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve Initialization Time #29
Comments
I haven't done any work around profiling, so I haven't attempted to optimize speed. At 3mb, it is going to be difficult to get it to be much more efficient, I think. Are you triggering a micro service in multiple places? The best I can recommend at this point is to launch a single micro service once, and then use the emitter interface to send multiple requests to it. That way you only endure the startup pain once per launch. Are you using I will put this into my backlog of things to investigate. |
Thanks for the reply, I am currently using the liquidserver to serve the js file from raw resources, not the network. I haven't tested what kind of delay the network calls would add, but I'm currently only planning to use the local resource. It's a pretty big file though, it was 3mb minified and about 7 mb before that (around 200,000 lines). I am also currently only triggering the microservice once, so it's just the initial 2.5 second added delay. I was actually able to reduce the dependencies and get the file size down to 500kb minified. This reduced the delay to about 1.2 seconds total until the READY event. I also noticed that minified vs non minified didn't make a huge different in the delay (~100 ms or so), even though the size was about half. I was wondering if there was a way to remove the sqllite dependency and all the file system code that runs in the beginning and just setup the node environment bare (not sure if this is technically possible)? That could help to reduce the initial 500 ms delay I see before the onStart of the microservice. Thanks again, |
You can remove these things, but it would require rebuilding the library. How much do really require node? If it is raw javascript (and you don't need networking and other node built-ins), then you can just use the javascript engine directly. |
I could probably get around using node if I polyfill some of the API's. With the javascript engine, I wouldn't be able to use the event based API though correct? Ill look into how much time it would save using the engine directly. One thing I was wondering was if there is a way to prevent having to go through the setup every time the user opens the app. Would it be possible to persist the filesystem and avoid having to do this setup every time, but instead just the first time? It would be great to be able to persist the id and just look up the microservice if its already initialized. Not sure if that is possible? Specifically talking about these steps:
|
FYI, I was able to get the code running using the javascript engine directly. This brought the initialization time down to ~800 ms. Much better than ~1300 with a microservice. Here is a breakdown of the ~800 ms. Creating JSContext: 202 ms |
Another update, I decided to try out AndroidJSCore, the library you deprecated for this one and found some interesting results. It is much faster than LiquidCore in terms of initialization. Here are the same times for AndroidJSCore on the same device, with the same code. Creating JSContext: 46 ms I guess JavascriptCore can initialize and process the javascript code faster than V8? Another thing to note, I also did a comparison of LiquidCore to J2V8 and I saw roughly the same times. The recording can be seen below. Creating JSContext: 205 ms |
@ericwlange Looks like the main way to reduce the context creation time would be to use the V8 snapshots feature. Have you looked into this at all? Also, looks like there were some significant performance increases to V8 released. More details here: https://v8project.blogspot.com/2017/05/launching-ignition-and-turbofan.html. These changes were released in Node.js version 8.3.0. Do you have any plans on updating? |
I haven't look at the snapshots feature yet. Thanks for sending. But I did watch the Chrome Dev Summit livestream and the v8 team mentioned that the newest version is 22% faster, and that the node community has been integrating it. I will let it settle a bit and then upgrade the version of node. In the meantime, I will look into snapshots. |
Quick update on this. I just checked in a massive update, which upgrades to node 8.9.3 and enables default snapshots. I haven't profiled it yet, but loading a I will also expose an API to create/use a custom snapshot so that your giant JS can be snapshotted as well. This will only work on pure javascript (i.e. not in a node |
Startup is significantly faster: 1. Node upgraded to version 8.9.3 with v8 improvements 2. v8 default snapshots enabled for quick startup 3. All Filesystem setup happens in a single JavaScript synchronization block 4. /home/node_modules only copies the first time a new version of the library is installed Also, sort-of fixed issues with 32 vs. 64-bit builds At the moment, debug library always used (with symbols), still need to make that only happen on debug builds
This has been resolved in Release 0.4.0 |
Hello,
Thanks for developing such an awesome library! I had one question around the initialization time, specifically the time it takes from calling start on a micro service to getting the onStart callback and then getting my "READY" callback after my js file has loaded. On average from the time I call start on the micro service to the time I get the onStart callback I see about a 500 ms overhead. Then from onStart to the time I get my "READY" callback at the end of my js file (which is fairly large at about 3mb) its about another 2 seconds. I was wondering if anyone else has experienced anything like this and found a way to reduce this overhead.
Thanks,
Tyler
The text was updated successfully, but these errors were encountered: