Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slow performance #49

Closed
VarunAgw opened this issue Dec 30, 2015 · 15 comments
Closed

Slow performance #49

VarunAgw opened this issue Dec 30, 2015 · 15 comments

Comments

@VarunAgw
Copy link

I've recently installed this, but I am feeling it is bit slow

I did time tldr ls

It says:

real    0m0.477s
user    0m0.405s
sys     0m0.082s

It will be better if it can work faster. 500ms is bit too high.

@igorshubovych
Copy link
Collaborator

Which version of NodeJS do you have?
How fast do you think it should work?

@VarunAgw
Copy link
Author

I checked it under v0.10.x and v4.x (both Ubuntu). I don't know exactly, but I feel 50ms will be good enough

@igorshubovych
Copy link
Collaborator

It runs for 310-390 ms on my MacBook Air / OSX.
Empty test.js runs for 80ms.
Gulp with no Gulpfile (so it does nothing) runs for 290-360ms.

So I would say it is hard to make it significantly faster. I may try to exclude some modules, but I doubt I can decrease even to 200 ms.

If the speed is critical for you I would recommend using either C++ or Go client:
https://github.com/tldr-pages/tldr#clients

NodeJS client goals were different:

  • reference client implementation
  • speed of development

@rprieto
Copy link
Contributor

rprieto commented Dec 30, 2015

Agreed with @igorshubovych. It would take a lot of profiling to bring it to ~200ms, not sure it's the top priority unfortunately.

But it does raise another question, having a lot of clients is nice for encouraging open-source and having lots of ideas, however it's very odd to have to

  • direct people to certain clients if they want certain features, at the detriment of other features
  • have to always ask which client people are using

Let's have a chat on Gitter about ways we can make this work better.

@VarunAgw
Copy link
Author

Hi, I've created a simple script, which somewhat solved this problem (by acting as caching).

function tldr () {
    mkdir -p /tmp/tldr_cache

    if [ "--flush-cache" == $1 ]; then
        rm /tmp/tldr_cache/*
        return
    fi

    if [ "--" == "${1:0:2}" ]; then
        command tldr $*
        if [ "--help" = $1 ]; then
            echo -e "To flush the custom caching\n"
            echo -e "    $ tldr --flush-cache"
        fi
        return
    fi

    if [ -f "/tmp/tldr_cache/$1" ]; then
        cat /tmp/tldr_cache/$1
    else
        script -q -c "tldr $1" /tmp/tldr_cache/$1
        sed '1d' /tmp/tldr_cache/$1 > /tmp/tldr_cache/$1.tmp;
        mv /tmp/tldr_cache/$1.tmp /tmp/tldr_cache/$1
    fi
}

@VarunAgw
Copy link
Author

@rprieto I personally don't like the concept of multiple clients. I tried some tldr-pages client today. Each one of them was lacking something, or was slow. So instead of 10, I'll prefer a single good client.

Like this client has color coding and now with caching I am good to go!

@waldyrious
Copy link
Member

That's an interesting point: we could provide basic specs expected from clients, such as the ability to color tokens, perform search, caching, work on dark or light backgrounds, and so on. That way we get the benefit of clients specialized on e.g. speed, or working on specific platforms or environments, without losing on the functionality side.

@leostera
Copy link
Contributor

@VarunAgw you seem to be a smart fella, why don't you have a crack at the node-client and try to make it a bit faster?

@waldyrious 👍

@amitzur
Copy link

amitzur commented Dec 31, 2015

I did a bit of profiling and it seems like the bulk of time is coming from require("request")
it takes ~250ms on my MacBook Pro

@rprieto
Copy link
Contributor

rprieto commented Dec 31, 2015

Thanks for that @amitzur! Might be worth checking if we can use raw require('http') instead.

Just to confirm, if you comment out require('request') do we incur the loading time somewhere else, or does it really save 250ms? (should not crash if you have a local cache)

@amitzur
Copy link

amitzur commented Dec 31, 2015

It does take off ~50% on my machine. from 500ms to 250ms on average. There are other things I didn't find that comprise the rest 250ms

@amitzur
Copy link

amitzur commented Dec 31, 2015

require("unzip2") is also a few dozen ms

@igorshubovych
Copy link
Collaborator

Maybe we can use node-fetch instead of request.

@igorshubovych
Copy link
Collaborator

Please update to 1.4.0.
Thanks for idea and investigation!

@VarunAgw
Copy link
Author

Thanks! That's a huge improvement

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants