Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Importance of caching algorithms #8083

Open
CaedenPH opened this issue Jan 6, 2023 · 3 comments
Open

Importance of caching algorithms #8083

CaedenPH opened this issue Jan 6, 2023 · 3 comments
Labels
enhancement This PR modified some existing files

Comments

@CaedenPH
Copy link
Contributor

CaedenPH commented Jan 6, 2023

Feature description

This is an algorithms-based repository, which does not necessarily focus on time complexity, but also on readability and understandability. That being said, specific algorithms are benchmarked for speed comparisons, and sometimes implementing a caching system vastly improves this speed.

For example, if the recursive fibonacci function found here
https://github.com/TheAlgorithms/Python/blob/master/maths/fibonacci.py#L63

is timed and produces an average speed of 6.0ms, on my machine at least. however when you implement the functools lru_cache, it speeds the algorithm to 0.0ms, to match the other algorithms.

The example with caching:

def fib_recursive(n: int) -> list[int]:
    @lru_cache(maxsize=None)
    def fib_recursive_term(i: int) -> int:
        if i < 0:
            raise Exception("n is negative")
        if i < 2:
            return i
        return fib_recursive_term(i - 1) + fib_recursive_term(i - 2)

    if n < 0:
        raise Exception("n is negative")
    return [fib_recursive_term(i) for i in range(n + 1)]
@CaedenPH CaedenPH added the enhancement This PR modified some existing files label Jan 6, 2023
@CaedenPH
Copy link
Contributor Author

CaedenPH commented Jan 6, 2023

@cclauss Maybe it could be in the form of adding another function with caching implemented, or mentioning it in the README, or having notes in the docstrings about caching. What is your opinion?

@cclauss
Copy link
Member

cclauss commented Jan 7, 2023

I love the notion for side-by-side implementations without cache and with cache. The first would be easier to read / understand and the second would be more performant. We would need timeit or similar benchmarks to prove the advantage of caching. This sounds like a great way to learn the what, why, and how of caching.

@CaedenPH
Copy link
Contributor Author

CaedenPH commented Jan 7, 2023

I love the notion for side-by-side implementations without cache and with cache. The first would be easier to read / understand and the second would be more performant. We would need timeit or similar benchmarks to prove the advantage of caching. This sounds like a great way to learn the what, why, and how of caching.

Awesome, I will make some pull requests implementing this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement This PR modified some existing files
Projects
None yet
Development

No branches or pull requests

2 participants