Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Debugger experiments #11441

Merged
merged 62 commits into from Mar 8, 2024
Merged

Debugger experiments #11441

merged 62 commits into from Mar 8, 2024

Conversation

kubouch
Copy link
Contributor

@kubouch kubouch commented Dec 29, 2023

Description

This PR adds a new evaluator path with callbacks to a mutable trait object implementing a Debugger trait. The trait object can do anything, e.g., profiling, code coverage, step debugging. Currently, entering/leaving a block and a pipeline element is marked with callbacks, but more callbacks can be added as necessary. Not all callbacks need to be used by all debuggers; unused ones are simply empty calls. A simple profiler is implemented as a proof of concept.

The debugging support is implementing by making eval_xxx() functions generic depending on whether we're debugging or not. This has zero computational overhead, but makes the binary slightly larger (see benchmarks below). eval_xxx() variants called from commands (like eval_block_with_early_return() in each) are chosen with a dynamic dispatch for two reasons: to not grow the binary size due to duplicating the code of many commands, and for the fact that it isn't possible because it would make Command trait objects object-unsafe.

In the future, I hope it will be possible to allow plugin callbacks such that users would be able to implement their profiler plugins instead of having to recompile Nushell. DAP would also be interesting to explore.

Try help debug profile.

Screenshots

Basic output:
profiler_new

To profile with more granularity, increase the profiler depth (you'll see that repeated is-windows calls take a large chunk of total time, making it a good candidate for optimizing):
profiler_new_m3

Benchmarks

Binary size

Binary size increase vs. main: +40360 bytes. (Both built with --release --features=extra,dataframe.)

Time

# bench_debug.nu
use std bench

let test = {
    1..100
    | each {
        ls | each {|row| $row.name | str length }
    }
    | flatten
    | math avg
}

print 'debug:'
let res2 = bench { debug profile $test } --pretty
print $res2
# bench_nodebug.nu
use std bench

let test = {
    1..100
    | each {
        ls | each {|row| $row.name | str length }
    }
    | flatten
    | math avg
}

print 'no debug:'
let res1 = bench { do $test } --pretty
print $res1

cargo run --release -- bench_debug.nu is consistently 1--2 ms slower than cargo run --release -- bench_nodebug.nu due to the collection overhead + gathering the report. This is expected. When gathering more stuff, the overhead is obviously higher.

cargo run --release -- bench_nodebug.nu vs. nu bench_nodebug.nu I didn't measure any difference. Both benchmarks report times between 97 and 103 ms randomly, without one being consistently higher than the other. This suggests that at least in this particular case, when not running any debugger, there is no runtime overhead.

API changes

This PR adds a generic parameter to all eval_xxx functions that forces you to specify whether you use the debugger. You can resolve it in two ways:

  • Use a provided helper that will figure it out for you. If you wanted to use eval_block(&engine_state, ...), call let eval_block = get_eval_block(&engine_state); eval_block(&engine_state, ...)
  • If you know you're in an evaluation path that doesn't need debugger support, call eval_block::<WithoutDebug>(&engine_state, ...) (this is the case of hooks, for example).

I tried to add more explanation in the docstring of debugger_trait.rs.

TODO

  • Better profiler output to reduce spam of iterative commands like each
  • Resolve TODO: DEBUG comments
  • Resolve unwraps
  • Add doc comments
  • Add usage and extra usage for debug profile, explaining all columns

User-Facing Changes

Hopefully none.

Tests + Formatting

After Submitting

@fdncred
Copy link
Collaborator

fdncred commented Feb 26, 2024

is there a way to move forward with this? or should we leave it until after the next release?

@kubouch
Copy link
Contributor Author

kubouch commented Feb 26, 2024

It's almost ready, I just wanted to make the profiler a bit more useful.

@stormasm
Copy link
Contributor

@kubouch great news !
I really hope this gets landed before the next release 😄

@kubouch kubouch added the pr:api-change This PR should be mentioned in #api-updates channel on Discord label Mar 6, 2024
@kubouch kubouch marked this pull request as ready for review March 6, 2024 21:58
@kubouch kubouch added this to the v0.92.0 milestone Mar 7, 2024
@kubouch kubouch added pr:release-note-mention Addition/Improvement to be mentioned in the release notes pr:commands This PR changes our commands in some way pr:screenshot This PR has a screenshot that could go to release notes labels Mar 7, 2024
@kubouch kubouch merged commit 14d1c67 into nushell:main Mar 8, 2024
19 checks passed
@kubouch kubouch deleted the debugger-test branch March 8, 2024 18:21
@devyn
Copy link
Contributor

devyn commented Mar 15, 2024

Hi @kubouch, I just noticed something that I'm not totally sure is right or wrong:

In crates/nu-engine/src/eval.rs, eval_block(), there are early returns on failure, e.g.:

            if failed {
                // External command failed.
                // Don't return `Err(ShellError)`, so nushell won't show an extra error message.
                return Ok(output);
            }

but I notice that in these cases, D::leave_block() is not called. Was that intentional?

@kubouch
Copy link
Contributor Author

kubouch commented Mar 15, 2024

I might have missed that, it's not intentional.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pr:api-change This PR should be mentioned in #api-updates channel on Discord pr:commands This PR changes our commands in some way pr:release-note-mention Addition/Improvement to be mentioned in the release notes pr:screenshot This PR has a screenshot that could go to release notes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants