Skip to content
This repository was archived by the owner on Mar 7, 2019. It is now read-only.

Conversation

calavera
Copy link
Contributor

This change allows to analyze historic data without having
to run pprof in real time. It's also useful when you cannot connect go-torch to a production server but the server has pprof installed.

It adds the flag load to the CLI. This flag takes the file path to raw data.
Go-torch tries to run the profiler as usual if this flag is empty.

PS: Thanks for this project, it has already saved me lots of time.

Signed-off-by: David Calavera david.calavera@gmail.com

This change allows to analyze historic data without having
to run pprof in real time.

It adds the flag `load` to the CLI. This flag takes the file path to raw data.
Go-torch tries to run the profiler as usual if this flag is empty.

Signed-off-by: David Calavera <david.calavera@gmail.com>
@sandlerben
Copy link
Contributor

Does this differ from the binaryinput/binaryname flags?

@prashantv
Copy link
Contributor

This is slightly different, since it's the raw txt format rather than the binary pprof file. The raw output contains the sample data as well as the symbols, while the binary only contains the sample data.

However, I'm wondering whether it's possible to use the standard binary formats rather than adding another option.

@calavera Is it possible to use -proto instead of -raw when calling pprof and pass that as the binaryinput?

@calavera
Copy link
Contributor Author

I guess it is. I recognize that this is a hack to inspect several historic files that I was keeping around.

I think it could be useful to have more documentation about how to use this program with pre-generated data, but this PR doesn't solve that problem, so I'm closing this PR.

@calavera calavera closed this Nov 30, 2015
@calavera calavera deleted the load_flag branch November 30, 2015 17:36
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants