Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LogGraph #32

Merged
merged 19 commits into from Jun 28, 2019

Conversation

Projects
None yet
4 participants
@shashikdm
Copy link
Contributor

commented May 28, 2019

Started implementation of log_graph. I am aiming to support Flux models. As discussed earlier in Slack, I. am using TrackedArrays to build the graph as follows.
2. Take dummy input along with the model (Chain type)
3. Apply the model to dummy input layer by layer with each layer creating a subgraph
4. Within each layer, nodes are created from the output to input by finding all the Tracked variables recursively
5. size of output of each node is also determined in the previous step

@oxinabox

This comment has been minimized.

Copy link
Contributor

commented May 28, 2019

My feeling is we should support some intermediate format.
Some data type / collection of types that just captures the minimum nature of the graph that tensorboard can display.

Then we should make that work, and merge that PR that works with that.
Then we should think about maybe writing an adaptor, that converts Flux model's to that format.
As a seperate PR.

It might be nice to look at https://github.com/JuliaPlots/GraphRecipes.jl
and at https://github.com/JuliaGraphs/GraphPlot.jl
for inspiration.
Though those are a bit tied to how LightGraphs thinks of Graphs.
Which is not a bad thing but does tend to focus on a very mathematical way of looking at them.
(THey are just not so far removed from Adjancy Matrixes)
Perhaps to think more abstract
look at https://github.com/Keno/AbstractTrees.jl

@shashikdm

This comment has been minimized.

Copy link
Contributor Author

commented May 29, 2019

@oxinabox As you proposed in slack a function asgraph which would create an AbstractGraph from a Flux model would be nice and more general. I believe it is feasible to convert an AbstractGraph to tensorboard graph, But then how feasible is it to write asgraph for Flux libary?
In my mind, this is what i want to do precisely:

  • Call model on a sample data
  • inspect every line (source or IR (codeinfo)). This should work recursively.

In short I want to do what you suggested to do with uncompressed_ast but while executing. because uncompressed_ast does not give any information of what happens inside flux.applychain.
I tried @code_lowered model(data) but I cannot go inside recursively. How can I inspect code recursively and get information like @code_lowered?

julia> IR = @code_lowered model(data)
CodeInfo(
33 1 ─ %1 = (Base.getproperty)(c, :layers)                                                                                                │
   │   %2 = (Flux.applychain)(%1, x)                                                                                                      │
   └──      return %2                                                                                                                     │
)
@PhilipVinc

This comment has been minimized.

Copy link
Owner

commented Jun 3, 2019

Exposing some intermediate representation is a good idea, and we can take inspiration from the relevant protobuffers, namely Graph, NodeDef, FunctionDef, Tensor and TensorShape (I might be missing some). (Given that the IR will later be lowered to those protobuffers, we might even directly expose them, providing some smarter constructors, but I'm not sure that would be a good idea.)

For the structure, I would roll a custom IRGraph <: AbstractDiGraph (or forgo the dependency altogether, as we don't really need it). IRGraph is a directed graph that holds nodes of some type (function, tensor, IRGraph...). IRGraph should also hold some special nodes (the slotnames in the ast)...

@PhilipVinc

This comment has been minimized.

Copy link
Owner

commented Jun 3, 2019

@shashikdm Provided the code is type-stable, I think that if you inspect the CodeInfo object returned by InteractiveUtils.code_lowered(c, Base.typesof(b)) you should be able to extract the function names and argument types, so that you can call it recursively.

If the code is type-unstable, I believe you can't avoid actually running the type-unstable part of the code, and calling Base.typesof(b) on the output.

@shashikdm

This comment has been minimized.

Copy link
Contributor Author

commented Jun 10, 2019

I have implemented barebones of log_graph that can log arbitrary Julia graphs. it takes optional arguments nodelabel, nodeop, nodedevice and nodevalue to determine name, op, device and attributes respectively with appropriate defaults.

Show resolved Hide resolved src/Loggers/LogGraph.jl Outdated
Show resolved Hide resolved src/Loggers/LogGraph.jl Outdated

shashikdm and others added some commits Jun 11, 2019

Update src/Loggers/LogGraph.jl
Co-Authored-By: Mathieu Besançon <mathieu.besancon@gmail.com>
@PhilipVinc

This comment has been minimized.

Copy link
Owner

commented Jun 14, 2019

What about the subgraph (I think it's called like this) of tensor board?

I'll give it a look this weekend, and probably merge it if you think it's ready.

@shashikdm

This comment has been minimized.

Copy link
Contributor Author

commented Jun 14, 2019

What about the subgraph (I think it's called like this) of tensor board?

that depends on nodelabels. subgraph nodes should be named in a hierarchy using "/"
e.g.

nodelabel = ["Dense/W", "Dense/b", "Dense/X"]

will form a collapsable subgraph with name "Dense"

@shashikdm

This comment has been minimized.

Copy link
Contributor Author

commented Jun 14, 2019

it's not ready yet. The values are not showing in the attributes. I'll look into that this weekend

shashikdm added some commits Jun 15, 2019

@shashikdm shashikdm changed the title [WIP] LogGraph LogGraph Jun 16, 2019

@shashikdm

This comment has been minimized.

Copy link
Contributor Author

commented Jun 16, 2019

@PhilipVinc You can review and merge this.
log_graph

  • converts an AbstractGraph to tensorboard's GraphDef.
  • accepts optional args for names, ops, and values.

The next task is to convert a model, Flux or Turing model to AbstractGraph and then use log_graph on it. I'm trying to handle this in a different module at TraceGraph.jl.

Show resolved Hide resolved src/Loggers/LogGraph.jl Outdated
Show resolved Hide resolved src/Loggers/LogGraph.jl Outdated
Show resolved Hide resolved src/Loggers/LogGraph.jl Outdated

shashikdm added some commits Jun 17, 2019

- change assert statements
- change getdatatype -> jltype2tf
- change @error -> throw
Show resolved Hide resolved src/Loggers/LogGraph.jl Outdated
Show resolved Hide resolved src/Loggers/LogGraph.jl Outdated

@PhilipVinc PhilipVinc merged commit 5a30c54 into PhilipVinc:master Jun 28, 2019

1 check passed

continuous-integration/travis-ci/pr The Travis CI build passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.