Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Visualizing the net-structure when training the net #10

Closed
Jacoobr opened this issue Nov 23, 2018 · 1 comment
Closed

Visualizing the net-structure when training the net #10

Jacoobr opened this issue Nov 23, 2018 · 1 comment

Comments

@Jacoobr
Copy link

Jacoobr commented Nov 23, 2018

Hi @AlexMa011 , Thanks for your great work. Now when i use "make_dot" function that from graphviz package to visulize the net-structure, i got a weird result.
(1.) Firstly, i add two lines code in the tarin.py script as follow:

    r = net(x,x1,x2,x3)
    g = make_dot(r, params=dict(net.named_parameters()))
    g.render('graph',view=False)
    result = r.contiguous().view(-1, 28*28+3)

When i run the script train.py, the program just stopped at this line code:g = make_dot(r, params=dict(net.named_parameters()))` , then i killed the thread by "Ctrl+C" shortcut. I can see
a dot format file called "graph" , there is the content of "graph":

digraph { graph [size="12,12"] node [align=left fontsize=12 height=0.2 ranksep=0.1 shape=box style=filled] 140010753187920 [label=GatherBackward] 140010753109776 -> 140010753187920 140010753109776 [label=ViewBackward] 140010753242896 -> 140010753109776 140010753242896 [label=ThAddmmBackward] 140010753243024 -> 140010753242896 140010753243024 [label=ExpandBackward] 140010753190480 -> 140010753243024 140010753190480 [label=BroadcastBackward] 140010753243280 -> 140010753190480 140010753243280 [label="module.model1.0.weight (64, 3, 3, 3)" fillcolor=lightblue] 140010753243152 -> 140010753190480 140010753243152 [label="module.model1.0.bias (64)" fillcolor=lightblue] 140010753243344 -> 140010753190480 140010753243344 [label="module.model1.1.weight (64)" fillcolor=lightblue] 140010753243408 -> 140010753190480 140010753243408 [label="module.model1.1.bias (64)" fillcolor=lightblue] 140010753243472 -> 140010753190480 140010753243472 [label="module.model1.3.weight (64, 64, 3, 3)" fillcolor=lightblue] 140010753243536 -> 140010753190480 140010753243536 [label="module.model1.3.bias (64)" fillcolor=lightblue] 140010753243600 -> 140010753190480 140010753243600 [label="module.model1.4.weight (64)" fillcolor=lightblue] 140010753243664 -> 140010753190480 140010753243664 [label="module.model1.4.bias (64)" fillcolor=lightblue] 140010753243728 -> 140010753190480 140010753243728 [label="module.model1.7.weight (128, 64, 3, 3)" fillcolor=lightblue] 140010753243792 -> 140010753190480 140010753243792 [label="module.model1.7.bias (128)" fillcolor=lightblue] 140010753243856 -> 140010753190480 140010753243856 [label="module.model1.8.weight (128)" fillcolor=lightblue] 140010753243920 -> 140010753190480 140010753243920 [label="module.model1.8.bias (128)" fillcolor=lightblue] 140010753243984 -> 140010753190480 140010753243984 [label="module.model1.10.weight (128, 128, 3, 3)" fillcolor=lightblue] 140010753244048 -> 140010753190480 140010753244048 [label="module.model1.10.bias (128)" fillcolor=lightblue] 140010753244112 -> 140010753190480 140010753244112 [label="module.model1.11.weight (128)" fillcolor=lightblue] 140010753244176 -> 140010753190480 140010753244176 [label="module.model1.11.bias (128)" fillcolor=lightblue] 140010753244240 -> 140010753190480 140010753244240 [label="module.model2.0.weight (256, 128, 3, 3)" fillcolor=lightblue] 140010753244304 -> 140010753190480 140010753244304 [label="module.model2.0.bias (256)" fillcolor=lightblue] 140010753244368 -> 140010753190480 140010753244368 [label="module.model2.1.weight (256)" fillcolor=lightblue] 140010753244432 -> 140010753190480 140010753244432 [label="module.model2.1.bias (256)" fillcolor=lightblue] 140010753244496 -> 140010753190480 140010753244496 [label="module.model2.3.weight (256, 256, 3, 3)" fillcolor=lightblue] 140010753244560 -> 140010753190480 140010753244560 [label="module.model2.3.bias (256)" fillcolor=lightblue] 140010753244624 -> 140010753190480 140010753244624 [label="module.model2.4.weight (256)" fillcolor=lightblue] 140010753244688 -> 140010753190480 140010753244688 [label="module.model2.4.bias (256)" fillcolor=lightblue] 140010753244752 -> 140010753190480 140010753244752 [label="module.model2.6.weight (256, 256, 3, 3)" fillcolor=lightblue] 140010753244816 -> 140010753190480 140010753244816 [label="module.model2.6.bias (256)" fillcolor=lightblue] 140010753244880 -> 140010753190480 140010753244880 [label="module.model2.7.weight (256)" fillcolor=lightblue] 140010753244944 -> 140010753190480 140010753244944 [label="module.model2.7.bias (256)" fillcolor=lightblue] 140010753245008 -> 140010753190480 140010753245008 [label="module.model3.0.weight (512, 256, 3, 3)" fillcolor=lightblue] 140010753245072 -> 140010753190480 140010753245072 [label="module.model3.0.bias (512)" fillcolor=lightblue] 140010753245136 -> 140010753190480 140010753245136 [label="module.model3.1.weight (512)" fillcolor=lightblue] 140010753286224 -> 140010753190480 140010753286224 [label="module.model3.1.bias (512)" fillcolor=lightblue] 140010753286288 -> 140010753190480 140010753286288 [label="module.model3.3.weight (512, 512, 3, 3)" fillcolor=lightblue] 140010753286352 -> 140010753190480 140010753286352 [label="module.model3.3.bias (512)" fillcolor=lightblue] 140010753286416 -> 140010753190480 140010753286416 [label="module.model3.4.weight (512)" fillcolor=lightblue] 140010753286480 -> 140010753190480 140010753286480 [label="module.model3.4.bias (512)" fillcolor=lightblue] 140010753286544 -> 140010753190480 140010753286544 [label="module.model3.6.weight (512, 512, 3, 3)" fillcolor=lightblue] 140010753286608 -> 140010753190480 140010753286608 [label="module.model3.6.bias (512)" fillcolor=lightblue] 140010753286672 -> 140010753190480 140010753286672 [label="module.model3.7.weight (512)" fillcolor=lightblue] 140010753286736 -> 140010753190480 140010753286736 [label="module.model3.7.bias (512)" fillcolor=lightblue] 140010753286800 -> 140010753190480 140010753286800 [label="module.model4.1.weight (512, 512, 3, 3)" fillcolor=lightblue] 140010753286864 -> 140010753190480 140010753286864 [label="module.model4.1.bias (512)" fillcolor=lightblue] 140010753286928 -> 140010753190480 140010753286928 [label="module.model4.2.weight (512)" fillcolor=lightblue] 140010753286992 -> 140010753190480 140010753286992 [label="module.model4.2.bias (512)" fillcolor=lightblue] 140010753287056 -> 140010753190480 140010753287056 [label="module.model4.4.weight (512, 512, 3, 3)" fillcolor=lightblue] 140010753287120 -> 140010753190480 140010753287120 [label="module.model4.4.bias (512)" fillcolor=lightblue] 140010753287184 -> 140010753190480 140010753287184 [label="module.model4.5.weight (512)" fillcolor=lightblue] 140010753287248 -> 140010753190480 140010753287248 [label="module.model4.5.bias (512)" fillcolor=lightblue] 140010753287312 -> 140010753190480 140010753287312 [label="module.model4.7.weight (512, 512, 3, 3)" fillcolor=lightblue] 140010753287376 -> 140010753190480 140010753287376 [label="module.model4.7.bias (512)" fillcolor=lightblue] 140010753287440 -> 140010753190480 140010753287440 [label="module.model4.8.weight (512)" fillcolor=lightblue] 140010753287504 -> 140010753190480 140010753287504 [label="module.model4.8.bias (512)" fillcolor=lightblue] 140010753287568 -> 140010753190480 140010753287568 [label="module.convlayer1.0.weight (128, 128, 3, 3)" fillcolor=lightblue] 140010753287632 -> 140010753190480 140010753287632 [label="module.convlayer1.0.bias (128)" fillcolor=lightblue] 140010753287696 -> 140010753190480 140010753287696 [label="module.convlayer1.2.weight (128)" fillcolor=lightblue] 140010753287760 -> 140010753190480 140010753287760 [label="module.convlayer1.2.bias (128)" fillcolor=lightblue] 140010753287824 -> 140010753190480 140010753287824 [label="module.convlayer2.0.weight (128, 256, 3, 3)" fillcolor=lightblue] 140010753287888 -> 140010753190480 140010753287888 [label="module.convlayer2.0.bias (128)" fillcolor=lightblue] 140010753287952 -> 140010753190480 140010753287952 [label="module.convlayer2.2.weight (128)" fillcolor=lightblue] 140010753288016 -> 140010753190480 140010753288016 [label="module.convlayer2.2.bias (128)" fillcolor=lightblue] 140010753288080 -> 140010753190480 140010753288080 [label="module.convlayer3.0.weight (128, 512, 3, 3)" fillcolor=lightblue] 140010753288144 -> 140010753190480 140010753288144 [label="module.convlayer3.0.bias (128)" fillcolor=lightblue] 140010753288208 -> 140010753190480 140010753288208 [label="module.convlayer3.2.weight (128)" fillcolor=lightblue] 140010753288272 -> 140010753190480 140010753288272 [label="module.convlayer3.2.bias (128)" fillcolor=lightblue] 140010753288336 -> 140010753190480 140010753288336 [label="module.convlayer4.0.weight (128, 512, 3, 3)" fillcolor=lightblue] 140010753288400 -> 140010753190480 140010753288400 [label="module.convlayer4.0.bias (128)" fillcolor=lightblue] 140010753288464 -> 140010753190480 140010753288464 [label="module.convlayer4.2.weight (128)" fillcolor=lightblue] 140010753288528 -> 140010753190480 140010753288528 [label="module.convlayer4.2.bias (128)" fillcolor=lightblue] 140010753288592 -> 140010753190480 140010753288592 [label="module.convlayer5.0.weight (128, 512, 3, 3)" fillcolor=lightblue] 140010753288656 -> 140010753190480 140010753288656 [label="module.convlayer5.0.bias (128)" fillcolor=lightblue] 140010753288720 -> 140010753190480 140010753288720 [label="module.convlayer5.2.weight (128)" fillcolor=lightblue] 140010753288784 -> 140010753190480 140010753288784 [label="module.convlayer5.2.bias (128)" fillcolor=lightblue] 140010753288848 -> 140010753190480 140010753288848 [label="module.linear2.weight (787, 1568)" fillcolor=lightblue] 140010753288912 -> 140010753190480 140010753288912 [label="module.linear2.bias (787)" fillcolor=lightblue] 140010753288976 -> 140010753190480 140010753288976 [label="module.lstmlayer.weight_ih_l0 (6272, 7846)" fillcolor=lightblue] 140010753289040 -> 140010753190480 140010753289040 [label="module.lstmlayer.weight_hh_l0 (6272, 1568)" fillcolor=lightblue] 140010753289104 -> 140010753190480 140010753289104 [label="module.lstmlayer.bias_ih_l0 (6272)" fillcolor=lightblue] 140010753289168 -> 140010753190480 140010753289168 [label="module.lstmlayer.bias_hh_l0 (6272)" fillcolor=lightblue] 140010753289232 -> 140010753190480 ...... }
Finally, I use the command dot -Tpdf graph -o graph_.pdf to convert the dot format to pdf for easily visual.
I got a weird result:
issue
I don't know what's wrong with my code and hope for your appreciative help or your suggestions for visulizing the net structure of polygon-rnn. Thank U again~.

@Jacoobr Jacoobr changed the title Visualizing the net-structure when train the net Visualizing the net-structure when training the net Nov 30, 2018
@AlexMa011
Copy link
Owner

Sorry to reply late. I am kind of busy recently. I think the model of polygon-rnn is quite easy. However, it has the complicated forward process, thus it may be difficult to use some tools to draw the graph. I will try to test the method you mentioned, but I'm afraid it won't work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants