You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, after reading the code, I have some questions about the details of the actor_network, if these questions can be answered, I will not be very grateful :)
In the part of the node network, why are there three dimensions in the input vector shape of the network instead of the two dimensions of [batch_size, number of features] in the usual design, so what are the considerations and basis for designing the input vector?
In the part of the node network, the features of input vectors combine the output of GCN and "node_inputs". Why is "node_inputs" also merged? Isn't the former a high-level representation of the latter processed by GCN?
The text was updated successfully, but these errors were encountered:
Notice that different jobs may have different number of nodes (computation stages). We can't simply merge features from two jobs into a 2D matrix because of size mismatch. However, the 3D input will be reshape to 2D in the pipeline (after the message passing step in graph neural network). Keep in mind we need to keep track of which row in the 2D features corresponding to which job from the 3D input. If you are interested, we implement the reshape operation (for sparse features) in https://github.com/hongzimao/decima-sim/blob/c010dd74ff4b7566bd0ac989c90a32cfbc630d84/sparse_op.py (you can trace how these functions are used in the feature processing pipeline)
We include the original feature because global level aggregation may find it more straightforward to use (e.g., sum the total work of all nodes). Feel free to leave out the merge and see if it affects the performance. It will be interesting to know.
Hello, after reading the code, I have some questions about the details of the actor_network, if these questions can be answered, I will not be very grateful :)
The text was updated successfully, but these errors were encountered: