You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Because pipe instances sometimes are stateful, their behaviour within a data loop can be difficult to predict and counterintuitive. For instance, the range filter:
graph.v.loop{ |v| v.out[0]}.while{:loop_and_emit}
It would make sense for this loop to be equivalent to:
because the range filter is not reset on each loop. The elements that are looped are mixed with the starting elements and fed back through the same out[0] pipe fragment which allows only the first element through and then stops emitting.
Another problem with this approach is that the order of iteration is uncontrollable because all emitted elements are mixed together in one queue that competes with the elements that are initially fed into the loop.
Instead of that it would make more sense to dynamically generate a pipe fragment for each depth of recursion within the loop, so the same definition would produce numerous out[0] pipes, each with their own queue that can be iterated in either depth-first or breadth first order.
Because the block generating the pipe fragment would be called once for each depth, it seems sensible to feed that block some context such as its current depth and the previous pipeline fragments. That would also make it possible to do things like aggregation, something like these bits of pseudo code:
Because pipe instances sometimes are stateful, their behaviour within a data loop can be difficult to predict and counterintuitive. For instance, the range filter:
It would make sense for this loop to be equivalent to:
but in fact it is equivalent to:
because the range filter is not reset on each loop. The elements that are looped are mixed with the starting elements and fed back through the same
out[0]
pipe fragment which allows only the first element through and then stops emitting.Another problem with this approach is that the order of iteration is uncontrollable because all emitted elements are mixed together in one queue that competes with the elements that are initially fed into the loop.
Instead of that it would make more sense to dynamically generate a pipe fragment for each depth of recursion within the loop, so the same definition would produce numerous
out[0]
pipes, each with their own queue that can be iterated in either depth-first or breadth first order.Because the block generating the pipe fragment would be called once for each depth, it seems sensible to feed that block some context such as its current depth and the previous pipeline fragments. That would also make it possible to do things like aggregation, something like these bits of pseudo code:
The text was updated successfully, but these errors were encountered: