Jump to conversation
Unresolved conversations (5)
@ricardoV94 ricardoV94 Jan 14, 2023
Todo: Check we are not missing performance by not having explicit sequences. Todo: When there are multiple sequences PyTensor defines n_steps as the shortest sequence. JAX should be able to handle this, but if not we could consider not allowing sequences/n_steps with different lengths in the Pytensor scan. Then we could pass a single shape as n_steps after asserting they are the same?
pytensor/link/jax/dispatch/loop.py
@ricardoV94 ricardoV94 Jan 13, 2023
This gives an annoying Supervisor Feature missing warning... gotta clean that up
pytensor/link/jax/dispatch/loop.py
@ferrine ferrine Jan 13, 2023
Why not graph_replace or using memo for FunctionGraph(memo={symbolic_idx: idx}) ([here](https://github.com/pymc-devs/pytensor/pull/191/files#diff-1a63681fc5a000e63de6f9215e28d43c9e8066fc0a976b5da24499a671cadd11R121))?
pytensor/loop/basic.py
ricardoV94
Ricardo Vieira
@ferrine ferrine Jan 13, 2023
What about subclassing Scan into * `Map(Scan)` * `Reduce(Scan)` * `Filter(Scan)` It will be easier to dispatch into optimized implementations
pytensor/loop/basic.py
ricardoV94
Ricardo Vieira
@ricardoV94 ricardoV94 Jan 10, 2023
TODO: Add mixin `HasInnerGraph` so that we can see the inner graph in debug_print
pytensor/loop/op.py
Resolved conversations (1)
@ricardoV94 ricardoV94 Jan 11, 2023
Hmm, I can't do this or else it won't be possible to reference the outer graph (e.g., to request gradients wrt non-sequences) as is don't by the jacobian function
Outdated
pytensor/loop/basic.py
ricardoV94 aseyboldt
Ricardo Vieira and Adrian Seyboldt