Commit e3c02af
Fix parameter gradients not embedded after params moved earlier
The commit 47a33fc moved params computation earlier in Tensor.op,
but this broke the assumption that t.params was empty when building
backprop. The condition `not (Set.mem t.params ti)` now correctly
skipped parameter backprop, but also skipped adding their gradient
nodes to embedded_nodes - causing "context lacks node x.grad" errors.
Fix: still add parameter gradients to embedded_nodes when skipping
their backprop code.
Also adds zero2hero_1of7_exec standalone test for easier debugging.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>1 parent 510a567 commit e3c02af
File tree
4 files changed
+450
-1
lines changed- tensor
- test/operations
4 files changed
+450
-1
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
409 | 409 | | |
410 | 410 | | |
411 | 411 | | |
412 | | - | |
| 412 | + | |
| 413 | + | |
| 414 | + | |
| 415 | + | |
| 416 | + | |
| 417 | + | |
| 418 | + | |
| 419 | + | |
413 | 420 | | |
414 | 421 | | |
415 | 422 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
377 | 377 | | |
378 | 378 | | |
379 | 379 | | |
| 380 | + | |
| 381 | + | |
| 382 | + | |
| 383 | + | |
| 384 | + | |
| 385 | + | |
| 386 | + | |
| 387 | + | |
| 388 | + | |
| 389 | + | |
| 390 | + | |
380 | 391 | | |
381 | 392 | | |
382 | 393 | | |
| |||
0 commit comments