Skip to content

Conversation

@willGraham01
Copy link
Collaborator

@willGraham01 willGraham01 commented Sep 3, 2025

Related to #75 |

Refactors the stochastic gradient descent method into a standalone function (inside a soon-to-be populated solvers module) that can be called across the code-base.

This has the bonus of making the two_normal_example a bit nicer looking (since we don't have to manually setup the optimisation loop itself).

Other additions include:

  • Some basic testing of the new method.
  • An implementation of the $l^2$-norm on PyTree objects. This is so that we can take the norms of gradient vectors without worrying about the container that their arguments are passed in as (which in turn dictates the format of the returned gradients).

Main motivation behind doing this is so that we can then apply SGD within the Quadratic Penalty Method, which is one of the focuses of #75.

@willGraham01 willGraham01 enabled auto-merge (squash) September 3, 2025 14:50
@willGraham01 willGraham01 merged commit 5d324f5 into main Sep 3, 2025
5 checks passed
@willGraham01 willGraham01 deleted the wgraham/refactor-sgd-method branch September 3, 2025 14:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants