New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[UnitaryHack] Added Rotosolve optimizer #93
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've left a bunch of (nit-picky) comments. In general, this looks great already! Good work!
I'll test the functionality of it and will then come back to you.
self.update_hyper_params() | ||
self.zero_grad() | ||
|
||
def update_hyper_params(self) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to define this method
def state_dict(self) -> dict[str, Any]: | ||
"""Return optimizer states as dictionary. | ||
|
||
Returns | ||
------- | ||
dict | ||
A dictionary containing the current state of the optimizer. | ||
|
||
""" | ||
raise NotImplementedError |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
return empty dict
def load_state_dict(self, state_dict: Mapping[str, Any]) -> None: | ||
"""Load state of the optimizer from the state dictionary. | ||
|
||
Parameters | ||
---------- | ||
state_dict : dict | ||
A dictionary containing a snapshot of the optimizer state. | ||
|
||
""" | ||
raise NotImplementedError |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just do a pass
here
|
||
""" | ||
RotosolveOptimizer | ||
============= |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add some =
lambeq/training/__init__.py
Outdated
'SPSAOptimizer', | ||
'RotosolveOptimizer', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
re-order
lambeq/__init__.py
Outdated
'SPSAOptimizer', | ||
'RotosolveOptimizer', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
re-order (Rotosolve before SPSA)
lambeq/__init__.py
Outdated
from lambeq.training import (Checkpoint, Dataset, Optimizer, SPSAOptimizer, | ||
Model, NumpyModel, PennyLaneModel, PytorchModel, | ||
QuantumModel, TketModel, Trainer, PytorchTrainer, | ||
RotosolveOptimizer, Model, NumpyModel, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
re-order
self.model.weights = self.gradient | ||
self.model.weights = self.project(self.model.weights) | ||
|
||
self.update_hyper_params() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no need to call this here
One note: Also have a look at the tests, there are some linting issues. |
Hi @Shiro-Raven. Please check the initial github issue. I've created an example notebook to test your optimiser. |
@Shiro-Raven Hi, please use the notebook provided by @Thommy257 in the issue to check your optimiser, so we can proceed and assign the task to you. Note that currently one test is failing, but it's not relevant to your PR -- we'll have a look at it at some point. |
@Shiro-Raven we've fixed the test error, so if you merge those changes into your PR all the tests should now pass. |
@Shiro-Raven Please address any remaining comments by tomorrow 13/6, last day of the hackathon. |
Just to clarify here, hackers have until June 20th to fully finalize contributions to be eligible for a unitaryHACK bounty. The June 13th deadline was for submitting pull requests. |
@natestemen Noted, thanks. |
PR adds the Rotosolve optimizer from the paper from Ostaszewski et al. to Lambeq.
Addresses #85.