Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[UnitaryHack] Added Rotosolve optimizer #93

Closed
wants to merge 3 commits into from

Conversation

Shiro-Raven
Copy link
Contributor

@Shiro-Raven Shiro-Raven commented May 26, 2023

PR adds the Rotosolve optimizer from the paper from Ostaszewski et al. to Lambeq.

Addresses #85.

Copy link
Contributor

@Thommy257 Thommy257 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've left a bunch of (nit-picky) comments. In general, this looks great already! Good work!

I'll test the functionality of it and will then come back to you.

self.update_hyper_params()
self.zero_grad()

def update_hyper_params(self) -> None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need to define this method

Comment on lines 151 to 160
def state_dict(self) -> dict[str, Any]:
"""Return optimizer states as dictionary.

Returns
-------
dict
A dictionary containing the current state of the optimizer.

"""
raise NotImplementedError
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

return empty dict

Comment on lines 162 to 171
def load_state_dict(self, state_dict: Mapping[str, Any]) -> None:
"""Load state of the optimizer from the state dictionary.

Parameters
----------
state_dict : dict
A dictionary containing a snapshot of the optimizer state.

"""
raise NotImplementedError
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just do a pass here


"""
RotosolveOptimizer
=============
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add some =

Comment on lines 26 to 27
'SPSAOptimizer',
'RotosolveOptimizer',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

re-order

Comment on lines 87 to 88
'SPSAOptimizer',
'RotosolveOptimizer',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

re-order (Rotosolve before SPSA)

Comment on lines 129 to 130
from lambeq.training import (Checkpoint, Dataset, Optimizer, SPSAOptimizer,
Model, NumpyModel, PennyLaneModel, PytorchModel,
QuantumModel, TketModel, Trainer, PytorchTrainer,
RotosolveOptimizer, Model, NumpyModel,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

re-order

self.model.weights = self.gradient
self.model.weights = self.project(self.model.weights)

self.update_hyper_params()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no need to call this here

@Thommy257
Copy link
Contributor

One note: model(diagrams) returns the circuit result for each diagram. You will also need to take the loss function into account. Otherwise, it won't be possible to calculate the updated theta.

Also have a look at the tests, there are some linting issues.

@Thommy257
Copy link
Contributor

Hi @Shiro-Raven. Please check the initial github issue. I've created an example notebook to test your optimiser.

@dimkart
Copy link
Contributor

dimkart commented Jun 7, 2023

@Shiro-Raven Hi, please use the notebook provided by @Thommy257 in the issue to check your optimiser, so we can proceed and assign the task to you.

Note that currently one test is failing, but it's not relevant to your PR -- we'll have a look at it at some point.

@le-big-mac
Copy link
Collaborator

@Shiro-Raven we've fixed the test error, so if you merge those changes into your PR all the tests should now pass.

@dimkart
Copy link
Contributor

dimkart commented Jun 12, 2023

@Shiro-Raven Please address any remaining comments by tomorrow 13/6, last day of the hackathon.

@natestemen
Copy link

Just to clarify here, hackers have until June 20th to fully finalize contributions to be eligible for a unitaryHACK bounty. The June 13th deadline was for submitting pull requests.

@dimkart
Copy link
Contributor

dimkart commented Jun 18, 2023

Just to clarify here, hackers have until June 20th to fully finalize contributions to be eligible for a unitaryHACK bounty. The June 13th deadline was for submitting pull requests.

@natestemen Noted, thanks.

@ianyfan ianyfan closed this in d9471b2 Jul 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants