Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Templates] Rewrite layer templates as operations #1163

Merged
merged 26 commits into from
Apr 9, 2021
Merged

Conversation

mariaschuld
Copy link
Contributor

@mariaschuld mariaschuld commented Mar 25, 2021

Porting layer templates to operations.

Things that were changed are:

  • docstrings improved
  • broadcast unpacked in every decomposition
  • shape method added (and mentioned in docstring)
  • tests are put in separate file and largely rewritten

There are 2 outstanding issues:

  • Since CVNeuralNetLayers depends on Interferometer I thought I also port this template quickly, but there was a problem with the existing gate! I uncommented the really bad test case for now, and another one is failing.
  • @Thenerdstation is fixing a problem with jax differentiation in ParticleConservingU1

Otherwise ready to review. Sorry, I know it is a nightmare because so much code moved, but just a sanity check regarding the tests and the structure (as well as docstrings) would be great!

@github-actions
Copy link
Contributor

Hello. You may have forgotten to update the changelog!
Please edit .github/CHANGELOG.md with:

  • A one-to-two sentence description of the change. You may include a small working example for new features.
  • A link back to this PR.
  • Your name (or GitHub username) in the contributors section.

@mariaschuld mariaschuld changed the title [Templates] [WIP] Rewrite layer templates as operations [Templates] Rewrite layer templates as operations Mar 25, 2021
return qml.expval(qml.X(0))


# def circuit_decomposed(*weights):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This has to be uncommented once the interferometer issue is solved

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reminder to uncomment!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, thanks, I forgot!

),
(np.array([1, 1]), np.array([0.0 + 0.0j, 0.0 + 0.0j, 0.0 + 0.0j, 1.0 + 0.0j])),
],
)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Todo: As far as I know we discourage test cases that were just run, and then compared to the output...better to have logical sanity checks?

Copy link
Contributor

@ixfoduap ixfoduap left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Having a hard time reviewing this PR with so many changes. Will get back to it with a clearer mind

pennylane/templates/layers/cv_neural_net.py Outdated Show resolved Hide resolved
broadcast(unitary=Displacement, pattern="single", wires=wires, parameters=a_and_phi_a)
@staticmethod
def shape(n_layers, n_wires):
r"""Returns a list of shapes for the 11 parameter tensors.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😁 😁

tests/templates/test_layers/test_cv_neural_net.py Outdated Show resolved Hide resolved
@codecov
Copy link

codecov bot commented Apr 6, 2021

Codecov Report

Merging #1163 (ace1680) into master (779f9ec) will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master    #1163   +/-   ##
=======================================
  Coverage   98.11%   98.12%           
=======================================
  Files         145      145           
  Lines       10943    10971   +28     
=======================================
+ Hits        10737    10765   +28     
  Misses        206      206           
Impacted Files Coverage Δ
pennylane/templates/layers/basic_entangler.py 100.00% <ø> (ø)
pennylane/templates/layers/cv_neural_net.py 100.00% <100.00%> (ø)
...nnylane/templates/layers/particle_conserving_u1.py 100.00% <100.00%> (ø)
...nnylane/templates/layers/particle_conserving_u2.py 100.00% <100.00%> (ø)
pennylane/templates/layers/random.py 100.00% <100.00%> (ø)
...ennylane/templates/layers/simplified_two_design.py 100.00% <100.00%> (ø)
pennylane/templates/layers/strongly_entangling.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 779f9ec...ace1680. Read the comment docs.

@mariaschuld
Copy link
Contributor Author

@ixfoduap and @Thenerdstation I know this is a boring PR, but would you mind having a general look? :)

pennylane/templates/layers/basic_entangler.py Show resolved Hide resolved
Comment on lines +205 to +230
def expand(self):

if self.seed is not None:
np.random.seed(self.seed)

shape = qml.math.shape(self.parameters[0])

with qml.tape.QuantumTape() as tape:

for l in range(self.n_layers):

i = 0
while i < shape[1]:
if np.random.random() > self.ratio_imprimitive:
# apply a random rotation gate to a random wire
gate = np.random.choice(self.rotations)
rnd_wire = self.wires.select_random(1)
gate(self.parameters[0][l, i], wires=rnd_wire)
i += 1

else:
# apply the entangler to two random wires
if len(self.wires) > 1:
rnd_wires = self.wires.select_random(2)
self.imprimitive(wires=rnd_wires)
return tape
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we care at all that expanding the tape will give different circuits each time?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's the idea of this template...to avoid, the user must feed a seed.

strongly_entangling_layer(
weights=weights[l], wires=wires, r=ranges[l], imprimitive=imprimitive
)
return n_layers, n_wires, 3
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return n_layers, n_wires, 3
return (n_layers, n_wires, 3)

Doesn't matter at all but makes it more clear to me it should be treated as a tuple and not 3 separate return args.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Codefactor complains about the unnecessary brakets though...?

Comment on lines -79 to -90
QUBIT_DIFFABLE_NONDIFFABLE = [(qml.templates.AmplitudeEmbedding,
{'features': [1 / 2, 1 / 2, 1 / 2, 1 / 2]},
{'wires': [0, 1], 'normalize': False},
2),
(qml.templates.AmplitudeEmbedding,
{'features': [1 / 2, 1 / 2, 1 / 2, 1 / 2]},
{'wires': [0, 1], 'normalize': True},
2),
(qml.templates.BasisEmbedding,
{},
{'wires': [0, 1], 'features': [1, 0]},
2),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason you removed most of these?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was a bit inconsistent here...the test_integration file will be deleted once all refactor PRs are merged, because the old tests are superseeded. I first deleted things from this file in each PR, then realised I can just leave it and delete in the end. So I'll take care of this!


assert type(tape.operations[0]) == rotation
assert type(tape.operations[1]) == rotation
assert rotation in [type(gate) for gate in queue]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't check that all gates are rotations, just that a rotation exists in that list.

Can you instead do assert all(type(gate) == rotation for gate in queue)?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But the queue does not only have rotations...I basically just want to check that the custom gate type is used here...where it is used is tested in the first test...

tests/templates/test_layers/test_basic_entangler.py Outdated Show resolved Hide resolved

res = circuit(weights)
res2 = circuit2(weights)
assert qml.math.allclose(res, res2, atol=tol, rtol=0)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
assert qml.math.allclose(res, res2, atol=tol, rtol=0)
np.testing.assert_allclose(res, res2, atol=tol, rtol=0)

This works with TF/Torch/JAX tensor-types too!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just checked and works here, but not when a tensor has a grad attached...so will not change it everywhere...

circuit()
circuit2()

assert np.allclose(dev.state, dev2.state, atol=tol, rtol=0)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we instead have the circuits return qml.state() and check that output instead of relying on dev.state? In the next quarter, we're going to start enforcing devices be stateless objects/functions, so this will be one of the main things that will be deprecated.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, I should have left a comment about this (did so in the other PRs): Custom wire labels do not work with qml.state(). So maybe when this gets deprecated, this test is a good motivation to also fix this limitation. So I'd suggest to leave it like this for now?

tests/templates/test_layers/test_simplified_twodesign.py Outdated Show resolved Hide resolved
Copy link
Contributor

@ixfoduap ixfoduap left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have to admit that my head hurts trying to review this PR. I'm requesting changes to ensure that Chase's comments are addressed and that a section of the code is uncommented with the resolved interferometer issue.

Otherwise I'm willing to approve to not slow down the process, but I do so without confidence in the correctness of the changes

pennylane/templates/layers/basic_entangler.py Show resolved Hide resolved
return qml.expval(qml.X(0))


# def circuit_decomposed(*weights):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reminder to uncomment!

Co-authored-by: Chase Roberts <chase@xanadu.ai>
@mariaschuld
Copy link
Contributor Author

Thanks so much for the reviews, I know they are hard, but you already picked up a few crucial oversights!

Ready for final check!

Copy link
Contributor

@ixfoduap ixfoduap left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

YOLO approve

@mariaschuld mariaschuld merged commit 5645fd2 into master Apr 9, 2021
@mariaschuld mariaschuld deleted the rewrite-layers branch April 9, 2021 07:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants