Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix qnn module tutorial for TorchLayer #247

Merged
merged 5 commits into from
Apr 14, 2021
Merged

Fix qnn module tutorial for TorchLayer #247

merged 5 commits into from
Apr 14, 2021

Conversation

trbromley
Copy link
Collaborator

Currently the tutorial is failing due to the issue highlighted here: PennyLaneAI/pennylane#1210.

As a quick fix in this PR, we swap out BasicEntanglerLayers for its explicit decomposition.

Copy link
Contributor

@albi3ro albi3ro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hopefully, the bug gets fixed soon and glad it got caught :)

I suggested adding a comment, but that's not particularly critical. Approve.

@@ -86,7 +86,10 @@
@qml.qnode(dev)
def qnode(inputs, weights):
qml.templates.AngleEmbedding(inputs, wires=range(n_qubits))
qml.templates.BasicEntanglerLayers(weights, wires=range(n_qubits))
for weights_layer in weights:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
for weights_layer in weights:
# explicit `qml.templates.BasicEngtnglerLayers` until a bug is fixed
for weights_layer in weights:

Maybe just a comment here that there is usually a better way of doing this.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤔 I see your point, although wonder if it may confuse the reader - unless they also look at the Keras tutorial they may not be expecting to see BasicEntanglerLayers.

@trbromley trbromley changed the base branch from master to dev April 14, 2021 11:39
@trbromley trbromley requested a review from albi3ro April 14, 2021 14:00
Comment on lines +88 to +90
# Embedding
qml.RX(inputs[0], wires=0)
qml.RX(inputs[1], wires=1)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@albi3ro, sorry to rerequest your review! We additionally have to change this, as the bug is also present in AngleEmbedding.

Copy link
Contributor

@albi3ro albi3ro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see anymore templates. Hopefully this time it's good to go :)

@josh146 josh146 merged commit 1866e7e into dev Apr 14, 2021
@josh146 josh146 deleted the fix_torch_qnn_tutorial branch April 14, 2021 15:38
trbromley added a commit that referenced this pull request Apr 16, 2021
trbromley added a commit that referenced this pull request Apr 19, 2021
* Revert "Fix qnn module tutorial for TorchLayer (#247)"

This reverts commit 1866e7e.

* Update config.yml

Co-authored-by: Josh Izaac <josh146@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants