Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[unitaryhack] Initializing respective weights by different method for TorchLayer #2678

Merged
merged 30 commits into from
Jun 10, 2022

Conversation

amitjansc
Copy link
Contributor

@amitjansc amitjansc commented Jun 8, 2022

Before submitting

Please complete the following checklist when submitting a PR:

  • All new features must include a unit test.
    If you've fixed a bug or added code that should be tested, add a test to the
    test directory!

  • All new functions and code must be clearly commented and documented.
    If you do make documentation changes, make sure that the docs build and
    render correctly by running make docs.

    Note: Running make docs raises the following error: You must configure the bibtex_bibfiles setting

  • Ensure that the test suite passes, by running make test.

  • Add a new entry to the doc/releases/changelog-dev.md file, summarizing the
    change, and including a link back to the PR.

  • The PennyLane source code conforms to
    PEP8 standards.
    We check all of our code against Pylint.
    To lint modified files, simply pip install pylint, and then
    run pylint pennylane/path/to/file.py.

When all the above are checked, delete everything above the dashed
line and fill in the pull request template.


Context:

The init_method argument of TorchLayer's constructor has been re-defined. Now it can take two different types:

  1. A torch.nn.init method to initialize all weights.
  2. A dictionary containing a torch.nn.init method or a torch.Tensor for each different weight.

Depending on the init_method used, the newly defined _init_weights method initializes and registers all the weights.

This PR also solves a small bug, where the user could define weights with negative shape and these would be assigned to a scalar Tensor.

Description of the Change:

  • pennylane/qnn/torch.py:

    • Moved weight initialization inside _init_weights():

      This method defines another method (init_weight) which returns the initialized weights depending on the given init_method. It then uses the aforementioned method to initialize all the weights defined in weight_shapes.

    • Changed if condition when building the weight_shapes, such that only shapes equal to 0 or 1 are casted to an empty tuple.

    • Style changes (a2ba68e): Merge nested if conditions. Convert for loop into list comprehension.

    • Added missing QNode type.

  • tests/qnn/test_qnn_torch.py: Added tests with all possible types of init_method arguments.

Benefits:

In TorchLayer, we can now specify per weight initialization method. Consequently, it is possible to initialize, for example, multiple entangling layers in one qnode with different initialization methods.

Possible Drawbacks:

If init_method is a dictionary, it must contain the same keys than the weight_shapes dictionary. If not a KeyError is raised. I can add a small check to raise a more self-explanatory error if needed.

torch.Tensor cannot be used when defining the input/output types of the TorchLayer class methods, because it crashes the tests that don't use torch.

Related GitHub Issues:

Issue #898.

@amitjansc amitjansc marked this pull request as ready for review June 8, 2022 22:25
@antalszava
Copy link
Contributor

Hi @amitjansc, thank you for this PR! 🙂 🎉 Let us know if we should go for a review. 👍

@codecov
Copy link

codecov bot commented Jun 9, 2022

Codecov Report

Merging #2678 (220258c) into master (6f8db28) will increase coverage by 0.00%.
The diff coverage is 100.00%.

@@           Coverage Diff           @@
##           master    #2678   +/-   ##
=======================================
  Coverage   99.61%   99.61%           
=======================================
  Files         251      251           
  Lines       20667    20675    +8     
=======================================
+ Hits        20587    20595    +8     
  Misses         80       80           
Impacted Files Coverage Δ
pennylane/qnn/torch.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 6f8db28...220258c. Read the comment docs.

@amitjansc
Copy link
Contributor Author

amitjansc commented Jun 9, 2022

Hi @antalszava,

I added a small commit checking the size of the given Tensor in the init_method. If all workflow checks are successful then the PR can be reviewed.

Please find below a working example where I initialize the weights using different methods and/or specific Tensor values:

import torch
import pennylane as qml

n_qubits = 1
dev = qml.device("default.qubit", wires=n_qubits)


@qml.qnode(dev)
def qnode(inputs, weights_0, weights_1, weights_2, weight_3, weight_4):
    qml.templates.AngleEmbedding(inputs, wires=range(n_qubits))
    qml.templates.StronglyEntanglingLayers(weights_0, wires=range(n_qubits))
    qml.templates.BasicEntanglerLayers(weights_1, wires=range(n_qubits))
    qml.Rot(*weights_2, wires=0)
    qml.RY(weight_3, wires=1)
    qml.RZ(weight_4, wires=1)
    qml.CNOT(wires=[0, 1])
    return qml.expval(qml.PauliZ(0)), qml.expval(qml.PauliZ(1))


weight_shapes = {
    "weights_0": (3, n_qubits, 3),
    "weights_1": (3, n_qubits),
    "weights_2": 3,
    "weight_3": 1,
    "weight_4": (1,),
}

init_method = {
    "weights_0": torch.nn.init.normal_,
    "weights_1": torch.nn.init.uniform,
    "weights_2": torch.tensor([1., 2., 3.]),
    "weight_3": torch.tensor(1.),  # scalar when shape is not an iterable and is <= 1
    "weight_4": torch.tensor([1.]),
}

qlayer = qml.qnn.TorchLayer(qnode, weight_shapes=weight_shapes, init_method=init_method)
print(qlayer.qnode_weights)

Prints:

{'weights_0': Parameter containing:
tensor([[[-2.4687, -0.1301, -0.2780]],

        [[ 0.5446,  0.4844,  2.5817]],

        [[-0.1818, -1.5043, -2.4209]]], requires_grad=True), 'weights_1': Parameter containing:
tensor([[0.0694],
        [0.2602],
        [0.9930]], requires_grad=True), 'weights_2': Parameter containing:
tensor([1., 2., 3.], requires_grad=True), 'weight_3': Parameter containing:
tensor(1., requires_grad=True), 'weight_4': Parameter containing:
tensor([1.], requires_grad=True)}

@antalszava antalszava self-requested a review June 9, 2022 19:24
Copy link
Contributor

@antalszava antalszava left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good @amitjansc! 🎉 Thank you for the contribution. 🙂

Just a couple of minor comments, the addition looks great otherwise. :)

pennylane/qnn/torch.py Outdated Show resolved Hide resolved
pennylane/qnn/torch.py Outdated Show resolved Hide resolved
tests/qnn/test_qnn_torch.py Show resolved Hide resolved
pennylane/qnn/torch.py Outdated Show resolved Hide resolved
pennylane/qnn/torch.py Outdated Show resolved Hide resolved
@amitjansc
Copy link
Contributor Author

All comments have been addressed @antalszava 🎉 :.

Before merging, beware that I could not run make docs because I get the following error message (which I don't know how to solve 😞) : You must configure the bibtex_bibfiles setting

@antalszava antalszava merged commit 6c81c07 into PennyLaneAI:master Jun 10, 2022
@antalszava antalszava changed the title Initializing respective weights by different method for TorchLayer [unitaryhack] Initializing respective weights by different method for TorchLayer Jun 22, 2022
@antalszava
Copy link
Contributor

Hi @amitjansc, as you've made your contribution during Unitary HACK you are eligible for swag. 🙂 To claim it you should be signed up for UnitaryHACK here and add [unitaryhack] in the title of the PR.

Let me know if you have further questions (cc @CatalinaAlbornoz).

@antalszava antalszava added the unitaryhack-accepted This contribution has been accepted for a UnitaryHack issue label Jun 22, 2022
@amitjansc
Copy link
Contributor Author

amitjansc commented Jun 25, 2022

Hi @amitjansc, as you've made your contribution during Unitary HACK you are eligible for swag. slightly_smiling_face To claim it you should be signed up for UnitaryHACK here and add [unitaryhack] in the title of the PR.

Let me know if you have further questions (cc @CatalinaAlbornoz).

Hi @antalszava. Great, thank you! I just signed up for UnitaryHACK. 👍

What is exactly a swag package? 😆

@amitjansc amitjansc changed the title [unitaryhack] Initializing respective weights by different method for TorchLayer [unitaryHACK] Initializing respective weights by different method for TorchLayer Jun 25, 2022
@amitjansc amitjansc changed the title [unitaryHACK] Initializing respective weights by different method for TorchLayer [unitaryhack] Initializing respective weights by different method for TorchLayer Jun 25, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
unitaryhack-accepted This contribution has been accepted for a UnitaryHack issue
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants