Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quantum Dropout implementation #4929

Open
fran-scala opened this issue Dec 8, 2023 · 7 comments
Open

Quantum Dropout implementation #4929

fran-scala opened this issue Dec 8, 2023 · 7 comments
Labels
enhancement ✨ New feature or request

Comments

@fran-scala
Copy link

Feature details

Hi Pennylane developers! I'm opening this feature request to propose the implementation of the novel Quantum Dropout technique in Pennylane. This feature request is motivated by the findings and insights presented in our recent paper titled "A General Approach to Dropout in Quantum Neural Networks" (arXiv:2310.04120) that will soon be published in Advanced Quantum Technologies. In our research, we explored the application of dropout techniques in Quantum Neural Networks(QNNs) to mitigate potential overfitting issues arising from the overparametrization of quantum circuits.

Given the rising interest in overparametrized QNNs from the QML community, it will be vital to have regularization techniques. The proposed quantum dropout technique aims to provide Pennylane users with a mechanism to reduce overfitting, enhance generalization, and guide model training in QML applications.

Quantum Dropout Scheme

The dropout technique for QNNs follows a few simple steps:

  1. While designing your ansatz, choose which gates will undergo dropout during the training procedure.
  2. Fix a probability of dropout, i.e. the probability of removing one of the gates chosen in step 1. This can be defined locally, layerwise or globally.
    Then at each training iteration:
  3. Sample which gates have to be removed according to the probability defined in step 2.
  4. Build the corresponding QNN.
  5. Perform the parameters update according to a certain optimizer.

Implementation

Rough implementation for the paper

For the extensive numerical analysis conducted in our paper, we employed Pennylane combined with JIT compilation of JAX.
Dropout essentially consists of randomly substituting a quantum gate with the identity. For parametrized gates, we reduced this to conditionally set the parameter to 0, while for CNOT gate we had to define a custom parametrized gate.

Suggested implementation

We suggest implementing this technique by adding a new parameter to all the gate functions which represents the dropout probability (with default value 0). Then according to this probability, the compute_matrix method returns either the gate matrix or the identity matrix (if dropout is applied).

It will be necessary that this works also in combination with JAX, which usually does not really like conditional statements.

How important would you say this feature is?

1: Not important. Would be nice to have.

Additional information

Conclusion

You can find more details about the technique in our preprint "A General Approach to Dropout in Quantum Neural Networks" (arXiv:2310.04120). Do not hesitate to contact me at francesco.scala@ateneopv.it if you need some insights or discussion.

We believe that incorporating this technique into Pennylane would not only enhance the library's capabilities but also contribute to the advancement of QML research empowering researchers and practitioners. We also hope that it will become a standard practice for training deep QNNs, just like its classical counterpart.

@fran-scala fran-scala added the enhancement ✨ New feature or request label Dec 8, 2023
@isaacdevlugt
Copy link
Contributor

Hey @fran-scala, thanks for the suggestion! We'll get back to you next week when we have a chance to digest the suggestions completely, but would this be something that you would like to contribute to PennyLane by opening a PR? Let us know!

@fran-scala
Copy link
Author

Hey @fran-scala, thanks for the suggestion! We'll get back to you next week when we have a chance to digest the suggestions completely, but would this be something that you would like to contribute to PennyLane by opening a PR? Let us know!

Yes, I would be delighted to contribute! However, I think I will still need some support to design the implementation correctly.

@josh146
Copy link
Member

josh146 commented Dec 11, 2023

Hi @fran-scala! Before designing an implementation, another option could be to write a demonstration on your paper for https://pennylane.ai/qml/demonstrations/. That is, a Jupyter notebook that showcases the quantum dropout implementation, using code to demonstrate while explaining the concepts.

The nice thing about this is it allows us to understand the implementation better (helping with future design questions), while giving a spot for us to market your research/share it with the community :)

@fran-scala
Copy link
Author

Hi @fran-scala! Before designing an implementation, another option could be to write a demonstration on your paper for https://pennylane.ai/qml/demonstrations/. That is, a Jupyter notebook that showcases the quantum dropout implementation, using code to demonstrate while explaining the concepts.

The nice thing about this is it allows us to understand the implementation better (helping with future design questions), while giving a spot for us to market your research/share it with the community :)

Hi @josh146 ! Thanks for your proposal. The idea of making a demonstration notebook to showcase our technique sounds great and actually it should be pretty easy to do from our code. Do I have to notify you here when I submit the demo?

@josh146
Copy link
Member

josh146 commented Dec 11, 2023

Do I have to notify you here when I submit the demo?

@fran-scala nope!

To submit the demo, a pull request simply needs to be made against the https://github.com/pennylaneai/qml repository. Check out the README for more details on how to add a demo 🙂

A couple of notes:

  • Our website doesn't use notebooks as the demo format; instead, demos are Python scripts with RestructuredText comments. However, if you would like to start off with a notebook, you can convert your notebook to a Python script using this converter script

  • Feel free to open a WIP (work-in-progress) PR against that repo, even if your demo is still a draft! This way we can help with any technical issues/answer any questions you might have. In addition, within each PR we have a GitHub action that will build a preview of your demo, making it convenient to check if everything is rendering correctly.

Let me know if you have any questions!

@fran-scala
Copy link
Author

Do I have to notify you here when I submit the demo?

@fran-scala nope!

To submit the demo, a pull request simply needs to be made against the https://github.com/pennylaneai/qml repository. Check out the README for more details on how to add a demo 🙂

A couple of notes:

  • Our website doesn't use notebooks as the demo format; instead, demos are Python scripts with RestructuredText comments. However, if you would like to start off with a notebook, you can convert your notebook to a Python script using this converter script
  • Feel free to open a WIP (work-in-progress) PR against that repo, even if your demo is still a draft! This way we can help with any technical issues/answer any questions you might have. In addition, within each PR we have a GitHub action that will build a preview of your demo, making it convenient to check if everything is rendering correctly.

Let me know if you have any questions!

Hi! I made a pull request in the qml module as you suggested. Let me know if it can work.

@isaacdevlugt
Copy link
Contributor

Hey @fran-scala! It looks awesome! We might be a bit delayed getting to it because of the holidays, but I assure you that we will take a look at it as soon as we can. Let's move the conversation over to your PR that you made 😄

KetpuntoG added a commit to PennyLaneAI/qml that referenced this issue Mar 13, 2024
Hi! This demo request refers to [Issue
#4929](PennyLaneAI/pennylane#4929) for
Pennylane, about implementing dropout for Quantum Neural Networks
directly in Pennylane.

------------------------------------------------------------------------------------------------------------

**Title:** 

Dropout for Quantum Neural Networks

**Summary:** 

In this demo we show how to exploit the quantum version of dropout
technique to avoid the problem of
overfitting in deep Quantum Neural Networks (QNNs). What follows is
based on the paper
“A General Approach to Dropout in Quantum Neural Networks” by F. Scala,
et al.

**Relevant references:**

- [1] Scala, F., Ceschini, A., Panella, M., & Gerace, D. (2023). *A
General Approach to Dropout in Quantum Neural Networks*. [Adv. Quantum
Technol.,
2300220](https://onlinelibrary.wiley.com/doi/full/10.1002/qute.202300220)
- [2] Kiani,B. T., Lloyd, S., & Maity, R. (2020).*Learning Unitaries by
Gradient Descent*. [arXiv: 2001.11897](https://arxiv.org/abs/2001.11897)
- [3] Larocca, M., Ju, N., García-Martín, D., Coles, P. J., & Cerezo, M.
(2023). *Theory of overparametrization in quantum neural networks*.
[Nat. Comp. Science, 3,
542–551](http://dx.doi.org/10.1038/s43588-023-00467-6)

**Possible Drawbacks:**

In this demo, to show the effectiveness of the technique, dropout is
implemented by randomly setting some of the optimized parameters to 0 at
each iteration. Actual dropout should be implemented by substituting a
certain gate with the identity gate.

**Related GitHub Issues:**

Pennyalne [Issue
#4929](PennyLaneAI/pennylane#4929)

----
If you are writing a demonstration, please answer these questions to
facilitate the marketing process.

* GOALS — Why are we working on this now?

We would like to implement dropout for QNNs directly in Pennylane,
referring to paper [1]


* AUDIENCE — Who is this for?

This technique (hence this demo) is for all people interested in Quantum
Machine Learning. Both researchers and enthusiasts may benefit by
learning from this demo how to avoid overfitting when using
overparametrized QNNs. We strongly believe that it will become a
standard for QML just like its classical counterpart in ML.


* KEYWORDS — What words should be included in the marketing post?

"Quantum Neural Networks","QNN", "overfitting", "dropout",
"regularization"

* Which of the following types of documentation is most similar to your
file?
(more details
[here](https://www.notion.so/xanaduai/Different-kinds-of-documentation-69200645fe59442991c71f9e7d8a77f8))
    
- [ ] Tutorial
- [x] Demo
- [ ] How-to

---------

Co-authored-by: Guillermo Alonso-Linaje <65235481+KetpuntoG@users.noreply.github.com>
Co-authored-by: Josh Izaac <josh146@gmail.com>
Co-authored-by: Isaac De Vlugt <34751083+isaacdevlugt@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement ✨ New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants