Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add module documentation for ops #2730

Merged
merged 92 commits into from
Aug 9, 2022
Merged
Show file tree
Hide file tree
Changes from 88 commits
Commits
Show all changes
92 commits
Select commit Hold shift + click to select a range
30650a6
Update phase decomp test (#2697)
antalszava Jun 13, 2022
333541b
Update interfaces.rst (#2698)
antalszava Jun 13, 2022
b524f25
Allow templates to be decomposed (#2704)
eddddddy Jun 14, 2022
cb85618
Deprecate `qml.ExpvalCost` (#2571)
Qottmann Jun 14, 2022
d0cb387
Update JAX jit forward mode forward evaluation (#2700)
antalszava Jun 14, 2022
76e3fa3
Improve ising gates documentation (#2711)
rmoyard Jun 15, 2022
8c39317
Support classical fisher gradients when using Autograd (#2688)
josh146 Jun 15, 2022
34dc494
Support classical Fisher gradients when using TF and torch (#2710)
eddddddy Jun 15, 2022
9ba938b
Remove `hardware` argument in `qml.qinfo.quantum_fisher` (#2695)
Qottmann Jun 15, 2022
96da6bd
Add `qinfo` measurements in supported configurations docs (#2712)
eddddddy Jun 15, 2022
e5a3ac3
Use access_state (#2719)
antalszava Jun 15, 2022
c036f79
Update docs v0.24 (#2724)
antalszava Jun 16, 2022
1d45127
Move summary to start (#2727)
eddddddy Jun 16, 2022
ddaa9d2
add ops module section
albi3ro Jun 16, 2022
b5080a8
only include functions and op_math
albi3ro Jun 16, 2022
1bb564d
added weights initialization example (#2735)
Qottmann Jun 16, 2022
4886673
Merge branch 'v0.24.0-rc0' into ops-module-docs
Jaybsoni Jun 16, 2022
40469ec
Wires not updated for a hamiltonian with in-place addition (#2738)
Jaybsoni Jun 16, 2022
cf69743
Merge branch 'v0.24.0-rc0' into ops-module-docs
Jaybsoni Jun 16, 2022
14737ea
try other way
albi3ro Jun 16, 2022
ef91cb2
try other way
albi3ro Jun 16, 2022
bf39ac0
Cleanup docs (#2736)
eddddddy Jun 16, 2022
a756e54
Documentation changes for `batch_partial` (#2737)
eddddddy Jun 17, 2022
64e5fb6
Merge branch 'v0.24.0-rc0' into ops-module-docs
albi3ro Jun 17, 2022
30d59c0
Fix sphinx class/method links (#2729)
dime10 Jun 17, 2022
666cae5
Merge branch 'v0.24.0-rc0' into ops-module-docs
rmoyard Jun 17, 2022
9dc5663
trying something else
albi3ro Jun 17, 2022
0778b2e
Fix rendering of matrix rep of ECR (#2741)
Jaybsoni Jun 17, 2022
e0e001d
Merge branch 'v0.24.0-rc0' into ops-module-docs
Jaybsoni Jun 17, 2022
9a07e07
Fix doc quantum information and ising (#2732)
rmoyard Jun 17, 2022
8100968
hope
albi3ro Jun 17, 2022
86d5a56
Merge branch 'ops-module-docs' of https://github.com/PennyLaneAI/penn…
albi3ro Jun 17, 2022
f798b58
Merge branch 'v0.24.0-rc0' into ops-module-docs
albi3ro Jun 17, 2022
3cafe56
Add quantum info measurements to introduction doc page (#2734)
albi3ro Jun 17, 2022
a21d846
maybe this time
albi3ro Jun 17, 2022
c8e1dda
probably not going to work but worth a try
albi3ro Jun 17, 2022
a6f69c1
Merge branch 'v0.24.0-rc0' into ops-module-docs
albi3ro Jun 17, 2022
58e96e0
Apply suggestions from code review
albi3ro Jun 17, 2022
7c9ede8
update with master
albi3ro Jul 26, 2022
dc7422f
Merge branch 'master' into ops-module-docs
albi3ro Jul 26, 2022
c78a42a
updates
albi3ro Jul 26, 2022
9740775
merge something
albi3ro Jul 26, 2022
de50fb5
Merge branch 'ops-module-docs' of https://github.com/PennyLaneAI/penn…
albi3ro Jul 26, 2022
142196b
update docstring
albi3ro Jul 26, 2022
d6df57f
line length problem
albi3ro Jul 26, 2022
29f70aa
Merge branch 'master' into ops-module-docs
albi3ro Jul 26, 2022
63bee53
Merge branch 'master' into ops-module-docs
albi3ro Aug 2, 2022
5a73282
fix controlled signature
albi3ro Aug 2, 2022
450915b
fix s_prod
albi3ro Aug 2, 2022
219914f
add control class notes
albi3ro Aug 2, 2022
157aca1
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 2, 2022
a5988d3
update docs for Sprod and Sum
Jaybsoni Aug 3, 2022
c38a046
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 3, 2022
cbd69d4
typo
Jaybsoni Aug 3, 2022
81f6aa6
Merge branch 'ops-module-docs' of https://github.com/PennyLaneAI/penn…
Jaybsoni Aug 3, 2022
12edb92
small fix
Jaybsoni Aug 3, 2022
4da67a2
fixing docs
Jaybsoni Aug 3, 2022
40fbbba
added pragma no-cover to signature property
Jaybsoni Aug 3, 2022
d29d10c
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 3, 2022
76fd92e
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 3, 2022
f0902ba
Apply suggestions from code review
Jaybsoni Aug 3, 2022
9b6427e
clean up
Jaybsoni Aug 3, 2022
419dcae
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 3, 2022
6981376
fix sprod demo
Jaybsoni Aug 4, 2022
21fa5a4
Merge branch 'ops-module-docs' of https://github.com/PennyLaneAI/penn…
Jaybsoni Aug 4, 2022
408eedf
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 4, 2022
929b7b1
add info to quick start operations guide
Jaybsoni Aug 4, 2022
ff42631
Merge branch 'ops-module-docs' of https://github.com/PennyLaneAI/penn…
Jaybsoni Aug 4, 2022
e83c454
typo
Jaybsoni Aug 4, 2022
f41a7ba
added example
Jaybsoni Aug 4, 2022
cb07d5f
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 4, 2022
be4b551
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 4, 2022
7ea822b
re-link top level imports to prevent duplication of docstring pages g…
Jaybsoni Aug 4, 2022
f098547
Merge branch 'ops-module-docs' of https://github.com/PennyLaneAI/penn…
Jaybsoni Aug 4, 2022
fa5983c
typo
Jaybsoni Aug 5, 2022
06e8efc
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 5, 2022
ff5341b
change constructor imports to toplevel
Jaybsoni Aug 5, 2022
8336df7
Merge branch 'ops-module-docs' of https://github.com/PennyLaneAI/penn…
Jaybsoni Aug 5, 2022
a8b7826
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 5, 2022
a3efeb4
Apply suggestions from code review
Jaybsoni Aug 5, 2022
dcfc891
add equal to ops.functions docstring
albi3ro Aug 5, 2022
8c2ce94
Merge branch 'master' into ops-module-docs
Jaybsoni Aug 5, 2022
043f7de
Merge branch 'v0.25.0-rc0' into ops-module-docs
albi3ro Aug 8, 2022
8f3cd2a
Update doc/introduction/operations.rst
Jaybsoni Aug 8, 2022
ff476b8
Update pennylane/ops/op_math/controlled_class.py
Jaybsoni Aug 8, 2022
a989c9a
Update pennylane/ops/op_math/sum.py
Jaybsoni Aug 8, 2022
3ee42af
Merge branch 'v0.25.0-rc0' into ops-module-docs
Jaybsoni Aug 8, 2022
b590e2a
Merge branch 'v0.25.0-rc0' into ops-module-docs
Jaybsoni Aug 8, 2022
34dd5af
lint
Jaybsoni Aug 8, 2022
287fafa
Merge branch 'v0.25.0-rc0' into ops-module-docs
Jaybsoni Aug 8, 2022
d71f15f
Merge branch 'v0.25.0-rc0' into ops-module-docs
Jaybsoni Aug 9, 2022
9d1aa71
Merge branch 'v0.25.0-rc0' into ops-module-docs
Jaybsoni Aug 9, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions doc/code/qml_ops_op_math.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
qml.ops.op_math
===============

.. automodule:: pennylane.ops.op_math
1 change: 1 addition & 0 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -183,6 +183,7 @@ PennyLane is **free** and **open source**, released under the Apache License, Ve
code/qml_grouping
code/qml_kernels
code/qml_math
code/qml_ops_op_math
code/qml_qinfo
code/qml_numpy
code/qml_qaoa
Expand Down
30 changes: 25 additions & 5 deletions doc/introduction/operations.rst
Original file line number Diff line number Diff line change
Expand Up @@ -42,25 +42,45 @@ Operator functions
------------------

Various functions and transforms are available for manipulating operators,
and extracting information.
and extracting information. These can be broken down into two main categories:

Operator to Operator functions
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. autosummary::

~pennylane.adjoint
~pennylane.ctrl
~pennylane.cond
~pennylane.op_sum
~pennylane.prod
~pennylane.s_prod
~pennylane.generator

These operator functions act on operators to produce new operators.

>>> op = qml.prod(qml.PauliX(0), qml.PauliZ(1))
>>> op = qml.op_sum(qml.Hadamard(0), op)
>>> op = qml.s_prod(1.2, op)
1.2*(Hadamard(wires=[0]) + PauliX(wires=[0]) @ PauliZ(wires=[1]))

Operator to Other functions
^^^^^^^^^^^^^^^^^^^^^^^^^^^

.. autosummary::

~pennylane.matrix
~pennylane.eigvals
~pennylane.generator

All operator functions can be used on instantiated operators,
These operator functions act on operators and return other data types.
All operator functions can be used on instantiated operators.

>>> op = qml.RX(0.54, wires=0)
>>> qml.matrix(op)
[[0.9637709+0.j 0. -0.26673144j]
[0. -0.26673144j 0.9637709+0.j ]]

Operator functions can also be used in a functional form:
Some operator functions can also be used in a functional form:

>>> x = torch.tensor(0.6, requires_grad=True)
>>> matrix_fn = qml.matrix(qml.RX)
Expand All @@ -75,7 +95,7 @@ In the functional form, they are usually differentiable with respect to gate arg
>>> x.grad
tensor(-0.5910)

Some operator transform can also act on multiple operators, by passing
Some operator transforms can also act on multiple operators, by passing
quantum functions, QNodes or tapes:

>>> def circuit(theta):
Expand Down
11 changes: 11 additions & 0 deletions pennylane/ops/functions/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,17 @@
# limitations under the License.
"""
This module contains functions that act on operators and tapes.
Comment on lines 13 to 15
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think it might be confusing having separate ops/functions and ops/op_math going forward? There might be edge cases where it's not clear to a developer which file to add the feature to.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's already confusing. The simplify function could go in either.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agree - should we just merge them? Have a single place for generic functions of operators?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We tend to group things in PennyLane by what type of object they are: functions, classes, tranforms, etc. We should instead start grouping things by dependencies, interactions, and responsibilities. But there's no good, clear cut answer on where to put things, especially in a language as flexible as Python.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep for sure, it is not an easy problem to solve. My short term thinking was to avoid developer overhead that we already see with operations vs. templates ("should I add this to operations or templates?")

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Re-working the docs might be a good epic to add in the future since I imagine this will require moving multiple functions and classes around.


.. currentmodule:: pennylane

.. autosummary::
:toctree: api

~eigvals
~generator
~matrix
~equal

"""
from .eigvals import eigvals
from .equal import equal
Expand Down
28 changes: 28 additions & 0 deletions pennylane/ops/op_math/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,38 @@
"""
This module contains classes and functions for Operator arithmetic.

Constructor Functions
~~~~~~~~~~~~~~~~~~~~~

.. currentmodule:: pennylane

.. autosummary::
:toctree: api

~adjoint
~ctrl
~op_sum
~prod
~s_prod

Symbolic Classes
~~~~~~~~~~~~~~~~

.. currentmodule:: pennylane.ops.op_math

.. autosummary::
:toctree: api

~Adjoint
~ControlledOperation
~Controlled
~ControlledOp
~Pow
~Prod
~Sum
~SProd
~SymbolicOp

"""

from .adjoint_class import Adjoint
Expand Down
9 changes: 9 additions & 0 deletions pennylane/ops/op_math/control.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,13 @@ class ControlledOperation(Operation):
control_wires: A wire or set of wires.
control_values: An int or list of ints indicating the values each control wire should
take.

.. note::
Currently, the :func:`~.ctrl` tranform uses this class ``ControlledOperation``. This class
wraps an entire :class:`pennylane.tape.QuantumTape`, and it is rarely supported for native device
execution. See :class:`pennylane.ops.op_math.Controlled` for a more versatile controlled operation
that wraps a single target ``Operator``.
albi3ro marked this conversation as resolved.
Show resolved Hide resolved

"""

grad_method = None
Expand Down Expand Up @@ -197,6 +204,8 @@ def ctrl(fn, control, control_values=None):
function: A new function that applies the controlled equivalent of ``fn``. The returned
function takes the same input arguments as ``fn``.

.. seealso:: :class:`~.ControlledOperation`.

**Example**

.. code-block:: python3
Expand Down
29 changes: 29 additions & 0 deletions pennylane/ops/op_math/controlled_class.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,8 @@

import warnings

from inspect import signature

import numpy as np
from scipy import sparse

Expand All @@ -41,6 +43,17 @@ class Controlled(SymbolicOp):
length as ``control_wires``. Defaults to ``True`` for all control wires.
work_wires (Any): Any auxiliary wires that can be used in the decomposition

.. note::
This class, ``Controlled``, denotes a controlled version of any individual operation.
:class:`~.ControlledOp` adds :class:`~.Operation` specific methods and properties to the
more general ``Controlled`` class.

The :class:`~.ControlledOperation` currently constructed by the :func:`~.ctrl` transform wraps
an entire tape and does not provide as many representations and attributes as ``Controlled``,
but :class:`~.ControlledOperation` does decompose.

.. seealso:: :class:`~.ControlledOp` and ::class:`~.ControlledOperation`

**Example**

>>> base = qml.RX(1.234, 1)
Expand Down Expand Up @@ -99,6 +112,20 @@ class Controlled(SymbolicOp):

"""

# pylint: disable=no-self-argument
@operation.classproperty
def __signature__(cls): # pragma: no cover
# this method is defined so inspect.signature returns __init__ signature
albi3ro marked this conversation as resolved.
Show resolved Hide resolved
# instead of __new__ signature
# See PEP 362
Jaybsoni marked this conversation as resolved.
Show resolved Hide resolved

# use __init__ signature instead of __new__ signature
sig = signature(cls.__init__)
# get rid of self from signature
new_parameters = tuple(sig.parameters.values())[1:]
new_sig = sig.replace(parameters=new_parameters)
return new_sig

# pylint: disable=unused-argument
def __new__(cls, base, *_, **__):
"""If base is an ``Operation``, then the a ``ControlledOp`` should be used instead."""
Expand Down Expand Up @@ -332,6 +359,8 @@ class ControlledOp(Controlled, operation.Operation):

When we no longer rely on certain functionality through ``Operation``, we can get rid of this
class.

.. seealso:: :class:`~.Controlled`
"""

def __new__(cls, *_, **__):
Expand Down
2 changes: 1 addition & 1 deletion pennylane/ops/op_math/prod.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ class Prod(Operator):

.. note::
When a Prod operator is applied in a circuit, its factors are applied in the reverse order.
(i.e ``Prod(op1, op2)`` corresponds to :math:`\hat{op}_{1} \dot \hat{op}_{2}` which indicates
(i.e ``Prod(op1, op2)`` corresponds to :math:`\hat{op}_{1}\dot\hat{op}_{2}` which indicates
first applying :math:`\hat{op}_{2}` then :math:`\hat{op}_{1}` in the circuit. We can see this
in the decomposition of the operator.

Expand Down
34 changes: 29 additions & 5 deletions pennylane/ops/op_math/sprod.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,21 +24,20 @@

def s_prod(scalar, operator, do_queue=True, id=None):
r"""Construct an operator which is the scalar product of the
given scalar and operator provided.
given scalar and operator provided.

Args:
scalar (float or complex): the scale factor being multiplied to the operator.
operator (~.operation.Operator): the operator which will get scaled.

Keyword Args:
do_queue (bool): determines if the scalar product operator will be queued
(currently not supported). Default is True.
do_queue (bool): determines if the scalar product operator will be queued. Default is True.
id (str or None): id for the scalar product operator. Default is None.

Returns:
~ops.op_math.SProd: the operator representing the scalar product.
~ops.op_math.SProd: The operator representing the scalar product.

..seealso:: :class:`~.ops.op_math.SProd`
.. seealso:: :class:`~.ops.op_math.SProd` and :class:`~.ops.op_math.SymbolicOp`

**Example**

Expand All @@ -65,6 +64,11 @@ class SProd(SymbolicOp):
(currently not supported). Default is True.
id (str or None): id for the scalar product operator. Default is None.

.. note::
Currently this operator can not be queued in a circuit as an operation, only measured terminally.

.. seealso:: :func:`~.ops.op_math.s_prod`

**Example**

>>> sprod_op = SProd(1.23, qml.PauliX(0))
Expand All @@ -75,6 +79,26 @@ class SProd(SymbolicOp):
[1.23, 0. ]])
>>> sprod_op.terms()
([1.23], [PauliX(wires=[0]])

.. details::
:title: Usage Details

The SProd operation can also be measured inside a qnode as an observable.
If the circuit is parameterized, then we can also differentiate through the observable.

.. code-block:: python

dev = qml.device("default.qubit", wires=1)

@qml.qnode(dev, grad_method="best")
def circuit(scalar, theta):
qml.RX(theta, wires=0)
return qml.expval(qml.s_prod(scalar, qml.Hadamard(wires=0)))

>>> scalar, theta = (1.2, 3.4)
>>> qml.grad(circuit, argnum=[0,1])(scalar, theta)
(array(-0.68362956), array(0.21683382))

"""
_name = "SProd"

Expand Down
41 changes: 33 additions & 8 deletions pennylane/ops/op_math/sum.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,17 +29,17 @@ def op_sum(*summands, do_queue=True, id=None):
r"""Construct an operator which is the sum of the given operators.

Args:
*summands (tuple[~.operation.Operator]): the operators we want to sum together.
summands (tuple[~.operation.Operator]): the operators we want to sum together.

Keyword Args:
do_queue (bool): determines if the sum operator will be queued (currently not supported).
Default is True.
id (str or None): id for the sum operator. Default is None.
id (str or None): id for the Sum operator. Default is None.

Returns:
~ops.op_math.Sum: the operator representing the sum of summands.
~ops.op_math.Sum: The operator representing the sum of summands.

..seealso:: :class:`~.ops.op_math.Sum`
.. seealso:: :class:`~.ops.op_math.Sum`

**Example**

Expand All @@ -57,7 +57,7 @@ def _sum(mats_gen, dtype=None, cast_like=None):
r"""Private method to compute the sum of matrices.

Args:
mats_gen (Generator): a python generator which produces the matricies which
mats_gen (Generator): a python generator which produces the matrices which
will be summed together.

Keyword Args:
Expand Down Expand Up @@ -85,10 +85,14 @@ class Sum(Operator):
summands (tuple[~.operation.Operator]): a tuple of operators which will be summed together.

Keyword Args:
do_queue (bool): determines if the sum operator will be queued (currently not supported).
Default is True.
do_queue (bool): determines if the sum operator will be queued. Default is True.
id (str or None): id for the sum operator. Default is None.

.. note::
Currently this operator can not be queued in a circuit as an operation, only measured terminally.

.. seealso:: :func:`~.ops.op_math.op_sum`

**Example**

>>> summed_op = Sum(qml.PauliX(0), qml.PauliZ(0))
Expand Down Expand Up @@ -116,7 +120,28 @@ class Sum(Operator):
1.81677345+0.57695852j, 0. +0.j ],
[0. +0.j , 0. +0.j ,
0. +0.j , 1.81677345+0.57695852j]])
"""

The Sum operation can also be measured inside a qnode as an observable.
If the circuit is parameterized, then we can also differentiate through the
sum observable.

.. code-block:: python

sum_op = Sum(qml.PauliX(0), qml.PauliZ(1))
dev = qml.device("default.qubit", wires=2)

@qml.qnode(dev, grad_method="best")
def circuit(weights):
qml.RX(weights[0], wires=0)
qml.RY(weights[1], wires=1)
qml.CNOT(wires=[0, 1])
qml.RX(weights[2], wires=1)
return qml.expval(sum_op)

>>> weights = qnp.array([0.1, 0.2, 0.3], requires_grad=True)
>>> qml.grad(circuit)(weights)
tensor([-0.09347337, -0.18884787, -0.28818254], requires_grad=True)
"""

_eigs = {} # cache eigen vectors and values like in qml.Hermitian

Expand Down