-
Notifications
You must be signed in to change notification settings - Fork 1
/
L05E02.tex
154 lines (130 loc) · 4.62 KB
/
L05E02.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
\documentclass[solutions.tex]{subfiles}
\xtitle
\begin{document}
\maketitle
\begin{exercise} 1) Show that
$\Delta A^2 = \avg{\bar{A}^2}$ and
$\Delta B^2 = \avg{\bar{B}^2}$ \\
2) Show that $[\bar{A}, \bar{B}] = [A, B]$ \\
3) Using these relations, show that
\[
\Delta A\ \Delta B \geq \frac12\bra{\Psi}[A, B]\ket{\Psi}
\]
\end{exercise}
OK, let's as usual recall the context: $A$ and $B$ are
two observables. We defined the expectation value of an observable
$C$ with eigenvalues labelled as $c$ to be:
\[
\avg{C} := \bra{\Psi}C\ket{\Psi} = \sum_c cP(c)
\]
We construct from $C$ a new observable $\bar{C}$:
\[
\bar{C} := C-\avg{C}I
\]
Where the identity $I$ is sometimes implicit. The eigenvalues
of $\bar{C}$ are denoted $\bar{c}$ and can be expressed in terms
of $C$'s eigenvalues, denoted $c$:
\[
\bar{c} = c-\avg{C}
\]
From there, we defined the \textit{standard deviation}, or the
square of the uncertainty of $C$, assuming a "well-behaved" probability
distribution $P$, by:
\[
(\Delta C)^2 := \sum_c \bar{c}^2P(c)
\]
\hrr
Let's first quickly prove that $\bar{c} = c-\left<C\right>$ are
indeed the eigenvalues of $\bar{C} = C-\left<C\right>I$. Consider
an eigenvalue $c$ of $C$, with associated eigenvector $\ket{c}$.
It follows that:
\begin{equation*}\begin{aligned}
~ && C\ket{c} &=&& c\ket{c} \\
\Leftrightarrow &&
C\ket{c} - \avg{C}\ket{c} &=&& c\ket{c} - \avg{C}\ket{c} \\
\Leftrightarrow &&
(C -\avg{C}I)\ket{c} &=&& (c - \avg{C})\ket{c} \\
\Leftrightarrow &&
\bar{C}\ket{c} &=&& (c - \avg{C})\ket{c} \\
\end{aligned}\end{equation*}
Meaning, $\ket{c}$ is still an eigenvector of $\bar{C}$, but
now associated to the eigenvalue $c-\avg{C}$. The $\ket{c}$
still make an orthonormal basis of the state space, so there
are no other eigenvectors (there can't be more eigenvectors
than the dimension of the surrounding state-space). $\qed$
\hrr
Similarly, we can prove that $c^2$ are the eigenvalues associated
to $C^2$, for an observable $C$: again start from an eigenvalue
$c$ of $C$, associated to an eigenvector $\ket{C}$:
\[
C\ket{c} = c\ket{c}
\Leftrightarrow C(C\ket{c}) = C(c\ket{c})
\Leftrightarrow C^2\ket{c} = c(\underbrace{C\ket{c}}_{c\ket{c}})
\Leftrightarrow C^2\ket{c} = c^2\ket{c}) \qed
\]
\hrr
1) We'll prove the fact for an arbitrary observable $C$: it'll
naturally hold for both $A$ and $B$.
\begin{equation*}\begin{aligned}
(\Delta C)^2 &:=&& \sum_c \bar{c}^2P(c) && ~ \\
~ &=&& \sum_c (c-\left<c\right>)^2P(c) && \text{(definition of $\bar{c}$)} \\
~ &=&& \bra{\Psi}\bar{C}^2\ket{\Psi} =: \avg{\bar{C}^2} && \text{(two previous properties) \qed} \\
\end{aligned}\end{equation*}
\hrr
2) This is an elementary calculation:
\begin{equation*}\begin{aligned}
[\bar{A}, \bar{B}] &:=&& \bar{A}\bar{B} - \bar{B}\bar{A}
&& \text{(commutator's definition)} \\
~ &=&& (A-\avg{A}I)(B-\avg{B}I) - (B-\avg{B}I)(A-\avg{A}I)
&& \text{(definition of $\bar{C}$)} \\
~ &=&& \Big(AB - \avg{A}B -\avg{B}A + \avg{A}\avg{B}I\Big)
- \Big(BA -\avg{B}A - \avg{A}B + \avg{B}\avg{A}I\Big) && ~ \\
~ &=&& AB - BA && ~ \\
~ &=:&& [A, B] && \text{(commutator's definition)} \qed \\
\end{aligned}\end{equation*}
Remember, $\avg{A}$ and $\avg{B}$ are real numbers (their multiplication
is then commutative).
\hrr
3) This is now just about following the reasoning preceding
the exercise in the book, as suggested by the authors, by
replacing $A$ and $B$ with $\bar{A}$ and $\bar{B}$. \\
So let:
\[
\ket{X} = \bar{A}\ket{\Psi} = (A-\avg{A}I)\ket{\Psi};\qquad
\ket{Y} = i\bar{B}\ket{\Psi}= i(B-\avg{B}I)\ket{\Psi}
\]
Recall the general form of Cauchy-Schwarz for a complex vector
space\footnote{I'm sticking to the authors' terminology and notations.}:
\[
2|X||Y| \geq |\braket{X}{Y} + \braket{Y}{X}|
\]
Where the norm is defined from the inner-product:
\[
|X| = \sqrt{\braket{X}{X}}
\]
Injecting our two vectors in such a Cauchy-Schwarz equation yields:
\begin{equation*}\begin{aligned}
2\sqrt{\avg{\bar{A}^2}\avg{\bar{B}^2}} &\geq&&
|i(\bra{\Psi}\bar{A}\bar{B}\ket{\Psi}-\bra{\Psi}\bar{B}\bar{A}\ket{\Psi})|
& ~ \\
~ &\geq&&
|\bra{\Psi}[\bar{A},\bar{B}]\ket{\Psi}|
& \text{(commutator definition)} \\
~ &\geq&&
|\bra{\Psi}[A,B]\ket{\Psi}| &
(\text{from 2), }[\bar{A},\bar{B}] = [A, B]) \\
\end{aligned}\end{equation*}
But from 1), we know that
\[
2\sqrt{\avg{\bar{A}^2}\avg{\bar{B}^2}} = 2\sqrt{(\Delta A)^2(\Delta B)^2}
= 2\Delta A\Delta B
\]
Note that the $\sqrt{.}$ can be removed "safely" as the $\Delta C^2$ are
defined as a sum of positive terms (no absolute values necessary). \\
Putting the two together yields the expected, \textit{general uncertainty principle}:
\[
\boxed{
\Delta A\Delta B \geq |\bra{\Psi}[A,B]\ket{\Psi}|
} \qed
\]
\end{document}