Skip to content

Latest commit

 

History

History
231 lines (153 loc) · 8.97 KB

secret_keys.rst

File metadata and controls

231 lines (153 loc) · 8.97 KB

Secret Key Agreement

One of the only methods of encrypting a message from Alice to Bomb such that no third party (Eve) can possibly decrypt it is a one-time pad. This technique requires that Alice and Bob have a secret sequence of bits, S, which Alice then encrypts by computing the exclusive-or of it with the plaintext, P, to produce the cyphertext, C: C = S ⊕ P. Bob can then decrypt by xoring again: P = S ⊕ C.

In order to pull this off, Alice and Bob need to construct S out of some sort joint randomness, p(x, y, z), and public communication, V, which is assumed to have perfect fidelity. The maximum rate at which S can be constructed in the secret key agreement rate.

Background

Given N IID copies of a joint distribution governed by p(x, y, z), let XN denote the random variables observed by Alice, YN denote the random variables observed by Bob, and ZN denote the random variables observed by Even. Furthermore, let S[X : Y||Z] be the maximum rate R such that, for N > 0, ϵ > 0, some public communication V, and functions f and g:

$$\begin{aligned} S_X = f(X^N, V) \\\ S_Y = g(Y^N, V) \\\ p(S_X \neq S_Y \neq S) \leq \epsilon \\\ \I{S : V Z^N} \leq \epsilon \\\ \frac{1}{N} \H{S} \geq R - \epsilon \end{aligned}$$

Intuitively, this means there exists some procedure such that, for every N observations, Alice and Bob can publicly converse and then construct S bits which agree almost surely, and are almost surely independent of everything Eve has access to. S is then known as a secret key.

Lower Bounds

Lower Intrinsic Mutual Information

The first lower bound on the secret key agreement rate is known in dit as the :pylower_intrinsic_mutual_information, and is given by:

$$\I{X : Y \uparrow Z} = \max\{ \I{X : Y} - \I{X : Z}, \I{X : Y} - \I{Y : Z}, 0 \}$$

Secrecy Capacity

Next is the secrecy capacity:

$$\begin{aligned} \I{X : Y \uparrow\uparrow Z} = \max \begin{cases} \displaystyle \max_{U - X - YZ} \I{U : Y} - \I{U : Z} \\\ \displaystyle \max_{U - Y - XZ} \I{U : X} - \I{U : Z} \end{cases} \end{aligned}$$

This gives the secret key agreement rate when communication is not allowed.

Necessary Intrinsic Mutual Information

A tighter bound is given by the :pynecessary_intrinsic_mutual_information gohari2017achieving:

$$\begin{aligned} \I{X : Y \uparrow\uparrow\uparrow Z} = \max \begin{cases} \displaystyle \max_{V - U - X - YZ} \I{U : Y | V} - \I{U : Z | V} \\\ \displaystyle \max_{V - U - Y - XZ} \I{U : X | V} - \I{U : Z | V} \end{cases} \end{aligned}$$

This quantity is actually equal to the secret key agreement rate when communication is limited to being unidirectional.

Upper Bounds

Upper Intrinsic Mutual Information

The secret key agreement rate is trivially upper bounded by:

$$\min\{ \I{X : Y}, \I{X : Y | Z} \}$$

Intrinsic Mutual Information

The :pyintrinsic_mutual_information maurer1997intrinsic is defined as:

$$\I{X : Y \downarrow Z} = \min_{p(\overline{z} | z)} \I{X : Y | \overline{Z}}$$

It is straightforward to see that $p(\overline{z} | z)$ being a constant achieves $\I{X : Y}$, and $p(\overline{z} | z)$ being the identity achieves $\I{X : Y | Z}$.

Reduced Intrinsic Mutual Information

This bound can be improved, producing the :pyreduced_intrinsic_mutual_information renner2003new:

$$\I{X : Y \downarrow\downarrow Z} = \min_{U} \I{X : Y \downarrow ZU} + \H{U}$$

This bound improves upon the Intrinsic Mutual Information when a small amount of information, U, can result in a larger decrease in the amount of information shared between X and Y given Z and U.

Minimal Intrinsic Mutual Information

The Reduced Intrinsic Mutual Information can be further reduced into the :pyminimal_intrinsic_total_correlation gohari2017comments:

$$\I{X : Y \downarrow\downarrow\downarrow Z} = \min_{U} \I{X : Y | U} + \I{XY : U | Z}$$

All Together Now

Taken together, we see the following structure:

$$\begin{aligned} \begin{align} &\min\{ \I{X : Y}, \I{X : Y | Z} \} \\\ &\quad \geq \I{X : Y \downarrow Z} \\\ &\quad\quad \geq \I{X : Y \downarrow\downarrow Z} \\\ &\quad\quad\quad \geq \I{X : Y \downarrow\downarrow\downarrow Z} \\\ &\quad\quad\quad\quad \geq S[X : Y || Z] \\\ &\quad\quad\quad\quad\quad \geq \I{X : Y \uparrow\uparrow\uparrow Z} \\\ &\quad\quad\quad\quad\quad\quad \geq \I{X : Y \uparrow\uparrow Z} \\\ &\quad\quad\quad\quad\quad\quad\quad \geq \I{X : Y \uparrow Z} \\\ &\quad\quad\quad\quad\quad\quad\quad\quad \geq 0.0 \end{align} \end{aligned}$$

Generalizations

Most of the above bounds have straightforward multivariate generalizations. These are not necessarily bounds on the multiparty secret key agreement rate. For example, one could compute the :pyminimal_intrinsic_dual_total_correlation:

$$\B{X_0 : \ldots : X_n \downarrow\downarrow\downarrow Z} = \min_{U} \B{X_0 : \ldots : X_n | U} + \I{X_0, \ldots, X_n : U | Z}$$

Examples

Let us consider a few examples:

In [1]: from dit.multivariate.secret_key_agreement import *

In [2]: from dit.example_dists.intrinsic import intrinsic_1, intrinsic_2, intrinsic_3

First, we consider the distribution intrinsic_1:

In [3]: print(intrinsic_1) Class: Distribution Alphabet: ('0', '1', '2', '3') for all rvs Base: linear Outcome Class: str Outcome Length: 3 RV Names: None

x p(x) 000 1/8 011 1/8 101 1/8 110 1/8 222 1/4 333 1/4

With upper bounds:

@doctest float In [4]: upper_intrinsic_mutual_information(intrinsic_1, [[0], [1]], [2]) Out[4]: 0.5

We see that the trivial upper bound is 0.5, because without conditioning on Z, X and Y can agree when the observe either a 2 or a 3, which results in $\I{X : Y} = 0.5$. Given Z, however, that information is no longer private. But, given Z, a conditional dependence is induced between X and Y: Z knows that if she is a 0 that X and Y agree, and if she is a 1 they disagree. This results $\I{X : Y | Z} = 0.5$. In either case, however, X and Y can not agree upon a secret key: in the first case the eavesdropper knows their correlation, while in the second they are actually independent.

The :pyintrinsic_mutual_information, however can detect this:

@doctest float In [5]: intrinsic_mutual_information(intrinsic_1, [[0], [1]], [2]) Out[5]: 0.0

Next, let's consider the distribution intrinsic_2:

In [7]: print(intrinsic_2) Class: Distribution Alphabet: (('0', '1', '2', '3'), ('0', '1', '2', '3'), ('0', '1')) Base: linear Outcome Class: str Outcome Length: 3 RV Names: None

x p(x) 000 1/8 011 1/8 101 1/8 110 1/8 220 1/4 331 1/4

In this case, Z no longer can distinguish between the case where X and Y can agree on a secret bit, and when they can not, because she can not determine when they are in the 01 regime or in the 23 regime:

@doctest float In [8]: intrinsic_mutual_information(intrinsic_2, [[0], [1]], [2]) Out[8]: 1.5

This seems to imply that X and Y can adopt a scheme such as: if they observe either a 0 or a 1, write down 0, and if they observe either a 2 or a 3, write that down. This has a weakness, however: what if Z were able to distinguish the two regimes? This costs her 1 bit, but reduces the secrecy of X and Y to nil. Thus, the secret key agreement rate is actually only 1 bit:

@doctest float In [9]: minimal_intrinsic_mutual_information(intrinsic_2, [[0], [1]], [2], bounds=(3,)) Out[9]: 1.0