Skip to content

Commit

Permalink
fix math
Browse files Browse the repository at this point in the history
  • Loading branch information
astonzhang committed Nov 5, 2019
1 parent 7e7a390 commit e0a4b36
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions chapter_preliminaries/reduction-norm.md
Original file line number Diff line number Diff line change
Expand Up @@ -351,7 +351,7 @@ In fact, the Euclidean distance is a norm:
specifically it is the $\ell_2$ norm.
Suppose that the elements in the $n$-dimensional vector
$\mathbf{x}$ are $x_1, \ldots, x_n$.
The *$\ell_2$ norm* of $\mathbf{x}$ is the square root of the sum of the squares of the vector elements:
The $\ell_2$ *norm* of $\mathbf{x}$ is the square root of the sum of the squares of the vector elements:

$$\|\mathbf{x}\|_2 = \sqrt{\sum_{i=1}^n x_i^2},$$

Expand All @@ -364,7 +364,7 @@ np.linalg.norm(u)

In deep learning, we work more often
with the squared $\ell_2$ norm.
You will also frequently encounter the $\ell_1$ norm,
You will also frequently encounter the $\ell_1$ *norm*,
which is expressed as the sum of the absolute values of the vector elements:

$$\|\mathbf{x}\|_1 = \sum_{i=1}^n \left|x_i \right|.$$
Expand All @@ -379,7 +379,7 @@ np.abs(u).sum()
```

Both the $\ell_2$ norm and the $\ell_1$ norm
are special cases of the more general $\ell_p$ norm:
are special cases of the more general $\ell_p$ *norm*:

$$\|\mathbf{x}\|_p = \left(\sum_{i=1}^n \left|x_i \right|^p \right)^{1/p}.$$

Expand Down

0 comments on commit e0a4b36

Please sign in to comment.