Skip to content

Commit 5f78fdf

Browse files
committed
Update week5.do.txt
1 parent 52b9bda commit 5f78fdf

File tree

1 file changed

+9
-1
lines changed

1 file changed

+9
-1
lines changed

doc/src/week5/week5.do.txt

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -310,21 +310,29 @@ just Shannon entropy, before we move over to a quantum mechanical way
310310
to define the entropy based on the density matrices discussed earlier.
311311

312312
We define a set of random variables $X=\{x_0,x_1,\dots,x_{n-1}\}$ with probability for an outcome $x\in X$ given by $p_X(x)$, the
313-
information entropy is defined as
313+
classical information entropy is defined as
314314
!bt
315315
\[
316316
S=-\sum_{x\in X}p_X(x)\log_2{p_X(x)}.
317317
\]
318318
!et
319+
Why this expression? What does it mean?
319320

321+
!split
322+
===== Mathematics of entropy =====
323+
What is the basic idea of the entropy as it is used in information theory (we leave out the standard description from statistical physics here)?
324+
325+
We want to have a measure of unlikelihod. Consider a simple binary system, with two outcomes, true and false with true given by a probability $p$ and false given by $1-p$. Since $p$ represents a probability
320326
!split
321327
===== Von Neumann entropy =====
322328

329+
Using the density matrix $\rho$, we define the quantum mechanical equivalent of the classical entropy as
323330
!bt
324331
\[
325332
S=-\mathrm{Tr}[\rho\log_2{\rho}].
326333
\]
327334
!et
335+
This is the so-called Von Neumann entropy. How did we arrive at this expression?
328336

329337

330338

0 commit comments

Comments
 (0)