import numpyThe following is a lecture series that introduces the basic theory of deep learning.
What to know about vector calculus?
Vectors are represented in lowercase boldface font as in
- The above example defines a Euclidean vector, which is a 1-D array of () real numbers (from ) organized into a column or a row.
- A column vector can be transposed () into a row vector.
import numpy as np
seq = np.arange(1, 10) # 1D array
x = seq.reshape(-1, 1) # column vector
x_transposed = x.transpose() # row vector
print("Column vector:", x, "Row vector:", x_transposed, sep="\n")Matrices in uppercase boldface font:
- The above defines a Euclidean matrix, which is a 2-D array of real numbers organized into a table with rows and columns.
- Transposing a matrix turns its rows (columns) into columns (rows).
W = np.arange(1, 10).reshape(3, -1) # 3-by-3 matrix
print("W:", W, "W^T:", W.transpose(), sep="\n")W = np.arange(1, 10).reshape(3, -1)
x = np.arange(1, 4).reshape(-1, 1)
Wx = W @ x
print("W:", W, "x:", x, "Wx:", Wx, sep="\n")What to know about Probability Theory?
- is the probability mass function (pmf) of conditioned on , and
- is the (multivariate) probability density function (pdf) of .
For any function of , the expectations are: