ABOUT ME

-

Today
-
Yesterday
-
Total
-
  • Machine Learning math background
    AI/Machine Learning 2025. 2. 13. 17:10

    Scalars

    A scalar is a single numerical value, representing quantities such as integers, real numbers, or any other numeric value. Scalars are the building blocks for more complex mathematical objects like vectors and matrices.

     

    Vectors

     

    def. A vector can be defined in two ways:

    • As a one-dimensional array of scalars.
    • As a directed line segment with both magnitude and direction.

    Key Calculations with Vectors

    1. Scalar Multiplication:
      • Multiplying a vector by a scalar stretches or shrinks the vector.
    2. Inner Product (Dot Product):
      • Calculates the similarity between two vectors.
      • Formula: 
    3. Outer Product:
      • Results in a matrix when two vectors are multiplied.
    4. Magnitude (Euclidean Norm):
      • The length of a vector, defined as:

    P-Norm

    The p-norm generalizes the vector norm: 

    • p = 1: Manhattan norm (sum of absolute values).
    • p = 2: Euclidean norm (usual distance measure).
    • p→∞: Maximum absolute value of elements.

    Basis Vectors

    Basis vectors form the foundation of a vector space:

    • A set of linearly independent vectors $$ S = \{u_1, u_2,... u_n\}&&
    • None of these vectors can be expressed as a linear combination of the others.
    • A basis spans the entire vector space.

    QR Decomposition

    This method finds an orthogonal basis spanning the space of the columns of a matrix. It uses the:

    • Gram-Schmidt Orthogonalization process to transform columns into orthonormal basis vectors.

    Matrices

    Definition

    A matrix is a two-dimensional array of scalars or, equivalently, a one-dimensional array of vectors.

    Tensors

    Matrices are a specific type of tensor, but tensors generalize to multiple dimensions:

    • 0D tensor: Scalar.
    • 1D tensor: Vector.
    • 2D tensor: Matrix.
    • kD tensor: Tensor of order k.

    Matrix Multiplication

    • Matrix multiplication involves the inner product of rows from one matrix with columns from another.
    • It is only defined when the number of columns in the first matrix equals the number of rows in the second.

    Trace

    The trace of a square matrix is the sum of its diagonal elements

    Rank

    The rank of a matrix is the number of linearly independent rows or columns. For square matrices:

    • Full Rank: All rows (or columns) are linearly independent.
    • Full rank implies the matrix is:
      • Invertible (it has an inverse).
      • Non-Singular (its determinant is non-zero).

    Identity Matrix

    An identity matrix is a square matrix with 1s on the diagonal and 0s elsewhere:

     

    It serves as the multiplicative identity in matrix algebra.

     

    Determinant

    Once can use the determinant ot check whether a matrix is invertible.

     

    Eigenvalue decomposition

    • background : A matrix = a linear transformation
    • More specifically, a linear transformation that consists of a set of operations : rotate, scale, and rotate
    • Eigenvalue Decomposition : For a square matrix A , find vectors V that satisfies ( $$\lambda$$ eigenvalue)

    • Givne a linear transform A, any points on an eigenvector v will remain on the same direction while its magnitude is multiplied by the corresponding eigenvalue.
    • 행렬 A를 선형변환으로 봤을 때, 선형변환 A에 의한 변환 결과가 자기 자신의 상수배가 되는 0이 아닌 벡터를 고유벡터(eigenvector)라 하고 이 상수배 값을 고유값(eigenvalue)라 함.
    • Use
      • Eigen Decomposition :  A = P A P^-1
      • PCA : 
      • SVD  : A = UDV^T

     

    'AI > Machine Learning' 카테고리의 다른 글

    Clustering  (0) 2025.02.17
    Density Estimation  (0) 2025.02.17
    Machine Learning math background part 2  (0) 2025.02.13
    2 Neural Network Basics  (0) 2023.02.06
    Introduction to Machine Learning  (0) 2023.02.06
Designed by Tistory.