thumbnail image

Kevin Li

  • Home
  • CS 189
  • Discussion

    Wednesdays 3-4pm in Etcheverry 3113

    • Email: kevintli@
    • Anonymous feedback form
    • Welcome survey - please fill this out if you plan on attending my discussions and/or if you'd like to receive occasional recap emails!
  • Section Materials

  • broken image

    Discussion 12: Principal Components Analysis (PCA)

    Rayleigh quotients and their connection to the spectral norm and related optimization problems. Derivations of PCA through various methods: Gaussian MLE, maximizing variance, and minimizing projection error. Relationship between the SVD and PCA.

     

    > Slides

    > Written work

  • broken image

    Discussion 11: Neural Networks

    Neural network basics: feature/representation learning, universal function approximation, motivations for backprop, and how to derive gradients for functions involving matrices and batch dimensions.

     

    > Slides

    > Written work

  • broken image

    Discussion 10: Kernel Methods

    Kernel methods and their motivation as both enabling efficient high-dimensional featurization, and allowing custom notions of similarity between data points. Conditions for the validity of a kernel function.

     

    > Slides

    > Written work

  • broken image

    Discussion 9: Decision Trees & Random Forests

    Decision tree foundations: entropy, information gain, and strictly concave cost functions. Motivation behind random forests.

     

    > Slides

    > Written work

  • broken image

    Discussion 7: Midterm Review

    Miscellaneous practice problems: logistic regression, squared vs. logistic vs. hinge loss functions, LDA/QDA, gradient descent and convexity

     

    > Written work

  • broken image

    Discussion 6: Least Squares & Least Norm

    Least-squares linear regression and motivation for the min-norm solution in the case of infinitely many solutions. SVD, the Moore-Penrose Pseudoinverse, and its application to the min-norm least squares problem.

     

    > Slides

    > Written work

  • broken image

    Discussion 5: Anisotropic Gaussians, Transformations, Quadratic Forms

    Overview of anisotropic Gaussians, including properties of the covariance matrix and the elliptical isocontours of the quadratic form. Change of basis as a way to understand various data transformations (sphering, whitening, etc.).

     

    > Slides

    > Written work

  • broken image

    Discussion 4: Generative Models, GDA, MLE

    Review of Bayes Decision Theory and MLE, and their applications to generative modeling. Gaussian Discriminant Analysis (QDA/LDA) as a special case of generative models.

     

    > Slides

    > Written work

     

  • broken image

    Discussion 3: Soft-Margin SVMs, Decision Theory

    Soft-margin SVMs, hinge loss, and interesting variants of SVMs for outlier detection. Deriving posterior class probabilities using Bayes' Rule.

     

    > Slides

    > Written work

     

  • broken image

    Discussion 2: Math Prep

    Review of math concepts that are useful in machine learning: linear algebra, probability, and vector calculus (especially taking derivatives of matrix/vector functions).

     

    > Slides

    > Written work

  • broken image

    Discussion 1: Intro & SVMs [recording]

    Review of vectors, projection, hyperplanes, and the distance formula. Intro to hard-margin SVMs, including motivation and formulation of the optimization problem.

     

    > Slides + written work

     

    Additional resources

    Understanding the SVM formulation

     

     

    Cookie Use
    We use cookies to ensure a smooth browsing experience. By continuing we assume you accept the use of cookies.
    Learn More