Apr 27, 2024  
2023-2024 Undergraduate Academic Catalog-June Update 
    
2023-2024 Undergraduate Academic Catalog-June Update [ARCHIVED CATALOG]

Add to Portfolio (opens a new window)

CSC 4601 - Theory of Machine Learning

2 lecture hours 2 lab hours 3 credits
Course Description
This course provides a broad introduction to machine learning. Theory of machine learning explores the study and construction of algorithms that can learn from and make predictions on data. Such algorithms operate by building a model from example inputs in order to make data-driven predictions or decisions, rather than following strictly static program instructions. Topic categories include decision boundaries, optimization, and both supervised and unsupervised methods. Students will apply the theory to implementation and evaluation of machine learning algorithms with hands-on, tutorial-oriented laboratory exercises. (prereq: MTH 2130 , MTH 2340 , CSC 2621 ) (quarter system prereq: CS 2300, MA 383, MA 2323)
Course Learning Outcomes
Upon successful completion of this course, the student will be able to:
  • Explain and analyze machine learning models and prediction algorithms (e.g., k-nearest neighbors, logistic regression, SVMs, decision trees)
  • Confidence in applying, manipulating, and interpreting linear algebra and calculus concepts within the context of machine learning
  • Implement and validate code for a mathematical model or algorithm given as an equation
  • Explain the geometric and algebraic interpretations of linear and non-linear decision boundaries and their relationship to error metrics (e.g., accuracy)
  • Understand the concepts of learning theory, i.e., what is learnable, bias, variance, overfitting, curse of dimensionality, splitting data for evaluation of model predictions
  • Estimate and plot decision boundaries described by trained models
  • Derive, visualize, apply, and interpret loss functions and associated derivatives to assess the quality of model predictions and fit to data
  • Describe and implement model training in terms of naïve and gradient-driven search problems over parameter space
  • Compare and contrast technical requirements and computational efficiency of two unconstrained optimization algorithms (e.g., gradient descent, Newton’s method)

Prerequisites by Topic
  • Linear algebra
  • Derivative calculus
  • Software development in Python

Coordinator
Dr. John Bukowy



Add to Portfolio (opens a new window)