Nov 21, 2024  
2024-2025 Graduate Academic Catalog-June 
    
2024-2025 Graduate Academic Catalog-June
Add to Portfolio (opens a new window)

CSC 5601 - Theory of Machine Learning

4 lecture hours 0 lab hours 4 credits
Course Description
This course provides a broad introduction to machine learning. Theory of machine learning explores the study and construction of algorithms that can learn from and make predictions on data. Such algorithms operate by building a model from example inputs in order to make data-driven predictions or decisions, rather than following strictly static program instructions. Topic categories include decision boundaries, optimization, and both supervised and unsupervised methods. Students will apply the theory to implementing and evaluating machine learning algorithms with hands-on, tutorial-oriented laboratory exercises.
Prereq: ((MTH 2130 and MTH 2340) or MTH 5810 ) and (CSC 2621 or CSC 5610 ) or instructor consent
Note: This course is open to qualified undergraduate students.
Course Learning Outcomes
Upon successful completion of this course, the student will be able to:
  • Explain and analyze machine learning models and prediction algorithms (e.g., k-nearest neighbors, logistic regression, SVMs, decision trees) in a precise manner  
  • Confidence in applying, manipulating, and interpreting linear algebra and calculus concepts within the context of machine learning  
  • Implement and validate code for a mathematical model or algorithm given as an equation  
  • Explain the geometric and algebraic interpretations of linear and non-linear decision boundaries and their relationship to error metrics (e.g., accuracy) 
  • Understand the concepts of learning theory, i.e., what is learnable, bias, variance, overfitting, curse of dimensionality, splitting data for evaluation of model predictions  
  • Estimate and plot decision boundaries described by trained models 
  • Be able to encode non-numerical variables in tabular data as numerical features   
  • Describe implications of feature representations with respect to linear separability   
  • Apply approaches for engineering and evaluating new features from existing data   
  • Derive, visualize, apply, and interpret loss functions and associated derivatives to assess the quality of model predictions and fit to data  
  • Describe and implement model training in terms of naïve and gradient-driven search problems over parameter space  
  • Compare and contrast technical requirements and computational efficiency of two unconstrained optimization algorithms (e.g., gradient descent, Newton’s method)  

Prerequisites by Topic
  • Linear algebra
  • Python
  • NumPy
  • Functional programming

Coordinator
Dr. John Bukowy



Add to Portfolio (opens a new window)