AID302 Optimization for Data Science

Referencing Curricula Print this page

Course Code Course Title Weekly Hours* ECTS Weekly Class Schedule
T P
AID302 Optimization for Data Science 3 2 6
Prerequisite None It is a prerequisite to

None

Lecturer Office Hours / Room / Phone

Currently not available

E-mail
Assistant Assistant E-mail
Course Objectives This course provides a comprehensive exploration of mathematical optimization and its applications in data science. Students will receive an introduction to the fundamental principles of mathematical optimization, including line search methods, gradient-based methods, Newton's method, Hessian-based methods, and derivative-free optimization. The course also covers techniques for solving linear and nonlinear optimization problems, with a specific focus on least squares techniques. Throughout the course, students will gain hands-on experience in applying optimization algorithms to real-world data science applications. By the end of the course, students will have a solid understanding of mathematical optimization and its practical relevance in data-driven decision-making.
Textbook TBA
Additional Literature
Learning Outcomes After successful  completion of the course, the student will be able to:
  1. transform real-world problems into optimization problems.
  2. identify and construct convex, nonconvex, and structured optimization problems.
  3. adapt optimization algorithms to address data science-related problems.
  4. employ optimization algorithms in the implementation of data science projects.
Teaching Methods
Teaching Method Delivery Teaching Method Delivery Notes
WEEK TOPIC REFERENCE
Week 1 Introduction: mathematical optimization, representative models and applications: machine learning, data science, operations research
Week 2 Mathematical preliminaries: Global and local optimizers, convexity, gradients and subgradients, optimality conditions, convergence rates.
Week 3 Steepest descent method and its convergence analysis in the general case, the convex case and the strongly convex case.
Week 4 Modelling: least squares, matrix completion
Week 5 Modelling: sparse inverse covariance estimation, sparse principal components, sparse plus low rank matrix decomposition
Week 6 Modelling: support vector machines, logistic regression, deep learning.
Week 7 Exam
Week 8 First-order methods : gradient and coordinate descent, Frank-Wolfe
Week 9 First-order methods : subgradient and mirror descent, stochastic and incremental gradient methods
Week 10 second-order methods (Newton and quasi Newton methods)
Week 11 non-convexity (local convergence, provable global convergence, cone programming, convex relaxations
Week 12 min-max optimization (extragradient methods).
Week 13 Data Science applications
Week 14 Project Presentations
Week 15
Assessment Methods and Criteria Evaluation Tool Quantity Weight Alignment with LOs
Final Exam 1 30
Semester Evaluation Components
Midterm 1 25
Quizzes 3 15
Term project and presentation 1 15
Lab assignments 7 15
***     ECTS Credit Calculation     ***
 Activity Hours Weeks Student Workload Hours Activity Hours Weeks Student Workload Hours
Lecture hours 3 14 42 Assignments 3 7 21
Active labs 2 14 28 Home study 1 14 14
In-term exam study 10 1 10 Final exam study 11 1 11
Term project/presentation 2 12 24
        Total Workload Hours = 150
*T= Teaching, P= Practice ECTS Credit = 6
Course Academic Quality Assurance: Semester Student Survey Last Update Date: 01/09/2023

Print this page