Course Code |
Course Title |
Weekly Hours* |
ECTS |
Weekly Class Schedule |
T |
P |
AID302 |
Optimization for Data Science |
3 |
2 |
6 |
|
Prerequisite |
None |
It is a prerequisite to |
|
Lecturer |
|
Office Hours / Room / Phone |
|
E-mail |
|
Assistant |
|
Assistant E-mail |
|
Course Objectives |
This course provides a comprehensive exploration of mathematical optimization and its applications in data science. Students will receive an introduction to the fundamental principles of mathematical optimization, including line search methods, gradient-based methods, Newton's method, Hessian-based methods, and derivative-free optimization. The course also covers techniques for solving linear and nonlinear optimization problems, with a specific focus on least squares techniques. Throughout the course, students will gain hands-on experience in applying optimization algorithms to real-world data science applications. By the end of the course, students will have a solid understanding of mathematical optimization and its practical relevance in data-driven decision-making. |
Textbook |
TBA |
Additional Literature |
|
Learning Outcomes |
After successful completion of the course, the student will be able to: |
- transform real-world problems into optimization problems.
- identify and construct convex, nonconvex, and structured optimization problems.
- adapt optimization algorithms to address data science-related problems.
- employ optimization algorithms in the implementation of data science projects.
|
Teaching Methods |
|
Teaching Method Delivery |
|
Teaching Method Delivery Notes |
|
WEEK |
TOPIC |
REFERENCE |
Week 1 |
Introduction: mathematical optimization, representative models and applications: machine learning, data science, operations research |
|
Week 2 |
Mathematical preliminaries: Global and local optimizers, convexity, gradients and subgradients, optimality conditions, convergence rates. |
|
Week 3 |
Steepest descent method and its convergence analysis in the general case, the convex case and the strongly convex case. |
|
Week 4 |
Modelling: least squares, matrix completion |
|
Week 5 |
Modelling: sparse inverse covariance estimation, sparse principal components, sparse plus low rank matrix decomposition |
|
Week 6 |
Modelling: support vector machines, logistic regression, deep learning. |
|
Week 7 |
Exam |
|
Week 8 |
First-order methods : gradient and coordinate descent, Frank-Wolfe |
|
Week 9 |
First-order methods : subgradient and mirror descent, stochastic and incremental gradient methods |
|
Week 10 |
second-order methods (Newton and quasi Newton methods) |
|
Week 11 |
non-convexity (local convergence, provable global convergence, cone programming, convex relaxations |
|
Week 12 |
min-max optimization (extragradient methods). |
|
Week 13 |
Data Science applications |
|
Week 14 |
Project Presentations |
|
Week 15 |
|
|