シラバス参照

講義概要/Course Information
2024/04/19 現在

科目基礎情報/General Information
授業科目名
/Course title (Japanese)
データ解析最適化論
英文授業科目名
/Course title (English)
Advanced Topics in Data Analysis Optimization?
科目番号
/Code
開講年度
/Academic year
2017年度 開講年次
/Year offered
全学年
開講学期
/Semester(s) offered
後学期 開講コース・課程
/Faculty offering the course
博士前期課程、博士後期課程
授業の方法
/Teaching method
講義 単位数
/Credits
2
科目区分
/Category
大学院専門教育科目 - 専門科目Ⅱ
開講類・専攻
/Cluster/Department
情報・ネットワーク工学専攻
担当教員名
/Lecturer(s)
笠井 裕之
居室
/Office
東2-611
公開E-mail
/e-mail
笠井准教授<kasai@is.uec.ac.jp>
授業関連Webページ
/Course website
http://www.kasailab.com/lecture/data_optimization
更新日
/Last update
2017/03/14 08:50:57 更新状況
/Update status
公開中
/now open to public
講義情報/Course Description
主題および
達成目標(2,000文字以内)
/Themes and goals(up to 2,000 letters)
This lecture addresses the fundamentals and algorithms of optimization theory which is one of core technologies of machine learning. Especially, non-linear programming is focused. Note that the topics related with combinational optimization, non-differential optimization, discrete optimization and linear programing are not provided.

講義では,データ解析のための機械学習(データ最適化手法)に関する技術と理論について学習する.
前もって履修
しておくべき科目(1,000文字以内)
/Prerequisites(up to 1,000 letters)
Linear algebra basic and calculus basic

線形代数および解析
前もって履修しておくこ
とが望ましい科目(1,000文字以内)
/Recommended prerequisites and preparation(up to 1,000 letters)
Not special
教科書等(1,000文字以内)
/Course textbooks and materials(up to 1,000 letters)
Not special
授業内容と
その進め方(2,000文字以内)
/Course outline and weekly schedule(up to 2,000 letters)
以下の内容に従って講義を行うが,学生の理解度に合わせて適宜内容を調整する.

1:Introduction

- Big and high-dimensional data analysis and its issues
- Linear and logistic regression
- Optimization technique basis

2: Mathematic preliminaries

- Space R^n and R^{mxn} space
- Inner product and norms
- Eigenvalues and eigenvectors
- Basic topological concepts


3: Optimality conditions for unconstrained optimization

- Global and local optima
- Classification of matrices
- First/second order optimality conditions
- Quadratic functions

4: Least squares

- Overdetermined systems
- Data fitting
- Regularized least squares
- De-noising

5: Gradient method 1

- Descent direction methods
- Gradient method
- Condition number
- Diagonal scaling

6: Gradient method 2

- Line search (Exact, Backtracking, Wolf conditions, etc.)
- Convergence analysis

7: Newton's method

- Standard Newton's method
- Damped Newton's method
- Cholesky factorization based Newton's method

8: Convex sets and functions

- Definition and examples of convex sets
- First/second order order characterizations of convex functions

9: Convex optimization 1

- Stationarity
- Orthogonal projection
- Gradient projection method

10: Convex optimization 2

- Convergence analysis

11: Optimality conditions for linearly constrained problems

- Separation and alternative theorems (Farkas' lemma)
- KKT conditions
- Orthogonal regression

12: KKT conditions

- Fritz-John theorem
- KKT conditions for inequality/equality constrained problem
- KKT conditions for convex optimization problem

13: Duality

- Motivations
- Definition
- Weak/strong duality in convex case
- Examples (LP, QP, Orthogonal projection, Chebyshev center,  Sum of norms, Denoising, etc.)

14: Advanced topics 1

- Stochastic optimization (SGD, SAG, SVRG, etc.)

15: Advanced topics 2

- Proximal (Stochastic) optimization methods
- ADMM
- Optimization on Riemannian manifolds
実務経験を活かした
授業内容
(実務経験内容も含む)
/Course content utilizing practical experience
授業時間外の学習
(予習・復習等)(1,000文字以内)
/Preparation and review outside class(up to 1,000 letters)
It is strongly recommended to study the related materials such as books and papers.

関連する書籍や論文などを適宜参照して勉強するのが望ましい.
成績評価方法
および評価基準
(最低達成基準を含む)
(1,000文字以内)
/Evaluation and grading
(up to 1,000 letters)
- Evaluation method
Middle term report:50%
Final term report:50%

- Evaluation metrics
How deeply students understand fundamentals and algorithms of optimization theory.
オフィスアワー:
授業相談(1,000文字以内)
/Office hours(up to 1,000 letters)
Contact the lecturer by e-mail.

メールにてアポイントメントをとるのが望ましい.
学生へのメッセージ(1,000文字以内)
/Message for students(up to 1,000 letters)
- The spoken language can be English if less-than one non-Japanese student attends.
- Black board is used.
- Matlab simulation tasks are provided to students for their deeper understandings.
その他
/Others
Students who are interested in machine learning, pattern recognition, and big data analysis are welcome.
キーワード
/Keywords
Optimization problem, Non-linear programming, Gradient, Hessian, Convex set/function, optimality conditions, Iterative gradient descent fundamentals,  Line search methods (Back-tracking, Armijo condition,Wolfe condition), Steepest descent, Newton's method, Quasi Newton's method, Conjugate gradient, Scaled/Preconditioning descent methods, Stochastic gradient descent