シラバス参照

講義概要
2017/11/25 現在

科目基礎情報
授業科目名 データ解析最適化論
英文授業科目名 Advanced Topics in Data Analysis Optimization?
開講年度 2017年度 開講年次 全学年
開講学期 後学期 開講コース・課程 博士前期課程、博士後期課程
授業の方法 講義 単位数 2
科目区分 大学院専門教育科目 - 専門科目Ⅱ
開講学科・専攻 情報・ネットワーク工学専攻
担当教員名 笠井 裕之
居室 東2-611
公開E-Mail 笠井准教授<kasai@is.uec.ac.jp>
授業関連Webページ http://www.kasailab.com/lecture/data_optimization
更新日 2017/03/14 08:50:57 更新状況 公開中
講義情報
主題および
達成目標
This lecture addresses the fundamentals and algorithms of optimization theory which is one of core technologies of machine learning. Especially, non-linear programming is focused. Note that the topics related with combinational optimization, non-differential optimization, discrete optimization and linear programing are not provided.

講義では,データ解析のための機械学習(データ最適化手法)に関する技術と理論について学習する.
前もって履修
しておくべき科目
Linear algebra basic and calculus basic

線形代数および解析
前もって履修しておくこ
とが望ましい科目
Not special
教科書等 Not special
授業内容と
その進め方
以下の内容に従って講義を行うが,学生の理解度に合わせて適宜内容を調整する.

1:Introduction

- Big and high-dimensional data analysis and its issues
- Linear and logistic regression
- Optimization technique basis

2: Mathematic preliminaries

- Space R^n and R^{mxn} space
- Inner product and norms
- Eigenvalues and eigenvectors
- Basic topological concepts


3: Optimality conditions for unconstrained optimization

- Global and local optima
- Classification of matrices
- First/second order optimality conditions
- Quadratic functions

4: Least squares

- Overdetermined systems
- Data fitting
- Regularized least squares
- De-noising

5: Gradient method 1

- Descent direction methods
- Gradient method
- Condition number
- Diagonal scaling

6: Gradient method 2

- Line search (Exact, Backtracking, Wolf conditions, etc.)
- Convergence analysis

7: Newton's method

- Standard Newton's method
- Damped Newton's method
- Cholesky factorization based Newton's method

8: Convex sets and functions

- Definition and examples of convex sets
- First/second order order characterizations of convex functions

9: Convex optimization 1

- Stationarity
- Orthogonal projection
- Gradient projection method

10: Convex optimization 2

- Convergence analysis

11: Optimality conditions for linearly constrained problems

- Separation and alternative theorems (Farkas' lemma)
- KKT conditions
- Orthogonal regression

12: KKT conditions

- Fritz-John theorem
- KKT conditions for inequality/equality constrained problem
- KKT conditions for convex optimization problem

13: Duality

- Motivations
- Definition
- Weak/strong duality in convex case
- Examples (LP, QP, Orthogonal projection, Chebyshev center,  Sum of norms, Denoising, etc.)

14: Advanced topics 1

- Stochastic optimization (SGD, SAG, SVRG, etc.)

15: Advanced topics 2

- Proximal (Stochastic) optimization methods
- ADMM
- Optimization on Riemannian manifolds
授業時間外の学習
(予習・復習等)
It is strongly recommended to study the related materials such as books and papers.

関連する書籍や論文などを適宜参照して勉強するのが望ましい.
成績評価方法
および評価基準
(最低達成基準を含む)
- Evaluation method
Middle term report:50%
Final term report:50%

- Evaluation metrics
How deeply students understand fundamentals and algorithms of optimization theory.
オフィスアワー:
授業相談
Contact the lecturer by e-mail.

メールにてアポイントメントをとるのが望ましい.
学生へのメッセージ - The spoken language can be English if less-than one non-Japanese student attends.
- Black board is used.
- Matlab simulation tasks are provided to students for their deeper understandings.
その他 Students who are interested in machine learning, pattern recognition, and big data analysis are welcome.
キーワード Optimization problem, Non-linear programming, Gradient, Hessian, Convex set/function, optimality conditions, Iterative gradient descent fundamentals,  Line search methods (Back-tracking, Armijo condition,Wolfe condition), Steepest descent, Newton's method, Quasi Newton's method, Conjugate gradient, Scaled/Preconditioning descent methods, Stochastic gradient descent