シラバス参照

講義概要/Course Information
2024/05/03 現在

科目基礎情報/General Information
授業科目名
/Course title (Japanese)
応用ネットワーキング論2
英文授業科目名
/Course title (English)
Network Applications 2
開講年度
/Academic year
2019年度 開講年次
/Year offered
全年次
開講学期
/Semester(s) offered
後学期 開講コース・課程
/Faculty offering the course
博士前期課程、博士後期課程
授業の方法
/Teaching method
講義 単位数
/Credits
2
科目区分
/Category
選択科目
開講類・専攻
/Cluster/Department
情報ネットワークシステム学専攻
担当教員名
/Lecturer(s)
笠井 裕之
居室
/Office
東2-611
公開E-mail
/e-mail
笠井<hiroyuki.kasai@waseda.jp>
授業関連Webページ
/Course website
http://www.kasailab.com/lecture
更新日
/Last update
2019/10/07 07:38:02 更新状況
/Update status
公開中
/now open to public
講義情報/Course Description
講義の狙い、目標 This lecture addresses the fundamentals and algorithms of optimization theory which is one of core technologies of machine learning. Especially, non-linear programming is focused. Note that the topics related with combinational optimization, non-differential optimization, discrete optimization and linear programing are not provided.

講義では,データ解析のための機械学習(データ最適化手法)に関する技術と理論について学習する.
内容 以下の内容に従って講義を行うが,学生の理解度に合わせて適宜内容を調整する.

1:Introduction

- Big and high-dimensional data analysis and its issues
- Linear and logistic regression
- Optimization technique basis

2: Mathematic preliminaries

- Space R^n and R^{mxn} space
- Inner product and norms
- Eigenvalues and eigenvectors
- Basic topological concepts


3: Optimality conditions for unconstrained optimization

- Global and local optima
- Classification of matrices
- First/second order optimality conditions
- Quadratic functions

4: Least squares

- Overdetermined systems
- Data fitting
- Regularized least squares
- De-noising

5: Gradient method 1

- Descent direction methods
- Gradient method
- Condition number
- Diagonal scaling

6: Gradient method 2

- Line search (Exact, Backtracking, Wolf conditions, etc.)
- Convergence analysis

7: Newton's method

- Standard Newton's method
- Damped Newton's method
- Cholesky factorization based Newton's method

8: Convex sets and functions

- Definition and examples of convex sets
- First/second order order characterizations of convex functions

9: Convex optimization 1

- Stationarity
- Orthogonal projection
- Gradient projection method

10: Convex optimization 2

- Convergence analysis

11: Optimality conditions for linearly constrained problems

- Separation and alternative theorems (Farkas' lemma)
- KKT conditions
- Orthogonal regression

12: KKT conditions

- Fritz-John theorem
- KKT conditions for inequality/equality constrained problem
- KKT conditions for convex optimization problem

13: Duality

- Motivations
- Definition
- Weak/strong duality in convex case
- Examples (LP, QP, Orthogonal projection, Chebyshev center,  Sum of norms, Denoising, etc.)

14: Advanced topics 1

- Stochastic optimization (SGD, SAG, SVRG, etc.)

15: Advanced topics 2

- Proximal (Stochastic) optimization methods
- ADMM
- Optimization on Riemannian manifolds
教科書、参考書 Not special
予備知識 Linear algebra basic and calculus basic

線形代数および解析
演習 Some reports are provided during the class.
成績評価方法
および評価基準
- Evaluation method
Middle term report:50%
Final term report:50%

- Evaluation metrics
How deeply students understand fundamentals and algorithms of optimization theory.
その他
/Others
- Students who are interested in machine learning, pattern recognition, and big data analysis are welcome.

- It is recommended to contact the lecturer by e-mail if you have any questions.
- The spoken language can be English if less-than one non-Japanese student attends.
- Black board is used.
- Matlab simulation tasks are provided to students for their deeper understandings.
キーワード
/Keywords
Optimization problem, Non-linear programming, Gradient, Hessian, Convex set/function, optimality conditions, Iterative gradient descent fundamentals,  Line search methods (Back-tracking, Armijo condition,Wolfe condition), Steepest descent, Newton's method, Quasi Newton's method, Conjugate gradient, Scaled/Preconditioning descent methods, Stochastic gradient descent