2019 NCTU資工所 Machine Learning class
teacher:洪瑞鴻、邱維辰
notice:
作業有錯誤歡迎發issue與pull request修改 :)
write on 2020/04/15
logistic:
- 對於gradient descent與newton's method,都要調learning rate (不同task要用不同learning rate)
EM alogo:
-
log likelihood
- likelihood要用+的,用乘的會變成極小值(0.X的784次方)
- normalized Posterior時,由於大家都是負數,會造成原本最大值在normalized後變最小值,故要用negative normalized改善。
- E-step向量化很重要,10分鐘->5秒
- for迴圈請注意cache miss
- 正確率約28%
-
likelihood
- likelihood用乘的
- E-step也可向量化
- prior中容易有幾個class的prob變為0
- 正確率約4x%
GP參考: http://krasserm.github.io/2018/03/19/gaussian-processes/
libsvm參考: https://github.com/cjlin1/libsvm
eigenvalue & eigenvector: https://drive.google.com/drive/folders/1vpMZ8n42cQJmU54vu38uNZ542-dA6MJb?usp=sharing
fisherface參考: https://www.bytefish.de/blog/fisherfaces/