주제 : "Regularization"

발표일 : 1/15


발표자 : 이수진


내용 : Regularization의 기법들에 대한 설명(data augmentation, ensemble, weight decay, batch normalization, dropout)


파일 링크 : https://koreaoffice-my.sharepoint.com/:b:/g/personal/lee_sujin_korea_edu/EVNtfzwV_dlPlaOzaD8YDPcBdUx_wM6Q7FyIvIjqatNSIQ?e=aDgX0o 




Reference :


[1] https://swalloow.github.io/bagging-boosting


[2] https://light-tree.tistory.com/139


[3] https://shuuki4.wordpress.com/2016/01/13/batch-normalization-%EC%84%A4%EB%AA%85-%EB%B0%8F-%EA%B5%AC%ED%98%84/


[4] https://luvimperfection.tistory.com/105


[5] http://blog.naver.com/PostView.nhn?blogId=laonple&logNo=220818841217


[6] https://nmhkahn.github.io/CNN-Practice


[7] https://m.blog.naver.com/PostView.nhn?blogId=laonple&logNo=220662317927&proxyReferer=https%3A%2F%2Fwww.google.com%2F


[8] https://nittaku.tistory.com/272


[9] 오일석, 기계학습, 한빛아카데미


[10] Sergey Ioffe, Christian Szegedy, Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift ICML'15: Proceedings of the 32nd International Conference on International Conference on Machine Learning, Vol 37, July 2015, 448–456


[11] Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, Ruslan Salakhutdinov; Dropout: A Simple Way to Prevent Neural Networks from Overfitting. The Journal of Machine Learning Research June 2014, 1929−1958


[12] Andrew ng - batch normalization youtube https://youtu.be/nUUqwaxLnWs


[13] Andrew ng - dropout youtube https://youtu.be/D8PJAL-MZv8