Neural Network神经网络学习笔记

常见激活函数
http://www.36dsj.com/archives/40455
RELU激活函数
http://www.cnblogs.com/neopenx/p/4453161.html
逐层贪婪训练法
http://blog.csdn.net/dcxhun3/article/details/48131745
比Momentum更快:揭开Nesterov Accelerated Gradient的真面目
https://zhuanlan.zhihu.com/p/22810533
训练算法详细介绍
http://sebastianruder.com/optimizing-gradient-descent/index.html#whichoptimizertochoose

weight decay normalization是什么鬼
https://www.zhihu.com/question/24529483
’
Rmsprop等多种训练算法详解
http://mooc.guokr.com/note/9711/
Rmsprop算法
http://climin.readthedocs.io/en/latest/rmsprop.html

Differences between L1 and L2 as Loss Function and Regularization
http://www.chioka.in/differences-between-l1-and-l2-as-loss-function-and-regularization/
自定义loss激活函数
http://spaces.ac.cn/archives/4293/
新型的权重更新规则
http://forum.ai100.com.cn/blog/thread/ml-2015-09-22-3889831303790401/
https://www.dengfanxin.cn/?m=201610

牛顿与梯度下降算法
https://www.zhihu.com/question/19723347?sort=created&page=1
详解反向传播算法(上)
https://zhuanlan.zhihu.com/p/25081671?utm_source=tuicool&utm_medium=referral
训练神经网络的五大算法
http://blog.csdn.net/starzhou/article/details/52918119

Deep learning中的优化方法:随机梯度下降、受限的BFGS、共轭梯度法
http://m.blog.csdn.net/article/details?id=51735709

Natural Neural Networks
https://arxiv.org/abs/1507.00210

合成梯度
https://www.jiqizhixin.com/articles/8d06d921-ede0-49bf-b84e-6bf652d10b1f
取消前向传播与反向传播
https://zhuanlan.zhihu.com/p/22143664
Let Computers Learn to Learn
https://zhuanlan.zhihu.com/p/21362413?refer=intelligentunit

Dropout思想,丢弃部分基因,获取对未知领域的适应性 
http://36kr.com/p/5044681.html

Batch Normalization
http://blog.csdn.net/meanme/article/details/48679785
http://blog.csdn.net/happynear/article/details/44238541

深度学习中Batch Normalization为什么效果好?
https://www.zhihu.com/question/38102762

Batch Normalization论文
https://arxiv.org/pdf/1502.03167v3.pdf

神经网络教程
http://deeplearning.stanford.edu/wiki/index.php/UFLDL_Tutorial

gographviz用于绘制神经网络图
https://github.com/awalterschulze/gographviz

playground 神经网络在线演示
http://playground.tensorflow.org/

softmax相关资料
http://blog.csdn.net/celerychen2009/article/details/9014797
https://www.zhihu.com/question/23765351
https://www.zybuluo.com/frank-shaw/note/143260
https://wenjun.blog.ustc.edu.cn/logistic-sigmoid-function.html
详解softmax函数以及相关求导过程
https://zhuanlan.zhihu.com/p/25723112

二值神经网络
http://www.tuicool.com/articles/eu6VBzf


发表于:2017-10-12 10:55:52

原文链接(转载请保留): http://www.multisilicon.com/blog/a24276522.html

友情链接: MICROIC
首页