Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

如果每一步都是线性,则整体用线性就好,没必要分层,故一般不用线性。

如果中间用阶梯状,0/1 error,则不好优化,NPhard问题,一般不用

中间为S型函数,常用

现在的s函数:是logistics regression用到的s函数的关系:现在的S函数=2s原来-1

s原来:logistics regression用到的s函数

Machine Learning Techniques 笔记:2-12 Neural Network

每一层都把前一层的输出当作这一层的输出,得到下一层的输入

S3(2):第二层,第三个节点的输入

Machine Learning Techniques 笔记:2-12 Neural Network

前一级的x,与当前层的权重w是否平行,做内积,平行时最大。

Machine Learning Techniques 笔记:2-12 Neural Network

第一层:(3+1)个权重 x 5

第二层:(5+1)x 1

Machine Learning Techniques 笔记:2-12 Neural Network

en:单一点上犯的错误。调整权重对en的影响

Machine Learning Techniques 笔记:2-12 Neural Network

error:为输出与最后一级神经元的输出差值的平方。

看起来,error与最后一级的权重相关。

i:向前面连的index,j:向后面连的index 

delta(L):输出与前一级输入的分数的关系

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

不同的出发点,会到达不同的山谷。得到的是局部最优点,难于得到全局最优

large weights:饱和区,平坦的部分

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network

迭代的次数少,vc dimension也会少。好的vc dimension在中间

中间停下来的方式:early stopping, 用validation来决定什么时候停下来。

Machine Learning Techniques 笔记:2-12 Neural Network

Machine Learning Techniques 笔记:2-12 Neural Network