Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

零基础入门深度学习(3) - 神经网络和反向传播算法 向量编程 计算输出层的delta时为何重复计算? #46

Open
ljfxmu opened this issue Mar 31, 2019 · 0 comments

Comments

@ljfxmu
Copy link

ljfxmu commented Mar 31, 2019

def calc_gradient(self, label):
delta = self.layers[-1].activator.backward(
self.layers[-1].output
) * (label - self.layers[-1].output)
//计算梯度时,这里先计算了输出层的delta. 然后再用这个delta去计算隐藏层的delta。

//但是在如下代码却是用隐藏层的Delta计算公式,输出层的delta值再次计算了一下输出层的delta值 ? layer遍历是从输出层开始的?
for layer in self.layers[::-1]:
layer.backward(delta)
delta = layer.delta
return delta

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant