Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于全连接网络的反向传播的疑问 #57

Open
ume-technology opened this issue Nov 3, 2020 · 2 comments
Open

关于全连接网络的反向传播的疑问 #57

ume-technology opened this issue Nov 3, 2020 · 2 comments

Comments

@ume-technology
Copy link

各位开发人员好: 最近我回头看反向传播的过程. 在全连接神经网络的反向传播的实现代码部分 (class NetWork): def calc_gradient(self, label): delta = self.layers[-1].activator.backward(self.layers[-1].output) * (label - self.layers[-1].output) for layer in self.layers[::-1]: layer.backward(delta) delta = layer.delta return delta.
结合文章, 我没有看明白求解梯度的这个第一行代码是在计算什么. 按照反向传播的起点来算, 应该是从损失函数开始算起, 但是这个求解梯度的方法并没有从损失值还是计算, 而是从输出层的神经元的激活结果开始算起, 这样子是正确的么? 如果是, 还请帮我解释一下, 万分感谢各位的帮助.

@ume-technology
Copy link
Author

ume-technology commented Nov 3, 2020

我没有明白上述方法中的第一行在反向传播过程中为什么会是
self.layers[-1].activator.backward(self.layers[-1].output) * (label - self.layers[-1].output) 括号中这样的数据计算

@huangzhenghui
Copy link

y*(1-y)(t-y) activator.backward(self.layers[-1].output) = y*(1-y)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants