Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于relu处理后为什么是非线性 #95

Open
any86 opened this issue Jun 13, 2022 · 0 comments
Open

关于relu处理后为什么是非线性 #95

any86 opened this issue Jun 13, 2022 · 0 comments

Comments

@any86
Copy link
Owner

any86 commented Jun 13, 2022

起初没理解为什么别人的文章中都说多个relu相加后可以形成多个线段形成数据.

主要不理解的就是这句:"多个relu相加".

看了别人的文章, 突然感觉好像懂了, 比如: 1/2层用了relu, 第1层接收的是relu(x1), 那么到了第二次就成了y2 = W * relu(x1), 这里因为W是张量, 那么就变成内积, 就出现了relu求和.

初学, 很多东西都不懂, 这里是个人理解, 不能保证正确, 权当做个笔记.

image

image

image

如果认同了我说的, 再看这个老师的讲解就通透了: https://www.bilibili.com/video/BV1Wv411K7o5?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant