-
Notifications
You must be signed in to change notification settings - Fork 662
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add flow.randn #5736
add flow.randn #5736
Conversation
VertexC
commented
Aug 5, 2021
super().__init__() | ||
# TODO: make shape process as a util | ||
assert size is not None, "shape must not be None!" | ||
assert isinstance( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
其实这些检查都是不必要的,functional接口在类型不匹配时会直接报错
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这种检查报错可能对用户更加友好一点?
random op因为有generator是有状态的 但是会有下面这种创建一个Randn module,再从这个module创建tensor的需求吗 m = flow.Randn(...)
tensor_1 = m()
tensor_2 = m() 如果没有的话,只暴露一个flow.randn的接口,应该不需要module吧。 |
这里如果完全对齐pytorch是不需要module的,调用这个接口,如果没有提供generator,有两种处理方式:1、用全局的generator就好了,但这样的话sbp肯定就不能支持B了;2、直接创建一个新的generator,新的generator的seed每次会不会变化,如果不变,那每次随机出来都是同样的值,如果seed会变化,那怎么保证sbp为B的情况下,multi-client间的seed相同? 你上面说的Randn module多次执行的问题,这个天然就存在这种需求的。比如说我用Randn module搭建了一个网络,然后迭代多次,是不是就会多次执行这个module呢。 |
Speed stats:
|