Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DL4J: Add something equivalent to Keras' Lambda layer using SameDiff #5431

Closed
AlexDBlack opened this issue Jun 2, 2018 · 3 comments
Closed
Labels
DL4J General DeepLearning4j issues Enhancement New features and other enhancements SameDiff Autodiff related issues

Comments

@AlexDBlack
Copy link
Contributor

As per title.

@AlexDBlack AlexDBlack added Enhancement New features and other enhancements DL4J General DeepLearning4j issues labels Jun 2, 2018
@raver119
Copy link
Contributor

raver119 commented Jun 2, 2018

We already have this issue, and it's more of JIT magic rather then DL4J layer.

We're almost ready to provide simple version of this for CPU backend, but for GPU backend there will be more low-level solution involved.

@AlexDBlack
Copy link
Contributor Author

We already have this issue, and it's more of JIT magic rather then DL4J layer.

I'm not following. What I want is a simple one-method way to define a no-param layer in DL4J, using SameDiff.

@maxpumperla maxpumperla added the SameDiff Autodiff related issues label Jun 15, 2018
@lock
Copy link

lock bot commented Sep 21, 2018

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Sep 21, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
DL4J General DeepLearning4j issues Enhancement New features and other enhancements SameDiff Autodiff related issues
Projects
None yet
Development

No branches or pull requests

3 participants