Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reworked Layers Phase 1 #334

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Conversation

JimClarke5
Copy link
Contributor

These are the first set of model layers, fairly simple ones without back propagation.

I have also revamped the TestSession, GraphTestSession and EagerTestSession to use lambdas for common operations between the Graph and Eager modes. This will need to be updated with #272, "Tensor.dataToString" when that is merged.

This PR supersedes #65, "Adding layers (based on keras) supporting multiple outputs"

This PR is dependent #191 "TensorFormat enum". I temporary created a copy of TensorFormat until #191 is merged.

The layers framework supports multiple inputs and outputs, but some layers are restricted to single input/output by design,
(e.g. Dense).

This revision moved the Ops parameter out of the CTORs, and creates a new method, init(Ops), and adds an Ops parameter to the call method [e.g. call(tf, ...)]. see issue #202.

SequentialLayersTest tests chains layers together like in the Sequential Model.

…CTORs to init(Ops) and added call(Ops, ...).
@JimClarke5
Copy link
Contributor Author

@karllessard the ops.pb conflict is because I moved the nn.raw ops to nn, and moved the higher level nn ops to framework. This is a generated file so I am not sure how to resolve this conflict.

@karllessard
Copy link
Collaborator

karllessard commented Jul 6, 2021

@deansher , any chance you can take a look at this work?

BTW @JimClarke5 , you'll need to rebase it. For the ops.pb, normally it should have no impact that you moved nn.raw to nn since this change is at the API definition level, which should not be part of ops.pb. Maybe just reset that file to the original version on master and discard it from your PR

@deansher
Copy link
Contributor

deansher commented Jul 9, 2021

Thank you, @karllessard, for inviting me to review. I have lately been exploring different aspects of deep learning that don't leave me the time to participate in tensorflow/java for now.

@karllessard
Copy link
Collaborator

@JimClarke5 there is a lot of conflict in this PR, I think it's because we have merged a few of your other PRs before it, can you please rebase so I can start to take a look at it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants