Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RandomHeight Preprocessing Layer #7483

Merged
merged 30 commits into from
Mar 23, 2023

Conversation

RWallie
Copy link
Contributor

@RWallie RWallie commented Mar 15, 2023

RandomHeight Preprocessing Layer
Co-authored-by:
Natalie Umanzor (@numanzor) umanzor2949@gmail.com
Silvia Kocsis (@Silvia42) silvia.kocsis@gmail.com
Ryan Wallace (@RWallie) ryanwallace1396@gmail.com


This change is Reviewable

RWallie and others added 23 commits February 6, 2023 09:35
…ctins for preprocessing layers

Co-authored-by: Silvia Kocsis <silvia.kocsis@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Silvia Kocsis <silvia.kocsis@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
…d handles randomization of width

Co-authored-by: Silvia Kocsis <silvia.kocsis@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Silvia Kocsis <silvia.kocsis@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Silvia Kocsis <silvia.kocsis@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
…ss, got rid of rng_types as all random distribution functions seem to be stateless

Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
…nd stores random distribution functions as properties

Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
…mLayer

Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
Co-authored-by: Silvia Kocsis <silvia.kocsis@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Silvia Kocsis <silvia.kocsis@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
…, and added test for RandomSeed

  Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
@mattsoulanille mattsoulanille self-requested a review March 15, 2023 18:46
Copy link
Member

@mattsoulanille mattsoulanille left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good. I just have one change to better match Keras's behavior. Thanks!

override computeOutputShape(inputShape: Shape|Shape[]): Shape|Shape[] {
inputShape = getExactlyOneShape(inputShape);
const numChannels = inputShape[2];
return [this.adjustedHeight, this.imgWidth, numChannels];
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Keras implementation sets the height index to None.

>>> random_height = keras.layers.RandomHeight([0.5, 0.7])
>>> random_height.compute_output_shape([1, 224, 224, 3])
TensorShape([1, None, 224, 3])

I think this is to indicate that the height of a generic tensor that this layer could create is unknown. I don't think it should be specific to the last tensor the layer created. The equivalent to None in a shape in TFJS is -1, so the above code, if written in TFJS, should return [1, -1, 224, 3]. For your code, you can just return [-1, this.imgWidth, numChannels].

This should also let you remove the this.adjustedHeight variable from the class and make it local to the call function.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I missed this in my review of RandomWidth. Could you add the change there as well? Thanks!

Comment on lines 129 to 130
this.adjustedHeight = this.heightFactor.dataSync()[0] * imgHeight;
this.adjustedHeight = Math.round(this.adjustedHeight);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
this.adjustedHeight = this.heightFactor.dataSync()[0] * imgHeight;
this.adjustedHeight = Math.round(this.adjustedHeight);
let adjustedHeight = this.heightFactor.dataSync()[0] * imgHeight;
adjustedHeight = Math.round(this.adjustedHeight);

this.adjustedHeight = this.heightFactor.dataSync()[0] * imgHeight;
this.adjustedHeight = Math.round(this.adjustedHeight);

const size:[number, number] = [this.adjustedHeight, this.imgWidth];
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
const size:[number, number] = [this.adjustedHeight, this.imgWidth];
const size:[number, number] = [adjustedHeight, this.imgWidth];

Silvia42 and others added 3 commits March 17, 2023 13:27
…OutputShape.

Co-authored-by: Natalie Umanzor <umanzor2949@gmail.com>
Co-authored-by: Ryan Wallace <ryanwallace1396@gmail.com>
@RWallie
Copy link
Contributor Author

RWallie commented Mar 17, 2023

Hey @mattsoulanille! Made the changes in RandomHeight and RandomWidth. The adjusted dimension is localized in the call method and computeOutputShape returns as you said to map better to Keras behavior. Thank you!

Copy link
Member

@mattsoulanille mattsoulanille left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! LGTM!

@mattsoulanille mattsoulanille merged commit 6723b22 into tensorflow:master Mar 23, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants