-
Notifications
You must be signed in to change notification settings - Fork 95
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice :)
if (!util.arraysEqual(config.dims.slice().sort(), expectedSortedIndices)) { | ||
throw new Error( | ||
'Invalid permutation `dims`: ' + JSON.stringify(config.dims) + | ||
' `dims` must contain consecutive integers starting from 1.'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice error string :)
Maybe add the expected last index as well? or how many were expected (i guess that's the same number anyway).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was surprised that the batch dim is always skipped. may want to add something to that effect as well. "(The batch dim is always unchanged by 'Permute')"
@@ -375,33 +373,33 @@ describeMathCPUAndGPU('Activation Layer: Tensor', () => { | |||
|
|||
it('linear', () => { | |||
const x = mul(scalar(10), ones(inputShape)); | |||
const layer = new Activation({activation: 'linear'}); | |||
const layer = tfl.layers.activation({activation: 'linear'}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing these
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the review.
Reviewable status: complete! 1 of 1 approvals obtained (waiting on @caisq and @davidsoergel)
src/layers/core.ts, line 670 at r1 (raw file):
Previously, bileschi (Stanley Bileschi) wrote…
I was surprised that the batch dim is always skipped. may want to add something to that effect as well. "(The batch dim is always unchanged by 'Permute')"
Re. your first comment above, it's not easy to guess what the user really wants when the error happens. For example, the user provides dims: [1, 4, 3]
. How do we know if the user really wants [1, 3, 2]
or [1, 2, 4, 3]
? What's stated in the error message is the best we can guess.
I was a little surprised by that as well. But on second thought, this keras convention makes sense:
- It is exceedingly rare when someone wants to permute the batch dimension with other dimensoins.
- It makes sense to use the natural dimension index, i.e., the first non-batch dimension is referred to as 1. The alternative (referring to the first non-batch dimension as 0) would be very confusing.
src/layers/core_test.ts, line 376 at r1 (raw file):
Previously, bileschi (Stanley Bileschi) wrote…
Thanks for fixing these
Ack.
FEATURE
Fixes: tensorflow/tfjs#577
This change is