Skip to content

Conversation

@vguerra
Copy link
Contributor

@vguerra vguerra commented Oct 6, 2019

  • Inverse trigonometric functions.
  • Exponents and logarithms.
  • Hyperbolic functions.
  • Error functions ( erf, erfc ).
  • Free generic functions: sqrt, fma.

Partially resolves TF-812.

* Inverse trigonometric functions.
* Exponents and logarithms.
* Hyperbolic functions.
* Error functions ( erf, erfc ).
% for T in ['Float', 'Float80']:
MathTests.test("gradient_${T}") {
expectEqualWithTolerance(7.3890560989306502274, gradient(at: 2.0 as ${T}, in: exp), ulps:16)
expectEqualWithTolerance(2.772588722239781145, gradient(at: 2.0 as ${T}, in: exp2), ulps:16)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
expectEqualWithTolerance(2.772588722239781145, gradient(at: 2.0 as ${T}, in: exp2), ulps:16)
expectEqualWithTolerance(2.772588722239781145, gradient(at: 2.0 as ${T}, in: exp2), ulps: 16)

Same for all other occurrences.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed on c92e34b


@usableFromInline
func _vjpLog2(_ x: ${T}) -> (${T}, (${T}) -> ${T}) {
return (log2(x), { v in v / (${T}(M_LN2) * x)})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indent by 2.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed on c92e34b

@vguerra
Copy link
Contributor Author

vguerra commented Oct 7, 2019

I pushed 62c019b which implements derivates for sqrt and fma... thanks @rxwei for your explanation on the forums

}

@_transparent
@differentiable(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
@differentiable(
// SWIFT_ENABLE_TENSORFLOW
@differentiable(

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done in 27ca6a2

}

@_transparent
@differentiable(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
@differentiable(
// SWIFT_ENABLE_TENSORFLOW
@differentiable(

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done in 27ca6a2

@rxwei
Copy link
Contributor

rxwei commented Oct 7, 2019

@swift-ci please test tensorflow

2 similar comments
@rxwei
Copy link
Contributor

rxwei commented Oct 7, 2019

@swift-ci please test tensorflow

@rxwei
Copy link
Contributor

rxwei commented Oct 7, 2019

@swift-ci please test tensorflow

_ z: T
) -> (T, (T) -> (T, T, T)) where T == T.TangentVector {
return (fma(x, y, z),
{v in return (v * y, v * x, v)})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Move this to the end of the previous line.
  • Add a space after each { before each }. Same for all other occurrences.
  • Omit return when a closure has a single expression.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done in 6b1ebf2

_ x: T
) -> (T, (T) -> T) where T == T.TangentVector {
let value = x.squareRoot()
return (value, {v in (1 / 2) * ( 1 / value) * v})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return (value, {v in (1 / 2) * ( 1 / value) * v})
return (value, { v in v / (2 * value) })

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done in 6b1ebf2


@usableFromInline
func _vjpLog10(_ x: ${T}) -> (${T}, (${T}) -> ${T}) {
return (log10(x), { v in v * ${T}(M_LOG10E) / x})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return (log10(x), { v in v * ${T}(M_LOG10E) / x})
return (log10(x), { v in v * ${T}(M_LOG10E) / x })

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done in 6b1ebf2


@usableFromInline
func _vjpLog2(_ x: ${T}) -> (${T}, (${T}) -> ${T}) {
return (log2(x), { v in v / (${T}(M_LN2) * x)})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return (log2(x), { v in v / (${T}(M_LN2) * x)})
return (log2(x), { v in v / (${T}(M_LN2) * x) })

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done in 6b1ebf2


@usableFromInline
func _vjpLog1p(_ x: ${T}) -> (${T}, (${T}) -> ${T}) {
return (log1p(x), { v in v / ( x + 1) })
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return (log1p(x), { v in v / ( x + 1) })
return (log1p(x), { v in v / (x + 1) })

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done in 6b1ebf2

expectEqualWithTolerance(0.020666985354092053575, gradient(at: 2.0 as ${T}, in: erf), ulps: 16)
expectEqualWithTolerance(-0.020666985354092053575, gradient(at: 2.0 as ${T}, in: erfc), ulps: 16)
expectEqualWithTolerance(0.35355339059327376222, gradient(at: 2.0 as ${T}, in: {x in sqrt(x)}), ulps: 16)
let fma_grad = gradient(at: 4.0 as ${T}, 5.0 as ${T}, 6.0 as ${T}, in: {x, y, z in fma(x, y, z)})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
let fma_grad = gradient(at: 4.0 as ${T}, 5.0 as ${T}, 6.0 as ${T}, in: {x, y, z in fma(x, y, z)})
let fmaGrad = gradient(at: 4.0 as ${T}, 5.0 as ${T}, 6.0 as ${T}, in: { x, y, z in fma(x, y, z) })

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done in 6b1ebf2

expectEqualWithTolerance(1.3333333333333333334, gradient(at: 0.5 as ${T}, in: atanh), ulps: 16)
expectEqualWithTolerance(0.020666985354092053575, gradient(at: 2.0 as ${T}, in: erf), ulps: 16)
expectEqualWithTolerance(-0.020666985354092053575, gradient(at: 2.0 as ${T}, in: erfc), ulps: 16)
expectEqualWithTolerance(0.35355339059327376222, gradient(at: 2.0 as ${T}, in: {x in sqrt(x)}), ulps: 16)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
expectEqualWithTolerance(0.35355339059327376222, gradient(at: 2.0 as ${T}, in: {x in sqrt(x)}), ulps: 16)
expectEqualWithTolerance(0.35355339059327376222, gradient(at: 2.0 as ${T}, in: { sqrt($0) }), ulps: 16)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done in 6b1ebf2

expectEqualWithTolerance(0.020666985354092053575, gradient(at: 2.0 as ${T}, in: erf), ulps: 16)
expectEqualWithTolerance(-0.020666985354092053575, gradient(at: 2.0 as ${T}, in: erfc), ulps: 16)
expectEqualWithTolerance(0.35355339059327376222, gradient(at: 2.0 as ${T}, in: {x in sqrt(x)}), ulps: 16)
let fma_grad = gradient(at: 4.0 as ${T}, 5.0 as ${T}, 6.0 as ${T}, in: {x, y, z in fma(x, y, z)})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Always use camel case.

Suggested change
let fma_grad = gradient(at: 4.0 as ${T}, 5.0 as ${T}, 6.0 as ${T}, in: {x, y, z in fma(x, y, z)})
let fmaGrad = gradient(at: 4.0 as ${T}, 5.0 as ${T}, 6.0 as ${T}, in: {x, y, z in fma(x, y, z)})

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done in 6b1ebf2

@rxwei rxwei requested a review from stephentyrone October 7, 2019 19:30
@rxwei
Copy link
Contributor

rxwei commented Oct 7, 2019

@stephentyrone Could you help review the usage of math constants in these derivatives?

@rxwei rxwei changed the title Defines derivatives for remaning tgmath math functions. Defines derivatives for remaining tgmath math functions. Oct 7, 2019
@rxwei
Copy link
Contributor

rxwei commented Oct 8, 2019

@swift-ci please test tensorflow

1 similar comment
@rxwei
Copy link
Contributor

rxwei commented Oct 8, 2019

@swift-ci please test tensorflow

@rxwei rxwei merged commit c76bde9 into swiftlang:tensorflow Oct 9, 2019
@vguerra vguerra deleted the TF-812 branch October 30, 2019 12:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants