Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Decimal scaling only occurs on op vs create #399

Closed
briandbecker opened this issue Apr 11, 2019 · 3 comments · Fixed by #439
Closed

Decimal scaling only occurs on op vs create #399

briandbecker opened this issue Apr 11, 2019 · 3 comments · Fixed by #439
Assignees
Labels
broken Product is broken and must be fixed component/navigator
Milestone

Comments

@briandbecker
Copy link

Disclaimer: I just started working with DAML in the last few days based on its open source release so perhaps I'm missing something.

When working with the IOU sample contract in the getting started guide I am able to do a split of Alice's 100 into an IOU of 100 and 0.00000000005. The very small 5 value seems like it should fall outside the defined scale of decimal in the documentation (scale 10), however it doesn't; instead the result is what appears to be a valid IOU of 100 and a second of 0.00000000005.

Steps to recreate:

  1. Go through the quick start guide; after running the IOU contract and starting navigator...
  2. Enter the navigator application as Alice
  3. Select the 100 IOU and do a split operation.
  4. Provide the split amount value as 0.00000000005
  5. You should now have two IOUs of 100 and 0.00000000005
  6. Do another split the same way; you should result in 3 IOUs of 1 x 100 and 2 x 0.00000000005

image

  1. Do a merge on the 2 x 0.00000000005 IOU's which should result in a now valid IOU within the documented scale of 0.0000000001
  2. This can now be merged back into the 100 IOU which results in a single IOU of

image

Solutions:

  1. Always do an op on user provided values. For example updating the IOU (https://github.com/digital-asset/daml/blob/05e691f55852fbb207f0e43cf23bb95b95866ba3/docs/source/getting-started/quickstart/template-root/daml/Iou.daml) contract split code so an opp is forced (less than ideal because it is counter intuitive and would have to be broadly applied)
    splitCid <- create this with amount = splitAmount + 0.0

  2. I'm going way outside my comfort zone on this but Decimal appears to be defined in https://github.com/digital-asset/daml/blob/05e691f55852fbb207f0e43cf23bb95b95866ba3/daml-lf/data/src/main/scala/com/digitalasset/daml/lf/data/Decimal.scala where you can see the scale (line 36) is applied during the check op. The check op should be applied during construction otherwise contracts that don't op user values but then use those values potentially result in BigDecimal values being valid.

@francesco-da
Copy link
Contributor

@briandbecker thanks for the high quality bug report. We will take a look at this soon.

@francesco-da francesco-da added this to the Maintenance milestone Apr 11, 2019
@francesco-da francesco-da added the broken Product is broken and must be fixed label Apr 12, 2019
francesco-da added a commit that referenced this issue Apr 12, 2019
Fixes #399.

This fixes a critical bug -- since:

* The DAML-LF spec specifies that the scale of `Decimal` is 10 --
    that is, there are at most 10 digits past the decimal point:
    <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/spec/value.rst#field-decimal>.
* However, the code converting from the string representation that
    we get on the wire was _not_ performing this check. This is due
    to two reasons:

    - `Decimal.check` is a function that checks whether a given
        `Decimal` is within the upper and lower bounds of what the
        DAML-LF spec specifies, but crucially it does _not_ check that
        the scale is not exceeded:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/data/src/main/scala/com/digitalasset/daml/lf/data/Decimal.scala#L31>.
        This behavior is correct in some cases (more on that later),
        but not in others. Crucially, `Decimal.fromString`, which was
        supposed to check if a decimal string literal is valid, used
        this function, which means that it accepted string literals
        containing numbers out of the required scale, rounding them to
        make them fit within the scale. This function has been renamed
        to `Decimal.checkWithinBoundsAndRound`, and a new function
        `Decimal.checkWithinBoundsAndWithinScale` has been added, which
        fails if the number provided has data not within the scale.
        `Decimal.fromString` now uses
        `Decimal.checkWithinBoundsAndWithinScale`.

    - `ApiToLfEngine.parseDecimal` assumed that `Decimal.fromString` _did_
        fail when passed numbers that were in any way invalid, and
        moreover did _not_ use the result of `Decimal.fromString`, but rather
        re-parsed the string into an unconstrained `BigDecimal`:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/ledger/ledger-api-common/src/main/scala/com/digitalasset/platform/participant/util/ApiToLfEngine.scala#L96>.
        The reason for the code to work this way is that in the past
        it was responsible for converting decimal strings both for the
        current engine but also for an older version of the engine which
        handled decimals of a different type. Both issues have been fixed.

* Therefore, `Decimal`s with scale exceeding the specified scale
    made it into the engine, and contracts could be created containing
    this invalid value.

* Once on the ledger, these bad numbers can be used to produce extremely
    surprising results, due to how `Decimal` operations are
    implemented. Generally speaking, all operations on `Decimal`
    first compute the result and then run the output through
    `Decimal.checkWithinBoundsAndRound`. The reason for this behavior
    is that we specify multiplication and division as rounding their
    output. Consider the case where the bad value 0.00000000005 made
    it to the engine, and is then added to 100. The full-precision
    result will be 100.00000000005, which after rounding becomes 100.
    Therefore, on a ledger where such invalid values exist, it is not
    the case that `y > 0 ==> x + y != x`, and so on.

Thanks a bunch to @briandbecker for the excellent bug report.
francesco-da added a commit that referenced this issue Apr 12, 2019
Fixes #399.

This fixes a critical bug -- since:

* The DAML-LF spec specifies that the scale of `Decimal` is 10 --
    that is, there are at most 10 digits past the decimal point:
    <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/spec/value.rst#field-decimal>.
* However, the code converting from the string representation that
    we get on the wire was _not_ performing this check. This is due
    to two reasons:

    - `Decimal.check` is a function that checks whether a given
        `Decimal` is within the upper and lower bounds of what the
        DAML-LF spec specifies, but crucially it does _not_ check that
        the scale is not exceeded:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/data/src/main/scala/com/digitalasset/daml/lf/data/Decimal.scala#L31>.
        This behavior is correct in some cases (more on that later),
        but not in others. Crucially, `Decimal.fromString`, which was
        supposed to check if a decimal string literal is valid, used
        this function, which means that it accepted string literals
        containing numbers out of the required scale, rounding them to
        make them fit within the scale. This function has been renamed
        to `Decimal.checkWithinBoundsAndRound`, and a new function
        `Decimal.checkWithinBoundsAndWithinScale` has been added, which
        fails if the number provided has data not within the scale.
        `Decimal.fromString` now uses
        `Decimal.checkWithinBoundsAndWithinScale`.

    - `ApiToLfEngine.parseDecimal` assumed that `Decimal.fromString` _did_
        fail when passed numbers that were in any way invalid, and
        moreover did _not_ use the result of `Decimal.fromString`, but rather
        re-parsed the string into an unconstrained `BigDecimal`:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/ledger/ledger-api-common/src/main/scala/com/digitalasset/platform/participant/util/ApiToLfEngine.scala#L96>.
        The reason for the code to work this way is that in the past
        it was responsible for converting decimal strings both for the
        current engine but also for an older version of the engine which
        handled decimals of a different type. Both issues have been fixed.

* Therefore, `Decimal`s with scale exceeding the specified scale
    made it into the engine, and contracts could be created containing
    this invalid value.

* Once on the ledger, these bad numbers can be used to produce extremely
    surprising results, due to how `Decimal` operations are
    implemented. Generally speaking, all operations on `Decimal`
    first compute the result and then run the output through
    `Decimal.checkWithinBoundsAndRound`. The reason for this behavior
    is that we specify multiplication and division as rounding their
    output. Consider the case where the bad value 0.00000000005 made
    it to the engine, and is then added to 100. The full-precision
    result will be 100.00000000005, which after rounding becomes 100.
    Therefore, on a ledger where such invalid values exist, it is not
    the case that `y > 0 ==> x + y != x`, and so on.

Thanks a bunch to @briandbecker for the excellent bug report.
francesco-da added a commit that referenced this issue Apr 12, 2019
Fixes #399.

This fixes a critical bug -- since:

* The DAML-LF spec specifies that the scale of `Decimal` is 10 --
    that is, there are at most 10 digits past the decimal point:
    <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/spec/value.rst#field-decimal>.
* However, the code converting from the string representation that
    we get on the wire was _not_ performing this check. This is due
    to two reasons:

    - `Decimal.check` is a function that checks whether a given
        `Decimal` is within the upper and lower bounds of what the
        DAML-LF spec specifies, but crucially it does _not_ check that
        the scale is not exceeded:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/data/src/main/scala/com/digitalasset/daml/lf/data/Decimal.scala#L31>.
        This behavior is correct in some cases (more on that later),
        but not in others. Crucially, `Decimal.fromString`, which was
        supposed to check if a decimal string literal is valid, used
        this function, which means that it accepted string literals
        containing numbers out of the required scale, rounding them to
        make them fit within the scale. This function has been renamed
        to `Decimal.checkWithinBoundsAndRound`, and a new function
        `Decimal.checkWithinBoundsAndWithinScale` has been added, which
        fails if the number provided has data not within the scale.
        `Decimal.fromString` now uses
        `Decimal.checkWithinBoundsAndWithinScale`.

    - `ApiToLfEngine.parseDecimal` assumed that `Decimal.fromString` _did_
        fail when passed numbers that were in any way invalid, and
        moreover did _not_ use the result of `Decimal.fromString`, but rather
        re-parsed the string into an unconstrained `BigDecimal`:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/ledger/ledger-api-common/src/main/scala/com/digitalasset/platform/participant/util/ApiToLfEngine.scala#L96>.
        The reason for the code to work this way is that in the past
        it was responsible for converting decimal strings both for the
        current engine but also for an older version of the engine which
        handled decimals of a different type. Both issues have been fixed.

* Therefore, `Decimal`s with scale exceeding the specified scale
    made it into the engine, and contracts could be created containing
    this invalid value.

* Once on the ledger, these bad numbers can be used to produce extremely
    surprising results, due to how `Decimal` operations are
    implemented. Generally speaking, all operations on `Decimal`
    first compute the result and then run the output through
    `Decimal.checkWithinBoundsAndRound`. The reason for this behavior
    is that we specify multiplication and division as rounding their
    output. Consider the case where the bad value 0.00000000005 made
    it to the engine, and is then added to 100. The full-precision
    result will be 100.00000000005, which after rounding becomes 100.
    Therefore, on a ledger where such invalid values exist, it is not
    the case that `y > 0 ==> x + y != x`, and so on.

Thanks a bunch to @briandbecker for the excellent bug report.
francesco-da added a commit that referenced this issue Apr 12, 2019
Fixes #399.

This fixes a critical bug -- since:

* The DAML-LF spec specifies that the scale of `Decimal` is 10 --
    that is, there are at most 10 digits past the decimal point:
    <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/spec/value.rst#field-decimal>.
* However, the code converting from the string representation that
    we get on the wire was _not_ performing this check. This is due
    to two reasons:

    - `Decimal.check` is a function that checks whether a given
        `Decimal` is within the upper and lower bounds of what the
        DAML-LF spec specifies, but crucially it does _not_ check that
        the scale is not exceeded:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/data/src/main/scala/com/digitalasset/daml/lf/data/Decimal.scala#L31>.
        This behavior is correct in some cases (more on that later),
        but not in others. Crucially, `Decimal.fromString`, which was
        supposed to check if a decimal string literal is valid, used
        this function, which means that it accepted string literals
        containing numbers out of the required scale, rounding them to
        make them fit within the scale. This function has been renamed
        to `Decimal.checkWithinBoundsAndRound`, and a new function
        `Decimal.checkWithinBoundsAndWithinScale` has been added, which
        fails if the number provided has data not within the scale.
        `Decimal.fromString` now uses
        `Decimal.checkWithinBoundsAndWithinScale`.

    - `ApiToLfEngine.parseDecimal` assumed that `Decimal.fromString` _did_
        fail when passed numbers that were in any way invalid, and
        moreover did _not_ use the result of `Decimal.fromString`, but rather
        re-parsed the string into an unconstrained `BigDecimal`:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/ledger/ledger-api-common/src/main/scala/com/digitalasset/platform/participant/util/ApiToLfEngine.scala#L96>.
        The reason for the code to work this way is that in the past
        it was responsible for converting decimal strings both for the
        current engine but also for an older version of the engine which
        handled decimals of a different type. Both issues have been fixed.

* Therefore, `Decimal`s with scale exceeding the specified scale
    made it into the engine, and contracts could be created containing
    this invalid value.

* Once on the ledger, these bad numbers can be used to produce extremely
    surprising results, due to how `Decimal` operations are
    implemented. Generally speaking, all operations on `Decimal`
    first compute the result and then run the output through
    `Decimal.checkWithinBoundsAndRound`. The reason for this behavior
    is that we specify multiplication and division as rounding their
    output. Consider the case where the bad value 0.00000000005 made
    it to the engine, and is then added to 100. The full-precision
    result will be 100.00000000005, which after rounding becomes 100.
    Therefore, on a ledger where such invalid values exist, it is not
    the case that `y > 0 ==> x + y != x`, and so on.

Thanks a bunch to @briandbecker for the excellent bug report.
francesco-da added a commit that referenced this issue Apr 12, 2019
Fixes #399.

This fixes a critical bug -- since:

* The DAML-LF spec specifies that the scale of `Decimal` is 10 --
    that is, there are at most 10 digits past the decimal point:
    <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/spec/value.rst#field-decimal>.
* However, the code converting from the string representation that
    we get on the wire was _not_ performing this check. This is due
    to two reasons:

    - `Decimal.check` is a function that checks whether a given
        `Decimal` is within the upper and lower bounds of what the
        DAML-LF spec specifies, but crucially it does _not_ check that
        the scale is not exceeded:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/data/src/main/scala/com/digitalasset/daml/lf/data/Decimal.scala#L31>.
        This behavior is correct in some cases (more on that later),
        but not in others. Crucially, `Decimal.fromString`, which was
        supposed to check if a decimal string literal is valid, used
        this function, which means that it accepted string literals
        containing numbers out of the required scale, rounding them to
        make them fit within the scale. This function has been renamed
        to `Decimal.checkWithinBoundsAndRound`, and a new function
        `Decimal.checkWithinBoundsAndWithinScale` has been added, which
        fails if the number provided has data not within the scale.
        `Decimal.fromString` now uses
        `Decimal.checkWithinBoundsAndWithinScale`.

    - `ApiToLfEngine.parseDecimal` assumed that `Decimal.fromString` _did_
        fail when passed numbers that were in any way invalid, and
        moreover did _not_ use the result of `Decimal.fromString`, but rather
        re-parsed the string into an unconstrained `BigDecimal`:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/ledger/ledger-api-common/src/main/scala/com/digitalasset/platform/participant/util/ApiToLfEngine.scala#L96>.
        The reason for the code to work this way is that in the past
        it was responsible for converting decimal strings both for the
        current engine but also for an older version of the engine which
        handled decimals of a different type. Both issues have been fixed.

* Therefore, `Decimal`s with scale exceeding the specified scale
    made it into the engine, and contracts could be created containing
    this invalid value.

* Once on the ledger, these bad numbers can be used to produce extremely
    surprising results, due to how `Decimal` operations are
    implemented. Generally speaking, all operations on `Decimal`
    first compute the result and then run the output through
    `Decimal.checkWithinBoundsAndRound`. The reason for this behavior
    is that we specify multiplication and division as rounding their
    output. Consider the case where the bad value 0.00000000005 made
    it to the engine, and is then added to 100. The full-precision
    result will be 100.00000000005, which after rounding becomes 100.
    Therefore, on a ledger where such invalid values exist, it is not
    the case that `y > 0 ==> x + y != x`, and so on.

Thanks a bunch to @briandbecker for the excellent bug report.
francesco-da added a commit that referenced this issue Apr 14, 2019
* fix various conversion functions from string to Decimal

Fixes #399.

This fixes a critical bug -- since:

* The DAML-LF spec specifies that the scale of `Decimal` is 10 --
    that is, there are at most 10 digits past the decimal point:
    <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/spec/value.rst#field-decimal>.
* However, the code converting from the string representation that
    we get on the wire was _not_ performing this check. This is due
    to two reasons:

    - `Decimal.check` is a function that checks whether a given
        `Decimal` is within the upper and lower bounds of what the
        DAML-LF spec specifies, but crucially it does _not_ check that
        the scale is not exceeded:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/daml-lf/data/src/main/scala/com/digitalasset/daml/lf/data/Decimal.scala#L31>.
        This behavior is correct in some cases (more on that later),
        but not in others. Crucially, `Decimal.fromString`, which was
        supposed to check if a decimal string literal is valid, used
        this function, which means that it accepted string literals
        containing numbers out of the required scale, rounding them to
        make them fit within the scale. This function has been renamed
        to `Decimal.checkWithinBoundsAndRound`, and a new function
        `Decimal.checkWithinBoundsAndWithinScale` has been added, which
        fails if the number provided has data not within the scale.
        `Decimal.fromString` now uses
        `Decimal.checkWithinBoundsAndWithinScale`.

    - `ApiToLfEngine.parseDecimal` assumed that `Decimal.fromString` _did_
        fail when passed numbers that were in any way invalid, and
        moreover did _not_ use the result of `Decimal.fromString`, but rather
        re-parsed the string into an unconstrained `BigDecimal`:
        <https://github.com/digital-asset/daml/blob/79bbf5c794530ed7e29e61fcc2d98221cdb221e3/ledger/ledger-api-common/src/main/scala/com/digitalasset/platform/participant/util/ApiToLfEngine.scala#L96>.
        The reason for the code to work this way is that in the past
        it was responsible for converting decimal strings both for the
        current engine but also for an older version of the engine which
        handled decimals of a different type. Both issues have been fixed.

* Therefore, `Decimal`s with scale exceeding the specified scale
    made it into the engine, and contracts could be created containing
    this invalid value.

* Once on the ledger, these bad numbers can be used to produce extremely
    surprising results, due to how `Decimal` operations are
    implemented. Generally speaking, all operations on `Decimal`
    first compute the result and then run the output through
    `Decimal.checkWithinBoundsAndRound`. The reason for this behavior
    is that we specify multiplication and division as rounding their
    output. Consider the case where the bad value 0.00000000005 made
    it to the engine, and is then added to 100. The full-precision
    result will be 100.00000000005, which after rounding becomes 100.
    Therefore, on a ledger where such invalid values exist, it is not
    the case that `y > 0 ==> x + y != x`, and so on.

Thanks a bunch to @briandbecker for the excellent bug report.

* fix failing test using to much precision
@bitonic
Copy link
Contributor

bitonic commented Apr 15, 2019

@briandbecker thanks again for reporting this. If you're OK with it we will include you in the "Acknowledgements" section of https://digitalasset.com/security/ .

@briandbecker
Copy link
Author

@bitonic I am ok with it and appreciate the recognition.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
broken Product is broken and must be fixed component/navigator
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants