SR-11514Decimal.init(_ value: Double) sometimes yields incorrect value
Decimal precision is a bit strange for a number 456.789. When you initialize Double with 456.789 it holds the number so it is representable, but if you try to convert it to a decimal with Decimal(456.789) it results in 456.7889999999998976 but if you use Decimal(string: "456.789")! it results in a correct decimal with a value 456.789.
I tried also Objective-C, and it works correctly there:
This is mostly expected. `456.789 as Double` has the exact value 456.788999999999987267074175179004669189453125, which is not exactly representable as `Decimal`, so it gets rounded to a representable `Decimal` value.
That second rounding appears to be not quite perfect (that's a Foundation bug), but even if Foundation were rounding correctly, the result would not be 456.789. It would be something like 456.78899999999998726707417517900466919 instead.