Double initialized as Decimal returns number with trailing fractional digits

Hello there!

I'm using Swift 5, you can check the code below in online playground: https://swiftfiddle.com/x3xli3a3avby7pf6tzuqwaebfa

In the example below you can see that a double is initialized as Decimal but instead of returning 0.94 it returns different number

let x:Double = 0.94
print(Decimal(x))
> 0.9399999999999997952 // expected: 0.94

At the same time if I try to do the same with numbers like 0.93 or 0.95 Decimal conversion returns exactly those numbers.

The problem it creates is that when similar operation is done with formatter it returns an unexpected number:

let formatter = NumberFormatter()
formatter.numberStyle = .decimal

let ynumber = formatter.number(from: y)
print("formatted number: ", ynumber)
print("formatted decimal value: ", ynumber?.decimalValue)
> formatted number:  Optional(0.94)
> formatted decimal value: Optional(0.9399999999999997952) // expected Optional(0.94)

Is there a way to make Decimal always return the same number as the value it is initialized with?

Thanks!

  • Hence the reason for visual formatters. Don't the Decimal type unless you really need to.

Add a Comment

Accepted Reply

This will work.

let formatter = NumberFormatter()
formatter.numberStyle = .decimal

let x : Double = 0.94
let string = formatter.string(for: x) ?? "?"
print("x as string =", string)

let num = try Decimal(string, format: .number)
print("As decimal =", num)

let numPlus10 = num + 10
print("numPlus10 =", numPlus10) // To show it is effectively a number, not a String

You get:

  • x as string = 0.94
  • As decimal = 0.94
  • numPlus10 = 10.94
  • Thank you! This solution worked as expected, I missed in docs this initializator init(from decoder: Decoder) throws.

Add a Comment

Replies

This should work:

let x : Double = 0.94
print(Decimal(x))

let formatter = NumberFormatter()
formatter.numberStyle = .decimal

let string = formatter.string(for: x) ?? "?"
print(string)

giving:

  • 0.9399999999999997952
  • 0.94

That's for display. Of course, you should keep the raw value x if you need to reuse.

Note: how is y defined in

let ynumber = formatter.number(from: y)

But

let ynumber = formatter.number(from: string) ?? 0.0

gives:

  • 0.9399999999999999

Because of precision of conversion.

  • Sorry, I didn't put that in: let y = "0.94". In the end I need to convert String to NSDecimalNumber but it seems to be impossible without converting it to Decimal first. So, the result shouldn't be a String, it should be a Decimal or NSDecimalNumber.

Add a Comment

Hence the reason for visual formatters. Don't use the Decimal type unless you really need to.

var x:Double = 0.94
var result = ""
print(x, terminator: "", to: &result)
print(result) // 0.94

This will work.

let formatter = NumberFormatter()
formatter.numberStyle = .decimal

let x : Double = 0.94
let string = formatter.string(for: x) ?? "?"
print("x as string =", string)

let num = try Decimal(string, format: .number)
print("As decimal =", num)

let numPlus10 = num + 10
print("numPlus10 =", numPlus10) // To show it is effectively a number, not a String

You get:

  • x as string = 0.94
  • As decimal = 0.94
  • numPlus10 = 10.94
  • Thank you! This solution worked as expected, I missed in docs this initializator init(from decoder: Decoder) throws.

Add a Comment