How to understand the type conversion in Int + Double?

Extend the example in “The Swift Programming Language” (Swift 5.5) “Integer and Floating-Point Conversion”:


3 + 0.14 // allowed

let three = 3
let rest = 0.14

3 + rest // allowed
0.14 + three // compile error
three + 0.14 // compile error

I don’t understand why the last two lines are taken as compile error. Can anyone help to explain a bit? Thanks.

Answered by OOPer in 692397022

I don’t understand why the last two lines are taken as compile error. Can anyone help to explain a bit?

You may need to know two things.

  • In Swift, addition of Int and Double is not allowed. (The binary operator + is not defined for (Int, Double) nor (Double, Int).)

  • In Swift, the types of literals are defined depending on the context.


In your first example:

3 + 0.14 // allowed

3 is interpreted as Double (this may not be as you expect), 0.14 is interpreted as Double.

The integer literal 3 can be interpreted both as Int and as Double depending on the context.

In this declaration:

let three = 3

The type of three is inferred as Int, as there is not type hint in the declaration. And the type of rest is inferred as Double.

In the following line, 3 is inferred as Double again in this context.

3 + rest // allowed

Thus, the last two lines causes error:

0.14 + three // compile error
three + 0.14 // compile error

Because type inference of three is finished here and it has the fixed type Int.

Accepted Answer

I don’t understand why the last two lines are taken as compile error. Can anyone help to explain a bit?

You may need to know two things.

  • In Swift, addition of Int and Double is not allowed. (The binary operator + is not defined for (Int, Double) nor (Double, Int).)

  • In Swift, the types of literals are defined depending on the context.


In your first example:

3 + 0.14 // allowed

3 is interpreted as Double (this may not be as you expect), 0.14 is interpreted as Double.

The integer literal 3 can be interpreted both as Int and as Double depending on the context.

In this declaration:

let three = 3

The type of three is inferred as Int, as there is not type hint in the declaration. And the type of rest is inferred as Double.

In the following line, 3 is inferred as Double again in this context.

3 + rest // allowed

Thus, the last two lines causes error:

0.14 + three // compile error
three + 0.14 // compile error

Because type inference of three is finished here and it has the fixed type Int.

Thanks @OOPer. Much clear now.

How to understand the type conversion in Int + Double?
 
 
Q