possible workaround looks a little bit strange
Int8(bitPattern: UInt8("ff", radix: 16)!)
possible workaround looks a little bit strange
Int8(bitPattern: UInt8("ff", radix: 16)!)
I think this is implemented well. It at least matches what you read when you Force Click on it.
... if the value it denotes in the given `radix` is not representable, the result is `nil`
Do you thing that "ff" is 'not unambiguously representable' as Int8 value -1 ???
It is indeed ambiguous. 2's complement is an implementation detail that is only assumed by you because it's always been that way for you.
This is the thing that is not ambiguous:
Int8("-1", radix: 16)
Jessy, did you check the question? why
Int8("ff", radix: 16) == nilI didn't aks why
Int8("-1", radix: 16) == -1
Please step back a moment. Think what FF is. FF is 255.
255 cannot be represented by an integer than can only go up to 127.
You must stop being blinded by anger and understand that I understand what 2's complement is, or you won't be able to understand the rationale of those of us who think what you're asking about should indeed be nil, or throw an error.
Sorry Jessy, if my question was not clear enough. My expectation is that
Int8("ff", radix: 16) == -1
UInt8("ff", radix: 16) == 255
Int8("11111111", radix: 2) == -1
UInt8("11111111", radix: 2) == 256The requested type is well known, the value is representable ...
If i compare it to
let d = Double(-1) // -1.0
let i = Int(-1) // -1the compiler is able to recognise what to do without any trouble. 'byte' representable by 'binary' string "11111111" could be -1 if I ask for Int8 value or 255 if I ask for UInt8 value. Unfortunatly, that doesn't work. My question is why? Is there some good reason for this behavior?
As the documentation states, the initializer uses a regular expression to parse the string. The regex doesn't know what type of value it is being asked to return (8/16/32/64-bit, signed/unsigned). What happens is that "ff" and the radix are passed to the regex, it returns 255, which cannot possibly fit into an 8-bit signed value, the value returned from the regex is supposed to be between -127 and 127.
What you want to do is this. First convert the hex string to its numeric bit value, then feed that into the signed int constructor which will use the bit pattern and preserve the sign.
let num = Int8(bitPattern: UInt8("ff", radix: 16)!)BTW, your last example of UInt8("11111111", radix: 2) equals 255, not 256.
I wonder if semantically (and performance-wise), it makes more sense represented like this:
Int8(truncatingBitPattern: UInt("baddab00ff", radix: 0x10)!)
Thank you for answer! 256 as the result was my 'typo', sorry for that (it is clear from text anyway ..), and yes I am using
Int8(bitPattern: UInt8("ff", radix: 16)!)as you can see in my question. I don't like to be depend on forced unwrapping, so from simple single line solution i got finaly some working but also not very nice solution
func fromHexaString<T:SignedIntegerType>(hs: String)->T?{
if let ui = UInt(hs, radix: 0x10) {
switch T.self {
case is Int8.Type:
return Int8(truncatingBitPattern: ui) as? T
case is Int16.Type:
return Int16(truncatingBitPattern: ui) as? T
case is Int32.Type:
return Int32(truncatingBitPattern: ui) as? T
case is Int64.Type:
return Int64(bitPattern: UInt64(ui)) as? T
case is Int.Type:
return Int(bitPattern: ui) as? T
default:
break
}
}
return nil
}
let i8: Int8? = fromHexaString("ff")
let i16: Int16? = fromHexaString("ffff")
let i32: Int32? = fromHexaString("ffffffff")
let i64: Int64? = fromHexaString("ffffffffffffffff")
let i: Int? = fromHexaString("ffffffffffffffff")
Thank you very much Jessy! 'truncatingBitPattern' has a lot of sence for me!