I use the following code to get a Character from a Unicode codepoiint:
let c = Character(Unicode.Scalar("12345")!)It seems very complex and clumsy. Is there an shortcut way to achieve the same goal?
I use the following code to get a Character from a Unicode codepoiint:
let c = Character(Unicode.Scalar("12345")!)It seems very complex and clumsy. Is there an shortcut way to achieve the same goal?
NO.
And the right syntax is
let c = Character(Unicode.Scalar(12345)!)(`Unicode.Scalar("12345")!` causes runtime crash.)
If you often work with `Character` and Unicode codepoint, you can write an extension of your own.
extension Character {
init?(_ codePoint: UInt32) {
guard let us = Unicode.Scalar(codePoint) else {
return nil
}
self = Character(us)
}
}
print(Character(12345)!)Or you can propose a new initializer of `Character` in swift.org .
NO.
And the right syntax is
let c = Character(Unicode.Scalar(12345)!)(`Unicode.Scalar("12345")!` causes runtime crash.)
If you often work with `Character` and Unicode codepoint, you can write an extension of your own.
extension Character {
init?(_ codePoint: UInt32) {
guard let us = Unicode.Scalar(codePoint) else {
return nil
}
self = Character(us)
}
}
print(Character(12345)!)Or you can propose a new initializer of `Character` in swift.org .
Presumably you’re doing this a lot, and I’m curious why that is. Can you explain more of the backstory here?
When I run into issues like this I generally find it’s because I’m working in the wrong view. For example, if I’m parsing a string in a network protocol, it’s better to use the UTF-8 view. The code units are then
UInt8, and I can initialise them from numeric values directly and from ASCII ‘character’ values using
init(ascii:).
Share and Enjoy
—
Quinn “The Eskimo!”
Apple Developer Relations, Developer Technical Support, Core OS/Hardware
let myEmail = "eskimo" + "1" + "@apple.com"