The code you posted is troubling. Specifically, the Base64 data, when decoded, doesn’t yield valid ASCII. Consider this tweaked version of your code:
func test() {
let base64String = "1ZwoNohdE8Nteis/IXl1rg=="
if let decodedData = Data(base64Encoded: base64String, options: .ignoreUnknownCharacters) {
print("OK, data: \((decodedData as NSData).debugDescription)")
} else {
print("NG")
}
}
This outputs the same value on iOS 17.6.1 and iOS 18.0, namely:
OK, data: <d59c2836 885d13c3 6d7a2b3f 217975ae>
But the resulting data has lots of high-bit-set bytes, and thus isn’t ASCII. So you’re relying on the behaviour of String.init(data:encoding:) when you pass it data that’s not ASCII. Which brings me to a further tweaked version of your code:
func test() {
let decodedData = Data([
0xd5, 0x9c, 0x28, 0x36, 0x88, 0x5d, 0x13, 0xc3,
0x6d, 0x7a, 0x2b, 0x3f, 0x21, 0x79, 0x75, 0xae,
])
if let decodedString = String(data: decodedData, encoding: .ascii) {
print("OK, string: \(decodedString)")
} else {
print("NG")
}
}
On iOS 17.6.1 it prints this:
OK, string: Õ(6]Ãmz+?!yu®
On iOS 18.0 it prints this:
iOS 18 NG
IMO that’s an improvement. The iOS 17 behaviour is nonsense. iOS 18 has correctly failed to decode your string because it’s not even close to ASCII.
Digging further into the hex dumps, it looks like iOS 17 was treating .ascii as a synonym for .isoLatin1. If you want the old behaviour — and, to be clear, that old behaviour makes no sense to me — you can get it by applying that change. My advice, however, is that you look at your code to figure out why you were trying to decode non-ASCII data as ASCII.
Share and Enjoy
—
Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"