So I confirm that this became a mess. A device can become uncalibrated, probably when he stays inside for a significant amount of time.
The issue is that going outside with clear GPS signal is not a good solution.
It might take A LOT of time for the device to recover (my watch, for example, never re-calibrated). If it does, the app using CMAltimeter might enter a weird behavior, due to the delivery of inaccurate events.
I don't know what happened with the latest updates, but I have 2 production apps facing major issues.
Changing let thumbMaxSize = min(2400, maxImageSize)
with
let thumbMaxSize = min(1200, maxImageSize)
thus, simply reducing the image size, fixed the issue.
I don't know what changed in 18.2 in that regards, but I must say that I'm happy that my use case allows for reducing the image size.
Any detail would be appreciated, in order to ensure that everything is working as it should. Thanks!
So I ran further tests:
Setting contentType: no success -> title is indexed, but not textContent
attributeSet.contentType = UTType.plainText.identifier
attributeSet.textContent = ocrText
I tried printing the entity textContent just before indexing:
NSLog("Indexing entity with OCR = \(entity.attributeSet.textContent ?? "")")
try await CSSearchableIndex.default().indexAppEntities([entity])
It correctly prints out the textContent of the entity.
--> Core Spotlight indexes the entity, but seems to ignore the textContent field
Modified the AppIntentsJournal project from Apple, to fill in the textContent field in the indexed attribute set.
--> textContent is correctly indexed in that project
The entity are almost identically the same in both projects.
What could cause a difference in behavior from CoreSpotlight?
@DTS Engineer : I filed the bug report:** FB16995719**
I tested the old way for indexing:
indexSearchableItems(_:completionHandler:)
The issue remains the same. The textContent is NOT found when using Spotlight.
However, querying Spotlight on textContent within the app, returns the right result.
This is so strange.
Best regards
Yes, I perform OCR on multiple images in parallel. This is a use case of the app. Typically:
a background process might OCR images that haven't been OCR'd yet
the user can add new images where OCR has to be performed
etc...
Moreover, as suggested in the WWDC video, I wanted to use the new Swift Concurrency API to perform multiple requests in parallel. But as mentioned in the topic, their provided code won't work with RecognizeTextRequest.
My current workaround is unfortunately to serialize every OCR request, to avoid the issue. But this doesn't fit the spirit of the app and the architecture.
So far, no. But I don't think my iOS 26 user base is large enough to conclude anything. I'll report here if I see any crash from iOS 26.
It happens quite often in production though, is there anything I can do to at least mitigate the issue?
So I confirm that this became a mess. A device can become uncalibrated, probably when he stays inside for a significant amount of time.
The issue is that going outside with clear GPS signal is not a good solution.
It might take A LOT of time for the device to recover (my watch, for example, never re-calibrated). If it does, the app using CMAltimeter might enter a weird behavior, due to the delivery of inaccurate events.
I don't know what happened with the latest updates, but I have 2 production apps facing major issues.
Changing let thumbMaxSize = min(2400, maxImageSize)
with
let thumbMaxSize = min(1200, maxImageSize)
thus, simply reducing the image size, fixed the issue.
I don't know what changed in 18.2 in that regards, but I must say that I'm happy that my use case allows for reducing the image size.
Any detail would be appreciated, in order to ensure that everything is working as it should. Thanks!
So I ran further tests:
Setting contentType: no success -> title is indexed, but not textContent
attributeSet.contentType = UTType.plainText.identifier
attributeSet.textContent = ocrText
I tried printing the entity textContent just before indexing:
NSLog("Indexing entity with OCR = \(entity.attributeSet.textContent ?? "")")
try await CSSearchableIndex.default().indexAppEntities([entity])
It correctly prints out the textContent of the entity.
--> Core Spotlight indexes the entity, but seems to ignore the textContent field
Modified the AppIntentsJournal project from Apple, to fill in the textContent field in the indexed attribute set.
--> textContent is correctly indexed in that project
The entity are almost identically the same in both projects.
What could cause a difference in behavior from CoreSpotlight?
@DTS Engineer : I filed the bug report:** FB16995719**
I tested the old way for indexing:
indexSearchableItems(_:completionHandler:)
The issue remains the same. The textContent is NOT found when using Spotlight.
However, querying Spotlight on textContent within the app, returns the right result.
This is so strange.
Best regards
Yes, I perform OCR on multiple images in parallel. This is a use case of the app. Typically:
a background process might OCR images that haven't been OCR'd yet
the user can add new images where OCR has to be performed
etc...
Moreover, as suggested in the WWDC video, I wanted to use the new Swift Concurrency API to perform multiple requests in parallel. But as mentioned in the topic, their provided code won't work with RecognizeTextRequest.
My current workaround is unfortunately to serialize every OCR request, to avoid the issue. But this doesn't fit the spirit of the app and the architecture.
So far, no. But I don't think my iOS 26 user base is large enough to conclude anything. I'll report here if I see any crash from iOS 26.
It happens quite often in production though, is there anything I can do to at least mitigate the issue?