In macOS Tahoe 26.2 an RDMA capability was added for Thunderbolt-5 interfaces. This has been demonstrated to significantly decrease the latency and maintain bandwidth for "clustered" Apple Silicon devices with TB5. What is the ideal and the maximum RDMA burst width for transfers over RDMA-enabled Thunderbolt-5 interfaces?
General
RSS for tagExplore the power of machine learning within apps. Discuss integrating machine learning features, share best practices, and explore the possibilities for your app.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Application is getting Crashed: AXSpeech
EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x000056f023efbeb0
Crashed: AXSpeech
0 libobjc.A.dylib 0x4820 objc_msgSend + 32
1 libsystem_trace.dylib 0x6c34 _os_log_fmt_flatten_object + 116
2 libsystem_trace.dylib 0x5344 _os_log_impl_flatten_and_send + 1884
3 libsystem_trace.dylib 0x4bd0 _os_log + 152
4 libsystem_trace.dylib 0x9c48 _os_log_error_impl + 24
5 TextToSpeech 0xd0a8c _pcre2_xclass_8
6 TextToSpeech 0x3bc04 TTSSpeechUnitTestingMode
7 TextToSpeech 0x3f128 TTSSpeechUnitTestingMode
8 AXCoreUtilities 0xad38 -[NSArray(AXExtras)
ax_flatMappedArrayUsingBlock:] + 204
9 TextToSpeech 0x3eb18 TTSSpeechUnitTestingMode
10 TextToSpeech 0x3c948 TTSSpeechUnitTestingMode
11 TextToSpeech 0x48824
AXAVSpeechSynthesisVoiceFromTTSSpeechVoice
12 TextToSpeech 0x49804 AXAVSpeechSynthesisVoiceFromTTSSpeechVoice
13 Foundation 0xf6064 __NSThreadPerformPerform + 264
14 CoreFoundation 0x37acc CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION + 28
15 CoreFoundation 0x36d48 __CFRunLoopDoSource0 + 176
16 CoreFoundation 0x354fc __CFRunLoopDoSources0 + 244
17 CoreFoundation 0x34238 __CFRunLoopRun + 828
18 CoreFoundation 0x33e18 CFRunLoopRunSpecific + 608
19 Foundation 0x2d4cc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 212
20 TextToSpeech 0x24b88 TTSCFAttributedStringCreateStringByBracketingAttributeWithString
21 Foundation 0xb3154 NSThread__start + 732
com.livingMedia.AajTakiPhone_issue_3ceba855a8ad2d1af83655803dc13f70_crash_session_9081fa41ced440ae9a57c22cb432f312_DNE_0_v2_stacktrace.txt
22 libsystem_pthread.dylib 0x24d4 _pthread_start + 136
23 libsystem_pthread.dylib 0x1a10 thread_start + 8
I have a very terrible crash problem in my App when I use AVSpeechSynthesizer and I can't repetition it.Here is my code, It's a singleton- (void)stopSpeech {
if ([self.synthesizer isPaused]) {
return;
}
if ([self.synthesizer isSpeaking]) {
BOOL isSpeech = [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryImmediate];
if (!isSpeech) {
[self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryWord];
}
}
self.stopBlock ? self.stopBlock() : nil;
}
-(AVSpeechSynthesizer *)synthesizer {
if (!_synthesizer) {
_synthesizer = [[AVSpeechSynthesizer alloc] init];
_synthesizer.delegate = self;
}
return _synthesizer;
}When the user leaves the page, I call the stopSpeech method。Then I got a lot of crash messagesHere is a crash log:# Crashlytics - plaintext stacktrace downloaded by liweican at Mon, 13 May 2019 03:03:24 GMT
# URL: https://fabric.io/youdao-dict/ios/apps/com.youdao.udictionary/issues/5a904ed88cb3c2fa63ad7ed3?time=last-thirty-days/sessions/b1747d91bafc4680ab0ca8e3a702c52c_DNE_0_v2
# Organization: zzz
# Platform: ios
# Application: U-Dictionary
# Version: 3.0.5.4
# Bundle Identifier: com.youdao.UDictionary
# Issue ID: 5a904ed88cb3c2fa63ad7ed3
# Session ID: b1747d91bafc4680ab0ca8e3a702c52c_DNE_0_v2
# Date: 2019-05-13T02:27:00Z
# OS Version: 12.2.0 (16E227)
# Device: iPhone 8 Plus
# RAM Free: 17%
# Disk Free: 64.6%
#19. Crashed: AXSpeech
0 libsystem_pthread.dylib 0x19c15e5b8 pthread_mutex_lock$VARIANT$armv81 + 102
1 CoreFoundation 0x19c4cf84c CFRunLoopSourceSignal + 68
2 Foundation 0x19cfc7280 performQueueDequeue + 464
3 Foundation 0x19cfc680c __NSThreadPerformPerform + 136
4 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24
5 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88
6 CoreFoundation 0x19c4d1b74 __CFRunLoopDoSources0 + 256
7 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004
8 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436
9 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300
10 libAXSpeechManager.dylib 0x1ac16c94c -[AXSpeechThread main] + 264
11 Foundation 0x19cfc66e4 __NSThread__start__ + 984
12 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
13 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
14 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
--
#0. com.apple.main-thread
0 libsystem_malloc.dylib 0x19c11ce24 small_free_list_remove_ptr_no_clear + 768
1 libsystem_malloc.dylib 0x19c11f094 small_malloc_from_free_list + 296
2 libsystem_malloc.dylib 0x19c11f094 small_malloc_from_free_list + 296
3 libsystem_malloc.dylib 0x19c11d63c small_malloc_should_clear + 224
4 libsystem_malloc.dylib 0x19c11adcc szone_malloc_should_clear + 132
5 libsystem_malloc.dylib 0x19c123c18 malloc_zone_malloc + 156
6 CoreFoundation 0x19c569ab4 __CFBasicHashRehash + 300
7 CoreFoundation 0x19c56b430 __CFBasicHashAddValue + 96
8 CoreFoundation 0x19c56ab9c CFBasicHashAddValue + 2160
9 CoreFoundation 0x19c49f3bc CFDictionaryAddValue + 260
10 CoreFoundation 0x19c572ee8 __54-[CFPrefsSource mergeIntoDictionary:sourceDictionary:]_block_invoke + 28
11 CoreFoundation 0x19c49f0b4 __CFDictionaryApplyFunction_block_invoke + 24
12 CoreFoundation 0x19c568b7c CFBasicHashApply + 116
13 CoreFoundation 0x19c49f090 CFDictionaryApplyFunction + 168
14 CoreFoundation 0x19c42f504 -[CFPrefsSource mergeIntoDictionary:sourceDictionary:] + 136
15 CoreFoundation 0x19c4bcd38 -[CFPrefsSearchListSource alreadylocked_getDictionary:] + 644
16 CoreFoundation 0x19c42e71c -[CFPrefsSearchListSource alreadylocked_copyValueForKey:] + 152
17 CoreFoundation 0x19c42e660 -[CFPrefsSource copyValueForKey:] + 60
18 CoreFoundation 0x19c579e88 __76-[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:]_block_invoke + 40
19 CoreFoundation 0x19c4bdff4 __108-[_CFXPreferences(SearchListAdditions) withSearchListForIdentifier:container:cloudConfigurationURL:perform:]_block_invoke + 272
20 CoreFoundation 0x19c4bda38 normalizeQuintuplet + 340
21 CoreFoundation 0x19c42c634 -[_CFXPreferences(SearchListAdditions) withSearchListForIdentifier:container:cloudConfigurationURL:perform:] + 108
22 CoreFoundation 0x19c42cec0 -[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:] + 148
23 CoreFoundation 0x19c57c2d0 _CFPreferencesCopyAppValueWithContainerAndConfiguration + 124
24 TextInput 0x1a450e550 -[TIPreferencesController valueForPreferenceKey:] + 460
25 UIKitCore 0x1c87c71f8 -[UIKeyboardPreferencesController handBias] + 36
26 UIKitCore 0x1c887275c -[UIKeyboardLayoutStar showKeyboardWithInputTraits:screenTraits:splitTraits:] + 320
27 UIKitCore 0x1c88f4240 -[UIKeyboardImpl finishLayoutChangeWithArguments:] + 492
28 UIKitCore 0x1c88f47c8 -[UIKeyboardImpl updateLayout] + 1208
29 UIKitCore 0x1c88eaad0 -[UIKeyboardImpl updateLayoutIfNecessary] + 448
30 UIKitCore 0x1c88eab9c -[UIKeyboardImpl setFrame:] + 140
31 UIKitCore 0x1c88d5d60 -[UIKeyboard activate] + 652
32 UIKitCore 0x1c894c90c -[UIKeyboardAutomatic activate] + 128
33 UIKitCore 0x1c88d5158 -[UIKeyboard setFrame:] + 296
34 UIKitCore 0x1c88d81b0 -[UIKeyboard _didChangeKeyplaneWithContext:] + 228
35 UIKitCore 0x1c88f4aa0 -[UIKeyboardImpl didMoveToSuperview] + 136
36 UIKitCore 0x1c8f2ad84 __45-[UIView(Hierarchy) _postMovedFromSuperview:]_block_invoke + 888
37 UIKitCore 0x1c8f2a970 -[UIView(Hierarchy) _postMovedFromSuperview:] + 760
38 UIKitCore 0x1c8f39ddc -[UIView(Internal) _addSubview:positioned:relativeTo:] + 1740
39 UIKitCore 0x1c88d5d84 -[UIKeyboard activate] + 688
40 UIKitCore 0x1c894c90c -[UIKeyboardAutomatic activate] + 128
41 UIKitCore 0x1c893b3a4 -[UIPeripheralHost(UIKitInternal) _reloadInputViewsForResponder:] + 1332
42 UIKitCore 0x1c8ae66d8 -[UIResponder(UIResponderInputViewAdditions) reloadInputViews] + 80
43 UIKitCore 0x1c8ae23bc -[UIResponder becomeFirstResponder] + 804
44 UIKitCore 0x1c8f2a560 -[UIView(Hierarchy) becomeFirstResponder] + 156
45 UIKitCore 0x1c8d93e84 -[UITextField becomeFirstResponder] + 244
46 UIKitCore 0x1c8d578dc -[UITextInteractionAssistant(UITextInteractionAssistant_Internal) setFirstResponderIfNecessary] + 192
47 UIKitCore 0x1c8d45d8c -[UITextSelectionInteraction oneFingerTap:] + 3136
48 UIKitCore 0x1c86e0bcc -[UIGestureRecognizerTarget _sendActionWithGestureRecognizer:] + 64
49 UIKitCore 0x1c86e8dd4 _UIGestureRecognizerSendTargetActions + 124
50 UIKitCore 0x1c86e6778 _UIGestureRecognizerSendActions + 316
51 UIKitCore 0x1c86e5ca4 -[UIGestureRecognizer _updateGestureWithEvent:buttonEvent:] + 760
52 UIKitCore 0x1c86d9d80 _UIGestureEnvironmentUpdate + 2180
53 UIKitCore 0x1c86d94b0 -[UIGestureEnvironment _deliverEvent:toGestureRecognizers:usingBlock:] + 384
54 UIKitCore 0x1c86d9290 -[UIGestureEnvironment _updateForEvent:window:] + 204
55 UIKitCore 0x1c8af14a8 -[UIWindow sendEvent:] + 3112
56 UIKitCore 0x1c8ad1534 -[UIApplication sendEvent:] + 340
57 UIKitCore 0x1c8b977c0 __dispatchPreprocessedEventFromEventQueue + 1768
58 UIKitCore 0x1c8b99eec __handleEventQueueInternal + 4828
59 UIKitCore 0x1c8b9311c __handleHIDEventFetcherDrain + 152
60 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24
61 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88
62 CoreFoundation 0x19c4d1b24 __CFRunLoopDoSources0 + 176
63 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004
64 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436
65 GraphicsServices 0x19e6cc79c GSEventRunModal + 104
66 UIKitCore 0x1c8ab7b68 UIApplicationMain + 212
67 UDictionary 0x10517e138 main (main.m:17)
68 libdyld.dylib 0x19bf928e0 start + 4
#1. Thread
0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340
2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4
#2. com.apple.uikit.eventfetch-thread
0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72
2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360
4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436
5 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300
6 Foundation 0x19ce99e5c -[NSRunLoop(NSRunLoop) runUntilDate:] + 96
7 UIKitCore 0x1c8b9d540 -[UIEventFetcher threadMain] + 136
8 Foundation 0x19cfc66e4 __NSThread__start__ + 984
9 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
10 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
11 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#3. JavaScriptCore bmalloc scavenger
0 libsystem_kernel.dylib 0x19c0ddee4 __psynch_cvwait + 8
1 libsystem_pthread.dylib 0x19c15d4a4 _pthread_cond_wait$VARIANT$armv81 + 628
2 libc++.1.dylib 0x19b6b5090 std::__1::condition_variable::wait(std::__1::unique_lock<std::__1::mutex>&) + 24
3 JavaScriptCore 0x1a36a2238 void std::__1::condition_variable_any::wait<std::__1::unique_lock<bmalloc::Mutex> >(std::__1::unique_lock<bmalloc::Mutex>&) + 108
4 JavaScriptCore 0x1a36a622c bmalloc::Scavenger::threadRunLoop() + 176
5 JavaScriptCore 0x1a36a59a4 bmalloc::Scavenger::Scavenger(std::__1::lock_guard<bmalloc::Mutex>&) + 10
6 JavaScriptCore 0x1a36a73e4 std::__1::__thread_specific_ptr<std::__1::__thread_struct>::set_pointer(std::__1::__thread_struct*) + 38
7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#4. WebThread
0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72
2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360
4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436
5 WebCore 0x1a5126480 RunWebThread(void*) + 600
6 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
7 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
8 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#5. com.twitter.crashlytics.ios.MachExceptionServer
0 UDictionary 0x1058a5564 CLSProcessRecordAllThreads (CLSProcess.c:376)
1 UDictionary 0x1058a594c CLSProcessRecordAllThreads (CLSProcess.c:407)
2 UDictionary 0x1058952dc CLSHandler (CLSHandler.m:26)
3 UDictionary 0x1058906cc CLSMachExceptionServer (CLSMachException.c:446)
4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#6. com.apple.NSURLConnectionLoader
0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72
2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360
4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436
5 CFNetwork 0x19cae574c -[__CoreSchedulingSetRunnable runForever] + 216
6 Foundation 0x19cfc66e4 __NSThread__start__ + 984
7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#7. AVAudioSession Notify Thread
0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72
2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360
4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436
5 AVFAudio 0x1a238a378 GenericRunLoopThread::Entry(void*) + 156
6 AVFAudio 0x1a23b4c60 CAPThread::Entry(CAPThread*) + 88
7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#8. WebCore: LocalStorage
0 libsystem_kernel.dylib 0x19c0ddee4 __psynch_cvwait + 8
1 libsystem_pthread.dylib 0x19c15d4a4 _pthread_cond_wait$VARIANT$armv81 + 628
2 JavaScriptCore 0x1a3668ce4 ***::ThreadCondition::timedWait(***::Mutex&, ***::WallTime) + 80
3 JavaScriptCore 0x1a364f96c ***::ParkingLot::parkConditionallyImpl(void const*, ***::ScopedLambda<bool ()> const&, ***::ScopedLambda<void ()> const&, ***::TimeWithDynamicClockType const&) + 2004
4 WebKitLegacy 0x1a67b6ea8 bool ***::Condition::waitUntil<***::Lock>(***::Lock&, ***::TimeWithDynamicClockType const&) + 184
5 WebKitLegacy 0x1a67b9ba4 std::__1::unique_ptr<***::Function<void ()>, std::__1::default_delete<***::Function<void ()> > > ***::MessageQueue<***::Function<void ()> >::waitForMessageFilteredWithTimeout<***::MessageQueue<***::Function<void ()> >::waitForMessage()::'lambda'(***::Function<void ()> const&)>(***::MessageQueueWaitResult&, ***::MessageQueue<***::Function<void ()> >::waitForMessage()::'lambda'(***::Function<void ()> const&)&&, ***::WallTime) + 156
6 WebKitLegacy 0x1a67b91c0 WebCore::StorageThread::threadEntryPoint() + 68
7 JavaScriptCore 0x1a3666f88 ***::Thread::entryPoint(***::Thread::NewThreadContext*) + 260
8 JavaScriptCore 0x1a3668494 ***::wtfThreadEntryPoint(void*) + 12
9 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
10 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
11 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#9. com.apple.CoreMotion.MotionThread
0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72
2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360
4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436
5 CoreFoundation 0x19c4cd0b0 CFRunLoopRun + 80
6 CoreMotion 0x1a1df0240 (Missing)
7 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
8 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
9 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#10. Thread
0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340
2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4
#11. Thread
0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x19c1611f8 _pthread_wqthread + 532
2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4
#12. com.apple.CFStream.LegacyThread
0 libsystem_kernel.dylib 0x19c0d30f4 mach_msg_trap + 8
1 libsystem_kernel.dylib 0x19c0d25a0 mach_msg + 72
2 CoreFoundation 0x19c4d1cb4 __CFRunLoopServiceMachPort + 236
3 CoreFoundation 0x19c4ccbc4 __CFRunLoopRun + 1360
4 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436
5 CoreFoundation 0x19c4e5094 _legacyStreamRunLoop_workThread + 260
6 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
7 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
8 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#13. Thread
0 libsystem_pthread.dylib 0x19c163cd0 start_wqthread + 190
#14. Thread
0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340
2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4
#15. Thread
0 libsystem_kernel.dylib 0x19c0deb74 __workq_kernreturn + 8
1 libsystem_pthread.dylib 0x19c161138 _pthread_wqthread + 340
2 libsystem_pthread.dylib 0x19c163cd4 start_wqthread + 4
#16. Thread
0 libsystem_kernel.dylib 0x19c0d3148 semaphore_timedwait_trap + 8
1 libdispatch.dylib 0x19bf50a4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64
2 libdispatch.dylib 0x19bf513a8 _dispatch_semaphore_wait_slow + 72
3 libdispatch.dylib 0x19bf647c8 _dispatch_worker_thread + 344
4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#17. Thread
0 libsystem_kernel.dylib 0x19c0d3148 semaphore_timedwait_trap + 8
1 libdispatch.dylib 0x19bf50a4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64
2 libdispatch.dylib 0x19bf513a8 _dispatch_semaphore_wait_slow + 72
3 libdispatch.dylib 0x19bf647c8 _dispatch_worker_thread + 344
4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#18. Thread
0 libsystem_kernel.dylib 0x19c0d3148 semaphore_timedwait_trap + 8
1 libdispatch.dylib 0x19bf50a4c _dispatch_sema4_timedwait$VARIANT$armv81 + 64
2 libdispatch.dylib 0x19bf513a8 _dispatch_semaphore_wait_slow + 72
3 libdispatch.dylib 0x19bf647c8 _dispatch_worker_thread + 344
4 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
5 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
6 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#19. Crashed: AXSpeech
0 libsystem_pthread.dylib 0x19c15e5b8 pthread_mutex_lock$VARIANT$armv81 + 102
1 CoreFoundation 0x19c4cf84c CFRunLoopSourceSignal + 68
2 Foundation 0x19cfc7280 performQueueDequeue + 464
3 Foundation 0x19cfc680c __NSThreadPerformPerform + 136
4 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24
5 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88
6 CoreFoundation 0x19c4d1b74 __CFRunLoopDoSources0 + 256
7 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004
8 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436
9 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300
10 libAXSpeechManager.dylib 0x1ac16c94c -[AXSpeechThread main] + 264
11 Foundation 0x19cfc66e4 __NSThread__start__ + 984
12 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
13 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
14 libsystem_pthread.dylib 0x19c163cdc thread_start + 4
#20. AXSpeech
0 (Missing) 0x1071ba524 (Missing)
1 (Missing) 0x1071b3e7c (Missing)
2 (Missing) 0x10718fba4 (Missing)
3 (Missing) 0x107184bc8 (Missing)
4 libdyld.dylib 0x19bf95908 dlopen + 176
5 CoreFoundation 0x19c5483e8 _CFBundleDlfcnLoadBundle + 140
6 CoreFoundation 0x19c486918 _CFBundleLoadExecutableAndReturnError + 352
7 Foundation 0x19ced5734 -[NSBundle loadAndReturnError:] + 428
8 TextToSpeech 0x1abfff800 TTSSpeechUnitTestingMode + 1020
9 libdispatch.dylib 0x19bf817d4 _dispatch_client_callout + 16
10 libdispatch.dylib 0x19bf52040 _dispatch_once_callout + 28
11 TextToSpeech 0x1abfff478 TTSSpeechUnitTestingMode + 116
12 libobjc.A.dylib 0x19b7173cc CALLING_SOME_+initialize_METHOD + 24
13 libobjc.A.dylib 0x19b71cee0 initializeNonMetaClass + 296
14 libobjc.A.dylib 0x19b71e640 initializeAndMaybeRelock(objc_class*, objc_object*, mutex_tt<false>&, bool) + 260
15 libobjc.A.dylib 0x19b7265a4 lookUpImpOrForward + 244
16 libobjc.A.dylib 0x19b733858 _objc_msgSend_uncached + 56
17 libAXSpeechManager.dylib 0x1ac167324 -[AXSpeechManager _initialize] + 68
18 Foundation 0x19cfc68d4 __NSThreadPerformPerform + 336
19 CoreFoundation 0x19c4d22bc __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24
20 CoreFoundation 0x19c4d223c __CFRunLoopDoSource0 + 88
21 CoreFoundation 0x19c4d1b74 __CFRunLoopDoSources0 + 256
22 CoreFoundation 0x19c4cca60 __CFRunLoopRun + 1004
23 CoreFoundation 0x19c4cc354 CFRunLoopRunSpecific + 436
24 Foundation 0x19ce99fcc -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 300
25 libAXSpeechManager.dylib 0x1ac16c94c -[AXSpeechThread main] + 264
26 Foundation 0x19cfc66e4 __NSThread__start__ + 984
27 libsystem_pthread.dylib 0x19c1602c0 _pthread_body + 128
28 libsystem_pthread.dylib 0x19c160220 _pthread_start + 44
29 libsystem_pthread.dylib 0x19c163cdc thread_start + 4I change my code like this, It still has the same problem- (void)stopSpeech {
if (self.synthesizer != nil && [self.synthesizer isPaused]) {
return;
}
// if ([self.synthesizer isSpeaking]) {
// BOOL isSpeech = [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryImmediate];
// if (!isSpeech) {
// [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryWord];
// }
// }
if (self.synthesizer != nil) {
[self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryImmediate];
// if (!isSpeech) {
// [self.synthesizer stopSpeakingAtBoundary:AVSpeechBoundaryWord];
// }
self.stopBlock ? self.stopBlock() : nil;
}
}
Hello, My app is crashing a lot with this issue. I can't reproduce the problem but I can see it occurs at the user's devices. The Crashlytics report shows the following lines:Crashed: AXSpeech
0 libsystem_pthread.dylib 0x1824386bc pthread_mutex_lock$VARIANT$mp + 278
1 CoreFoundation 0x1826d3a34 CFRunLoopSourceSignal + 68
2 Foundation 0x18319ec90 performQueueDequeue + 468
3 Foundation 0x18325a020 __NSThreadPerformPerform + 136
4 CoreFoundation 0x1827b7404 __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 24
5 CoreFoundation 0x1827b6ce0 __CFRunLoopDoSources0 + 456
6 CoreFoundation 0x1827b479c __CFRunLoopRun + 1204
7 CoreFoundation 0x1826d4da8 CFRunLoopRunSpecific + 552
8 Foundation 0x183149674 -[NSRunLoop(NSRunLoop) runMode:beforeDate:] + 304
9 libAXSpeechManager.dylib 0x192852830 -[AXSpeechThread main] + 284
10 Foundation 0x183259efc __NSThread__start__ + 1040
11 libsystem_pthread.dylib 0x182435220 _pthread_body + 272
12 libsystem_pthread.dylib 0x182435110 _pthread_body + 290
13 libsystem_pthread.dylib 0x182433b10 thread_start + 4The crash occurs in different threads (never at main thread)It is driving me crazy... Can anybody help me?Thanks a lot
Hello,
My goal is to enable users to perform a freeform search request for any product I sell using a spoken phrase, for example, "Hey Siri, search GAMING CONSOLES on MyCatalogApp". The result would launch MyCatalogApp and navigate to a search results page displaying gaming consoles.
I have defined a SearchIntent (using the .system.search schema) and a Shortcut to accomplish this.
However, Siri doesn't seem to be able to correctly parse the spoken phrase, extract the search string, and provide it as the critiria term within SearchIntent.
What am I doing wrong?
Here is the SearchIntent. Note the print() statement outputs the search string--which in the scenario above would be "GAMING CONSOLES"--but it doesn't work.
import AppIntents
@available(iOS 17.2, *)
@AppIntent(schema: .system.search)
struct SearchIntent: ShowInAppSearchResultsIntent {
static var searchScopes: [StringSearchScope] = [.general]
@Parameter(title: "Criteria")
var criteria: StringSearchCriteria
static var title: LocalizedStringResource = "Search with MyCatalogApp"
@MainActor
func perform() async throws -> some IntentResult {
let searchString = criteria.term
print("**** Search String: \(searchString) ****") // tmp debugging
try await MyCatalogSearchHelper.search(for: searchString) // fetch results via my async fetch API
return .result()
}
}
Here's the Shortcuts definition:
import AppIntents
@available(iOS 17.2, *)
struct Shortcuts: AppShortcutsProvider {
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: SearchIntent(),
phrases: ["Search for \(\.$criteria) on \(.applicationName)."],
shortTitle: "Search", systemImageName: "magnifyingglass"
)
}
}
Thanks for any help!
I don't know if these forums are any good for rumors or plans, but does anybody know whether or not Apple plans to release a library for training reinforcement learning? It would be handy, implementing games in Swift, for example, to be able to train the computer players on the same code.
Subject: Technical Report: Float32 Precision Ceiling & Memory Fragmentation in JAX/Metal Workloads on M3
To: Metal Developer Relations
Hello,
I am reporting a repeatable numerical saturation point encountered during sustained recursive high-order differential workloads on the Apple M3 (16 GB unified memory) using the JAX Metal backend.
Workload Characteristics:
Large-scale vector projections across multi-dimensional industrial datasets
Repeated high-order finite-difference calculations
Heavy use of jax.grad and lax.cond inside long-running loops
Observation:
Under these conditions, the Metal/MPS backend consistently enters a terminal quantization lock where outputs saturate at a fixed scalar value (2.0000), followed by system-wide NaN propagation. This appears to be a precision-limited boundary in the JAX-Metal bridge when handling high-order operations with cubic time-scale denominators.
have identified the specific threshold where recursive high-order tensor derivatives exceed the numerical resolution of 32-bit consumer architectures, necessitating a migration to a dedicated 64-bit industrial stack.
I have prepared a minimal synthetic test script (randomized vectors only, no proprietary logic) that reliably reproduces the allocator fragmentation and saturation behavior. Let me know if your team would like the telemetry for XLA/MPS optimization purposes.
Best regards,
Alex Severson
Architect, QuantumPulse AI
I’m trying to integrate Apple’s Translation framework in a Swift 6 project with Approachable Concurrency enabled.
I’m following the code here: https://developer.apple.com/documentation/translation/translating-text-within-your-app#Offer-a-custom-translation
And, specifically, inside the following code
.translationTask(configuration) { session in
do {
// Use the session the task provides to translate the text.
let response = try await session.translate(sourceText)
// Update the view with the translated result.
targetText = response.targetText
} catch {
// Handle any errors.
}
}
On the try await session.translate(…) line, the compiler complains that “Sending ‘session’ risks causing data races”.
Extended error message:
Sending main actor-isolated 'session' to @concurrent instance method 'translate' risks causing data races between @concurrent and main actor-isolated uses
I’ve downloaded Apple’s sample code (at the top of linked webpage), it compiles fine as-is on Xcode 26.4, but fails with the same error as soon as I switch the Swift Language Mode to Swift 6 in the project.
How can I fix this?
Hello All,
I’m working on a computer-vision–heavy iOS application that uses the camera, LiDAR depth maps, and semantic segmentation to reason about the environment (object identification, localization and measurement - not just visualization).
Current architecture
I initially built the image pipeline around CIImage as a unifying abstraction. It seemed like a good idea because:
CIImage integrates cleanly with Vision, ARKit, AVFoundation, Metal, Core Graphics, etc.
It provides a rich set of out-of-the-box transforms and filters.
It is immutable and thread-safe, which significantly simplified concurrency in a multi-queue pipeline.
The LiDAR depth maps, semantic segmentation masks, etc. were treated as CIImages, with conversion to CVPixelBuffer or MTLTexture only at the edges when required.
Problem
I’ve run into cases where Core Image transformations do not preserve numeric fidelity for non-visual data.
Example:
Rendering a CIImage-backed segmentation mask into a larger CVPixelBuffer can cause label values to change in predictable but incorrect ways.
This occurs even when:
using nearest-neighbor sampling
disabling color management (workingColorSpace / outputColorSpace = NSNull)
applying identity or simple affine transforms
I’ve confirmed via controlled tests that:
Metal → CVPixelBuffer paths preserve values correctly
CIImage → CVPixelBuffer paths can introduce value changes when resampling or expanding the render target
This makes CIImage unsafe as a source of numeric truth for segmentation masks and depth-based logic, even though it works well for visualization, and I should have realized this much sooner.
Direction I’m considering
I’m now considering refactoring toward more intent-based abstractions instead of a single image type, for example:
Visual images: CIImage (camera frames, overlays, debugging, UI)
Scalar fields: depth / confidence maps backed by CVPixelBuffer + Metal
Label maps: segmentation masks backed by integer-preserving buffers (no interpolation, no transforms)
In this model, CIImage would still be used extensively — but primarily for visualization and perceptual processing, not as the container for numerically sensitive data.
Thread safety concern
One of the original advantages of CIImage was that it is thread-safe by design, and that was my biggest incentive.
For CVPixelBuffer / MTLTexture–backed data, I’m considering enforcing thread safety explicitly via:
Swift Concurrency (actor-owned data, explicit ownership)
Questions
For those may have experience with CV / AR / imaging-heavy iOS apps, I was hoping to know the following:
Is this separation of image intent (visual vs numeric vs categorical) a reasonable architectural direction?
Do you generally keep CIImage at the heart of your pipeline, or push it to the edges (visualization only)?
How do you manage thread safety and ownership when working heavily with CVPixelBuffer and Metal? Using actor-based abstractions, GCD, or adhoc?
Are there any best practices or gotchas around using Core Image with depth maps or segmentation masks that I should be aware of?
I’d really appreciate any guidance or experience-based advice. I suspect I’ve hit a boundary of Core Image’s design, and I’m trying to refactor in a way that doesn't involve too much immediate tech debt, remains robust and maintainable long-term.
Thank you in advance!
Also submitted as feedback (ID: FB20612561).
Tensorflow-metal fails on tensorflow versions above 2.18.1, but works fine on tensorflow 2.18.1
In a new python 3.12 virtual environment:
pip install tensorflow
pip install tensor flow-metal
python -c "import tensorflow as tf"
Prints error:
Traceback (most recent call last):
File "", line 1, in
File "/Users//pt/venv/lib/python3.12/site-packages/tensorflow/init.py", line 438, in
_ll.load_library(_plugin_dir)
File "/Users//pt/venv/lib/python3.12/site-packages/tensorflow/python/framework/load_library.py", line 151, in load_library
py_tf.TF_LoadLibrary(lib)
tensorflow.python.framework.errors_impl.NotFoundError: dlopen(/Users//pt/venv/lib/python3.12/site-packages/tensorflow-plugins/libmetal_plugin.dylib, 0x0006): Library not loaded: @rpath/_pywrap_tensorflow_internal.so
Referenced from: <8B62586B-B082-3113-93AB-FD766A9960AE> /Users//pt/venv/lib/python3.12/site-packages/tensorflow-plugins/libmetal_plugin.dylib
Reason: tried: '/Users//pt/venv/lib/python3.12/site-packages/tensorflow-plugins/../_solib_darwin_arm64/_U@local_Uconfig_Utf_S_S_C_Upywrap_Utensorflow_Uinternal___Uexternal_Slocal_Uconfig_Utf/_pywrap_tensorflow_internal.so' (no such file), '/Users//pt/venv/lib/python3.12/site-packages/tensorflow-plugins/../_solib_darwin_arm64/_U@local_Uconfig_Utf_S_S_C_Upywrap_Utensorflow_Uinternal___Uexternal_Slocal_Uconfig_Utf/_pywrap_tensorflow_internal.so' (no such file), '/opt/homebrew/lib/_pywrap_tensorflow_internal.so' (no such file), '/System/Volumes/Preboot/Cryptexes/OS/opt/homebrew/lib/_pywrap_tensorflow_internal.so' (no such file)
Topic:
Machine Learning & AI
SubTopic:
General
Tags:
Developer Tools
Metal
Machine Learning
tensorflow-metal
Environment:
macOS 26.2 (Tahoe)
Xcode 16.3
Apple Silicon (M4)
Sandboxed Mac App Store app
Description:
Repeated use of VNRecognizeTextRequest causes permanent memory growth in the host process. The physical footprint increases by approximately 3-15 MB per OCR call and never returns to baseline, even after all references to the request, handler, observations, and image are released.
`
private func selectAndProcessImage() {
let panel = NSOpenPanel()
panel.allowedContentTypes = [.image]
panel.allowsMultipleSelection = false
panel.canChooseDirectories = false
panel.message = "Select an image for OCR processing"
guard panel.runModal() == .OK, let url = panel.url else { return }
selectedImageURL = url
isProcessing = true
recognizedText = "Processing..."
// Run OCR on a background thread to keep UI responsive
let workItem = DispatchWorkItem {
let result = performOCR(on: url)
DispatchQueue.main.async {
recognizedText = result
isProcessing = false
}
}
DispatchQueue.global(qos: .userInitiated).async(execute: workItem)
}
private func performOCR(on url: URL) -> String {
// Wrap EVERYTHING in autoreleasepool so all ObjC objects are drained immediately
let resultText: String = autoreleasepool {
// Load image and convert to CVPixelBuffer for explicit memory control
guard let imageData = try? Data(contentsOf: url) else {
return "Error: Could not read image file."
}
guard let nsImage = NSImage(data: imageData) else {
return "Error: Could not create image from file data."
}
guard let cgImage = nsImage.cgImage(forProposedRect: nil, context: nil, hints: nil) else {
return "Error: Could not create CGImage."
}
let width = cgImage.width
let height = cgImage.height
// Create a CVPixelBuffer from the CGImage
var pixelBuffer: CVPixelBuffer?
let attrs: [String: Any] = [
kCVPixelBufferCGImageCompatibilityKey as String: true,
kCVPixelBufferCGBitmapContextCompatibilityKey as String: true
]
let status = CVPixelBufferCreate(
kCFAllocatorDefault,
width,
height,
kCVPixelFormatType_32ARGB,
attrs as CFDictionary,
&pixelBuffer
)
guard status == kCVReturnSuccess, let buffer = pixelBuffer else {
return "Error: Could not create CVPixelBuffer (status: \(status))."
}
// Draw the CGImage into the pixel buffer
CVPixelBufferLockBaseAddress(buffer, [])
guard let context = CGContext(
data: CVPixelBufferGetBaseAddress(buffer),
width: width,
height: height,
bitsPerComponent: 8,
bytesPerRow: CVPixelBufferGetBytesPerRow(buffer),
space: CGColorSpaceCreateDeviceRGB(),
bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue
) else {
CVPixelBufferUnlockBaseAddress(buffer, [])
return "Error: Could not create CGContext for pixel buffer."
}
context.draw(cgImage, in: CGRect(x: 0, y: 0, width: width, height: height))
CVPixelBufferUnlockBaseAddress(buffer, [])
// Run OCR
let requestHandler = VNImageRequestHandler(cvPixelBuffer: buffer, options: [:])
let request = VNRecognizeTextRequest()
request.recognitionLevel = .accurate
request.usesLanguageCorrection = true
do {
try requestHandler.perform([request])
} catch {
return "Error during OCR: \(error.localizedDescription)"
}
guard let observations = request.results, !observations.isEmpty else {
return "No text found in image."
}
let lines = observations.compactMap { observation in
observation.topCandidates(1).first?.string
}
// Explicitly nil out the pixel buffer before the pool drains
pixelBuffer = nil
return lines.joined(separator: "\n")
}
// Everything — Data, NSImage, CGImage, CVPixelBuffer, VN objects — released here
return resultText
}
`
I get the following error when running this command in a Jupyter notebook:
v = tf.Variable(initial_value=tf.random.normal(shape=(3, 1)))
v[0, 0].assign(3.)
Environment:
python == 3.11.14
tensorflow==2.19.1
tensorflow-metal==1.2.0
{
"name": "InvalidArgumentError",
"message": "Cannot assign a device for operation ResourceStridedSliceAssign: Could not satisfy explicit device specification '/job:localhost/replica:0/task:0/device:GPU:0' because no supported kernel for GPU devices is available.\nColocation Debug Info:\nColocation group had the following types and supported devices: \nRoot Member(assigned_device_name_index_=1 requested_device_name_='/job:localhost/replica:0/task:0/device:GPU:0' assigned_device_name_='/job:localhost/replica:0/task:0/device:GPU:0' resource_device_name_='/job:localhost/replica:0/task:0/device:GPU:0' supported_device_types_=[CPU] possible_devices_=[]\nResourceStridedSliceAssign: CPU \n_Arg: GPU CPU \n\nColocation members, user-requested devices, and framework assigned devices, if any:\n ref (_Arg) framework assigned device=/job:localhost/replica:0/task:0/device:GPU:0\n ResourceStridedSliceAssign (ResourceStridedSliceAssign) /job:localhost/replica:0/task:0/device:GPU:0\n\nOp: ResourceStridedSliceAssign\n
[...]
[[{{node ResourceStridedSliceAssign}}]] [Op:ResourceStridedSliceAssign] name: strided_slice/_assign"
}
It seems like the ResourceStridedSliceAssign operation is not implemented for the GPU
Hello,
I am interested in using jax-metal to train ML models using Apple Silicon. I understand this is experimental.
After installing jax-metal according to https://developer.apple.com/metal/jax/, my python code fails with the following error
JaxRuntimeError: UNKNOWN: -:0:0: error: unknown attribute code: 22
-:0:0: note: in bytecode version 6 produced by: StableHLO_v1.12.1
My issue is identical to the one reported here https://github.com/jax-ml/jax/issues/26968#issuecomment-2733120325, and is fixed by pinning to jax-metal 0.1.1., jax 0.5.0 and jaxlib 0.5.0.
Thank you!
After watching the What's new in App Intents session I'm attempting to create an intent conforming to URLRepresentableIntent. The video states that so long as my AppEntity conforms to URLRepresentableEntity I should not have to provide a perform method . My application will be launched automatically and passed the appropriate URL.
This seems to work in that my application is launched and is passed a URL, but the URL is in the form: FeatureEntity/{id}.
Am I missing something, or is there a trick that enables it to pass along the URL specified in the AppEntity itself?
struct MyExampleIntent: OpenIntent, URLRepresentableIntent {
static let title: LocalizedStringResource = "Open Feature"
static var parameterSummary: some ParameterSummary {
Summary("Open \(\.$target)")
}
@Parameter(title: "My feature", description: "The feature to open.")
var target: FeatureEntity
}
struct FeatureEntity: AppEntity {
// ...
}
extension FeatureEntity: URLRepresentableEntity {
static var urlRepresentation: URLRepresentation {
"https://myurl.com/\(.id)"
}
}
Hi everyone,
I’m exploring ideas around on-device analysis of user typing behavior on iPhone, and I’d love input from others who’ve worked in this area or thought about similar problems.
Conceptually, I’m interested in things like:
High-level sentiment or tone inferred from what a user types over time using ML-models
Identifying a user’s most important or frequent topics over a recent window (e.g., “last week”)
Aggregated insights rather than raw text (privacy-preserving summaries: e.g., your typo-rate by hour to infer highly efficient time slots or "take-a-break" warning typing errors increase)
I understand the significant privacy restrictions around keyboard input on iOS, especially for third-party keyboards and system text fields. I’m not trying to bypass those constraints—rather, I’m curious about what’s realistically possible within Apple’s frameworks and policies. (For instance, Grammarly as a correction tool includes some information about tone)
Questions I’m thinking through:
Are there any recommended approaches for on-device text analysis that don’t rely on capturing raw keystrokes?
Has anyone used NLP / Core ML / Natural Language successfully for similar summarization or sentiment tasks, scoped only to user-explicit input?
For custom keyboards, what kinds of derived or transient signals (if any) are acceptable to process and summarize locally?
Any design patterns that balance usefulness with Apple’s privacy expectations?
If you’ve built something adjacent—journaling, writing analytics, well-being apps, etc.—I’d appreciate hearing what worked, what didn’t, and what Apple reviewers were comfortable with.
Thanks in advance for any ideas or references 🙏
Topic:
Machine Learning & AI
SubTopic:
General
I'm using python 3.9.6, tensorflow 2.20.0, tensorflow-metal 1.2.0, and when I try to run
import tensorflow as tf
It gives
Traceback (most recent call last):
File "/Users/haoduoyu/Code/demo.py", line 1, in <module>
import tensorflow as tf
File "/Users/haoduoyu/Code/test/lib/python3.9/site-packages/tensorflow/__init__.py", line 438, in <module>
_ll.load_library(_plugin_dir)
File "/Users/haoduoyu/Code/test/lib/python3.9/site-packages/tensorflow/python/framework/load_library.py", line 151, in load_library
py_tf.TF_LoadLibrary(lib)
tensorflow.python.framework.errors_impl.NotFoundError: dlopen(/Users/haoduoyu/Code/test/lib/python3.9/site-packages/tensorflow-plugins/libmetal_plugin.dylib, 0x0006): Library not loaded: @rpath/_pywrap_tensorflow_internal.so
Referenced from: <8B62586B-B082-3113-93AB-FD766A9960AE> /Users/haoduoyu/Code/test/lib/python3.9/site-packages/tensorflow-plugins/libmetal_plugin.dylib
Reason: tried: '/Users/haoduoyu/Code/test/lib/python3.9/site-packages/tensorflow-plugins/../_solib_darwin_arm64/_U@local_Uconfig_Utf_S_S_C_Upywrap_Utensorflow_Uinternal___Uexternal_Slocal_Uconfig_Utf/_pywrap_tensorflow_internal.so' (no such file), '/Users/haoduoyu/Code/test/lib/python3.9/site-packages/tensorflow-plugins/../_solib_darwin_arm64/_U@local_Uconfig_Utf_S_S_C_Upywrap_Utensorflow_Uinternal___Uexternal_Slocal_Uconfig_Utf/_pywrap_tensorflow_internal.so' (no such file)
As long as I uninstall tensorflow-metal, nothing goes wrong. How can I fix this problem?
I have a series of shortcuts that I’ve written that use the “Use Model” action to do various things. For example, I have a shortcut “Clipboard Markdown to Notes” that takes the content of the clipboard, creates a new note in Notes, converts the markdown content to rich text, adds it to the note etc.
One key step is to analyze the markdown content with “Use Model” and generate a short descriptive title for the note.
I use the on-device model for this, but sometimes the content and prompt exceed the context window size and the action fails with an error message to that effect.
In that case, I’d like to either repeat the action using the Cloud model, or, if the error was a refusal, to prompt the user to enter a title to use.
I‘ve tried using an IF based on whether the response had any text in it, but that didn’t work. No matter what I’ve tried, I can’t seem to find a way to catch the error from Use Model, determine what the error was, and take appropriate action.
Is there a way to do this?
(And by the way, a huge ”thank you” to whoever had the idea of making AppIntents visible in Shortcuts and adding the Use Model action — has made a huge difference already, and it lets us see what Siri will be able to use as well.)
It is vital for Apple to refine its OCR models to correctly distinguish between Khmer and Thai scripts. Incorrectly labeling Khmer text as Thai is more than a technical bug; it is a culturally insensitive error that impacts national identity, especially given the current geopolitical climate between Cambodia and Thailand. Implementing a more robust language-detection threshold would prevent these harmful misidentifications.
There is a significant logic flaw in the VNRecognizeTextRequest language detection when processing Khmer script. When the property automaticallyDetectsLanguage is set to true, the Vision framework frequently misidentifies Khmer characters as Thai.
While both scripts share historical roots, they are distinct languages with different alphabets. Currently, the model’s confidence threshold for distinguishing between these two scripts is too low, leading to incorrect OCR output in both developer-facing APIs and Apple’s native ecosystem (Preview, Live Text, and Photos).
import SwiftUI
import Vision
class TextExtractor {
func extractText(from data: Data, completion: @escaping (String) -> Void) {
let request = VNRecognizeTextRequest { (request, error) in
guard let observations = request.results as? [VNRecognizedTextObservation] else {
completion("No text found.")
return
}
let recognizedStrings = observations.compactMap { observation in
let str = observation.topCandidates(1).first?.string
return "{text: \(str!), confidence: \(observation.confidence)}"
}
completion(recognizedStrings.joined(separator: "\n"))
}
request.automaticallyDetectsLanguage = true // <-- This is the issue.
request.recognitionLevel = .accurate
let handler = VNImageRequestHandler(data: data, options: [:])
DispatchQueue.global(qos: .background).async {
do {
try handler.perform([request])
} catch {
completion("Failed to perform OCR: \(error.localizedDescription)")
}
}
}
}
Recognizing Khmer
Confidence Score is low for Khmer text. (The output is in Thai language with low confidence score)
Recognizing English
Confidence Score is high expected.
Recognizing Thai
Confidence Score is high as expected
Issues on Preview, Photos
Khmer text
Copied text
Kouk Pring Chroum Temple [19121 รอาสายสุกตีนานยารรีสใหิสรราภูชิตีนนสุฐตีย์ [รุก
เผือชิษาธอยกัตธ์ตายตราพาษชาณา ถวเชยาใบสราเบรถทีมูสินตราพาษชาณา ทีมูโษา เช็ก
อาษเชิษฐอารายสุกบดตพรธุรฯ ตากร"สุก"ผาตากรธกรธุกเยากสเผาพศฐตาสาย รัอรณาษ"ตีพย"
สเผาพกรกฐาภูชิสาเครๆผู:สุกรตีพาสเผาพสรอสายใผิตรรารตีพสๆ เดียอลายสุกตีน
ธาราชรติ ธิพรหณาะพูชุบละเาหLunet De Lajonquiere ผารูกรสาราพารผรผาสิตภพ ตารสิทูก ธิพิ
คุณที่นสายเระพบพเคเผาหนารเกะทรนภาษเราภุพเสารเราษทีเลิกสญาเราหรุฬารชสเกาก เรากุม
สงสอบานตรเราะากกต่ายภากายระตารุกเตียน
Recommended Solutions
1. Set a Threshold
Filter out the detected result where the threshold is less than or equal to 0.5, so that it would not output low quality text which can lead to the issue.
For example,
let recognizedStrings = observations.compactMap { observation in
if observation.confidence <= 0.5 {
return nil
}
let str = observation.topCandidates(1).first?.string
return "{text: \(str!), confidence: \(observation.confidence)}"
}
2. Add Khmer Language Support
This issue would never happen if the model has the capability to detect and recognize image with Khmer language.
Doc2Text GitHub: https://github.com/seanghay/Doc2Text-Swift
Hi everyone,
I'm experiencing an inconsistent behavior with the Translation framework on iOS 18. The LanguageAvailability.status() API reports language models as .installed, but translation fails with Code 16.
Setup:
Using translationTask modifier with TranslationSession
Batch translation with explicit source/target languages
Languages: Portuguese→English, German→English
Issue:
let status = await LanguageAvailability().status(from: sourceLang, to: targetLang) // Returns: .installed
// But translation fails:
let responses = try await session.translations(from: requests)
// Error: TranslationErrorDomain Code=16 "Offline models not available"
Logs:
Language model installed: pt -> en
Language model installed: de -> en
Starting translation: de -> en
Error Domain=TranslationErrorDomain Code=16 "Translation failed"NSLocalizedFailureReason=Offline models not available for language pair
What I've tried:
Re-downloading languages in Settings
Using source: nil for auto-detection
Fresh TranslationSession.Configuration each time
Questions:
Is there a way to force model re-validation/re-download programmatically?
Should translationTask show download popup when Code 16 occurs?
Has anyone found a reliable workaround?
I've seen similar reports in threads 791357 and 777113. Any guidance appreciated!
Thanks!
Topic:
Machine Learning & AI
SubTopic:
General
Environment
MacOC 26
Xcode Version 26.0 beta 7 (17A5305k)
simulator: iPhone 16 pro
iOS: iOS 26
Problem
NLContextualEmbedding.load() fails with the following error
In simulator
Failed to load embedding from MIL representation: filesystem error: in create_directories: Permission denied ["/var/db/com.apple.naturallanguaged/com.apple.e5rt.e5bundlecache"]
filesystem error: in create_directories: Permission denied ["/var/db/com.apple.naturallanguaged/com.apple.e5rt.e5bundlecache"]
Failed to load embedding model 'mul_Latn' - '5C45D94E-BAB4-4927-94B6-8B5745C46289'
assetRequestFailed(Optional(Error Domain=NLNaturalLanguageErrorDomain Code=7 "Embedding model requires compilation" UserInfo={NSLocalizedDescription=Embedding model requires compilation}))
in #Playground
I'm new to this embedding model. Not sure if it's caused by my code or environment.
Code snippet
import Foundation
import NaturalLanguage
import Playgrounds
#Playground {
// Prefer initializing by script for broader coverage; returns NLContextualEmbedding?
guard let embeddingModel = NLContextualEmbedding(script: .latin) else {
print("Failed to create NLContextualEmbedding")
return
}
print(embeddingModel.hasAvailableAssets)
do {
try embeddingModel.load()
print("Model loaded")
} catch {
print("Failed to load model: \(error)")
}
}