Sound and Haptics

RSS for tag

Create meaningful and delightful experiences that engage a wider range of senses with sound and haptic design.

Posts under Sound and Haptics tag

6 Posts

Post

Replies

Boosts

Views

Activity

macOS 26 – NSSound/CoreAudio causes SIGILL crash in caulk allocator
Hi everyone, We are the engineering team behind an enterprise communications application for macOS. We are experiencing a critical crash on macOS 26 that did not occur on any previous macOS version. We are seeking clarification from Apple engineers or anyone who may have insight into this behaviour. Environment Architecturex86_64macOS26.4.1 (25E253)HardwareMac15,13 (MacBook Pro)ExceptionSIGILL / ILL_ILLOPCCrashed ThreadThread 0 (Main Thread)TriggerPlaying a notification sound via NSSound during an incoming call Crash Stack 0 caulk consolidating_free_map::maybe_create_free_node + 119 ← SIGILL 1 caulk tiered_allocator + 1469 2 caulk exported_resource::do_allocate + 15 3 AudioToolboxCore EABLImpl::create + 204 4 CoreAudio AUNotQuiteSoSimpleTimeFactory + 33267 8 AudioToolboxCore AudioUnitInitialize + 189 9 AudioToolbox XAudioUnit::Initialize + 19 10 AudioToolbox MESubmixGraph::initialize + 125 11 AudioToolbox MESubmixGraph::connectInputChannel + 1172 12 AudioToolbox MEDeviceStreamClient::AddRunningClient + 509 15 AudioToolbox AudioQueueObject::StartRunning + 194 16 AudioToolbox AudioQueueObject::Start + 1447 22 AudioToolbox AQ::API::V2Impl::AudioQueueStartWithFlags + 805 23 AVFAudio AVAudioPlayerCpp::playQueue + 354 24 AVFAudio AVAudioPlayerCpp::DoAction + 134 25 AVFAudio -[AVAudioPlayer play] + 26 26 AppKit -[NSSound play] + 100 27 Our App -[AudioHelper tryToStartSound:ofType:] + 569 28 Our App block_invoke + 59 Behaviour Difference Between macOS Versions The exact same code path that triggers this crash on macOS 26 works without any issue on macOS 14 and macOS 15 — no crash, no warning, no log output of any kind. The crash occurs inside Apple's private caulk memory allocator during CoreAudio audio engine initialisation, triggered by a call to [NSSound play]. The SIGILL / ILL_ILLOPC at maybe_create_free_node + 119 suggests a hard ud2 trap — an intentional abort guard inserted at compile time. This strongly suggests that something changed in macOS 26 within NSSound / CoreAudio / caulk that causes this code path to fail in a way it previously did not. Questions We have the following specific questions: Was there a deliberate threading policy change in NSSound / CoreAudio in macOS 26? Is the SIGILL in caulk::consolidating_free_map::maybe_create_free_node an intentional thread-affinity assertion introduced in macOS 26? Are there any other NSSound / AVAudioPlayer / AudioQueue APIs that have similarly tightened their requirements in macOS 26 that we should be aware of? Is there a migration guide, release note, or WWDC session that covers CoreAudio changes in macOS 26 that we may have missed? Has anyone else in the developer community encountered a similar SIGILL crash in caulk on macOS 26 during audio playback?
3
0
553
1d
hapticpatternlibrary.plist error with Text entry fields in Simulator only
When I have a TextField or TextEditor, tapping into it produces these two console entries about 18 times each: CHHapticPattern.mm:487 +[CHHapticPattern patternForKey:error:]: Failed to read pattern library data: Error Domain=NSCocoaErrorDomain Code=260 "The file “hapticpatternlibrary.plist” couldn’t be opened because there is no such file." UserInfo={NSFilePath=/Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSURL=file:///Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSUnderlyingError=0x600000ca1b30 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}} <_UIKBFeedbackGenerator: 0x600003505290>: Error creating CHHapticPattern: Error Domain=NSCocoaErrorDomain Code=260 "The file “hapticpatternlibrary.plist” couldn’t be opened because there is no such file." UserInfo={NSFilePath=/Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSURL=file:///Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSUnderlyingError=0x600000ca1b30 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}} My app does not use haptics. This doesn't appear to cause any issues, although entering text can feel a bit sluggish (even on device), but I am unable to determine relatedness. None-the-less, it definitely is a lot of log noise. Code to reproduce in simulator (xcode 26.2; ios 26 or 18, with iPhone 16 Pro or iPhone 17 Pro): import SwiftUI struct ContentView: View { @State private var textEntered: String = "" @State private var textEntered2: String = "" @State private var textEntered3: String = "" var body: some View { VStack { Spacer() TextField("Tap Here", text: $textEntered) TextField("Tap Here Too", text: $textEntered2) TextEditor(text: $textEntered3) .overlay(RoundedRectangle(cornerRadius: 8).strokeBorder(.primary, lineWidth: 1)) .frame(height: 100) Spacer() } } } #Preview { ContentView() } Tapping back and forth in these fields generates the errors each time. Thanks, Steve
4
0
973
Mar ’26
Hybrid Wired-to-Wireless Audio Mode Using AirPods Charging Case
Many Apple users own both Bluetooth earphones (AirPods) and traditional wired earphones. While Bluetooth audio provides freedom of movement, some users still prefer wired earphones for comfort, sound profile, or personal preference. However, plugging wired earphones directly into an iPhone can feel restrictive and inconvenient during daily use. This proposal suggests a hybrid audio approach where wired earphones can be connected to a Bluetooth-enabled AirPods charging case (or a similar Apple-designed module), allowing users to enjoy wired earphones without a physical connection to the iPhone. #Problem Statement *Wired earphones offer consistent audio quality and zero latency *Bluetooth earphones provide freedom from cables *Users must currently choose one or the other *Plugging wired earphones into an iPhone limits movement and can feel intrusive in daily scenarios (walking, commuting, working) There is no native Apple solution that allows wired earphones to function wirelessly while maintaining Apple’s audio experience standards. #Proposed Solution Introduce a Wired-to-Wireless Audio Mode through the AirPods charging case or a dedicated Apple Bluetooth audio bridge. How it works: User plugs wired earphones into the AirPods case (or a future AirPods accessory port) The case acts as a Bluetooth audio transmitter Audio is streamed wirelessly from iPhone to the case The case outputs audio to the wired earphones #User experiences: No cable connected to the iPhone Familiar wired earphone sound Freedom of movement similar to Bluetooth earbuds User Experience (UX Flow) Plug wired earphones into the AirPods case iPhone automatically detects: “Wired Earphones via AirPods Case” Seamless pairing using existing AirPods framework Audio controls, volume, and switching handled through iOS No additional apps required #Key Benefits Combines wired sound reliability with wireless convenience Reduces physical cable disturbance during use Extends usefulness of existing wired earphones Minimal learning curve for users Fits naturally into Apple’s ecosystem and design philosophy #Privacy & Performance Considerations On-device audio processing only No cloud involvement Low-latency audio using Apple’s proprietary Bluetooth codecs Power-efficient usage leveraging AirPods case battery #Target Users Users who prefer wired earphones but want wireless freedom Commuters and walkers Developers and professionals who multitask Users sensitive to Bluetooth earbud fit or comfort #Ecosystem Fit Builds on existing AirPods pairing and audio stack Aligns with Apple’s focus on seamless UX Could be implemented via: New AirPods hardware Firmware update + accessory Dedicated Apple audio bridge
1
0
326
Jan ’26
Apple Pencil Pro Haptic Initiation
Hello, The Apple Pencil Pro brought with it the UICanvasFeedbackGenerator API, which lets us trigger haptic feedback on discrete events initiated by the pencil. That works fine. My question then: is it possible / are we "allowed" to trigger haptic feedback on events that weren't initiated by the pencil? For example, say the user is using a left hand finger to drag a slider, while holding the pencil in their right hand-- would it be possible to make the pencil vibrate to indicate the dragged slider knob reached a certain point? Or is the rule that vibration is only possible/allowed when the pencil itself generated a touch? Thanks!
3
0
271
Sep ’25
macOS 26 – NSSound/CoreAudio causes SIGILL crash in caulk allocator
Hi everyone, We are the engineering team behind an enterprise communications application for macOS. We are experiencing a critical crash on macOS 26 that did not occur on any previous macOS version. We are seeking clarification from Apple engineers or anyone who may have insight into this behaviour. Environment Architecturex86_64macOS26.4.1 (25E253)HardwareMac15,13 (MacBook Pro)ExceptionSIGILL / ILL_ILLOPCCrashed ThreadThread 0 (Main Thread)TriggerPlaying a notification sound via NSSound during an incoming call Crash Stack 0 caulk consolidating_free_map::maybe_create_free_node + 119 ← SIGILL 1 caulk tiered_allocator + 1469 2 caulk exported_resource::do_allocate + 15 3 AudioToolboxCore EABLImpl::create + 204 4 CoreAudio AUNotQuiteSoSimpleTimeFactory + 33267 8 AudioToolboxCore AudioUnitInitialize + 189 9 AudioToolbox XAudioUnit::Initialize + 19 10 AudioToolbox MESubmixGraph::initialize + 125 11 AudioToolbox MESubmixGraph::connectInputChannel + 1172 12 AudioToolbox MEDeviceStreamClient::AddRunningClient + 509 15 AudioToolbox AudioQueueObject::StartRunning + 194 16 AudioToolbox AudioQueueObject::Start + 1447 22 AudioToolbox AQ::API::V2Impl::AudioQueueStartWithFlags + 805 23 AVFAudio AVAudioPlayerCpp::playQueue + 354 24 AVFAudio AVAudioPlayerCpp::DoAction + 134 25 AVFAudio -[AVAudioPlayer play] + 26 26 AppKit -[NSSound play] + 100 27 Our App -[AudioHelper tryToStartSound:ofType:] + 569 28 Our App block_invoke + 59 Behaviour Difference Between macOS Versions The exact same code path that triggers this crash on macOS 26 works without any issue on macOS 14 and macOS 15 — no crash, no warning, no log output of any kind. The crash occurs inside Apple's private caulk memory allocator during CoreAudio audio engine initialisation, triggered by a call to [NSSound play]. The SIGILL / ILL_ILLOPC at maybe_create_free_node + 119 suggests a hard ud2 trap — an intentional abort guard inserted at compile time. This strongly suggests that something changed in macOS 26 within NSSound / CoreAudio / caulk that causes this code path to fail in a way it previously did not. Questions We have the following specific questions: Was there a deliberate threading policy change in NSSound / CoreAudio in macOS 26? Is the SIGILL in caulk::consolidating_free_map::maybe_create_free_node an intentional thread-affinity assertion introduced in macOS 26? Are there any other NSSound / AVAudioPlayer / AudioQueue APIs that have similarly tightened their requirements in macOS 26 that we should be aware of? Is there a migration guide, release note, or WWDC session that covers CoreAudio changes in macOS 26 that we may have missed? Has anyone else in the developer community encountered a similar SIGILL crash in caulk on macOS 26 during audio playback?
Replies
3
Boosts
0
Views
553
Activity
1d
hapticpatternlibrary.plist error with Text entry fields in Simulator only
When I have a TextField or TextEditor, tapping into it produces these two console entries about 18 times each: CHHapticPattern.mm:487 +[CHHapticPattern patternForKey:error:]: Failed to read pattern library data: Error Domain=NSCocoaErrorDomain Code=260 "The file “hapticpatternlibrary.plist” couldn’t be opened because there is no such file." UserInfo={NSFilePath=/Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSURL=file:///Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSUnderlyingError=0x600000ca1b30 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}} <_UIKBFeedbackGenerator: 0x600003505290>: Error creating CHHapticPattern: Error Domain=NSCocoaErrorDomain Code=260 "The file “hapticpatternlibrary.plist” couldn’t be opened because there is no such file." UserInfo={NSFilePath=/Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSURL=file:///Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSUnderlyingError=0x600000ca1b30 {Error Domain=NSPOSIXErrorDomain Code=2 "No such file or directory"}} My app does not use haptics. This doesn't appear to cause any issues, although entering text can feel a bit sluggish (even on device), but I am unable to determine relatedness. None-the-less, it definitely is a lot of log noise. Code to reproduce in simulator (xcode 26.2; ios 26 or 18, with iPhone 16 Pro or iPhone 17 Pro): import SwiftUI struct ContentView: View { @State private var textEntered: String = "" @State private var textEntered2: String = "" @State private var textEntered3: String = "" var body: some View { VStack { Spacer() TextField("Tap Here", text: $textEntered) TextField("Tap Here Too", text: $textEntered2) TextEditor(text: $textEntered3) .overlay(RoundedRectangle(cornerRadius: 8).strokeBorder(.primary, lineWidth: 1)) .frame(height: 100) Spacer() } } } #Preview { ContentView() } Tapping back and forth in these fields generates the errors each time. Thanks, Steve
Replies
4
Boosts
0
Views
973
Activity
Mar ’26
Hybrid Wired-to-Wireless Audio Mode Using AirPods Charging Case
Many Apple users own both Bluetooth earphones (AirPods) and traditional wired earphones. While Bluetooth audio provides freedom of movement, some users still prefer wired earphones for comfort, sound profile, or personal preference. However, plugging wired earphones directly into an iPhone can feel restrictive and inconvenient during daily use. This proposal suggests a hybrid audio approach where wired earphones can be connected to a Bluetooth-enabled AirPods charging case (or a similar Apple-designed module), allowing users to enjoy wired earphones without a physical connection to the iPhone. #Problem Statement *Wired earphones offer consistent audio quality and zero latency *Bluetooth earphones provide freedom from cables *Users must currently choose one or the other *Plugging wired earphones into an iPhone limits movement and can feel intrusive in daily scenarios (walking, commuting, working) There is no native Apple solution that allows wired earphones to function wirelessly while maintaining Apple’s audio experience standards. #Proposed Solution Introduce a Wired-to-Wireless Audio Mode through the AirPods charging case or a dedicated Apple Bluetooth audio bridge. How it works: User plugs wired earphones into the AirPods case (or a future AirPods accessory port) The case acts as a Bluetooth audio transmitter Audio is streamed wirelessly from iPhone to the case The case outputs audio to the wired earphones #User experiences: No cable connected to the iPhone Familiar wired earphone sound Freedom of movement similar to Bluetooth earbuds User Experience (UX Flow) Plug wired earphones into the AirPods case iPhone automatically detects: “Wired Earphones via AirPods Case” Seamless pairing using existing AirPods framework Audio controls, volume, and switching handled through iOS No additional apps required #Key Benefits Combines wired sound reliability with wireless convenience Reduces physical cable disturbance during use Extends usefulness of existing wired earphones Minimal learning curve for users Fits naturally into Apple’s ecosystem and design philosophy #Privacy & Performance Considerations On-device audio processing only No cloud involvement Low-latency audio using Apple’s proprietary Bluetooth codecs Power-efficient usage leveraging AirPods case battery #Target Users Users who prefer wired earphones but want wireless freedom Commuters and walkers Developers and professionals who multitask Users sensitive to Bluetooth earbud fit or comfort #Ecosystem Fit Builds on existing AirPods pairing and audio stack Aligns with Apple’s focus on seamless UX Could be implemented via: New AirPods hardware Firmware update + accessory Dedicated Apple audio bridge
Replies
1
Boosts
0
Views
326
Activity
Jan ’26
Crackling speakers on MacBook M1 Pro
My MacBook speakers have started crackling on every sound since macOS 26 Beta 1, the problem is still the same on Beta 9. Happens especially when Simulator is opened.
Replies
1
Boosts
0
Views
269
Activity
Jan ’26
Code snippet
part of the .swift file that controla the haptics buzz per each heart beat. however gives a 3-5 seconds delay between actual heart beat and watch haptic/buzz for that heart beat.
Replies
0
Boosts
0
Views
123
Activity
Nov ’25
Apple Pencil Pro Haptic Initiation
Hello, The Apple Pencil Pro brought with it the UICanvasFeedbackGenerator API, which lets us trigger haptic feedback on discrete events initiated by the pencil. That works fine. My question then: is it possible / are we "allowed" to trigger haptic feedback on events that weren't initiated by the pencil? For example, say the user is using a left hand finger to drag a slider, while holding the pencil in their right hand-- would it be possible to make the pencil vibrate to indicate the dragged slider knob reached a certain point? Or is the rule that vibration is only possible/allowed when the pencil itself generated a touch? Thanks!
Replies
3
Boosts
0
Views
271
Activity
Sep ’25