My MacBook speakers have started crackling on every sound since macOS 26 Beta 1, the problem is still the same on Beta 9.
Happens especially when Simulator is opened.
Sound and Haptics
RSS for tagCreate meaningful and delightful experiences that engage a wider range of senses with sound and haptic design.
Posts under Sound and Haptics tag
18 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hello,
I had submitted a question to clarify which components have accessibility APIs that trigger haptics for VoiceOver users https://developer.apple.com/forums/thread/773182.
The question stems from perhaps a more direct question about specific components: do tablists and disclosures natively intend to include haptics or screen reader hint or other state or properties to indicate to screen reader users where the component begins or ends?
In some web experiences there are screen reader hint text stating "end of..." or "entering" as a way to define the boundaries of these inline dialogs.
I had asked about haptics in the prior thread because I do not recall natively implemented version of this except in some haptic cues but have not experienced them consistently so I am not sure if that is an intended native Swift implementation or perhaps something custom.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
iOS
Accessibility
Sound and Haptics
Core Haptics
Is there a way to configure the APNS notification sound volume to be louder?
I am implementing some custom sounds(narrative sentences) for APNS, it does play the custom sound, but the volume of the custom sound is not loud enough even though I had set the device's volume and "RingTone and Alerts" volume to max.
I tried to amplify the custom sound file, it does play louder but the result is minimum if I want to maintain the quality of the sound without it been distorted.
I tried to use Notification Service Extension, AVAudioPlayer and AVAudioSession to play the sound, it does play louder in max volume compare with relying on default sound payload in APNS, but the problem is AVAudioPlayer and AVAudioSession do not seems to be usable when the application is in background or killed state, is there any other alternative I could use?
I want to implement a feature where a custom notification sound file is downloaded from the server when the app is first launched and stored locally on the device. When a push notification arrives, the stored sound should be played in all app states, including foreground, background, and terminated (killed) state.
Does anyone have an idea on how to implement this in iOS? Specifically, I am looking for guidance on:
1)Downloading and storing the sound file securely on the device.
2)Using the locally stored file for push notification sounds.
3)Ensuring the sound plays correctly in all states, including when the app is not running.
Topic:
Community
SubTopic:
Apple Developers
Tags:
APNS
User Notifications
Sound and Haptics
Notification Center
I've made the code in xcode for apple watch with 2 swift view (contentView.swift and interfaceController.swift).The swift for sound and haptic feedback is in InterfaceController.swift. But the the sound does not appear with haptic feedback in apple watch after complete the xcode.
the app is done but no sound appear with haptic feedback when rotate apple watch digital crown. when crown rotated but sound appear
code
import WatchKit
import AVFoundation
import WatchKit
class InterfaceController: WKInterfaceController {
// ... your UI elements
func playSelectionHapticAndSound() {
// Play a haptic feedback pattern
WKInterfaceDevice.current().play(.success)
// Load and play a selection sound effect
guard let soundURL = Bundle.main.url(forResource: "spin", withExtension: "wav") else { return }
do {
let player = try AVAudioPlayer(contentsOf: soundURL)
player.play()
} catch {
print("Error playing sound: \(error)")
}
}
}
Hello, my submission is based on Haptics. Without it the App doesn't make sense. And only real iPhone can give this opportunity. But it says that Xcode playgrounds will be tested on Simulator.
Is it indeed like this? What can I do?
Thank you in advance!
I would love to see a feature where the iPhone vibrates when a call is answered by the other person. This would be helpful for users who are walking or holding their phone away from their ear while waiting for the call to connect. The vibration would provide a subtle yet effective notification that the call has been picked up, reducing unnecessary screen-checking.
This feature could be optional and toggled in settings under Sounds & Haptics > Call Feedback for users who prefer it.
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound?
Some transitions in Apple apps generally include different beep sounds, such as
opening a new screen
screen dimming
when a VoiceOver user swipes from the header / navbar to the body
a scraping sound when swiping up or down a page.
the beginning or end of the body section
in Calculator when swiping from one row to the next.
opening a pop up menu
I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
I checked the latest release notes for latest beta, and there doesn't seem to be a fix for this. But basically, the vibrations that you receive for when you long press a message to react, or hold down on an app in Home Screen, seem to stop working after a while.
This issue is reoccurring randomly.
Steps to repro:
Not fully sure on this, but I'm on iPhone 16 pro max and running the iOS 18.3 dev beta described in the title. I have the default haptics enabled in which you receive a vibration when you long press on a message in iMessage or Messenger, and also when you long press on an app on the Home Screen.
These seem to stop working, along with any other vibrations apart from calls and notifications) after a while. The only workaround is to restart the iPhone entirely.
anyone else face the same?
Our watch app, Regatta Timer, is a specialised countdown timer for sailing competitions. It is crucial that the beeps & haptics continue when 'wrist down' on alway on displays. We tried to enable this by adding 'background mode' but that only works in the Xcode Apple Watch simulator, not on an actual device with always on display. Any idea how we can get this working also on the Apple Watch device?
In ContentView.swift we currently added this code:
WKInterfaceDevice.current().play(sound)
}
but that doesnt work - regardless of adding , phase == .active`
or not.
STEPS TO REPRODUCE
Install on an ACTUAL DEVICE with always on display
start the countdown timer: beeps & sounds are OK (each minute,...)
do 'wrist down': the countdown timer continues on the dimmed display, but the sounds & haptics stop working until you raise your wrist to wake up the display.
I updated developers beta 18.3 on my iPad 7th generation. Post the update sound isn’t working. I tried to restart and reset device completely But no luck. Also, youtube app isn’t working post update
I have an iPhone 14 with iOS 18 installed on which I noticed the vibration no longer works. The haptic feedback setting is set to "Always on", I have no vibration on notifications, nor on incoming calls and the keyboard feedback does not work either.
I also noticed another strange thing: going to the ringtones menu, these do not play if I select them to try them.
I tried to update to iOS 18.1 but
even with this version I have the same problem. Could it therefore be a hardware problem and not a software problem?
Is anyone else having the same problem as me?
Thanks
New in iOS 17.5 is UIImpactFeedbackGenerator/impactOccurred(at:), which generates haptic feedback at a specific screen location.
https://developer.apple.com/documentation/uikit/uiimpactfeedbackgenerator/4403143-impactoccurred
Which devices support this?
How does this work if the Taptic Engine has a fixed physical location?
After installing iOS 18, the iPhone would have consistent low volume and from this low, it wants to go more down suddenly when playing some media and gets back to its normal condition (already low) itself. I have tried several things from Internet like forced restart, several times restart, turning on vocal shortcut and turning it off back, but nothing helped. Any suggestions? Its Iphone 16 pro max by the way
Hello everyone,
I’m experiencing occasional crashes in my app related to the Core Haptics framework, specifically when creating haptic pattern players. The crashes are intermittent and not easily reproducible, so I’m hoping to get some guidance on what might be causing them.
It's seems it's connected to Audio Resource I'm using within AHAP file.
Setup:
I use AVAudioSession and AVAudioEngine to record and play continuous audio. After activating the audio session and setting up the audio engine, I initialize the CHHapticEngine as follows:
let engine = try CHHapticEngine(audioSession: .sharedInstance())
...
try engine?.start()
// Recreate all haptic pattern players you had created.
let pattern = createPatternFromAHAP(Pattern.thinking.rawValue)!
thinkingHapticPlayer = try? engine?.makePlayer(with: pattern)
// Repeat for other players...
AHAP file:
"Pattern":
[
... haptic events
{
"Event":
{
"Time": 0.0,
"EventType": "AudioCustom",
"EventWaveformPath": "voice.chat.thinking.mp3",
"EventParameters":
[
{ "ParameterID": "AudioVolume", "ParameterValue": 0.7 }
]
}
}
]
I’m receiving the following crash report:
Crashed: com.apple.main-thread
EXC_BREAKPOINT 0x00000001ba525c68
0
CoreHaptics
+[CHHapticEngine(CHHapticEngineInternal) doRegisterAudioResource:options:fromPattern:player:error:].cold.1 + 104
1
CoreHaptics
+[CHHapticEngine(CHHapticEngineInternal) doRegisterAudioResource:options:fromPattern:player:error:].cold.1 + 104
2
CoreHaptics
+[CHHapticEngine(CHHapticEngineInternal) doRegisterAudioResource:options:fromPattern:player:error:] + 3784
3
CoreHaptics
-[CHHapticPattern resolveExternalResources:error:] + 388
4
CoreHaptics
-[PatternPlayer initWithPlayable:engine:privileged:error:] + 560
5
CoreHaptics
-[CHHapticEngine createPlayerWithPattern:error:] + 256
6
Mind
VoiceChatHapticEngine.swift - Line 170
VoiceChatHapticEngine.createThinkingHapticPlayer() + 170
Has anyone encountered similar crashes when working with CHHapticEngine and haptic patterns that contains audioCustom event?
Thank you so much for your help.
Ondrej
Haptics are often represented as audio for presentation purposes by Apple in videos and learning resources.
I am curious if:
...Apple has released, or is willing to release any tools that may have been used synthesize audio to represent a haptic patterns (such as in their WWDC19 Audio-Haptic presentation)?
...there are any current tools available that take haptic instruction as input (like AHAP) and outputs an audio file?
...there is some low-level access to the signal that drives the Taptic Engine, so that it can be repurposed as an audio stream?
...you have any other suggestions!
I can imagine some crude solutions that hack together preexisting synthesizers and fudging together a process to convert AHAP to MIDI instructions, dialing in some synth settings to mimic the behaviour of an actuator, but I'm not too interested in that rabbit hole just yet.
Any thoughts? Very curious what the process was for the WWDC videos and audio examples in their documentation...
Thank you!
Just installed iOS 18 Beta 3.
I am seeing my AccessibilityUIServer using the microphone and this is causing no notification sounds, inability to use Siri by voice and volume is grayed out.
If I start to play anything with sound AccessibilityUIServer releases the microphone and I am able to use the app.
Calls still work since AccessibilityUIServer will release and the phone will ring.
Feed back ID is FB14241838.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Audio
Accessibility
Sound Analysis
Sound and Haptics
Hi everyone,
I'm currently facing an issue with AVAudioPlayer in my SwiftUI project. Despite ensuring that the sound file "buttonsound.mp3" is properly added to the project's resources (I dragged and dropped it into Xcode), the application is still unable to locate the file when attempting to play it.
Here's the simplified version of the code I'm using:
import SwiftUI
import AVFoundation
struct ContentView: View {
var body: some View {
VStack {
Button("Play sound") {
playSound(named: "buttonsound", ofType: "mp3")
}
}
}
}
func playSound(named name: String, ofType type: String) {
guard let soundURL = Bundle.main.url(forResource: name, withExtension: type) else {
print("Sound file not found")
return
}
do {
let audioPlayer = try AVAudioPlayer(contentsOf: soundURL)
audioPlayer.prepareToPlay()
audioPlayer.play()
} catch let error {
print("Error playing sound: \(error.localizedDescription)")
}
}