Post

Replies

Boosts

Views

Activity

Reply to ppq.apple.com returning 502 Bad Gateway - Unable to verify developer apps on device
I've gotten OK, Bad-Gateway, Gateway Time-Out on four different systems. Even when OK is returned, it takes a while to get a response. This started almost exactly at 10am pacific time for me. Like many of you I used the security command line to check my certs, checked my firewall (PFSense), and even switch my build system and my target iPad to use my iPhone personal hot-spot, in order to bypass my infrastructure and still, no-joy. After reading your posts, I have become convinced that this is an Apple services issue, even though their status page says everything is working. More information. I have another project (Project-B) I'm working on, which is being developed on a different machine. Project-B installs on the same target iPad with no issues. I migrated the failing project (Project-A) over to this machine and added the missing certificates to the keychain so that the failing project would build. Even on this second system, I'm consistently getting the error for project-a but not for project-b. So, apparently this failure is specific to the type of certificate or provisioning for the target app, it seams. Project-A: Hardware app using DriverKit, custom USB hardware, microphone, camera, etc. Only viable on iPadOS with an M-Processor. Project-B: SwiftUI, SwiftData, document-based. Does use a FLIR camera but does not require DriverKit. Also uses microphone and camera. Are any of you using DriverKit or working with anything that corresponds to Project-A? Have you tried build a simple sample app that does nothing special; does it work?
1w
Reply to How to safely switch between mic configurations on iOS?
The call to setVoiceProcessingEnabled(isEnabled: Bool) is synchronous, it triggers a configuration change in the engine, so you must ensure that you access formats from input and output node after changing the voice processing enabled state to ensure you’re configuring your graph with the correct formats. Additionally there is some sample code available here as part of the AVEchoTouch project - https://developer.apple.com/documentation/avfaudio/using-voice-processing?language=objc
Topic: Media Technologies SubTopic: Audio Tags:
Nov ’25
Reply to downsampling in AVAudioEngine graph
From what I have observed, downsampling or upsampling is automatic, if you use a mixer. Specifically what I observed was that one end of the mixer is attached to the input node, which was running at 44.1KHz. The other end of the mixer was connected to an AVAudioPlayerNode running at 48KHz. I could hear the network transmitted audio coming out of the speaker. This confused me at first until I looked at the specific documentation for AVAudioMixer: "The mixer accepts input at any sample rate and efficiently combines sample rate conversions. It also accepts any channel count and correctly upmixes or downmixes to the output channel count."
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25
Reply to Handling AVAudioEngine Configuration Change
@milesegan Were there any memory management issues switching from AVAudioPlayerNode to AVAudioSourceNode? I'm using the player node now and am having issues when the audio engine goes through a configuration change. When this happens I stop the engine, remove the player node(s), re-attach and re-connect the player nodes, and then restart the engine. I wrote this code before realizing source nodes were a thing. I'm hoping that using a source mode makes things simpler and require less dynamic coordination. My thinking is that I can have the requisite number of source nodes connected to a mixer and just leave that configuration around for the duration of my app. Then, when one of my two or three dedicated inputs comes online, I can feed buffers into the source nodes and not worry about adding and removing player nodes. From your experience, does this sound like it would work? Would you be willing to share some code showing how you configure your engine with the source node?
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25
Reply to Is AVAudioPCMFormatFloat32 required for playing a buffer with AVAudioEngine / AVAudioPlayerNode
In my experience, things only consistently work when using Float32 non-interleaved samples. This seems to be the requirement for the audio engine input and output nodes as well asl playing back audio with the player node. I am also recording data to the disk in this format. Any time I tried to use Int16 interleaved data, the API results were negative. I had to perform my own conversions to and from these two formats because the third-party library I was using for remote-conference audio only accepted Int16 interleaved data in both directions.
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25
Reply to When to set AVAudioSession's preferredInput?
Probably way too late but perhaps someone else will benefit from the discussion. I have observed that the system will always jump to the recently plugged-in microphone. I assume this is because they assume that a person who just plugged in a microphone wants to use said microphone immediately. My suggestion is to monitory the route change notifications and re-assert your wishes by calling setPreferredInput again. I have not tested this but give it a try.
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25
Reply to DriverKit driver doesn't appear in Settings when installed with iPad app
The Apple defect for iOS was solved last year. If you are still having issues on iOS, make sure you are running the latest version of iOS as a first step. If that requirement has been met, you have some other technical issue prevent preventing your driver from properly being recognized by the system. If you are using macOS, there may be other issues such as your Xcode configuration, or not having your system properly set up for driver development. I don't have all the details for macOS, but the documentation provided by Apple should help in this instance. It might help if you created a separate issue perhaps linked to this one where you completely describe your situation and the problem you're experiencing. Be sure to include the code and configuration snippets so we can see exactly what you are doing in order to properly register and activate your driver.
Topic: App & System Services SubTopic: Drivers Tags:
Apr ’25
Reply to DriverKit: Check that driver is enabled on iPadOS
I too have the same requirement. With my USB driver I cannot tell the difference between the device being unplugged and the driver not being activated via the Settings app. I need to be able to direct the user to flip the switch when I know for certain that it has not yet been flipped. Neither the IOKit or SystemExtensions frameworks are available in iOS. Does anyone know of a workaround for this issue?
Topic: App & System Services SubTopic: Core OS Tags:
Jan ’25
Reply to IOServiceOpen fails with -308 Error (smUnExBusError)
If I do what I think I need to be doing to unmap the memory, when I try to open the service again, it fails. If I skip the step where I do that unmapping, the service opens successfully. Are you saying everything works if you don't unmap the memory? That is, when you open the device again without attempting to un-map memory, can you communication successfully to the device and proceed as normal? The way this is worded, it is unclear to me.
Topic: App & System Services SubTopic: Drivers Tags:
Nov ’24
Reply to AVAudioPCMBuffer Memory Management
Did you ever figure this out? I've been doing the same thing. I don't get any crashes but when I hand the buffers of to LiveSwitch for playback, there is not audio signal. I receive the buffer from a tap on bus zero. I send the buffer to the publisher. Buffer is received, potentially by up to two consumers (currently one). Buffer has to be converted using AVAudioConverter from Float32 to Int16, which is required for consumption by LiveSwitch APIs. Buffer memory converted to NSMutableData (required by LiveSwitch) Buffer wrapped / converted in FMLiveSwitchAudioFrame. Buffer raised to LiveSwitch for processing. Result: No signal.
Topic: Media Technologies SubTopic: Audio Tags:
Aug ’24
Reply to ppq.apple.com returning 502 Bad Gateway - Unable to verify developer apps on device
Quinn, I met you face-to-face back in 2005. Glad you are still around.
Replies
Boosts
Views
Activity
1w
Reply to ppq.apple.com returning 502 Bad Gateway - Unable to verify developer apps on device
I've gotten OK, Bad-Gateway, Gateway Time-Out on four different systems. Even when OK is returned, it takes a while to get a response. This started almost exactly at 10am pacific time for me. Like many of you I used the security command line to check my certs, checked my firewall (PFSense), and even switch my build system and my target iPad to use my iPhone personal hot-spot, in order to bypass my infrastructure and still, no-joy. After reading your posts, I have become convinced that this is an Apple services issue, even though their status page says everything is working. More information. I have another project (Project-B) I'm working on, which is being developed on a different machine. Project-B installs on the same target iPad with no issues. I migrated the failing project (Project-A) over to this machine and added the missing certificates to the keychain so that the failing project would build. Even on this second system, I'm consistently getting the error for project-a but not for project-b. So, apparently this failure is specific to the type of certificate or provisioning for the target app, it seams. Project-A: Hardware app using DriverKit, custom USB hardware, microphone, camera, etc. Only viable on iPadOS with an M-Processor. Project-B: SwiftUI, SwiftData, document-based. Does use a FLIR camera but does not require DriverKit. Also uses microphone and camera. Are any of you using DriverKit or working with anything that corresponds to Project-A? Have you tried build a simple sample app that does nothing special; does it work?
Replies
Boosts
Views
Activity
1w
Reply to How to safely switch between mic configurations on iOS?
The call to setVoiceProcessingEnabled(isEnabled: Bool) is synchronous, it triggers a configuration change in the engine, so you must ensure that you access formats from input and output node after changing the voice processing enabled state to ensure you’re configuring your graph with the correct formats. Additionally there is some sample code available here as part of the AVEchoTouch project - https://developer.apple.com/documentation/avfaudio/using-voice-processing?language=objc
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Nov ’25
Reply to downsampling in AVAudioEngine graph
From what I have observed, downsampling or upsampling is automatic, if you use a mixer. Specifically what I observed was that one end of the mixer is attached to the input node, which was running at 44.1KHz. The other end of the mixer was connected to an AVAudioPlayerNode running at 48KHz. I could hear the network transmitted audio coming out of the speaker. This confused me at first until I looked at the specific documentation for AVAudioMixer: "The mixer accepts input at any sample rate and efficiently combines sample rate conversions. It also accepts any channel count and correctly upmixes or downmixes to the output channel count."
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to Handling AVAudioEngine Configuration Change
@milesegan Were there any memory management issues switching from AVAudioPlayerNode to AVAudioSourceNode? I'm using the player node now and am having issues when the audio engine goes through a configuration change. When this happens I stop the engine, remove the player node(s), re-attach and re-connect the player nodes, and then restart the engine. I wrote this code before realizing source nodes were a thing. I'm hoping that using a source mode makes things simpler and require less dynamic coordination. My thinking is that I can have the requisite number of source nodes connected to a mixer and just leave that configuration around for the duration of my app. Then, when one of my two or three dedicated inputs comes online, I can feed buffers into the source nodes and not worry about adding and removing player nodes. From your experience, does this sound like it would work? Would you be willing to share some code showing how you configure your engine with the source node?
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to Is AVAudioPCMFormatFloat32 required for playing a buffer with AVAudioEngine / AVAudioPlayerNode
In my experience, things only consistently work when using Float32 non-interleaved samples. This seems to be the requirement for the audio engine input and output nodes as well asl playing back audio with the player node. I am also recording data to the disk in this format. Any time I tried to use Int16 interleaved data, the API results were negative. I had to perform my own conversions to and from these two formats because the third-party library I was using for remote-conference audio only accepted Int16 interleaved data in both directions.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to How do I change AVAudioSession settings to be not defaultToSpeaker?
When I had this problem, I seem to recall that I had to add the .allowBluetoothA2DP option to get my AirPods to work.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to When to set AVAudioSession's preferredInput?
Probably way too late but perhaps someone else will benefit from the discussion. I have observed that the system will always jump to the recently plugged-in microphone. I assume this is because they assume that a person who just plugged in a microphone wants to use said microphone immediately. My suggestion is to monitory the route change notifications and re-assert your wishes by calling setPreferredInput again. I have not tested this but give it a try.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to DriverKit driver doesn't appear in Settings when installed with iPad app
The Apple defect for iOS was solved last year. If you are still having issues on iOS, make sure you are running the latest version of iOS as a first step. If that requirement has been met, you have some other technical issue prevent preventing your driver from properly being recognized by the system. If you are using macOS, there may be other issues such as your Xcode configuration, or not having your system properly set up for driver development. I don't have all the details for macOS, but the documentation provided by Apple should help in this instance. It might help if you created a separate issue perhaps linked to this one where you completely describe your situation and the problem you're experiencing. Be sure to include the code and configuration snippets so we can see exactly what you are doing in order to properly register and activate your driver.
Topic: App & System Services SubTopic: Drivers Tags:
Replies
Boosts
Views
Activity
Apr ’25
Reply to DriverKit: Check that driver is enabled on iPadOS
I too have the same requirement. With my USB driver I cannot tell the difference between the device being unplugged and the driver not being activated via the Settings app. I need to be able to direct the user to flip the switch when I know for certain that it has not yet been flipped. Neither the IOKit or SystemExtensions frameworks are available in iOS. Does anyone know of a workaround for this issue?
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Jan ’25
Reply to IOServiceOpen fails with -308 Error (smUnExBusError)
If I do what I think I need to be doing to unmap the memory, when I try to open the service again, it fails. If I skip the step where I do that unmapping, the service opens successfully. Are you saying everything works if you don't unmap the memory? That is, when you open the device again without attempting to un-map memory, can you communication successfully to the device and proceed as normal? The way this is worded, it is unclear to me.
Topic: App & System Services SubTopic: Drivers Tags:
Replies
Boosts
Views
Activity
Nov ’24
Reply to Does AVFoundation on iOS/iPadOS support capturing from USB cameras?
Yes it does (assuming I'm correct about AVCaptureDevice being part of the AVFoundation framework). I have a USB connected camera that I use to capture video and still images using the AVCaptureDevice APIs.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’24
Reply to Has iOS 18 changed the threshold for decoding base64 into ASCII code?
I am also having an issue with this change in my code. Why doesn't switching from .ascii to .utf8 work? Doesn't this allow for the high bit to be set? Sorry for the ignorance but I look forward to Quinn's answer.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Oct ’24
Reply to DriverKit driver doesn't appear in Settings when installed with iPad app
This is now an issue with iPadOS 18. In iPadOS 17.7 my driver shows up in settings just fine. After recompiling with Xcode 16 and installing my app (containing my driver) on iPadOS 18, the app shows up in settings but the driver-enable button is missing from Settings. When I plug-in my custom USB device, the app cannot detect it.
Topic: App & System Services SubTopic: Drivers Tags:
Replies
Boosts
Views
Activity
Sep ’24
Reply to AVAudioPCMBuffer Memory Management
Did you ever figure this out? I've been doing the same thing. I don't get any crashes but when I hand the buffers of to LiveSwitch for playback, there is not audio signal. I receive the buffer from a tap on bus zero. I send the buffer to the publisher. Buffer is received, potentially by up to two consumers (currently one). Buffer has to be converted using AVAudioConverter from Float32 to Int16, which is required for consumption by LiveSwitch APIs. Buffer memory converted to NSMutableData (required by LiveSwitch) Buffer wrapped / converted in FMLiveSwitchAudioFrame. Buffer raised to LiveSwitch for processing. Result: No signal.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Aug ’24