Post

Replies

Boosts

Views

Activity

Reply to AVCam Example: Can I use a Actor instead of a DispatchQueue for capture session activity?
@enodev Thank you for your response. However, the answer seems to skirt the question. I don't want to use both Actor and DispatchQueue, perhaps in some sort of wrapped or nested form. I want to use one in place of the other. Your answer would seem to imply that this is not possible and that using an Actor with AVFoundation APIs that must run on something other than main, for blocking reasons, would still need a DispatchQueue.
May ’24
Reply to AVAudioPCMBuffer Memory Management
Did you ever figure this out? I've been doing the same thing. I don't get any crashes but when I hand the buffers of to LiveSwitch for playback, there is not audio signal. I receive the buffer from a tap on bus zero. I send the buffer to the publisher. Buffer is received, potentially by up to two consumers (currently one). Buffer has to be converted using AVAudioConverter from Float32 to Int16, which is required for consumption by LiveSwitch APIs. Buffer memory converted to NSMutableData (required by LiveSwitch) Buffer wrapped / converted in FMLiveSwitchAudioFrame. Buffer raised to LiveSwitch for processing. Result: No signal.
Topic: Media Technologies SubTopic: Audio Tags:
Aug ’24
Reply to IOServiceOpen fails with -308 Error (smUnExBusError)
If I do what I think I need to be doing to unmap the memory, when I try to open the service again, it fails. If I skip the step where I do that unmapping, the service opens successfully. Are you saying everything works if you don't unmap the memory? That is, when you open the device again without attempting to un-map memory, can you communication successfully to the device and proceed as normal? The way this is worded, it is unclear to me.
Topic: App & System Services SubTopic: Drivers Tags:
Nov ’24
Reply to DriverKit: Check that driver is enabled on iPadOS
I too have the same requirement. With my USB driver I cannot tell the difference between the device being unplugged and the driver not being activated via the Settings app. I need to be able to direct the user to flip the switch when I know for certain that it has not yet been flipped. Neither the IOKit or SystemExtensions frameworks are available in iOS. Does anyone know of a workaround for this issue?
Topic: App & System Services SubTopic: Core OS Tags:
Jan ’25
Reply to DriverKit driver doesn't appear in Settings when installed with iPad app
The Apple defect for iOS was solved last year. If you are still having issues on iOS, make sure you are running the latest version of iOS as a first step. If that requirement has been met, you have some other technical issue prevent preventing your driver from properly being recognized by the system. If you are using macOS, there may be other issues such as your Xcode configuration, or not having your system properly set up for driver development. I don't have all the details for macOS, but the documentation provided by Apple should help in this instance. It might help if you created a separate issue perhaps linked to this one where you completely describe your situation and the problem you're experiencing. Be sure to include the code and configuration snippets so we can see exactly what you are doing in order to properly register and activate your driver.
Topic: App & System Services SubTopic: Drivers Tags:
Apr ’25
Reply to When to set AVAudioSession's preferredInput?
Probably way too late but perhaps someone else will benefit from the discussion. I have observed that the system will always jump to the recently plugged-in microphone. I assume this is because they assume that a person who just plugged in a microphone wants to use said microphone immediately. My suggestion is to monitory the route change notifications and re-assert your wishes by calling setPreferredInput again. I have not tested this but give it a try.
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25
Reply to Is AVAudioPCMFormatFloat32 required for playing a buffer with AVAudioEngine / AVAudioPlayerNode
In my experience, things only consistently work when using Float32 non-interleaved samples. This seems to be the requirement for the audio engine input and output nodes as well asl playing back audio with the player node. I am also recording data to the disk in this format. Any time I tried to use Int16 interleaved data, the API results were negative. I had to perform my own conversions to and from these two formats because the third-party library I was using for remote-conference audio only accepted Int16 interleaved data in both directions.
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25
Reply to Handling AVAudioEngine Configuration Change
@milesegan Were there any memory management issues switching from AVAudioPlayerNode to AVAudioSourceNode? I'm using the player node now and am having issues when the audio engine goes through a configuration change. When this happens I stop the engine, remove the player node(s), re-attach and re-connect the player nodes, and then restart the engine. I wrote this code before realizing source nodes were a thing. I'm hoping that using a source mode makes things simpler and require less dynamic coordination. My thinking is that I can have the requisite number of source nodes connected to a mixer and just leave that configuration around for the duration of my app. Then, when one of my two or three dedicated inputs comes online, I can feed buffers into the source nodes and not worry about adding and removing player nodes. From your experience, does this sound like it would work? Would you be willing to share some code showing how you configure your engine with the source node?
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25
Reply to Core Animation Layer in SwiftUI app does not update unless I rotate the device.
Solved. Must call setNeedsDisplay from the main thread.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
May ’24
Reply to AVCam Example: Can I use a Actor instead of a DispatchQueue for capture session activity?
@enodev Thank you for your response. However, the answer seems to skirt the question. I don't want to use both Actor and DispatchQueue, perhaps in some sort of wrapped or nested form. I want to use one in place of the other. Your answer would seem to imply that this is not possible and that using an Actor with AVFoundation APIs that must run on something other than main, for blocking reasons, would still need a DispatchQueue.
Replies
Boosts
Views
Activity
May ’24
Reply to Change Mic output format in AVAudioEngine
Can Apple weigh-in on this issue? I'm currently receiving samples as Float values and I also need signed 16-bit integer values. I don't want to convert these on the fly if I don't have to or if I can somehow get it for free at the source. Apple: How do we make this happen in code?
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Jul ’24
Reply to AVAudioPCMBuffer Memory Management
Did you ever figure this out? I've been doing the same thing. I don't get any crashes but when I hand the buffers of to LiveSwitch for playback, there is not audio signal. I receive the buffer from a tap on bus zero. I send the buffer to the publisher. Buffer is received, potentially by up to two consumers (currently one). Buffer has to be converted using AVAudioConverter from Float32 to Int16, which is required for consumption by LiveSwitch APIs. Buffer memory converted to NSMutableData (required by LiveSwitch) Buffer wrapped / converted in FMLiveSwitchAudioFrame. Buffer raised to LiveSwitch for processing. Result: No signal.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Aug ’24
Reply to DriverKit driver doesn't appear in Settings when installed with iPad app
This is now an issue with iPadOS 18. In iPadOS 17.7 my driver shows up in settings just fine. After recompiling with Xcode 16 and installing my app (containing my driver) on iPadOS 18, the app shows up in settings but the driver-enable button is missing from Settings. When I plug-in my custom USB device, the app cannot detect it.
Topic: App & System Services SubTopic: Drivers Tags:
Replies
Boosts
Views
Activity
Sep ’24
Reply to Has iOS 18 changed the threshold for decoding base64 into ASCII code?
I am also having an issue with this change in my code. Why doesn't switching from .ascii to .utf8 work? Doesn't this allow for the high bit to be set? Sorry for the ignorance but I look forward to Quinn's answer.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Oct ’24
Reply to Does AVFoundation on iOS/iPadOS support capturing from USB cameras?
Yes it does (assuming I'm correct about AVCaptureDevice being part of the AVFoundation framework). I have a USB connected camera that I use to capture video and still images using the AVCaptureDevice APIs.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’24
Reply to IOServiceOpen fails with -308 Error (smUnExBusError)
If I do what I think I need to be doing to unmap the memory, when I try to open the service again, it fails. If I skip the step where I do that unmapping, the service opens successfully. Are you saying everything works if you don't unmap the memory? That is, when you open the device again without attempting to un-map memory, can you communication successfully to the device and proceed as normal? The way this is worded, it is unclear to me.
Topic: App & System Services SubTopic: Drivers Tags:
Replies
Boosts
Views
Activity
Nov ’24
Reply to DriverKit: Check that driver is enabled on iPadOS
I too have the same requirement. With my USB driver I cannot tell the difference between the device being unplugged and the driver not being activated via the Settings app. I need to be able to direct the user to flip the switch when I know for certain that it has not yet been flipped. Neither the IOKit or SystemExtensions frameworks are available in iOS. Does anyone know of a workaround for this issue?
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Jan ’25
Reply to DriverKit driver doesn't appear in Settings when installed with iPad app
The Apple defect for iOS was solved last year. If you are still having issues on iOS, make sure you are running the latest version of iOS as a first step. If that requirement has been met, you have some other technical issue prevent preventing your driver from properly being recognized by the system. If you are using macOS, there may be other issues such as your Xcode configuration, or not having your system properly set up for driver development. I don't have all the details for macOS, but the documentation provided by Apple should help in this instance. It might help if you created a separate issue perhaps linked to this one where you completely describe your situation and the problem you're experiencing. Be sure to include the code and configuration snippets so we can see exactly what you are doing in order to properly register and activate your driver.
Topic: App & System Services SubTopic: Drivers Tags:
Replies
Boosts
Views
Activity
Apr ’25
Reply to When to set AVAudioSession's preferredInput?
Probably way too late but perhaps someone else will benefit from the discussion. I have observed that the system will always jump to the recently plugged-in microphone. I assume this is because they assume that a person who just plugged in a microphone wants to use said microphone immediately. My suggestion is to monitory the route change notifications and re-assert your wishes by calling setPreferredInput again. I have not tested this but give it a try.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to How do I change AVAudioSession settings to be not defaultToSpeaker?
When I had this problem, I seem to recall that I had to add the .allowBluetoothA2DP option to get my AirPods to work.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to Is AVAudioPCMFormatFloat32 required for playing a buffer with AVAudioEngine / AVAudioPlayerNode
In my experience, things only consistently work when using Float32 non-interleaved samples. This seems to be the requirement for the audio engine input and output nodes as well asl playing back audio with the player node. I am also recording data to the disk in this format. Any time I tried to use Int16 interleaved data, the API results were negative. I had to perform my own conversions to and from these two formats because the third-party library I was using for remote-conference audio only accepted Int16 interleaved data in both directions.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’25
Reply to Handling AVAudioEngine Configuration Change
@milesegan Were there any memory management issues switching from AVAudioPlayerNode to AVAudioSourceNode? I'm using the player node now and am having issues when the audio engine goes through a configuration change. When this happens I stop the engine, remove the player node(s), re-attach and re-connect the player nodes, and then restart the engine. I wrote this code before realizing source nodes were a thing. I'm hoping that using a source mode makes things simpler and require less dynamic coordination. My thinking is that I can have the requisite number of source nodes connected to a mixer and just leave that configuration around for the duration of my app. Then, when one of my two or three dedicated inputs comes online, I can feed buffers into the source nodes and not worry about adding and removing player nodes. From your experience, does this sound like it would work? Would you be willing to share some code showing how you configure your engine with the source node?
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’25