Post

Replies

Boosts

Views

Activity

Reply to Internal Apple Developer team member does not appear in TestFlight. (User is in limbo)
It looks like the main culprit here is a lack of patience on our part. Another way of putting it is that the portal is slow when handling some operations and doesn't do the best job of reporting current status. In any case, coming back to the problem the next day showed the user and the portal in their appropriate states. We are able to continue with our test.
May ’24
Reply to AVCam Example: Can I use a Actor instead of a DispatchQueue for capture session activity?
@enodev Thank you for your response. However, the answer seems to skirt the question. I don't want to use both Actor and DispatchQueue, perhaps in some sort of wrapped or nested form. I want to use one in place of the other. Your answer would seem to imply that this is not possible and that using an Actor with AVFoundation APIs that must run on something other than main, for blocking reasons, would still need a DispatchQueue.
May ’24
Reply to AVAudioPCMBuffer Memory Management
Did you ever figure this out? I've been doing the same thing. I don't get any crashes but when I hand the buffers of to LiveSwitch for playback, there is not audio signal. I receive the buffer from a tap on bus zero. I send the buffer to the publisher. Buffer is received, potentially by up to two consumers (currently one). Buffer has to be converted using AVAudioConverter from Float32 to Int16, which is required for consumption by LiveSwitch APIs. Buffer memory converted to NSMutableData (required by LiveSwitch) Buffer wrapped / converted in FMLiveSwitchAudioFrame. Buffer raised to LiveSwitch for processing. Result: No signal.
Topic: Media Technologies SubTopic: Audio Tags:
Aug ’24
Reply to IOServiceOpen fails with -308 Error (smUnExBusError)
If I do what I think I need to be doing to unmap the memory, when I try to open the service again, it fails. If I skip the step where I do that unmapping, the service opens successfully. Are you saying everything works if you don't unmap the memory? That is, when you open the device again without attempting to un-map memory, can you communication successfully to the device and proceed as normal? The way this is worded, it is unclear to me.
Topic: App & System Services SubTopic: Drivers Tags:
Nov ’24
Reply to DriverKit: Check that driver is enabled on iPadOS
I too have the same requirement. With my USB driver I cannot tell the difference between the device being unplugged and the driver not being activated via the Settings app. I need to be able to direct the user to flip the switch when I know for certain that it has not yet been flipped. Neither the IOKit or SystemExtensions frameworks are available in iOS. Does anyone know of a workaround for this issue?
Topic: App & System Services SubTopic: Core OS Tags:
Jan ’25
Reply to DriverKit driver doesn't appear in Settings when installed with iPad app
The Apple defect for iOS was solved last year. If you are still having issues on iOS, make sure you are running the latest version of iOS as a first step. If that requirement has been met, you have some other technical issue prevent preventing your driver from properly being recognized by the system. If you are using macOS, there may be other issues such as your Xcode configuration, or not having your system properly set up for driver development. I don't have all the details for macOS, but the documentation provided by Apple should help in this instance. It might help if you created a separate issue perhaps linked to this one where you completely describe your situation and the problem you're experiencing. Be sure to include the code and configuration snippets so we can see exactly what you are doing in order to properly register and activate your driver.
Topic: App & System Services SubTopic: Drivers Tags:
Apr ’25
Reply to When to set AVAudioSession's preferredInput?
Probably way too late but perhaps someone else will benefit from the discussion. I have observed that the system will always jump to the recently plugged-in microphone. I assume this is because they assume that a person who just plugged in a microphone wants to use said microphone immediately. My suggestion is to monitory the route change notifications and re-assert your wishes by calling setPreferredInput again. I have not tested this but give it a try.
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25
Reply to Is AVAudioPCMFormatFloat32 required for playing a buffer with AVAudioEngine / AVAudioPlayerNode
In my experience, things only consistently work when using Float32 non-interleaved samples. This seems to be the requirement for the audio engine input and output nodes as well asl playing back audio with the player node. I am also recording data to the disk in this format. Any time I tried to use Int16 interleaved data, the API results were negative. I had to perform my own conversions to and from these two formats because the third-party library I was using for remote-conference audio only accepted Int16 interleaved data in both directions.
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25