Post

Replies

Boosts

Views

Activity

Reply to Is it possible to stream video from a UVC (USB Video Class) camera on an iPhone 15?
Nope. External UVC devices are only supported on iPads with USB C connectors. API support is in iOS 17, the limitation is mentioned in the the "Support external cameras in your iPadOS app" presentation from WWDC 2023. The documentation for AVCaptureDevice.DeviceType.external says "On iPad, external devices are those that conform to the UVC (USB Video Class) specification.", but does not make it clear that support is limited to iPads with USB C only.
Topic: Media Technologies SubTopic: Streaming Tags:
Apr ’25
Reply to Does USB Audio in macOS support 12 channels with 16bit PCM samples?
I can't tell you anything about the audio drivers in macOS, sorry. By "how custom" I meant "do you write the firmware". It doesn't really matter how much bandwidth is available on the bus. The device declares a number of different alternate interfaces in its USB descriptor, which ask the OS for a different amount of USB bandwidth. This enables multiple devices to occupy the same bus with predictable results. It is possible that there is something a bit odd about your device's USB descriptor. You could post it here. Use the usbdiagnose tool to fetch the descriptor (it is in /usb/bin) Have you checked the system log to see if it is complaining about the device, either from the USB sub-system or from the Core Audio subsystem? Since the log floods with irrelevant messages but you don't know what to look for until you've seen it, live filtering on predicates is not very useful. I would connect the device, then execute log collect --last 1m which will write the last 1 minute of log data to a file, which you can then peruse later, or export and examine with a text editor. The tech note you referenced says "Snow Leopard (and later) is able to stream 10 channels of 32-bit audio at 192 kHz to and from a USB Audio 2.0 device" - bandwidth or system performance is unlikely to be your limitation. There are USB audio mixers available for macOS with more than 12 channels at 192kHz and 16 bits. Have you compared your device's behavior with one of those?
Topic: App & System Services SubTopic: Drivers Tags:
Apr ’25
Reply to Transfer an unpublished app
The transfer process for a published app is designed to preserve user reviews, privacy statements, revenue etc., and to enable users to upgrade the app in future, rather than download a new one. If it was never published, from an App Store point of view, there's nothing to transfer. You said "we need to transfer it to another Apple Developer account", so you don't need to create a new developer account - it already exists. Was the app granted some special entitlements which were tied to the old developer account? If so, you'd need to apply for those again.
Apr ’25
Reply to About USB accessory certification
I question why a card reader would not present a mass storage interface. External USB storage devices are supported by iPadOS. Why is yours using HID? How does it present itself on macOS, or on Windows? If the reader has some special functions which can only be controlled over HID, you'll need to write a dext to communicate with it, because iPadOS' support for HID extends only to pointing devices, keyboards, and game controllers. There's no HID manager on iOS, so you need your own app (or you supply code to third party app vendors) to send your vendor-specific HID commands to your device, and a dext. Note that dexts are only supported on iPads with M-series processors.
Apr ’25
Reply to Value column missing for Info.plist
open the Info.plist in its editor, by selecting Info.plist in the Navigator on the left. You'll find that the width of the Key field has been made very wide, and this is tied to the width of the Key field in the Info panel. If you can't see the column width handles in the Info.plist editor, you can scroll the whole display to the left to bring them into view. The Target's Info tab editor can't do that.
Mar ’25
Reply to SwiftUI update master list from detail view
the short answer is because the id of the view in your ForEach is \.self (line 31). If you just write ForEach($cars) { $car in your code will work. (with the code you posted) you can edit all the fields, but you won't see your changes in the detail view until you close the view by going Back, then opening a new CarDetailView. the equality function is used to check if a structure has changed. If you make the equality function only dependent on the id, and have no means to change id, then when you edit your Car, the new struct is considered equal to the old struct value and your 'new' struct won't be saved to the cars array. I have to admit I spent a lot of time today trying to understand why there's a difference between ForEach($cars, id: \.self) { $car in and ForEach($cars) { $car in but could not find a satisfactory explanation. In fact, your code worked in the simulator for me, but not on a real phone, and not in Xcode's live preview. On the phone, it did accept edits, but the values propagated to the cars array were different, while in the live preview, I saw the behavior you described. The id used in the ForEach needs to be stable - self keeps changing if you edit the fields.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Mar ’25
Reply to CoreAudio HAL plugin vs dext
Yes our driver is purely virtual, and we know about he best practice note - that's what I'm asking about. We're not using the device only to capture system audio. We're building a software mixer. We create multiple audio mixes from different sources, such as apps and audio inputs, at different levels. We provide the mixed audio to the system as input devices so other apps can use it. As far as I can tell, an audio tap can be used for mixing, but without volume control. I tried to add per-input and per-mix volume control by cascading aggregate devces, but an ggregate of aggregate devices doesn't work - the first-level aggregate device is considered "off-line" by the second-level aggregate device.
Topic: Media Technologies SubTopic: Audio Tags:
Mar ’25
Reply to Testing endpoint security on a virtual Mac
For real devices, someone (it might be the Account Owners, or perhaps any Admin) can go to the Devices section of the Accounts tab of developer.apple.com, and add the "Provisioning UDID" of a device to the list of devices. I've never tried this for a virtual Mac. You can find the Provisioning UDID of your Mac in the Hardware section of System Information. Bear in mind that you can have a maximum of 100 devices on your developer account, and that list can only be pruned once a year. So if you re-create your virtual Mac often, and it has a different Provisioning UDID every time, you may run out of devices. If you build your software on a Mac with Xcode, with automatic signing turned on, that Mac will be automatically added to the list of developer devices. The process of manually adding to the list is only necessary for pure "victim" devices which need to run development software.
Topic: App & System Services SubTopic: Core OS Tags:
Feb ’25
Reply to Swift Testing environment differences from regular executable
I lied. Inadvertently. Sorry. I put all the code (create and find) into one function and ran it from a standalone executable - it fails. If I create two separate standalone executables, one that creates, and another that finds, and run those one after the other, the find executable succeeds. In the single-executable case, I can add this line between creation and finding to make it work: RunLoop.main.run(until: Date().addingTimeInterval(0.5)) 0.1 seconds was not long enough. I'm not sure about the 0.5s because I've only tried it once. I'm interested in doing this in a test because I'd like to create and destroy many such plugins to check for leaks. But I'm not keen on hard-coded delays. Is there a way to detect when CoreAudio is ready to be interrogated again (rather like IORegistryWaitQuiet)? And is there a way to run a runloop within a Swift Test?
Feb ’25