When opening the USB device in response to the first match notification, the USB interface iterator (since 10.14) does not return a USB interface anymore.However, if the USB device is already connected at the time calling IOServiceAddMatchingNotification(), and therefor it is iterated with its resulting iterator (without match notification) then the interfaces are found and from then on the USB interface iterator works.Once the interface iterator has successfully returned a USB interface, it now also works as a reaction to USB match notification. (Therefore a reboot is necessary to reproduce the problem once the iterator found an USB interface).Interestingly, it doesn't seem to be a simple timing problem, because a delay doesn't help, but a BP does...Any suggestion is highly recommend.Thanks!
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
The very little and outdated 'documentation' shared by Apple about CoreAudio and CoreMIDI server plugins suggested to use syslog for logging.
At least since Bug Sur syslog doesn't end up anywhere.
(So, while you seem to think its OK to not document your APIs you could at least remove not working APIs then! Not to do so causes unnecessary and frustrating bug hunting?)
Should we replace syslog by unified logging?
For debugging purpose only our plugins write to our own log files. Where can I find suitable locations? Where is this documented?
Thanks,
hagen.
Hi,
to be able to receive IOServiceAddMatchingNotification we need to attach to an appropriate CFRunLoop/IONotificationPort. To avoid race condition the matching notification ideally would be serialized with the CoreAudio notification/callbacks.
How can this be achieved? Attaching it to the runloop returned by CFRunLoopGetCurrent() does not yield to any notifications at all, to CFRunLoopGetMain leads to notifications asynchronous to CoreAudio callbacks.
There are a set of deprecated AudioHardwareAdd/RemoveRunLoopSource() but apart of its deprecation at least on Big Sur @ Apple Silicon this does not lead to any notification as well.
So, how is this supposed to be implemented? Do we really need to introduce locks? Also on the process calls? Wasn't it the purpose of runloops to manage exactly those kinds of situation? And more importantly over everything: Where is the documentation?
Thanks for any hints,
all the best,
hagen.
Since we have to en/decode the audio stream to/from our audio device anyway and we are using NEON SIMD to do so, we could just convert it into a stream of float on the fly.
Since floats are the natural CoreAudio data format we probably can avoid having to involve an additional int-float/float-int conversion by CoreAudio this way.
Does this make sense?
Thanks,
hagen
Hi,
our CoreAudio server plugin provides the standard kAudioVolumeControlClassID, kAudioMuteControlClassID, kAudioSoloControlClassID incl. kAudioDataSourceControlClassID.
But it looks like controllers can be created in a general way. Due to signal processing capabilities of our device it could provide way more controllers, but would there be any application that is able to present those generic controllers?
Will Audio MIDI Setup.app or AU Lab be able to display those? Any DAW?
Thanks,
hagen
At least under macOS Sonoma 14.2.1 kAudioFormatFlagIsBigEndian for 24bit audio doesn't seem to be supported by the CoreAudio engine when providing kAudioServerPlugInIOOperationWriteMix streaming buffers for our CoreAudio server plugin.
Is that correct and to be expected? Or how should the AudioStreamBasicDescription be filled out on a kAudioStreamPropertyPhysicalFormat request to correctly announce 24bit big endian audio to CoreAudio?
Thanks, hagen.
While most of C++20 seems to be available 'syncstream' is missing (Xcode Version 15.2 (15C500b), macOS SDK 14.2),
CLANG_CXX_LANGUAGE_STANDARD= c++20
How can this be made available?
How can an app obtain the valid range for setting thread_time_constraint_policy_data_t for thread_policy_set()?
Hi,
we have multiple threads in our CoreAudio server plugin carrying out necessary asynchronous work (namely handling USB callbacks and shuffling the required data to the IO).
Although these threads have been set up with the appropriate THREAD_TIME_CONSTRAINT_POLICY (which actually improves it) - on M* processors there is an extremely high, non-realtime amount of jitter of >10ms(!)
Now either the runloop notification from the USB stack comes that late or the thread driving the runloop hasn't been set up to correctly handling the callbacks in a timely manner.
Since AudioUnits threads requiring to comply to the frame deadlines can join the workgroup of the audio device is there a similar opportunity for the CoreAudio server plugin threads? And if so, how should these correctly be set up?
Thanks for any hints! Or pointing me to the docs :)
Hi,
we have .pkg install package consisting of various sub packages. One of them contains presets and needs to be installed the the default preset location /Library/Audio/Presets. If this non-binary preset package is the only one in a .pkg choice notarization fails with:
"logFormatVersion": 1,
"jobId": "*",
"status": "Invalid",
"statusSummary": "Archive contains critical validation errors",
"statusCode": 4000,
"archiveFilename": "mypackage.pkg.zip",
"uploadDate": "2024-08-22T21:24:03.251Z",
"sha256": "*",
"ticketContents": null,
"issues": [
{
"severity": "error",
"code": null,
"path": "mypackage.pkg.zip",
"message": "Package mypackage.pkg.zip has no signed executables or bundles. No tickets can be generated.",
"docUrl": null,
"architecture": null
},
{
"severity": "warning",
"code": null,
"path": "mypackage.pkg.zip/mypackage.pkg",
"message": "b\"Invalid component package: mypackage_vstpreset Distribution file's value: #com.mycompany.mypackage.vstpreset.pkg\\n\"",
"docUrl": null,
"architecture": null
}
]
}
Not sure, but maybe its worth noting that the causing sub packge only generates a warning, but the parent package seems to escalate this into an error.
How can a non-binary sub package be included in a notarized parent package?
Any hints or thoughts are highly appreciated, Thanks!
I have various main Xcode projects referencing the same Xcode sub projects. However Xcode only allows to open one of the main projects with the shared sub project available for accessing its files and building.
How can I create main projects that are able to open shared sub project at the same time?
Topic:
Developer Tools & Services
SubTopic:
Xcode
Hi,
our CoreAudio server plugin supports different clock sources. A switch might result in a change of the selectable sample rates (and other settings). On a clock source switch the plugin reconfigures the set of available kAudioStreamPropertyAvailablePhysicalFormats and announces the change via AudioServerPlugInHostInterface::PropertiesChanged(). However at least the Audio MIDI Setup seems to ignore to update it's UI. The changes are first reflected after selecting another device and re-selecting the device of interest. (Latest macOS, M4 macMini)
Is this a bug? Or is our CoreAudio server plugin required to indicate the change in the list of available audio formats differently?
Thanks!
Hi,
macOS (latest macOS, latest HW, but doesn't matter) seems to prevent CoreMIDI driver logging with standard logging procedures (syslog, unified logging).
The only chance to log something is writing to a file at one of the rare write-accessible locations for CoreMIDI.
How is this supposed to work? Any hint is highly appreciated. Thanks!
Hi,
when a CoreMIDI driver controls physical HW it is probably quite commune to have to control the amount of MIDI data received from the system.
What comes to mind is to just delay returning control of the MIDIDriverInterface::Send() callback to the calling process. While the application trying to send MIDI really stalls until the callback returns it seems only to be a side effect of a generally stalled CoreMIDI server. Between the callbacks the application can send as much MIDI data as it wants to CoreMIDI, it's buffering seems to be endless... However the HW might not be able to play out all the data.
It seems there is no way to indicate an overflow/full buffer situation back the application/CoreMIDI. How is this supposed to work?
Thanks, any hints or pointers are highly appreciated!
Hagen.
How can a Dext be activated from C++. Specifically we need to load our Dext from our c++ CoreAudio server plugin.
I guess it all boils down to how to call the OSSystemExtensionRequest Swift interface from C++...
Thanks for any hints!