I am converting the example code in Learning CoreAudio by Adamson & Avila to Swift. In one of the examples, they use Apple's CARingBuffer C++ code. In trying to get that working, I get a warning:
OSAtomicCompareAndSwap32Barrier' is deprecated: first deprecated in macOS 10.12 - Use std::atomic_compare_exchange_strong() from <atomic> instead
I'm not familiar with C++ and I'm having trouble figuring out how to use atomic_compare_exchange_strong(). I've also had trouble figuring out what OSAtomicCompareAndSwap32Barrier is supposed to do.
The only place it is called in CARingBuffer is
void CARingBuffer::SetTimeBounds(SampleTime startTime, SampleTime endTime)
{
UInt32 nextPtr = mTimeBoundsQueuePtr + 1;
UInt32 index = nextPtr & kGeneralRingTimeBoundsQueueMask;
mTimeBoundsQueue[index].mStartTime = startTime;
mTimeBoundsQueue[index].mEndTime = endTime;
mTimeBoundsQueue[index].mUpdateCounter = nextPtr;
CAAtomicCompareAndSwap32Barrier(mTimeBoundsQueuePtr, mTimeBoundsQueuePtr + 1, (SInt32*)&mTimeBoundsQueuePtr);
}
The call to CAAtomicCompareAndSwap32Barrier directly calls OSAtomicCompareAndSwap32Barrier.
Even with the deprecation warning, the code performs as expected, but I'd like to eliminate the warning.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I've written some code that can be compiled differently depending using
#if XXX
...
#else
...
#endif
I then set up two different project targets. In one target under Swift Compiler - Custom Flags / Active Compilation Conditions, I define XXX. In the other, I don't.
Using the two project targets, I can compile the program two different ways.
However, if I introduce an error into a section of code that is going to be ignored, XCode reports an error and won't compile.
Does the compiler truly ignore code that is in a failed #if block or does the code end up in the compiled code with a runtime check to not run?
When running my code, I got the following messages in the console:
malloc: *** error for object 0x16fdfee70: pointer being freed was not allocated
malloc: *** set a breakpoint in malloc_error_break to debug
I understand the first message is coming from a bug in my code. The second is trying to help me figure out the issue.
The question is how do I set this helpful breakpoint?
FWIW, I've seen similar "set a breakpoint in ..." messages before in other cases.
This is on a Mac Mini M1 with OSX Monterey.
I am trying to write an audio network using AVAudioEngine as opposed to AUAudioGraph (which I understand is deprecated in favor of AVAudioEngine). My code works properly with AUAudioGraph.
The input is a microphone which has a sample rate of 8 kHz. In the render proc, the data is written to a ring buffer. Debugging shows that the render proc is called every 0.064 seconds and writes 512 samples (8000 * 0x064 = 512).
The program creates an AVAudioSourceNode. The render block for that node pulls data from the above ring buffer. But debugging shows that it is trying to take 512 samples about every 0.0107 seconds. That works out to 48000 samples per second, which is the output device sample rate. Obviously the ring buffer can't keep up.
In the statement connecting the above source node to the AVEngine's mixer node, I specify (at least I think I am) a sample rate of 8000, but it still seems to be running at 48000.
let inputFormat = AVAudioFormat(
commonFormat: outputFormat.commonFormat,
sampleRate: 8000,
channels: 1,
interleaved: outputFormat.isInterleaved)
engine.connect(srcNode, to: mixerNode, fromBus: 0, toBus: 0, format: inputFormat)
Also, looking at the microphone input using Audio MIDI Setup shows that microphone format is 8000 Hz, 1 channel 16-bit integer, but when I examine the input format of the AudioNode it is reported as 8000 Hz, 1 channel 32-bit float. The input node is using HAL. Obviously, somewhere in the internals of the node the samples are being converted from 16-bit ints to 32-bit floats. Is there a way to also have the sample rate changed?
Am I doing this wrong? The HAL node was used with AUAudioGraph. Is there a different node that should be used with AVAudioEngine? I see that AVAudioEngine has an input node, but it seems if I connect it to the microphone, the input goes straight to the hardware output without going through the mixer node (where I want to mix in other audio sources).
The original AUGraph code was modeled after the code in "Learning Core Audio" by Adamson & Avila, which, although it is old (pre-dating Swift and AVAudioEngine), is the only detailed reference on CoreAudio that I have been able to find. Is there a newer reference?
Thanks,
Mark
I have downloaded the WWDC signal generator example code for 2019 session 510 "What's New in AVAudioEngine."
at link
When I run it in XCode 13.2 on OSX 12.3 on a M1 Mac Mini , on line 99
let mainMixer = engine.mainMixerNode
I get 9 lines
2022-03-30 21:09:19.288011-0400 SignalGenerator[52247:995478] throwing -10878
2022-03-30 21:09:19.288351-0400 SignalGenerator[52247:995478] throwing -10878
2022-03-30 21:09:19.288385-0400 SignalGenerator[52247:995478] throwing -10878
2022-03-30 21:09:19.288415-0400 SignalGenerator[52247:995478] throwing -10878
2022-03-30 21:09:19.288440-0400 SignalGenerator[52247:995478] throwing -10878
2022-03-30 21:09:19.288467-0400 SignalGenerator[52247:995478] throwing -10878
2022-03-30 21:09:19.288491-0400 SignalGenerator[52247:995478] throwing -10878
2022-03-30 21:09:19.288534-0400 SignalGenerator[52247:995478] throwing -10878
2022-03-30 21:09:19.288598-0400 SignalGenerator[52247:995478] throwing -10878
in the console output.
-10878 is in valid parameter
But the program seems to run as expected.
Can this just be ignored, or does it indicate improper setup?
I'm trying to figure out how to set the volume of a CoreAudio AudioUnit. I found the parameter kHALOutputParam_Volume, but I can't find anything about it.
I called AudioUnitGetPropertyInfo and that told me that the parameter is 4 bytes long and writeable. How can I find out whether that is an Int32, UInt32, Float32 or some other type and what acceptable values are and mean?
I used AudioUnitGetProperty and read it as either Int32 (512) or Float32 (7.17e-43).
Is there any documentation on this and other parameters?
I'm teaching myself CoreAudio programming on the Mac (I have a need).
Sometimes the documentation refers to a .h file. Can those files be viewed in Xcode? If so, how do I find them?
For example the help page for AudioUnitParameter says this in the Overview:
This data structure is used by functions declared in the AudioToolbox/AudioUnitUtilities.h header file in macOS.
Is it possible to determine which error matched the pattern in a catch statement? Also, is it possible to get any associated value(s) with the error that was matched?
Since the upgrade to Xcode 13.2, every time I run my program, the output window is filling up with messages that include CVDisplayLink.
What's this all about?
I'm using 13.2 to write SwiftUI apps for personal use on my M1 Mac mini with Monterey 12.2. After upgrading to 12.2, The first time I launched one of my apps, I got two pop-up screens (see attachments)
I using a personal team account for development.
Rebuilding the app seems to clear things up. Do I have to rebuild apps after every OS upgrade?
Mark
I'm developing an iPad app in Xcode 13.1. I'm using the iPad Air 2 simulator (that's the physical iPad I have). I've set the simulator to be in landscape mode with the home button to the right. When my app loads, it always loads in portrait mode. If I rotate the simulator to portrait and back to landscape, the app screen rotates as desired. Why isn't the app screen properly rotated when the app first launches in the simulator?
Thanks,
Mark
Apple's automatic update of Xcode 13.1 to 13.2 broke two of my projects and I'm dead in the water. I was using the App Store version. Is it possible to download Xcode 13.1? All I can find is 13.2.
With Xcode 13.2, every time I rebuild my Mac application, the first time I run it, it asks for permission to use my current location (the app does use that). With Xcode 13.1, it only asked the first time I ran the app, not every time I rebuilt, now it seems like it thinks this is a new app every time I rebuild it. I looked in Privacy settings and the app is only listed once. Is my Mac filling up with permissions to use my location for each build of my app?
struct ModelDemo {
let callback : () -> ()
}
final class ViewModelDemo {
let modelDemo: ModelDemo
init() {
modelDemo = ModelDemo(callback: self.modelCallback)
}
private func modelCallback() {
}
}
The above generates the error "'self' used before all stored properties are initialized."
I understand the error, but this is a common programming pattern for me in other languages. Is there a preferred way to do something like this in Swift?
My current workaround is to make the callback property of ModelDemo into an optional and initialize it to nil, then to set it from the ViewModelDemo init() after it (and other ViewModel properties) have been initialized.
FWIW, the intent of the code is to give the Model a way to inform the ViewModel that something in the Model has changed. Since the Model is supposed to be isolated from the View, I don't think I should use @ObservableObject as that's a SwiftUI feature.
When a Swift program is executing a switch statement does it look at each case in turn to see which one to execute? Would it make sense to put the case's more likely to be chosen closer to the top of the statement?
With a switch statement on an enum with a large (over 100) number of cases would it make sense to replace the switch with a dictionary of closures?