Hello,
I am setting up macMinis as CI machines (using gitlab-runner) for my team. We are developing mostly audio stuff, and some of our unit tests imply using audio inputs with AVAudioSession/AVAudioEngine.
These CI jobs trigger a microphone authorization pop-up on the macMinis, asking for permission to give gitlab-runner access to the microphone. Once the authorization is given, subsequent jobs run fine.
My issue is that the macMinis are updated on a regular basis with scripts, and since the path of the gitlab-runner binary, installed with homebrew, changes on every version, the pop-up is triggered again every time gitlab-runner gets updated.
Since we are having more and more CI runners, maintaining this manually is becoming impossible.
Is there a way to either deactivate this security or scripting the authorization for a binary to access the microphone?
Thank you for your help!
Tom
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
I'm facing an issue with Xcode 15 and iOS 17: it seems impossible to get AVAudioEngine's audio input node to work on simulator.
inputNode has a 0ch, 0kHz input format,
connecting input node to any node or installing a tap on it fails systematically.
What we tested:
Everything works fine on iOS simulators <= 16.4, even with Xcode 15.
Nothing works on iOS simulator 17.0 on Xcode 15.
Everything works fine on iOS 17.0 device with Xcode 15.
More details on this here: https://github.com/Fesongs/InputNodeFormat
Any idea on this? Something I'm missing?
Thanks for your help 🙏
Tom
PS: I filed a bug on Feedback Assistant, but it usually takes ages to get any answer so I'm also trying here 😉
Topic:
Media Technologies
SubTopic:
Audio
Tags:
AVAudioSession
AVAudioEngine
AVAudioNode
AVFoundation
Hello,
Here is an issue I encountered recently. Does anybody have feedback on this?
Issue encountered
AVAudioFile
throws when opening WAV files and MPEG-DASH files with .mp3 extension,
works fine with many other tested combinations of formats and extension (for example, an AIFF file with .mp3 extension is read by AVAudioFile without error).
The Music app, AVAudioFile and ExtAudioFile all fail on the same files.
However, previewing an audio file in Finder (select the file and hit the space bar) works regardless of the file extension.
Why do I consider this an issue?
AVAudioFile seems to rely on extension sometimes but not always to guess the audio format of the file, which leads to unexpected errors.
I would expect AVAudioFile to deal properly with wrong extensions for all supported audio formats.
⚠️ This behaviour can cause real trouble in iOS and macOS applications using audio files coming from the user, which often have unreliable extensions.
I published some code to easily reproduce the issue:
https://github.com/ThomasHezard/AVAudioFileFormatIssue
Thank you everybody, have a great day 😎
Hello,
I encountered an issue with AVAssertExportSession.
My iOS app uses an AVAssertExportSession with AVAssetExportPresetPassthrough to export audio files from the media library, and use it later.
I noticed that if I export the same m4a file several times, the resulting file is never the same. The exported file sounds exactly the same, but there are a few different bytes between the exported files.
This does not seem to be the case for other formats (wav and aiff for example).
This is actually an issue for me for several reasons.
Do you know if this is an expected behaviour or a bug?
Is there a way to obtain the same output every time in the case of m4a files?
Thank you for your help 🙏
Archiving for mac catalyst: Unable to find a destination matching the provided destination specifier
Hello,
I export xcframeworks for iOS, iOS simulator and macCatalyst with a script calling xcodebuild commands:
archive for iphoneos
archive for iphonesimulator
archive for maccatalyst
create xcframework from the three frameworks with dSyms and BSCymbolMaps.
I'm struggling with one of my projects refusing to archive for macCatalyst.
The macCatalyst archiving command is basically the following (same for all my projects):
xcodebuild archive -project $PROJECT_PATH -scheme $SCHEME -destination "generic/platform=macOS,variant=Mac Catalyst" [...]
The error I obtain is the following (formatted with xcpretty, I replaced sensitive info with XXX):
▸ xcodebuild: error: Unable to find a destination matching the provided destination specifier:
▸ { generic:1, platform:macOS, variant:Mac Catalyst }
▸ Available destinations for the "XXX" scheme:
▸ { platform:macOS, arch:arm64, variant:Mac Catalyst, id:XXX }
▸ { platform:macOS, arch:x86_64, variant:Mac Catalyst, id:XXX }
▸ { platform:macOS, arch:arm64, variant:Designed for [iPad,iPhone], id:XXX }
▸ { platform:iOS Simulator, id:XXX, OS:15.5, name:test-simulator }
▸ Ineligible destinations for the "XXX" scheme:
▸ { platform:iOS, id:dvtdevice-DVTiPhonePlaceholder-iphoneos:placeholder, name:Any iOS Device }
▸ { platform:iOS Simulator, id:dvtdevice-DVTiOSDeviceSimulatorPlaceholder-iphonesimulator:placeholder, name:Any iOS Simulator Device }
I compared with another project building fine, the only difference I found is the deployment targets (the project not working needs iOS 14.0+):
Project archiving for macCatalyst:
Project NOT archiving for macCatalyst:
In Xcode
the "any mac" destination is available for the first project:
but not for the second:
I have the same issue on Intel and Apple Silicon macs, with Xcode 13.2, 13.3 and 13.4.
I tried to change the deployment targets of the first project to reproduce the issue, this project still archives for macCatalyst without any issue, even though I can not find any other difference in the two projects settings, which really bugs me 😭
I tried to add name=Any Mac in the destination, as seen here and there on the web for this issue, it did not help.
Any help would be really appreciated 🤗
Hello all,I've been working with custom AudioUnit for AVAudioEngine on iOS for a while (Audio Unit v3), and I now want to build an Audio Unit plug-in for macOS DAWs (Logic Pro or Ableton Live for example). I've been looking about the subject for a while, but I can't manage to understand how it works.It seems that part of the solution is Audio Unit Extensions, as I saw in WWDC 2015 - 508 and WWDC 2016 - 507, but I don't get how I can reuse the .appex product in Logic Pro or Ableton Live. I've used 3rd party audio units for these softwares but I always get them as .components bundles, which is to be copied in the /Library/Audio/Plug-Ins directory. I tried to copy the .appex in the same folder but it doesn't seem to be recognized by Logic Pro.Any idea about what am I missing here?Thank to you all 🙂Tom
Hi,I'm working with USB Midi devices on iOS apps, and I'm currently looking for a way to identifiy them.I know for a fact that getting the different MIDI properties with CoreMIDI API is not sufficiently reliable, as some devices return only generic information like "USB MIDI DEVICE" but are nonetheless recognized by other apps.After some research, it appears that the the best solution is to get the USB-ID of the device (VID-PID : vendor ID and product ID). My question is the following : How can I get the USBID of a USB MIDI device on iOS? The IOKit API is not available on iOS, so what is the solution to access USB devices on iOS?Thank you all for the attention,Best,Thomas