Can someone help me in identifying the source of this crash that I see in crash logs?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
@AVFoundation engineers, my users am seeing the following crash which I can not reproduce at my end.
*** -[__NSArrayI objectAtIndexedSubscript:]: index 2 beyond bounds [0 .. 1]
Fatal Exception: NSRangeException
0 CoreFoundation 0x99288 __exceptionPreprocess
1 libobjc.A.dylib 0x16744 objc_exception_throw
2 CoreFoundation 0x1a431c -[__NSCFString characterAtIndex:].cold.1
3 CoreFoundation 0x4c96c CFDateFormatterCreateStringWithAbsoluteTime
4 AVFCapture 0x6cad4 -[AVCaptureConnection getAvgAudioLevelForChannel:]
All I am doing is this:
func updateMeters() {
var channelCount = 0
var decibels:[Float] = []
let audioConnection = self.audioConnection
if let audioConnection = audioConnection {
for audioChannel in audioConnection.audioChannels {
decibels.append(audioChannel.averagePowerLevel)
channelCount = channelCount + 1
}
}
What am I doing wrong?
I see in Crashlytics few users are getting this exception when connecting the inputNode to mainMixerNode in AVAudioEngine:
Fatal Exception: com.apple.coreaudio.avfaudio
required condition is false:
format.sampleRate == hwFormat.sampleRate
Here is my code:
self.engine = AVAudioEngine()
let format = engine.inputNode.inputFormat(forBus: 0)
//main mixer node is connected to output node by default
engine.connect(self.engine.inputNode, to: self.engine.mainMixerNode, format: format)
Just want to understand how can this error occur and what is the right fix?
I filed a bug and the status in Feedback Assistant now shows "Potential fix identified - In iOS 15". But the bug is still visible in iOS 15 beta 6. What does the status mean? Does it say it will be fixed in iOS 15 main build?
iOS 16 has serious bugs in UIKit without any known workarounds, so much that I had to remove my app from sale. Obviously I am affected and am desperate to know if anyone has found any workaround or if UIKit Engineers can tell me any workarounds.
To summarise, the issue is calling setNeedsUpdateOfSupportedInterfaceOrientations shortly (within 0.5 seconds) after app launch doesn't trigger autorotation. It does trigger autorotation when the device is launched from the debugger but not when the app is launched directly. Maybe the debugger incurs some delay that is perhaps sufficient to trigger autorotation? I have filed FB11516363 but need a workaround desperately so as to get my app back on AppStore.
I get this error when I run my code on an iOS 14 device using XCode 14.
dyld: Symbol not found: _$s7SwiftUI15GraphicsContextV4fill_4with5styleyAA4PathV_AC7ShadingVAA9FillStyleVtF
Referenced from: ******
Expected in: /System/Library/Frameworks/SwiftUI.framework/SwiftUI
I do not get any errors when I run the same code on an iOS 15 or iOS 16 device.
How do I resolve this issue?
This is another strange issue on iOS 16, and one among the many woes that cause autoRotation problems on the platform. At the app startup time, there is a mismatch between -[UIViewController interfaceOrientation] (now deprecated), -[UIViewController supportedInterfaceOrientations] (which returns .landscapeRight), and -[windowScene interfaceOrientation].
public var windowOrientation: UIInterfaceOrientation {
return view.window?.windowScene?.interfaceOrientation ?? .unknown
}
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
layoutInterfaceForOrientation(windowOrientation)
}
override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
return .landscapeRight
Putting a breakpoint in viewDidLayoutSubviews reveals that windowOrientation is portrait while interfaceOrientation is .landscapeRight.
Is there anyway this can be fixed or any workaround possible?
I am unable to understand the sentence "Default is YES for all apps linked on or after iOS 15.4"found in Apple's documentation. What does it actually mean? Does it mean that the default value is YES for app running on iOS 15.4 or later? Or does it mean it is yes if the app is built with iOS 15.4 SDK or later? Or is it something else?
I am seeing a strange issue with autorotation on iOS 16 that is not seen in other iOS versions. And worse, the issue is NOT seen if I connect device to XCode and debug. It is ONLY seen when I directly launch the app on device once it is installed, and that's the reason I am unable to identify any fix. So here is the summary of the issue.
I disable autorotation in the app till the camera session starts running. Once camera session starts running, I fire a notification to force autorotation of device to current orientation.
var disableAutoRotation: Bool {
if !cameraSessionRunning {
return true
}
return false
}
override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
var orientations:UIInterfaceOrientationMask = .landscapeRight
if !self.disableAutoRotation {
orientations = .all
}
return orientations
}
func cameraSessionStartedRunning(_ session:AVCaptureSession?) {
DispatchQueue.main.asyncAfter(deadline: .now(), execute: {
/*
* HELP::: This code does something only when debug directly from XCode,
* not when directly launching the app on device!!!!
*/
cameraSessionRunning = true
if #available(iOS 16.0, *) {
UIView.performWithoutAnimation {
self.setNeedsUpdateOfSupportedInterfaceOrientations()
}
} else {
// Fallback on earlier versions
UIViewController.attemptRotationToDeviceOrientation()
}
self.layoutInterfaceForOrientation(self.windowOrientation)
})
}
With a 4 channel audio input, I get error while recording a movie using AVAssetWriter and using LPCM codec. Are 4 audio channels not supported by AVAssetWriterInput? Here are my compression settings:
var aclSize:size_t = 0
var currentChannelLayout:UnsafePointer<AudioChannelLayout>? = nil
/*
* outputAudioFormat = CMSampleBufferGetFormatDescription(sampleBuffer)
* for the latest sample buffer received in captureOutput sampleBufferDelegate
*/
if let outputFormat = outputAudioFormat {
currentChannelLayout = CMAudioFormatDescriptionGetChannelLayout(outputFormat, sizeOut: &aclSize)
}
var currentChannelLayoutData:Data = Data()
if let currentChannelLayout = currentChannelLayout, aclSize > 0 {
currentChannelLayoutData = Data.init(bytes: currentChannelLayout, count: aclSize)
}
let numChannels = AVAudioSession.sharedInstance().inputNumberOfChannels
audioSettings[AVSampleRateKey] = 48000.0
audioSettings[AVFormatIDKey] = kAudioFormatLinearPCM
audioSettings[AVLinearPCMIsBigEndianKey] = false
audioSettings[AVLinearPCMBitDepthKey] = 16
audioSettings[AVNumberOfChannelsKey] = numChannels
audioSettings[AVLinearPCMIsFloatKey] = false
audioSettings[AVLinearPCMIsNonInterleaved] = false
audioSettings[AVChannelLayoutKey] = currentChannelLayoutData
It's very hard to know from AVFoundation error codes what is the exact error. For instance, I see the following message and don't know what error code means?
This seems like a new bug in iOS 16.1(b5) where AVMultiCamSession outputs silent audio frames when back & front mics have been added to it. This issue is not seen in iOS 16.0.3 or earlier. I can't reproduce this issue with AVMultiCamPIP sample code so I believe I have some AVAudioSession or AVMultiCamSession configuration in my code that is causing this. Moreover, setting captureSession.usesApplicationAudioSession = true also fixes the issue, but then I do not get the audio samples from both the microphones.
Here is the code:
public func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
{
if let videoDataOutput = output as? AVCaptureVideoDataOutput {
processVideoSampleBuffer(sampleBuffer, fromOutput: videoDataOutput)
} else if let audioDataOutput = output as? AVCaptureAudioDataOutput {
processsAudioSampleBuffer(sampleBuffer, fromOutput: audioDataOutput)
}
}
private var lastDumpTime:TimeInterval?
private func processsAudioSampleBuffer(_ sampleBuffer: CMSampleBuffer, fromOutput audioDataOutput: AVCaptureAudioDataOutput) {
if lastDumpTime == nil {
lastDumpTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer).seconds
}
let time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer).seconds
if time - lastDumpTime! >= 1.0 {
dumpAudioSampleBuffer(sampleBuffer)
lastDumpTime = time
}
}
}
private func dumpAudioSampleBuffer(_ sampleBuffer:CMSampleBuffer) {
NSLog("Dumping audio sample buffer")
var audioBufferList = AudioBufferList(mNumberBuffers: 1,
mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))
var buffer: CMBlockBuffer? = nil
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, bufferListSizeNeededOut: nil, bufferListOut: &audioBufferList, bufferListSize: MemoryLayout.size(ofValue: audioBufferList), blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment), blockBufferOut: &buffer)
// Create UnsafeBufferPointer from the variable length array starting at audioBufferList.mBuffers
withUnsafePointer(to: &audioBufferList.mBuffers) { ptr in
let buffers = UnsafeBufferPointer<AudioBuffer>(start: ptr, count: Int(audioBufferList.mNumberBuffers))
for buf in buffers {
// Create UnsafeBufferPointer<Int16> from the buffer data pointer
let numSamples = Int(buf.mDataByteSize)/MemoryLayout<Int16>.stride
var samples = buf.mData!.bindMemory(to: Int16.self, capacity: numSamples)
for i in 0..<numSamples {
NSLog("Sample \(samples[i])")
}
}
}
}
And here is the output:
Dump Audio Samples
AVFoundation has serious issued in iOS 16.1 beta 5. None of these issues are seen prior to iOS 16.0 or earlier.
The following code fails regularly when switching between AVCaptureMultiCamSession & AVCaptureSession. It turns out that assetWriter.canApply(outputSettings:) condition is false for no apparent reason.
if assetWriter?.canApply(outputSettings: audioSettings!, forMediaType: AVMediaType.audio) ?? false {
}
I dumped audioSettings dictionary and here it is:
Looks like number of channels in AVAudioSession are 3 and that is the issue. But how did that happen? Probably there is a bug and AVCaptureMultiCamSession teardown and deallocation is causing some issue.
Using AVAssetWriter in AVCaptureMultiCamSession, many times no audio is recorded in the video under same audio settings dictionary dumped above. There is audio track in the video but everything is silent it seems. The same code works perfectly in all other iOS versions. I checked that audio sample buffers are indeed vended during recording but it's very likely they are silent buffers.
Is anyone aware of these issues?
I have the following audio compression settings which fail with AVAssetWriter (mov container, HEVC codec, kAudioFormatMPEG4AAC format ID):
["AVSampleRateKey": 48000, "AVFormatIDKey": 1633772320, "AVNumberOfChannelsKey": 1, "AVEncoderBitRatePerChannelKey": 128000, "AVChannelLayoutKey": <02006500 00000000 00000000 00000000 00000000 00000000 00000000 00000000>]
Here is the code line that fails:
if _assetWriter?.canApply(outputSettings: audioSettings!, forMediaType: AVMediaType.audio) ?? false {
} else {
/* Failure */
}
Want to understand what is wrong? I can not reproduce it at my end (only reproducible on user's device with a particular microphone).
I need to know if it is mandatory to provide value for AVChannelLayoutKey in the dictionary with kAudioFormatMPEG4AAC? That could be a possible culprit.
I want to know under what conditions can -[AVCaptureSession synchronizationClock] be nil? Some of my app users on iOS 16.1 are hitting this error (synchronizationClock == nil) which is not reproducible on my side.