Post

Replies

Boosts

Views

Activity

Workaround for iOS 16 Autorotation bugs
iOS 16 has serious bugs in UIKit without any known workarounds, so much that I had to remove my app from sale. Obviously I am affected and am desperate to know if anyone has found any workaround or if UIKit Engineers can tell me any workarounds. To summarise, the issue is calling setNeedsUpdateOfSupportedInterfaceOrientations shortly (within 0.5 seconds) after app launch doesn't trigger autorotation. It does trigger autorotation when the device is launched from the debugger but not when the app is launched directly. Maybe the debugger incurs some delay that is perhaps sufficient to trigger autorotation? I have filed FB11516363 but need a workaround desperately so as to get my app back on AppStore.
0
0
1.8k
Sep ’22
XCode 14 SwiftUI dyld symbol not found
I get this error when I run my code on an iOS 14 device using XCode 14. dyld: Symbol not found: _$s7SwiftUI15GraphicsContextV4fill_4with5styleyAA4PathV_AC7ShadingVAA9FillStyleVtF Referenced from: ****** Expected in: /System/Library/Frameworks/SwiftUI.framework/SwiftUI I do not get any errors when I run the same code on an iOS 15 or iOS 16 device. How do I resolve this issue?
1
0
1.4k
Sep ’22
iOS 16 mismatch between windowScene & UIViewController interfaceOrientation
This is another strange issue on iOS 16, and one among the many woes that cause autoRotation problems on the platform. At the app startup time, there is a mismatch between -[UIViewController interfaceOrientation] (now deprecated), -[UIViewController supportedInterfaceOrientations] (which returns .landscapeRight), and -[windowScene interfaceOrientation].  public var windowOrientation: UIInterfaceOrientation {         return view.window?.windowScene?.interfaceOrientation ?? .unknown     }  override func viewDidLayoutSubviews() {         super.viewDidLayoutSubviews()      layoutInterfaceForOrientation(windowOrientation)     }  override var supportedInterfaceOrientations: UIInterfaceOrientationMask {         return .landscapeRight Putting a breakpoint in viewDidLayoutSubviews reveals that windowOrientation is portrait while interfaceOrientation is .landscapeRight. Is there anyway this can be fixed or any workaround possible?
Topic: UI Frameworks SubTopic: UIKit Tags:
0
0
555
Sep ’22
AVAssetWriter audio compression settings for multichannel audio
With a 4 channel audio input, I get error while recording a movie using AVAssetWriter and using LPCM codec. Are 4 audio channels not supported by AVAssetWriterInput? Here are my compression settings: var aclSize:size_t = 0      var currentChannelLayout:UnsafePointer<AudioChannelLayout>? = nil /* * outputAudioFormat = CMSampleBufferGetFormatDescription(sampleBuffer) * for the latest sample buffer received in captureOutput sampleBufferDelegate */       if let outputFormat = outputAudioFormat {               currentChannelLayout = CMAudioFormatDescriptionGetChannelLayout(outputFormat, sizeOut: &aclSize)        }       var currentChannelLayoutData:Data = Data()       if let currentChannelLayout = currentChannelLayout, aclSize > 0 {              currentChannelLayoutData = Data.init(bytes: currentChannelLayout, count: aclSize)        }       let numChannels = AVAudioSession.sharedInstance().inputNumberOfChannels       audioSettings[AVSampleRateKey] =  48000.0     audioSettings[AVFormatIDKey] = kAudioFormatLinearPCM       audioSettings[AVLinearPCMIsBigEndianKey] = false       audioSettings[AVLinearPCMBitDepthKey] = 16       audioSettings[AVNumberOfChannelsKey] = numChannels       audioSettings[AVLinearPCMIsFloatKey] = false       audioSettings[AVLinearPCMIsNonInterleaved] = false       audioSettings[AVChannelLayoutKey] = currentChannelLayoutData
1
0
1.1k
Oct ’22
iOS 16.1 beta 5 AVAudioSession Error
We are getting this error on iOS 16.1 beta 5 that we never saw before in any of the iOS versions. [as_client]     AVAudioSession_iOS.mm:2374  Failed to set category, error: 'what' I wonder if there is any known workaround for the same. iOS 16 has been a nightmare and lot of AVFoundation code breaks or becomes unpredictable in behaviour. This is a new issue added in iOS 16.1.
2
0
2.4k
Nov ’22
AVFoundation serious bugs in iOS 16.1 beta 5
AVFoundation has serious issued in iOS 16.1 beta 5. None of these issues are seen prior to iOS 16.0 or earlier. The following code fails regularly when switching between AVCaptureMultiCamSession & AVCaptureSession. It turns out that assetWriter.canApply(outputSettings:) condition is false for no apparent reason.         if assetWriter?.canApply(outputSettings: audioSettings!, forMediaType: AVMediaType.audio) ?? false { }           I dumped audioSettings dictionary and here it is: Looks like number of channels in AVAudioSession are 3 and that is the issue. But how did that happen? Probably there is a bug and AVCaptureMultiCamSession teardown and deallocation is causing some issue. Using AVAssetWriter in AVCaptureMultiCamSession, many times no audio is recorded in the video under same audio settings dictionary dumped above. There is audio track in the video but everything is silent it seems. The same code works perfectly in all other iOS versions. I checked that audio sample buffers are indeed vended during recording but it's very likely they are silent buffers. Is anyone aware of these issues?
1
0
875
Oct ’22
iOS 16.1(b5) - AVMultiCamSession emitting silent audio frames
This seems like a new bug in iOS 16.1(b5) where AVMultiCamSession outputs silent audio frames when back & front mics have been added to it. This issue is not seen in iOS 16.0.3 or earlier. I can't reproduce this issue with AVMultiCamPIP sample code so I believe I have some AVAudioSession or AVMultiCamSession configuration in my code that is causing this. Moreover, setting captureSession.usesApplicationAudioSession = true also fixes the issue, but then I do not get the audio samples from both the microphones. Here is the code:     public func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)     {         if let videoDataOutput = output as? AVCaptureVideoDataOutput {             processVideoSampleBuffer(sampleBuffer, fromOutput: videoDataOutput)         } else if let audioDataOutput = output as? AVCaptureAudioDataOutput {             processsAudioSampleBuffer(sampleBuffer, fromOutput: audioDataOutput)         }     }  private var lastDumpTime:TimeInterval?     private func processsAudioSampleBuffer(_ sampleBuffer: CMSampleBuffer, fromOutput audioDataOutput: AVCaptureAudioDataOutput) {         if lastDumpTime == nil {             lastDumpTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer).seconds         }         let time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer).seconds         if time - lastDumpTime! >= 1.0 {             dumpAudioSampleBuffer(sampleBuffer)             lastDumpTime = time         }       }  }  private func dumpAudioSampleBuffer(_ sampleBuffer:CMSampleBuffer) {         NSLog("Dumping audio sample buffer")         var audioBufferList = AudioBufferList(mNumberBuffers: 1,               mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))         var buffer: CMBlockBuffer? = nil         CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, bufferListSizeNeededOut: nil, bufferListOut: &audioBufferList, bufferListSize: MemoryLayout.size(ofValue: audioBufferList), blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment), blockBufferOut: &buffer)         // Create UnsafeBufferPointer from the variable length array starting at audioBufferList.mBuffers         withUnsafePointer(to: &audioBufferList.mBuffers) { ptr in             let buffers = UnsafeBufferPointer<AudioBuffer>(start: ptr, count: Int(audioBufferList.mNumberBuffers))             for buf in buffers {                 // Create UnsafeBufferPointer<Int16> from the buffer data pointer                 let numSamples = Int(buf.mDataByteSize)/MemoryLayout<Int16>.stride                 var samples = buf.mData!.bindMemory(to: Int16.self, capacity: numSamples)                 for i in 0..<numSamples {                     NSLog("Sample \(samples[i])")                 }             }         }     } And here is the output: Dump Audio Samples
2
0
841
Oct ’22
AVAssetWriter audio settings failure with compression settings
I have the following audio compression settings which fail with AVAssetWriter (mov container, HEVC codec, kAudioFormatMPEG4AAC format ID): ["AVSampleRateKey": 48000, "AVFormatIDKey": 1633772320, "AVNumberOfChannelsKey": 1, "AVEncoderBitRatePerChannelKey": 128000, "AVChannelLayoutKey": <02006500 00000000 00000000 00000000 00000000 00000000 00000000 00000000>] Here is the code line that fails: if _assetWriter?.canApply(outputSettings: audioSettings!, forMediaType: AVMediaType.audio) ?? false { } else { /* Failure */ } Want to understand what is wrong? I can not reproduce it at my end (only reproducible on user's device with a particular microphone). I need to know if it is mandatory to provide value for AVChannelLayoutKey in the dictionary with kAudioFormatMPEG4AAC? That could be a possible culprit.
1
0
1.1k
Nov ’22
iPadPro M2 ProRes unavailable
I have the following code to determine ProRes and HDR support on iOS devices. extension AVCaptureDevice.Format {     var supports10bitHDR:Bool {         let mediaType = CMFormatDescriptionGetMediaType(formatDescription)         let mediaSubtype = CMFormatDescriptionGetMediaSubType(formatDescription)         return mediaType == kCMMediaType_Video && mediaSubtype == kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange     }     var supportsProRes422:Bool {         let mediaType = CMFormatDescriptionGetMediaType(formatDescription)         let mediaSubtype = CMFormatDescriptionGetMediaSubType(formatDescription)         return (mediaType == kCMMediaType_Video && (mediaSubtype == kCVPixelFormatType_422YpCbCr10BiPlanarVideoRange))     } } On iPad Pro M2, supportsProRes422 returns false for all device formats (for wide angle camera). Is it a bug or intentional?
1
0
1.1k
Jan ’23
AVAssetWriter sample rate AV drift
I have an AVAssetWriter and I do set audio compression settings dictionary using canApply(outputSettings: audioCompressionSettings, forMediaType: .audio) API. One of the fields in the compression settings is setting an audio sample rate using AVSampleRateKey. My question is if the sample rate I set in this key is different from sample rate of audio sample buffers that are appended, can this cause audio to drift away from video? Is setting arbitrary sample rate in asset writer settings not recommended?
1
0
933
Dec ’22
CVPixelBuffer HDR10 to BT.709 conversion using AVAssetWriter
I am getting CMSampleBuffers in kCVPixelFormatType_422YpCbCr10BiPlanarVideoRange format from the camera. The pixel buffers are 10 bit HDR. I need to record using ProRes422 codec but in non-HDR format. I am not sure what is a reliable way of doing this so reaching out here. What I did is simply set AVAssetWriter compression dictionary as follows: compressionSettings[AVVideoTransferFunctionKey] = AVVideoTransferFunction_ITU_R_709_2 compressionSettings[AVVideoColorPrimariesKey] = AVVideoColorPrimaries_ITU_R_709_2 compressionSettings[AVVideoYCbCrMatrixKey] = AVVideoYCbCrMatrix_ITU_R_709_2 It works and the end video recording shows the color space to be HD 1-1-1 with Apple ProRes codec. But I am not sure if AVAssetWriter has actually performed colorspace conversion from HDR 10 to BT.709, or simply clipped the colors out of range. I need to know a definitive way to achieve this. I see Apple's native camera app is doing this, but not sure how.
0
0
712
Mar ’23