iOS 12 seems to have lot of bugs in AVFoundation code it seems. The latest one seems to be one with AVPlayer:Cannot remove an observer <NSKeyValueObservance 0x280625ce0> for the key path "currentItem.status" from <AVPlayerInternal 0x107626840>, most likely because the value for the key "currentItem" has changed without an appropriate KVO notification being sent. Check the KVO-compliance of the AVPlayerInternal class.'Has anyone come across this?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Is there a way to get stereo audio on iPhone XS using RemoteIO unit, or any of the CoreAudio APIs? Need to record audio in stereo format.
iOS 13/13.1 autorotation seems to be behave differently than iOS 12. For instance, my app allows user to lock interface orientation to portrait or landscape mode in settings.If I have portrait rotation lock on device and return only .landscape mode as supportedInterfaceOrientations, the interface remains in portrait mode only until I disable portrait lock orientation on device. This does not seem to be the case with iOS 12. Infact, supportedInterfaceOrientations is not even called!UIViewController.attemptRotationToDeviceOrientation() also does not work in such cases.The root of the problem is I temporarily return shouldAutorotate to false while the app is initializing and when everything is initialized, I call UIViewController.attemptRotationToDeviceOrientation() to trigger autorotation. It triggers autorotation in iOS 12 but in iOS 13.1 it doesn't works.Looks like a bug in iOS 13.1 I believe. What's a known workaround?
I have iPhone 11 Pro running iOS 14 beta 2. In one of my projects (and only that project perhaps), the view controller doesn't autorotate when connected to XCode (11 or 12) debugger. When not connected to debugger, there is no issue in autorotation. This happens for only one particular project and it's not clear what could be blocking autorotation. None of the autorotation methods are called. I wonder what could be blocking autorotation. And if I connect any iOS 13 device, autorotation works with the debugger. Is this a known issue with iOS 14/XCode?
Dear Apple Engineers,
This has happened twice with iPhone 11 Pro running iOS 14 beta 4. I have now filed bug FB8334182 and recorded vides and photos that have been attached. Because once the device enters this state, nothing works. Sysdiagnose fails to capture, iPhone can not be synced to Mac anymore, all apps either crash or freeze. Deleting files from system have no effect. And honestly, it is not clear why iPhone storage is increasing. The app I was debugging started showing increase in disk space in GBs when nothing was saved by the app anywhere. Now the iPhone is stuck in boot loop with Apple logo and the only option is to do a hard reset by installing a fresh copy of iPhone.
There is no option to report in Feedback assistant that issue is with iOS. Had to report issue with UIKit instead!
When using AVCaptureVideoDataOutput/AVCaptureAudioDataOutput & AVAsset writer to record video with cinematic extended video stabilization, the audio lags video upto 1-1.5 seconds and as a result in the video recording, video playback is frozen in the last 1-1.5 seconds. This does not happen when using AVCaptureMovieFileOutput. I want to know if this can be fixed or there is a workaround to synchronize audio/video frames? How does AVCaptureMovieFileOutput handle it?
I am converting HEVC video to H264 video using AVFoundation APIs. However I have two problems:
How do I detect the encoder used in the input file using AVFoundation?
How do I calculate the bitrate of the output H264 file that matches the quality of input HEVC file? Need to pass this bitrate to AVAssetWriter compression settings dictionary.
MTLCommandBuffer present(MTLDrawable, afterMinimumDuration: CFTimeInterval) is not found in XCode 12 GM. The following function of MTLCommandBuffer throws compilation error.
	commandBuffer.present(drawable, afterMinimumDuration: 1.0/Double(self.preferredFramespersecond)
Incorrect argument label in call (have ':afterMinimumDuration:', expected ':atTime:')
What is the fix?
Ours is a video camera app. Our app displays the following prompt in Microphone Usage permission (Privacy - Microphone Usage Description):"The app needs Microphone Access to record Audio"
But now Apple seems to be rejecting it.
Guideline 5.1.1 - Legal - Privacy - Data Collection and Storage
"We noticed that your app requests the user’s consent to access their microphone but does not clarify the use of the microphone in the applicable purpose string."
Wondering what is not obvious here. A video recorder app needs microphone access to record audio, nothing more, nothing less! How do I fix it? It has been 8 years now and no such concern was raised before.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
App Review
App Store Connect
Dear AVFoundation Engineers & other AVFoundation developers,
In the context of a multilayer video editing timeline where there are 4 or more layers, I want to know if it is a problem to have just one AVVideoCompositionInstruction for the entire time range of the timeline? The parameter requiredSourceTrackIds will be all the tracks added to AVMutableComposition, containsTweening will be true, etc. Then at any frame time, the custom compositor could consult its own internal data structures and blend video frames of different tracks as required. I want to know if there is something wrong in this approach from the performance perspective, especially on new iOS devices(iPhone 7 or later)?
I see a weird bug. I have the following code:
		 func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
		 application.isIdleTimerDisabled = true
						...
However, this has no effect on some iOS devices. This was not the case before when code was built with XCode11.x. Is this a known bug?
On iOS 14, when we have AVCaptureSession setup wit AVCaptureVideoDataOutput, AVCaptureAudioDataOutput and also RemoteIO unit configured, AVCaptureSession sometimes takes upto 3-4 seconds to start especially when also changing AVAudioSession category from playback to playAndRecord. Is this because iOS14 is overall slow and buggy in the initial release or something specific to AVFoundation?
PHPhotoLibrary gives an AVComposition rather than an AVURLAsset for a video recorded in SlowMo. I want to insert slowMo videos into another AVMutableComposition, so this means I need to insert this AVComposition into AVMutableComposition which is my editing timeline. The hack I used before was to load the tracks and segments and find the mediaURL of asset.
AVCompositionTrack *track = [avAsset tracks][0];
AVCompositionTrackSegment *segment = track.segments[0];
mediaURL = [segment sourceURL];
Once I had mediaURL, I was able to create a new AVAsset that could be inserted into AVMutableComposition. But I wonder if there is a cleaner approach that allows the slowMo video composition to be directly inserted into the timeline AVMutableComposition?
I am using the following code to prompt PhotoLibrary access with add only prompt:
PHPhotoLibrary.requestAuthorization(for: .addOnly) { status in
handler(status)
}
However, this still shows the wrong prompt "App would Like to Access Your Photos". Why is that and what can I do to show "App would like to Add to Your Photos" as shown in WWDC video?
PHPickerViewController allows access to copies of photo library assets as well as returning PHAssets in the results. To get PHAssets instead of file copies, I do:
let photolibrary = PHPhotoLibrary.shared()
var configuration = PHPickerConfiguration(photoLibrary: photolibrary)
configuration.filter = .videos
configuration.selectionLimit = 0
let picker = PHPickerViewController(configuration: configuration)
picker.delegate = self
self.present(picker, animated: true, completion: nil)
And then,
			func picker(_ picker: PHPickerViewController,		 didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true) {
let identifiers:[String] = results.compactMap(\.assetIdentifier)
let fetchResult = PHAsset.fetchAssets(withLocalIdentifiers: identifiers, options: nil)
NSLog("\(identifiers), \(fetchResult)")
}
}
But the problem is once the photo picker is dismissed, it prompts for Photo Library access which is confusing and since the user anyways implicitly gave access to the selected assets in PHPickerViewController, PHPhotoLibrary should load those assets directly. Is there anyway to avoid the Photo library permission? The other option to copy the assets in the app is a waste of space for editing applications.