Post

Replies

Boosts

Views

Activity

AVCaptureDevice isCenterStageEnabled KVO not fired
@AVFoundationEngineers I am trying to observe isCenterStageEnabled property as follows: AVCaptureDevice.self.addObserver(self, forKeyPath: "isCenterStageEnabled", options: [.initial, .new], context: &CapturePipeline.centerStageContext) I have set the centerStageControlMode to .cooperative. The KVO fires only when I do make changes to property AVCaptureDevice.isCenterStageEnabled in my code. KVO is NOT fired when the user toggles the centerStage property from Control Center. Is this a bug?
0
0
530
Oct ’21
Metal Core Image passing sampler arguments
I am trying to use a CIColorKernel or CIBlendKernel with sampler arguments but the program crashes. Here is my shader code which compiles successfully. extern "C" float4 wipeLinear(coreimage::sampler t1, coreimage::sampler t2, float time) { float2 coord1 = t1.coord(); float2 coord2 = t2.coord(); float4 innerRect = t2.extent(); float minX = innerRect.x + time*innerRect.z; float minY = innerRect.y + time*innerRect.w; float cropWidth = (1 - time) * innerRect.w; float cropHeight = (1 - time) * innerRect.z; float4 s1 = t1.sample(coord1); float4 s2 = t2.sample(coord2); if ( coord1.x > minX && coord1.x < minX + cropWidth && coord1.y > minY && coord1.y <= minY + cropHeight) { return s1; } else { return s2; } } And it crashes on initialization. class CIWipeRenderer: CIFilter { var backgroundImage:CIImage? var foregroundImage:CIImage? var inputTime: Float = 0.0 static var kernel:CIColorKernel = { () -> CIColorKernel in let url = Bundle.main.url(forResource: "AppCIKernels", withExtension: "ci.metallib")! let data = try! Data(contentsOf: url) return try! CIColorKernel(functionName: "wipeLinear", fromMetalLibraryData: data) //Crashes here!!!! }() override var outputImage: CIImage? { guard let backgroundImage = backgroundImage else { return nil } guard let foregroundImage = foregroundImage else { return nil } return CIWipeRenderer.kernel.apply(extent: backgroundImage.extent, arguments: [backgroundImage, foregroundImage, inputTime]) } } It crashes in the try line with the following error: Fatal error: 'try!' expression unexpectedly raised an error: Foundation._GenericObjCError.nilError If I replace the kernel code with the following, it works like a charm: extern "C" float4 wipeLinear(coreimage::sample_t s1, coreimage::sample_t s2, float time) { return mix(s1, s2, time); }
1
0
1.6k
Oct ’23
CIColorCube load lut data
I have a .cube file storing LUT data, such as this: TITLE "Cool LUT" LUT_3D_SIZE 64 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0157 0.0000 0.0000 0.0353 0.0000 0.0000 My question is how do I load required NSData that can be used in CIColorCube filter? When using Metal, I convert this data into MTLTexture using AdobeLUTParser. Not sure what to do in case of CoreImage.
0
0
916
Dec ’21
AVVideoComposition startRequest early return without calling finishWithComposedVideoFrame
@AVFoundationEngineers I browsed through AVCustomEdit sample code and notice the following: - (void)startVideoCompositionRequest:(AVAsynchronousVideoCompositionRequest *)request { @autoreleasepool { dispatch_async(_renderingQueue,^() { // Check if all pending requests have been cancelled if (_shouldCancelAllRequests) { [request finishCancelledRequest]; } else { NSError *err = nil; // Get the next rendererd pixel buffer CVPixelBufferRef resultPixels = [self newRenderedPixelBufferForRequest:request error:&err]; if (resultPixels) { // The resulting pixelbuffer from OpenGL renderer is passed along to the request [request finishWithComposedVideoFrame:resultPixels]; CFRelease(resultPixels); } else { [request finishWithError:err]; } } } } The startVideoComposition request returns early without calling finishWithComposedVideoFrame as the processing takes place asynchronously in a dispatchQueue. However, the documentation of startVideoCompositionRequest states the following: Note that if the custom compositor's implementation of -startVideoCompositionRequest: returns without finishing the composition immediately, it may be invoked again with another composition request before the prior request is finished; therefore in such cases the custom compositor should be prepared to manage multiple composition requests. But I don't see anything in the code that is prepared to handle multiple composition requests of same video frame. How is one supposed to handle this case?
0
0
581
Jan ’22
AVPlayer seek completion handler not called
@AVFoundation Engineers, I get this bug repeatedly when using AVComposition & AVVideoComposition. Sometimes AVPlayer seek to time completion handler is not called. I check for a flag whether seek is in progress before placing another seek request. But if the completion handler is never invoked, all further seeks stall as flag remains true. What is a reliable way to know seek is not in progress before initiating another seek request. playerSeeking = true         player.seek(to: time, toleranceBefore: .zero, toleranceAfter: .zero) { [weak self] completed             if !completed {                 NSLog("Seek not completed \(time.seconds)")             }             guard let self = self else {                 return             }             self.playerSeeking = false             if self.player.rate == 0.0 {                 self.updateButtonStates()             }         }
2
0
1.8k
Feb ’22
Combining two apps in XCode
I have two XCode workspaces, both the workspaces have multiple targets. Each target in both the workspaces have storyboard file called Main.storyboard. My problem is to combine one target in the first workspace with a second in another. What is the right approach to merge the targets?
0
0
402
Feb ’22
Urgent: XCode git remove files of another project
I dragged a group of files from one XCode project to another. I noticed XCode copied the whole project folder instead of the selected group of files. I immediately selected the copied project folder and deleted it, and selected "Remove References only" as I had a bad experience earlier where I chose to delete the files (in which case the files got deleted from the original project). But now when I commit the changes to git, it shows me list of 1300 files from the original project to commit, even though they are not there in the project. I search the all the project subfolders and those files are no where in the project. What do I do for git to forget those files?
0
0
602
Feb ’22
How to debug crash - Thread 1: EXC_BAD_INSTRUCTION (code=1, subcode=0x40343276)
I have a strange crash on iOS device (EXC_BAD_INSTRUCTION). I have a custom UIControl called ScrollingScrubber and all it has is a UIScrollView, and it fires .valueChanged events when user is scrolling. That's where sometimes the crash happens and I have no idea how to debug it further. var value:CGFloat = 0 override open func beginTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { isUserDragging = scroller.isDragging if isUserDragging { sendActions(for: .editingDidBegin) } return isUserDragging } override open func continueTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { value = (scroller.contentOffset.y + scroller.contentInset.top)/(scroller.contentSize.height) sendActions(for: .valueChanged) //It sometimes crashes here with EXC_BAD_INSTRUCTION, why????? return true }
0
0
390
Apr ’22
RemoteIO to AVAudioEngine port
I have a RemoteIO unit that successfully playbacks the microphone samples in realtime via attached headphones. I need to get the same functionality ported using AVAudioEngine, but I can't seem to make a head start. Here is my code, all I do is connect inputNode to playerNode which crashes. var engine: AVAudioEngine! var playerNode: AVAudioPlayerNode! var mixer: AVAudioMixerNode! var engineRunning = false private func setupAudioSession() { var options:AVAudioSession.CategoryOptions = [.allowBluetooth, .allowBluetoothA2DP] do { try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord, mode: AVAudioSession.Mode.default, options: options) try AVAudioSession.sharedInstance().setAllowHapticsAndSystemSoundsDuringRecording(true) } catch { MPLog("Could not set audio session category") } let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setActive(false) try audioSession.setPreferredSampleRate(Double(44100)) } catch { print("Unable to deactivate Audio session") } do { try audioSession.setActive(true) } catch { print("Unable to activate AudioSession") } } private func setupAudioEngine() { self.engine = AVAudioEngine() self.playerNode = AVAudioPlayerNode() self.engine.attach(self.playerNode) engine.connect(self.engine.inputNode, to: self.playerNode, format: nil) do { try self.engine.start() } catch { print("error couldn't start engine") } engineRunning = true } But starting AVAudioEngine causes a crash: libc++abi: terminating with uncaught exception of type NSException *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: inDestImpl->NumberInputs() > 0 || graphNodeDest->CanResizeNumberOfInputs()' terminating with uncaught exception of type NSException How do I get realtime record and playback of mic samples via headphones working?
1
0
1.2k
Apr ’22
iOS Settings app dark mode matching colors
I have a UITableViewController with a grouped table view. No matter what I try, I can't match dark mode colors of native Settings app of iOS 14. I tried the following: self.tableView.backgroundColor = UIColor.systemGroupedBackground And in cellForItemAtIndexPath, I set cell.backgroundColor = UIColor.secondarySystemGroupedBackground This matches colors for light mode but not for dark mode.
Topic: UI Frameworks SubTopic: UIKit Tags:
0
0
365
Apr ’22
External display window switch at runtime programmatically
I display my view to external display using UIScene as follows by selecting an appropriate UISceneConfiguration: // MARK: UISceneSession Lifecycle     @available(iOS 13.0, *)     func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptions) -> UISceneConfiguration {         // Called when a new scene session is being created.         // Use this method to select a configuration to create the new scene with.        // return UISceneConfiguration(name: "Default Configuration", sessionRole: connectingSceneSession.role)         // This is not necessary; however, I found it useful for debugging                switch connectingSceneSession.role {                    case  .windowApplication:                        return UISceneConfiguration(name: "Default Configuration", sessionRole: connectingSceneSession.role)                    case .windowExternalDisplay:                        return UISceneConfiguration(name: "External Screen", sessionRole: connectingSceneSession.role)                    default:                        fatalError("Unknown Configuration \(connectingSceneSession.role.rawValue)")                    }     } The above API is called automatically when external screen is connected/disconnected. My question is whether there is anyway or API that disables/enables external screen display at runtime (without user disconnecting the HDMI cable)?
Topic: UI Frameworks SubTopic: UIKit Tags:
0
0
934
May ’22
AVPlayerViewController content on both external screen and iOS device
I see AVPlayerViewController automatically hides video preview on iOS device the moment it detects external display. I want to present video preview on external display as well as on iOS device at the same time. The audio should be routed to external screen via HDMI and not playback on iOS device. Is it possible to do this using standard APIs? I could think of a couple of ways mentioned below and would like to hear from AVFoundation team and others what is recommended or the way out: Create two AVPlayerViewController and two AVPlayers. One of it is attached to externalScreen window, and both display same local video file. Audio would be muted for the playerController displaying video on iOS device. Some play/pause/seek syncing code is required to sync both the players. Use only one AVPlayerViewController that airplays to external screen. Attach AVPlayerItemVideoOutput to the playerItem and intercept video frames and display them using Metal/CoreImage on the iOS device. Use two AVPlayerLayer and add them to external and device screen. Video frames will be replicated to both external screen and iOS device , but not sure if it is possible to route audio only to external screen via HDMI but muted locally.
0
0
809
May ’22