Post

Replies

Boosts

Views

Activity

Can MTKView display HDR CIImage/CVPixelBuffer?
I have tried everything but it looks to be impossible to get MTKView to display full range of colors of HDR CIImage made from CVPixelBuffer (in 10bit YUV format). Only builtin layers such as AVCaptureVideoPreviewLayer, AVPlayerLayer, AVSampleBufferDisplayLayer are able to fully display HDR images on iOS. Is MTKView incapable of displaying full BT2020_HLG color range? Why does MTKView clip colors no matter even if I set pixel Color format to bgra10_xr or bgra10_xr_srgb?  convenience init(frame: CGRect, contentScale:CGFloat) {         self.init(frame: frame)         contentScaleFactor = contentScale     }     convenience init(frame: CGRect) {         let device = MetalCamera.metalDevice         self.init(frame: frame, device: device)         colorPixelFormat = .bgra10_xr         self.preferredFramesPerSecond = 30     }     override init(frame frameRect: CGRect, device: MTLDevice?) {         guard let device = device else {             fatalError("Can't use Metal")         }         guard let cmdQueue = device.makeCommandQueue(maxCommandBufferCount: 5) else {             fatalError("Can't make Command Queue")         }         commandQueue = cmdQueue         context = CIContext(mtlDevice: device, options: [CIContextOption.cacheIntermediates: false])         super.init(frame: frameRect, device: device)         self.framebufferOnly = false         self.clearColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 0)     } And then rendering code:  override func draw(_ rect: CGRect) {         guard let image = self.image else {             return         }         let dRect = self.bounds         let drawImage: CIImage         let targetSize = dRect.size         let imageSize = image.extent.size         let scalingFactor = min(targetSize.width/imageSize.width, targetSize.height/imageSize.height)         let scalingTransform = CGAffineTransform(scaleX: scalingFactor, y: scalingFactor)         let translation:CGPoint = CGPoint(x: (targetSize.width - imageSize.width * scalingFactor)/2 , y: (targetSize.height - imageSize.height * scalingFactor)/2)         let translationTransform = CGAffineTransform(translationX: translation.x, y: translation.y)         let scalingTranslationTransform = scalingTransform.concatenating(translationTransform)        drawImage = image.transformed(by: scalingTranslationTransform)         let commandBuffer = commandQueue.makeCommandBufferWithUnretainedReferences()         guard let texture = self.currentDrawable?.texture else {             return         }         var colorSpace:CGColorSpace                 if #available(iOS 14.0, *) {             colorSpace = CGColorSpace(name: CGColorSpace.itur_2100_HLG)!         } else {             // Fallback on earlier versions             colorSpace = drawImage.colorSpace ?? CGColorSpaceCreateDeviceRGB()         }         NSLog("Image \(colorSpace.name), \(image.colorSpace?.name)")         context.render(drawImage, to: texture, commandBuffer: commandBuffer, bounds: dRect, colorSpace: colorSpace)         commandBuffer?.present(self.currentDrawable!, afterMinimumDuration: 1.0/Double(self.preferredFramesPerSecond))         commandBuffer?.commit()     }
1
0
2.2k
Apr ’23
MCBrowserViewController error -72008
I get the following error when configuring MCBrowserViewController to look for nearby peers. And this is despite appending the required info in info.plist namely, [MCNearbyServiceBrowser] NSNetServiceBrowser did not search with error dict [{     NSNetServicesErrorCode = "-72008";     NSNetServicesErrorDomain = 10; }]. NSLocalNetworkUsageDescription <string>Need permission to discover and connect to My Service running on peer iOS device</string> NSBonjourServices <array> <string>_my-server._tcp</string> <string>_my-server._udp</string> </array> Here is my code: let browser = MCBrowserViewController(serviceType: "my-server", session: session)             browser.delegate = self             browser.minimumNumberOfPeers = kMCSessionMinimumNumberOfPeers             browser.maximumNumberOfPeers = 1             self.present(browser, animated: true, completion: nil)
1
0
1k
Oct ’21
AVAssetWriter error -12870 after building with XCode 13
The AppStore version of my app works perfectly fine. But the moment I build the same code with XCode 13, AVAssetWriter fails with errors at the very beginning itself both on iOS 14 and iOS 15. This happens with Multicam session only. Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780) I am wondering what could be wrong. I even passed default settings to AVAssetWriter (the recommended ones). Update: It turned out to be an issue with audio settings.
2
0
813
Oct ’21
AVCaptureDevice isCenterStageEnabled KVO not fired
@AVFoundationEngineers I am trying to observe isCenterStageEnabled property as follows: AVCaptureDevice.self.addObserver(self, forKeyPath: "isCenterStageEnabled", options: [.initial, .new], context: &CapturePipeline.centerStageContext) I have set the centerStageControlMode to .cooperative. The KVO fires only when I do make changes to property AVCaptureDevice.isCenterStageEnabled in my code. KVO is NOT fired when the user toggles the centerStage property from Control Center. Is this a bug?
0
0
540
Oct ’21
Metal Core Image passing sampler arguments
I am trying to use a CIColorKernel or CIBlendKernel with sampler arguments but the program crashes. Here is my shader code which compiles successfully. extern "C" float4 wipeLinear(coreimage::sampler t1, coreimage::sampler t2, float time) { float2 coord1 = t1.coord(); float2 coord2 = t2.coord(); float4 innerRect = t2.extent(); float minX = innerRect.x + time*innerRect.z; float minY = innerRect.y + time*innerRect.w; float cropWidth = (1 - time) * innerRect.w; float cropHeight = (1 - time) * innerRect.z; float4 s1 = t1.sample(coord1); float4 s2 = t2.sample(coord2); if ( coord1.x > minX && coord1.x < minX + cropWidth && coord1.y > minY && coord1.y <= minY + cropHeight) { return s1; } else { return s2; } } And it crashes on initialization. class CIWipeRenderer: CIFilter { var backgroundImage:CIImage? var foregroundImage:CIImage? var inputTime: Float = 0.0 static var kernel:CIColorKernel = { () -> CIColorKernel in let url = Bundle.main.url(forResource: "AppCIKernels", withExtension: "ci.metallib")! let data = try! Data(contentsOf: url) return try! CIColorKernel(functionName: "wipeLinear", fromMetalLibraryData: data) //Crashes here!!!! }() override var outputImage: CIImage? { guard let backgroundImage = backgroundImage else { return nil } guard let foregroundImage = foregroundImage else { return nil } return CIWipeRenderer.kernel.apply(extent: backgroundImage.extent, arguments: [backgroundImage, foregroundImage, inputTime]) } } It crashes in the try line with the following error: Fatal error: 'try!' expression unexpectedly raised an error: Foundation._GenericObjCError.nilError If I replace the kernel code with the following, it works like a charm: extern "C" float4 wipeLinear(coreimage::sample_t s1, coreimage::sample_t s2, float time) { return mix(s1, s2, time); }
1
0
1.6k
Oct ’23
CIColorCube load lut data
I have a .cube file storing LUT data, such as this: TITLE "Cool LUT" LUT_3D_SIZE 64 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0157 0.0000 0.0000 0.0353 0.0000 0.0000 My question is how do I load required NSData that can be used in CIColorCube filter? When using Metal, I convert this data into MTLTexture using AdobeLUTParser. Not sure what to do in case of CoreImage.
0
0
929
Dec ’21
AVVideoComposition startRequest early return without calling finishWithComposedVideoFrame
@AVFoundationEngineers I browsed through AVCustomEdit sample code and notice the following: - (void)startVideoCompositionRequest:(AVAsynchronousVideoCompositionRequest *)request { @autoreleasepool { dispatch_async(_renderingQueue,^() { // Check if all pending requests have been cancelled if (_shouldCancelAllRequests) { [request finishCancelledRequest]; } else { NSError *err = nil; // Get the next rendererd pixel buffer CVPixelBufferRef resultPixels = [self newRenderedPixelBufferForRequest:request error:&err]; if (resultPixels) { // The resulting pixelbuffer from OpenGL renderer is passed along to the request [request finishWithComposedVideoFrame:resultPixels]; CFRelease(resultPixels); } else { [request finishWithError:err]; } } } } The startVideoComposition request returns early without calling finishWithComposedVideoFrame as the processing takes place asynchronously in a dispatchQueue. However, the documentation of startVideoCompositionRequest states the following: Note that if the custom compositor's implementation of -startVideoCompositionRequest: returns without finishing the composition immediately, it may be invoked again with another composition request before the prior request is finished; therefore in such cases the custom compositor should be prepared to manage multiple composition requests. But I don't see anything in the code that is prepared to handle multiple composition requests of same video frame. How is one supposed to handle this case?
0
0
593
Jan ’22
AVPlayer seek completion handler not called
@AVFoundation Engineers, I get this bug repeatedly when using AVComposition & AVVideoComposition. Sometimes AVPlayer seek to time completion handler is not called. I check for a flag whether seek is in progress before placing another seek request. But if the completion handler is never invoked, all further seeks stall as flag remains true. What is a reliable way to know seek is not in progress before initiating another seek request. playerSeeking = true         player.seek(to: time, toleranceBefore: .zero, toleranceAfter: .zero) { [weak self] completed             if !completed {                 NSLog("Seek not completed \(time.seconds)")             }             guard let self = self else {                 return             }             self.playerSeeking = false             if self.player.rate == 0.0 {                 self.updateButtonStates()             }         }
2
0
1.8k
Feb ’22
Combining two apps in XCode
I have two XCode workspaces, both the workspaces have multiple targets. Each target in both the workspaces have storyboard file called Main.storyboard. My problem is to combine one target in the first workspace with a second in another. What is the right approach to merge the targets?
0
0
412
Feb ’22
Urgent: XCode git remove files of another project
I dragged a group of files from one XCode project to another. I noticed XCode copied the whole project folder instead of the selected group of files. I immediately selected the copied project folder and deleted it, and selected "Remove References only" as I had a bad experience earlier where I chose to delete the files (in which case the files got deleted from the original project). But now when I commit the changes to git, it shows me list of 1300 files from the original project to commit, even though they are not there in the project. I search the all the project subfolders and those files are no where in the project. What do I do for git to forget those files?
0
0
611
Feb ’22
How to debug crash - Thread 1: EXC_BAD_INSTRUCTION (code=1, subcode=0x40343276)
I have a strange crash on iOS device (EXC_BAD_INSTRUCTION). I have a custom UIControl called ScrollingScrubber and all it has is a UIScrollView, and it fires .valueChanged events when user is scrolling. That's where sometimes the crash happens and I have no idea how to debug it further. var value:CGFloat = 0 override open func beginTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { isUserDragging = scroller.isDragging if isUserDragging { sendActions(for: .editingDidBegin) } return isUserDragging } override open func continueTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { value = (scroller.contentOffset.y + scroller.contentInset.top)/(scroller.contentSize.height) sendActions(for: .valueChanged) //It sometimes crashes here with EXC_BAD_INSTRUCTION, why????? return true }
0
0
400
Apr ’22
Can MTKView display HDR CIImage/CVPixelBuffer?
I have tried everything but it looks to be impossible to get MTKView to display full range of colors of HDR CIImage made from CVPixelBuffer (in 10bit YUV format). Only builtin layers such as AVCaptureVideoPreviewLayer, AVPlayerLayer, AVSampleBufferDisplayLayer are able to fully display HDR images on iOS. Is MTKView incapable of displaying full BT2020_HLG color range? Why does MTKView clip colors no matter even if I set pixel Color format to bgra10_xr or bgra10_xr_srgb?  convenience init(frame: CGRect, contentScale:CGFloat) {         self.init(frame: frame)         contentScaleFactor = contentScale     }     convenience init(frame: CGRect) {         let device = MetalCamera.metalDevice         self.init(frame: frame, device: device)         colorPixelFormat = .bgra10_xr         self.preferredFramesPerSecond = 30     }     override init(frame frameRect: CGRect, device: MTLDevice?) {         guard let device = device else {             fatalError("Can't use Metal")         }         guard let cmdQueue = device.makeCommandQueue(maxCommandBufferCount: 5) else {             fatalError("Can't make Command Queue")         }         commandQueue = cmdQueue         context = CIContext(mtlDevice: device, options: [CIContextOption.cacheIntermediates: false])         super.init(frame: frameRect, device: device)         self.framebufferOnly = false         self.clearColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 0)     } And then rendering code:  override func draw(_ rect: CGRect) {         guard let image = self.image else {             return         }         let dRect = self.bounds         let drawImage: CIImage         let targetSize = dRect.size         let imageSize = image.extent.size         let scalingFactor = min(targetSize.width/imageSize.width, targetSize.height/imageSize.height)         let scalingTransform = CGAffineTransform(scaleX: scalingFactor, y: scalingFactor)         let translation:CGPoint = CGPoint(x: (targetSize.width - imageSize.width * scalingFactor)/2 , y: (targetSize.height - imageSize.height * scalingFactor)/2)         let translationTransform = CGAffineTransform(translationX: translation.x, y: translation.y)         let scalingTranslationTransform = scalingTransform.concatenating(translationTransform)        drawImage = image.transformed(by: scalingTranslationTransform)         let commandBuffer = commandQueue.makeCommandBufferWithUnretainedReferences()         guard let texture = self.currentDrawable?.texture else {             return         }         var colorSpace:CGColorSpace                 if #available(iOS 14.0, *) {             colorSpace = CGColorSpace(name: CGColorSpace.itur_2100_HLG)!         } else {             // Fallback on earlier versions             colorSpace = drawImage.colorSpace ?? CGColorSpaceCreateDeviceRGB()         }         NSLog("Image \(colorSpace.name), \(image.colorSpace?.name)")         context.render(drawImage, to: texture, commandBuffer: commandBuffer, bounds: dRect, colorSpace: colorSpace)         commandBuffer?.present(self.currentDrawable!, afterMinimumDuration: 1.0/Double(self.preferredFramesPerSecond))         commandBuffer?.commit()     }
Replies
1
Boosts
0
Views
2.2k
Activity
Apr ’23
MCBrowserViewController error -72008
I get the following error when configuring MCBrowserViewController to look for nearby peers. And this is despite appending the required info in info.plist namely, [MCNearbyServiceBrowser] NSNetServiceBrowser did not search with error dict [{     NSNetServicesErrorCode = "-72008";     NSNetServicesErrorDomain = 10; }]. NSLocalNetworkUsageDescription <string>Need permission to discover and connect to My Service running on peer iOS device</string> NSBonjourServices <array> <string>_my-server._tcp</string> <string>_my-server._udp</string> </array> Here is my code: let browser = MCBrowserViewController(serviceType: "my-server", session: session)             browser.delegate = self             browser.minimumNumberOfPeers = kMCSessionMinimumNumberOfPeers             browser.maximumNumberOfPeers = 1             self.present(browser, animated: true, completion: nil)
Replies
1
Boosts
0
Views
1k
Activity
Oct ’21
AVAssetWriter error -12870 after building with XCode 13
The AppStore version of my app works perfectly fine. But the moment I build the same code with XCode 13, AVAssetWriter fails with errors at the very beginning itself both on iOS 14 and iOS 15. This happens with Multicam session only. Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780) I am wondering what could be wrong. I even passed default settings to AVAssetWriter (the recommended ones). Update: It turned out to be an issue with audio settings.
Replies
2
Boosts
0
Views
813
Activity
Oct ’21
How to observe AVCaptureDevice.isCenterStageEnabled?
WWDC 2021 session 10047 recommends to observe changes in AVCaptureDevice.isCenterStageEnabled which is a class property. But how exactly do we observe a class property in Swift?
Replies
3
Boosts
0
Views
1.6k
Activity
Oct ’21
AVAssetWriter settings for ProRes422
For editing ProRes videos, what are the right compression settings for AVAssetWriter? This is for editing an existing ProRes video.
Replies
5
Boosts
0
Views
1k
Activity
Oct ’21
AVCaptureDevice isCenterStageEnabled KVO not fired
@AVFoundationEngineers I am trying to observe isCenterStageEnabled property as follows: AVCaptureDevice.self.addObserver(self, forKeyPath: "isCenterStageEnabled", options: [.initial, .new], context: &CapturePipeline.centerStageContext) I have set the centerStageControlMode to .cooperative. The KVO fires only when I do make changes to property AVCaptureDevice.isCenterStageEnabled in my code. KVO is NOT fired when the user toggles the centerStage property from Control Center. Is this a bug?
Replies
0
Boosts
0
Views
540
Activity
Oct ’21
AppStoreConnect hourly sales data down
Hourly sales data is not updated in appstoreconnect for the past 3 days. Two days ago, the same website running on different devices were showing different hourly sales figures and now they all show zero sales. Is it a problem on client side or a server outage?
Replies
2
Boosts
0
Views
863
Activity
Mar ’22
Metal Core Image passing sampler arguments
I am trying to use a CIColorKernel or CIBlendKernel with sampler arguments but the program crashes. Here is my shader code which compiles successfully. extern "C" float4 wipeLinear(coreimage::sampler t1, coreimage::sampler t2, float time) { float2 coord1 = t1.coord(); float2 coord2 = t2.coord(); float4 innerRect = t2.extent(); float minX = innerRect.x + time*innerRect.z; float minY = innerRect.y + time*innerRect.w; float cropWidth = (1 - time) * innerRect.w; float cropHeight = (1 - time) * innerRect.z; float4 s1 = t1.sample(coord1); float4 s2 = t2.sample(coord2); if ( coord1.x > minX && coord1.x < minX + cropWidth && coord1.y > minY && coord1.y <= minY + cropHeight) { return s1; } else { return s2; } } And it crashes on initialization. class CIWipeRenderer: CIFilter { var backgroundImage:CIImage? var foregroundImage:CIImage? var inputTime: Float = 0.0 static var kernel:CIColorKernel = { () -> CIColorKernel in let url = Bundle.main.url(forResource: "AppCIKernels", withExtension: "ci.metallib")! let data = try! Data(contentsOf: url) return try! CIColorKernel(functionName: "wipeLinear", fromMetalLibraryData: data) //Crashes here!!!! }() override var outputImage: CIImage? { guard let backgroundImage = backgroundImage else { return nil } guard let foregroundImage = foregroundImage else { return nil } return CIWipeRenderer.kernel.apply(extent: backgroundImage.extent, arguments: [backgroundImage, foregroundImage, inputTime]) } } It crashes in the try line with the following error: Fatal error: 'try!' expression unexpectedly raised an error: Foundation._GenericObjCError.nilError If I replace the kernel code with the following, it works like a charm: extern "C" float4 wipeLinear(coreimage::sample_t s1, coreimage::sample_t s2, float time) { return mix(s1, s2, time); }
Replies
1
Boosts
0
Views
1.6k
Activity
Oct ’23
CIColorCube load lut data
I have a .cube file storing LUT data, such as this: TITLE "Cool LUT" LUT_3D_SIZE 64 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0157 0.0000 0.0000 0.0353 0.0000 0.0000 My question is how do I load required NSData that can be used in CIColorCube filter? When using Metal, I convert this data into MTLTexture using AdobeLUTParser. Not sure what to do in case of CoreImage.
Replies
0
Boosts
0
Views
929
Activity
Dec ’21
AVVideoComposition startRequest early return without calling finishWithComposedVideoFrame
@AVFoundationEngineers I browsed through AVCustomEdit sample code and notice the following: - (void)startVideoCompositionRequest:(AVAsynchronousVideoCompositionRequest *)request { @autoreleasepool { dispatch_async(_renderingQueue,^() { // Check if all pending requests have been cancelled if (_shouldCancelAllRequests) { [request finishCancelledRequest]; } else { NSError *err = nil; // Get the next rendererd pixel buffer CVPixelBufferRef resultPixels = [self newRenderedPixelBufferForRequest:request error:&err]; if (resultPixels) { // The resulting pixelbuffer from OpenGL renderer is passed along to the request [request finishWithComposedVideoFrame:resultPixels]; CFRelease(resultPixels); } else { [request finishWithError:err]; } } } } The startVideoComposition request returns early without calling finishWithComposedVideoFrame as the processing takes place asynchronously in a dispatchQueue. However, the documentation of startVideoCompositionRequest states the following: Note that if the custom compositor's implementation of -startVideoCompositionRequest: returns without finishing the composition immediately, it may be invoked again with another composition request before the prior request is finished; therefore in such cases the custom compositor should be prepared to manage multiple composition requests. But I don't see anything in the code that is prepared to handle multiple composition requests of same video frame. How is one supposed to handle this case?
Replies
0
Boosts
0
Views
593
Activity
Jan ’22
AVPlayer seek completion handler not called
@AVFoundation Engineers, I get this bug repeatedly when using AVComposition & AVVideoComposition. Sometimes AVPlayer seek to time completion handler is not called. I check for a flag whether seek is in progress before placing another seek request. But if the completion handler is never invoked, all further seeks stall as flag remains true. What is a reliable way to know seek is not in progress before initiating another seek request. playerSeeking = true         player.seek(to: time, toleranceBefore: .zero, toleranceAfter: .zero) { [weak self] completed             if !completed {                 NSLog("Seek not completed \(time.seconds)")             }             guard let self = self else {                 return             }             self.playerSeeking = false             if self.player.rate == 0.0 {                 self.updateButtonStates()             }         }
Replies
2
Boosts
0
Views
1.8k
Activity
Feb ’22
Combining two apps in XCode
I have two XCode workspaces, both the workspaces have multiple targets. Each target in both the workspaces have storyboard file called Main.storyboard. My problem is to combine one target in the first workspace with a second in another. What is the right approach to merge the targets?
Replies
0
Boosts
0
Views
412
Activity
Feb ’22
Urgent: XCode git remove files of another project
I dragged a group of files from one XCode project to another. I noticed XCode copied the whole project folder instead of the selected group of files. I immediately selected the copied project folder and deleted it, and selected "Remove References only" as I had a bad experience earlier where I chose to delete the files (in which case the files got deleted from the original project). But now when I commit the changes to git, it shows me list of 1300 files from the original project to commit, even though they are not there in the project. I search the all the project subfolders and those files are no where in the project. What do I do for git to forget those files?
Replies
0
Boosts
0
Views
611
Activity
Feb ’22
How to debug crash - Thread 1: EXC_BAD_INSTRUCTION (code=1, subcode=0x40343276)
I have a strange crash on iOS device (EXC_BAD_INSTRUCTION). I have a custom UIControl called ScrollingScrubber and all it has is a UIScrollView, and it fires .valueChanged events when user is scrolling. That's where sometimes the crash happens and I have no idea how to debug it further. var value:CGFloat = 0 override open func beginTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { isUserDragging = scroller.isDragging if isUserDragging { sendActions(for: .editingDidBegin) } return isUserDragging } override open func continueTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { value = (scroller.contentOffset.y + scroller.contentInset.top)/(scroller.contentSize.height) sendActions(for: .valueChanged) //It sometimes crashes here with EXC_BAD_INSTRUCTION, why????? return true }
Replies
0
Boosts
0
Views
400
Activity
Apr ’22
XCode jump to definition does not work for many symbols
The screenshot says all, it's a very irritating issue in XCode where for many builtin symbols, on selection it shows "Cut, Copy,..." instead of "Jump to Definition". Happens most often when jumping from Swift to Objective C symbols. Wondering if there is any fix?
Replies
0
Boosts
0
Views
602
Activity
Apr ’22