Post

Replies

Boosts

Views

Activity

UITableView detailTextLabel equivalent in UIListContentConfiguration
I see there is a deprecation warning when using detailTextLabel of UITableViewCell. @available(iOS, introduced: 3.0, deprecated: 100000, message: "Use UIListContentConfiguration instead, this property will be deprecated in a future release.") open var detailTextLabel: UILabel? { get } // default is nil. label will be created if necessary (and the current style supports a detail label). But it is not clear how to use UIListContentConfiguration to support detailTextLabel that is on the right side of the cell. I only see secondaryText in UIListContentConfiguration that is always displayed as a subtitle. How does one use UIListContentConfiguration as a replacement?
2
0
1.3k
Apr ’22
RemoteIO to AVAudioEngine port
I have a RemoteIO unit that successfully playbacks the microphone samples in realtime via attached headphones. I need to get the same functionality ported using AVAudioEngine, but I can't seem to make a head start. Here is my code, all I do is connect inputNode to playerNode which crashes. var engine: AVAudioEngine! var playerNode: AVAudioPlayerNode! var mixer: AVAudioMixerNode! var engineRunning = false private func setupAudioSession() { var options:AVAudioSession.CategoryOptions = [.allowBluetooth, .allowBluetoothA2DP] do { try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord, mode: AVAudioSession.Mode.default, options: options) try AVAudioSession.sharedInstance().setAllowHapticsAndSystemSoundsDuringRecording(true) } catch { MPLog("Could not set audio session category") } let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setActive(false) try audioSession.setPreferredSampleRate(Double(44100)) } catch { print("Unable to deactivate Audio session") } do { try audioSession.setActive(true) } catch { print("Unable to activate AudioSession") } } private func setupAudioEngine() { self.engine = AVAudioEngine() self.playerNode = AVAudioPlayerNode() self.engine.attach(self.playerNode) engine.connect(self.engine.inputNode, to: self.playerNode, format: nil) do { try self.engine.start() } catch { print("error couldn't start engine") } engineRunning = true } But starting AVAudioEngine causes a crash: libc++abi: terminating with uncaught exception of type NSException *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: inDestImpl->NumberInputs() > 0 || graphNodeDest->CanResizeNumberOfInputs()' terminating with uncaught exception of type NSException How do I get realtime record and playback of mic samples via headphones working?
1
0
1.2k
Apr ’22
How to debug crash - Thread 1: EXC_BAD_INSTRUCTION (code=1, subcode=0x40343276)
I have a strange crash on iOS device (EXC_BAD_INSTRUCTION). I have a custom UIControl called ScrollingScrubber and all it has is a UIScrollView, and it fires .valueChanged events when user is scrolling. That's where sometimes the crash happens and I have no idea how to debug it further. var value:CGFloat = 0 override open func beginTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { isUserDragging = scroller.isDragging if isUserDragging { sendActions(for: .editingDidBegin) } return isUserDragging } override open func continueTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { value = (scroller.contentOffset.y + scroller.contentInset.top)/(scroller.contentSize.height) sendActions(for: .valueChanged) //It sometimes crashes here with EXC_BAD_INSTRUCTION, why????? return true }
0
0
400
Apr ’22
AVPlayer audio metering for AVComposition
I have an AVComposition playback via AVPlayer where AVComposition has multiple audio tracks with audioMix applied. My question is how is it possible to compute audio meter values for the audio playing back through AVPlayer? Using MTAudioProcessingTap it seems you can only get callback for one track at a time. But if that route has to be used, it's not clear how to get sample values of all the audio tracks at a given time in a single callback?
1
0
1.6k
Mar ’22
Autolayout warning "NSLayoutConstraint is being configured with a constant that exceeds internal limits"
I have a subclass of UIScrollView called MyScrollView. There is a subview called contentViewinside MyScrollView. The width constraint is set to be the contentSize of MyScrollView. private func setupSubviews() { contentView = ContentView() contentView.backgroundColor = UIColor.blue contentView.translatesAutoresizingMaskIntoConstraints = false contentView.isUserInteractionEnabled = true self.addSubview(contentView) contentView.leadingAnchor.constraint(equalTo: self.leadingAnchor).isActive = true contentView.trailingAnchor.constraint(equalTo: self.trailingAnchor).isActive = true contentView.topAnchor.constraint(equalTo: self.topAnchor).isActive = true contentView.bottomAnchor.constraint(equalTo: self.bottomAnchor).isActive = true // create contentView's Width and Height constraints cvWidthConstraint = contentView.widthAnchor.constraint(equalToConstant: 0.0) cvHeightConstraint = contentView.heightAnchor.constraint(equalToConstant: 0.0) // activate them cvWidthConstraint.isActive = true cvHeightConstraint.isActive = true cvWidthConstraint.constant = myWidthConstant //<--- problem here if myWidthConstant is very high, such as 512000 cvHeightConstraint.constant = frame.height contentView.layoutIfNeeded() } The problem is if I set cvWidthConstraint.constant to a very high value such as 521000, I get a warning: This NSLayoutConstraint is being configured with a constant that exceeds internal limits. A smaller value will be substituted, but this problem should be fixed. Break on BOOL _NSLayoutConstraintNumberExceedsLimit(void) to debug. This will be logged only once. This may break in the future. I wonder how does one set UIScrollView content size to be very high values?
1
0
1.2k
Mar ’22
What exactly is the use of CIKernel DOD?
I wrote the following Metal Core Image Kernel to produce constant red color. extern "C" float4 redKernel(coreimage::sampler inputImage, coreimage::destination dest) {     return float4(1.0, 0.0, 0.0, 1.0); } And then I have this in Swift code: class CIMetalRedColorKernel: CIFilter {     var inputImage:CIImage?     static var kernel:CIKernel = { () -> CIKernel in         let bundle = Bundle.main         let url = bundle.url(forResource: "Kernels", withExtension: "ci.metallib")!         let data = try! Data(contentsOf: url)         return try! CIKernel(functionName: "redKernel", fromMetalLibraryData: data)     }()     override var outputImage: CIImage? {         guard let inputImage = inputImage else {             return nil         }         let dod = inputImage.extent         return CIMetalTestRenderer.kernel.apply(extent: dod, roiCallback: { index, rect in             return rect         }, arguments: [inputImage])     } } As you can see, the dod is given to be the extent of the input image. But when I run the filter, I get a whole red image beyond the extent of the input image (DOD), why? I have multiple filters chained together and the overall size is 1920x1080. Isn't the red filter supposed to run only for DOD rectangle passed in it and produce clear pixels for anything outside the DOD?
4
0
1.7k
Feb ’22
Urgent: XCode git remove files of another project
I dragged a group of files from one XCode project to another. I noticed XCode copied the whole project folder instead of the selected group of files. I immediately selected the copied project folder and deleted it, and selected "Remove References only" as I had a bad experience earlier where I chose to delete the files (in which case the files got deleted from the original project). But now when I commit the changes to git, it shows me list of 1300 files from the original project to commit, even though they are not there in the project. I search the all the project subfolders and those files are no where in the project. What do I do for git to forget those files?
0
0
611
Feb ’22
Combining two apps in XCode
I have two XCode workspaces, both the workspaces have multiple targets. Each target in both the workspaces have storyboard file called Main.storyboard. My problem is to combine one target in the first workspace with a second in another. What is the right approach to merge the targets?
0
0
412
Feb ’22
AVPlayer seek completion handler not called
@AVFoundation Engineers, I get this bug repeatedly when using AVComposition & AVVideoComposition. Sometimes AVPlayer seek to time completion handler is not called. I check for a flag whether seek is in progress before placing another seek request. But if the completion handler is never invoked, all further seeks stall as flag remains true. What is a reliable way to know seek is not in progress before initiating another seek request. playerSeeking = true         player.seek(to: time, toleranceBefore: .zero, toleranceAfter: .zero) { [weak self] completed             if !completed {                 NSLog("Seek not completed \(time.seconds)")             }             guard let self = self else {                 return             }             self.playerSeeking = false             if self.player.rate == 0.0 {                 self.updateButtonStates()             }         }
2
0
1.8k
Feb ’22
AVVideoComposition startRequest early return without calling finishWithComposedVideoFrame
@AVFoundationEngineers I browsed through AVCustomEdit sample code and notice the following: - (void)startVideoCompositionRequest:(AVAsynchronousVideoCompositionRequest *)request { @autoreleasepool { dispatch_async(_renderingQueue,^() { // Check if all pending requests have been cancelled if (_shouldCancelAllRequests) { [request finishCancelledRequest]; } else { NSError *err = nil; // Get the next rendererd pixel buffer CVPixelBufferRef resultPixels = [self newRenderedPixelBufferForRequest:request error:&err]; if (resultPixels) { // The resulting pixelbuffer from OpenGL renderer is passed along to the request [request finishWithComposedVideoFrame:resultPixels]; CFRelease(resultPixels); } else { [request finishWithError:err]; } } } } The startVideoComposition request returns early without calling finishWithComposedVideoFrame as the processing takes place asynchronously in a dispatchQueue. However, the documentation of startVideoCompositionRequest states the following: Note that if the custom compositor's implementation of -startVideoCompositionRequest: returns without finishing the composition immediately, it may be invoked again with another composition request before the prior request is finished; therefore in such cases the custom compositor should be prepared to manage multiple composition requests. But I don't see anything in the code that is prepared to handle multiple composition requests of same video frame. How is one supposed to handle this case?
0
0
593
Jan ’22
CIColorCube load lut data
I have a .cube file storing LUT data, such as this: TITLE "Cool LUT" LUT_3D_SIZE 64 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0157 0.0000 0.0000 0.0353 0.0000 0.0000 My question is how do I load required NSData that can be used in CIColorCube filter? When using Metal, I convert this data into MTLTexture using AdobeLUTParser. Not sure what to do in case of CoreImage.
0
0
929
Dec ’21
UITableView detailTextLabel equivalent in UIListContentConfiguration
I see there is a deprecation warning when using detailTextLabel of UITableViewCell. @available(iOS, introduced: 3.0, deprecated: 100000, message: "Use UIListContentConfiguration instead, this property will be deprecated in a future release.") open var detailTextLabel: UILabel? { get } // default is nil. label will be created if necessary (and the current style supports a detail label). But it is not clear how to use UIListContentConfiguration to support detailTextLabel that is on the right side of the cell. I only see secondaryText in UIListContentConfiguration that is always displayed as a subtitle. How does one use UIListContentConfiguration as a replacement?
Replies
2
Boosts
0
Views
1.3k
Activity
Apr ’22
RemoteIO to AVAudioEngine port
I have a RemoteIO unit that successfully playbacks the microphone samples in realtime via attached headphones. I need to get the same functionality ported using AVAudioEngine, but I can't seem to make a head start. Here is my code, all I do is connect inputNode to playerNode which crashes. var engine: AVAudioEngine! var playerNode: AVAudioPlayerNode! var mixer: AVAudioMixerNode! var engineRunning = false private func setupAudioSession() { var options:AVAudioSession.CategoryOptions = [.allowBluetooth, .allowBluetoothA2DP] do { try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord, mode: AVAudioSession.Mode.default, options: options) try AVAudioSession.sharedInstance().setAllowHapticsAndSystemSoundsDuringRecording(true) } catch { MPLog("Could not set audio session category") } let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setActive(false) try audioSession.setPreferredSampleRate(Double(44100)) } catch { print("Unable to deactivate Audio session") } do { try audioSession.setActive(true) } catch { print("Unable to activate AudioSession") } } private func setupAudioEngine() { self.engine = AVAudioEngine() self.playerNode = AVAudioPlayerNode() self.engine.attach(self.playerNode) engine.connect(self.engine.inputNode, to: self.playerNode, format: nil) do { try self.engine.start() } catch { print("error couldn't start engine") } engineRunning = true } But starting AVAudioEngine causes a crash: libc++abi: terminating with uncaught exception of type NSException *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: inDestImpl->NumberInputs() > 0 || graphNodeDest->CanResizeNumberOfInputs()' terminating with uncaught exception of type NSException How do I get realtime record and playback of mic samples via headphones working?
Replies
1
Boosts
0
Views
1.2k
Activity
Apr ’22
XCode jump to definition does not work for many symbols
The screenshot says all, it's a very irritating issue in XCode where for many builtin symbols, on selection it shows "Cut, Copy,..." instead of "Jump to Definition". Happens most often when jumping from Swift to Objective C symbols. Wondering if there is any fix?
Replies
0
Boosts
0
Views
602
Activity
Apr ’22
How to debug crash - Thread 1: EXC_BAD_INSTRUCTION (code=1, subcode=0x40343276)
I have a strange crash on iOS device (EXC_BAD_INSTRUCTION). I have a custom UIControl called ScrollingScrubber and all it has is a UIScrollView, and it fires .valueChanged events when user is scrolling. That's where sometimes the crash happens and I have no idea how to debug it further. var value:CGFloat = 0 override open func beginTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { isUserDragging = scroller.isDragging if isUserDragging { sendActions(for: .editingDidBegin) } return isUserDragging } override open func continueTracking(_ touch: UITouch, with event: UIEvent?) -> Bool { value = (scroller.contentOffset.y + scroller.contentInset.top)/(scroller.contentSize.height) sendActions(for: .valueChanged) //It sometimes crashes here with EXC_BAD_INSTRUCTION, why????? return true }
Replies
0
Boosts
0
Views
400
Activity
Apr ’22
AppStoreConnect hourly sales data down
Hourly sales data is not updated in appstoreconnect for the past 3 days. Two days ago, the same website running on different devices were showing different hourly sales figures and now they all show zero sales. Is it a problem on client side or a server outage?
Replies
2
Boosts
0
Views
863
Activity
Mar ’22
AVPlayer audio metering for AVComposition
I have an AVComposition playback via AVPlayer where AVComposition has multiple audio tracks with audioMix applied. My question is how is it possible to compute audio meter values for the audio playing back through AVPlayer? Using MTAudioProcessingTap it seems you can only get callback for one track at a time. But if that route has to be used, it's not clear how to get sample values of all the audio tracks at a given time in a single callback?
Replies
1
Boosts
0
Views
1.6k
Activity
Mar ’22
Autolayout warning "NSLayoutConstraint is being configured with a constant that exceeds internal limits"
I have a subclass of UIScrollView called MyScrollView. There is a subview called contentViewinside MyScrollView. The width constraint is set to be the contentSize of MyScrollView. private func setupSubviews() { contentView = ContentView() contentView.backgroundColor = UIColor.blue contentView.translatesAutoresizingMaskIntoConstraints = false contentView.isUserInteractionEnabled = true self.addSubview(contentView) contentView.leadingAnchor.constraint(equalTo: self.leadingAnchor).isActive = true contentView.trailingAnchor.constraint(equalTo: self.trailingAnchor).isActive = true contentView.topAnchor.constraint(equalTo: self.topAnchor).isActive = true contentView.bottomAnchor.constraint(equalTo: self.bottomAnchor).isActive = true // create contentView's Width and Height constraints cvWidthConstraint = contentView.widthAnchor.constraint(equalToConstant: 0.0) cvHeightConstraint = contentView.heightAnchor.constraint(equalToConstant: 0.0) // activate them cvWidthConstraint.isActive = true cvHeightConstraint.isActive = true cvWidthConstraint.constant = myWidthConstant //<--- problem here if myWidthConstant is very high, such as 512000 cvHeightConstraint.constant = frame.height contentView.layoutIfNeeded() } The problem is if I set cvWidthConstraint.constant to a very high value such as 521000, I get a warning: This NSLayoutConstraint is being configured with a constant that exceeds internal limits. A smaller value will be substituted, but this problem should be fixed. Break on BOOL _NSLayoutConstraintNumberExceedsLimit(void) to debug. This will be logged only once. This may break in the future. I wonder how does one set UIScrollView content size to be very high values?
Replies
1
Boosts
0
Views
1.2k
Activity
Mar ’22
What exactly is the use of CIKernel DOD?
I wrote the following Metal Core Image Kernel to produce constant red color. extern "C" float4 redKernel(coreimage::sampler inputImage, coreimage::destination dest) {     return float4(1.0, 0.0, 0.0, 1.0); } And then I have this in Swift code: class CIMetalRedColorKernel: CIFilter {     var inputImage:CIImage?     static var kernel:CIKernel = { () -> CIKernel in         let bundle = Bundle.main         let url = bundle.url(forResource: "Kernels", withExtension: "ci.metallib")!         let data = try! Data(contentsOf: url)         return try! CIKernel(functionName: "redKernel", fromMetalLibraryData: data)     }()     override var outputImage: CIImage? {         guard let inputImage = inputImage else {             return nil         }         let dod = inputImage.extent         return CIMetalTestRenderer.kernel.apply(extent: dod, roiCallback: { index, rect in             return rect         }, arguments: [inputImage])     } } As you can see, the dod is given to be the extent of the input image. But when I run the filter, I get a whole red image beyond the extent of the input image (DOD), why? I have multiple filters chained together and the overall size is 1920x1080. Isn't the red filter supposed to run only for DOD rectangle passed in it and produce clear pixels for anything outside the DOD?
Replies
4
Boosts
0
Views
1.7k
Activity
Feb ’22
Urgent: XCode git remove files of another project
I dragged a group of files from one XCode project to another. I noticed XCode copied the whole project folder instead of the selected group of files. I immediately selected the copied project folder and deleted it, and selected "Remove References only" as I had a bad experience earlier where I chose to delete the files (in which case the files got deleted from the original project). But now when I commit the changes to git, it shows me list of 1300 files from the original project to commit, even though they are not there in the project. I search the all the project subfolders and those files are no where in the project. What do I do for git to forget those files?
Replies
0
Boosts
0
Views
611
Activity
Feb ’22
Combining two apps in XCode
I have two XCode workspaces, both the workspaces have multiple targets. Each target in both the workspaces have storyboard file called Main.storyboard. My problem is to combine one target in the first workspace with a second in another. What is the right approach to merge the targets?
Replies
0
Boosts
0
Views
412
Activity
Feb ’22
AVPlayer seek completion handler not called
@AVFoundation Engineers, I get this bug repeatedly when using AVComposition & AVVideoComposition. Sometimes AVPlayer seek to time completion handler is not called. I check for a flag whether seek is in progress before placing another seek request. But if the completion handler is never invoked, all further seeks stall as flag remains true. What is a reliable way to know seek is not in progress before initiating another seek request. playerSeeking = true         player.seek(to: time, toleranceBefore: .zero, toleranceAfter: .zero) { [weak self] completed             if !completed {                 NSLog("Seek not completed \(time.seconds)")             }             guard let self = self else {                 return             }             self.playerSeeking = false             if self.player.rate == 0.0 {                 self.updateButtonStates()             }         }
Replies
2
Boosts
0
Views
1.8k
Activity
Feb ’22
Passing MTLTexture to Metal Core Image Kernel
Is it possible to pass MTLTexture to Metal Core Image Kernel? How can Metal resources be shared with Core Image?
Replies
1
Boosts
0
Views
1.2k
Activity
Feb ’22
AVVideoComposition startRequest early return without calling finishWithComposedVideoFrame
@AVFoundationEngineers I browsed through AVCustomEdit sample code and notice the following: - (void)startVideoCompositionRequest:(AVAsynchronousVideoCompositionRequest *)request { @autoreleasepool { dispatch_async(_renderingQueue,^() { // Check if all pending requests have been cancelled if (_shouldCancelAllRequests) { [request finishCancelledRequest]; } else { NSError *err = nil; // Get the next rendererd pixel buffer CVPixelBufferRef resultPixels = [self newRenderedPixelBufferForRequest:request error:&err]; if (resultPixels) { // The resulting pixelbuffer from OpenGL renderer is passed along to the request [request finishWithComposedVideoFrame:resultPixels]; CFRelease(resultPixels); } else { [request finishWithError:err]; } } } } The startVideoComposition request returns early without calling finishWithComposedVideoFrame as the processing takes place asynchronously in a dispatchQueue. However, the documentation of startVideoCompositionRequest states the following: Note that if the custom compositor's implementation of -startVideoCompositionRequest: returns without finishing the composition immediately, it may be invoked again with another composition request before the prior request is finished; therefore in such cases the custom compositor should be prepared to manage multiple composition requests. But I don't see anything in the code that is prepared to handle multiple composition requests of same video frame. How is one supposed to handle this case?
Replies
0
Boosts
0
Views
593
Activity
Jan ’22
CIColorCube load lut data
I have a .cube file storing LUT data, such as this: TITLE "Cool LUT" LUT_3D_SIZE 64 0.0000 0.0000 0.0000 0.0000 0.0000 0.0000 0.0157 0.0000 0.0000 0.0353 0.0000 0.0000 My question is how do I load required NSData that can be used in CIColorCube filter? When using Metal, I convert this data into MTLTexture using AdobeLUTParser. Not sure what to do in case of CoreImage.
Replies
0
Boosts
0
Views
929
Activity
Dec ’21
AVAssetWriter settings for ProRes422
For editing ProRes videos, what are the right compression settings for AVAssetWriter? This is for editing an existing ProRes video.
Replies
5
Boosts
0
Views
1k
Activity
Oct ’21