Post

Replies

Boosts

Views

Activity

The view.layer.presentation() 's value is not accurate.
I add two view on screen. One is the blue view, other is orange view. And I use CADisplayLink to change orange view 's frame according to the blueView.layer.presentation.frame As expected, the orange view should cover the blue view completely, but we can see a gap and we can see a little part of blue view during keyboard animation. Here is demo code: class SecondViewController: UIViewController {       let textField = UITextField()       override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {     self.textField.resignFirstResponder()   }       lazy var displayLink = CADisplayLink(target: self, selector: #selector(self.onDisplayLink))       @objc private func onDisplayLink () {     self.realBox.frame = CGRect(x: self.realBox.frame.origin.x, y: self.blueBox.layer.presentation()!.frame.origin.y, width: self.view.bounds.width, height: 100)   }       let blueBox = UIView()   let realBox = UIView()       override func viewDidLoad() {     super.viewDidLoad()     self.view.backgroundColor = .white           textField.frame = .init(x: 100, y: 100, width: 100, height: 100)     textField.backgroundColor = .red     self.view.addSubview(textField)           realBox.backgroundColor = .orange           blueBox.backgroundColor = .blue     blueBox.frame = .init(x: 0, y: self.view.bounds.height - 100, width: self.view.bounds.width, height: 100)     self.view.addSubview(blueBox)     self.view.addSubview(realBox)     realBox.frame = .init(x: 0, y:self.view.bounds.height - 100 , width: self.view.bounds.width, height: 100)           NotificationCenter.default.addObserver(forName: UIResponder.keyboardWillChangeFrameNotification, object: nil, queue: .main) { noti in       let userInfo = noti.userInfo!       let endFrame = userInfo[UIResponder.keyboardFrameEndUserInfoKey] as! CGRect       let isOpen = endFrame.intersects(self.view.bounds)       self.blueBox.frame = .init(x: 0, y: isOpen ? self.view.bounds.height - 100 - endFrame.height : self.view.bounds.height - 100, width: self.view.bounds.width, height: 100)     }           if #available(iOS 15.0, *) {       self.displayLink.preferredFrameRateRange = .init(minimum: 60, maximum: 120, preferred: 120)     } else {       self.displayLink.preferredFramesPerSecond = 120     }     self.displayLink.add(to: .main, forMode: .common)   } } So how to get an accurate value of the layer currently displayed on screen? Here is the video: https://github.com/luckysmg/daily_images/blob/main/RPReplay_Final1661168764.mov?raw=true
3
2
1.7k
Aug ’22
How to get animation interpolation during animation?
I want to get animation interpolation during animation running. just like code below: Animator(begin:100,end:200).addListener({value in //here will be called when animation updates. print(value) }) But I can't find any API like the code above in iOS SDK.And I don't want to use any UIView or CALayer to get presentationLayer to get this value. So How can I do that?🤔🤔🤔
2
0
1.4k
Oct ’21
Does captureTextFromCamera API availability will change during App running ?
I use this method to judge this device whether supports the captureTextFromCamera API : let tf = UITextField() tf.canPerformAction(#selector(UIResponder.captureTextFromCamera(_:)),withSender: nil) This will return false on iPhoneX and return true on iPhone13 Pro So I want to know: On iPhone13Pro tf.canPerformAction(#selector(UIResponder.captureTextFromCamera(_:)),withSender: nil) Can this method return the false ? Or this value won't change for every device? Can I assume that this method will always return true on iPhone13 Pro and will always false on iPhoneX?
Topic: UI Frameworks SubTopic: UIKit Tags:
2
0
1.1k
Jun ’22
Is there a way to increase the frequency of touchesMoved method?
I have set CADisableMinimumFrameDurationOnPhone to YES And My device is iPhone13 Pro. The refresh rate is be 120HZ max. But the touchesMoved callback call with a frequency of 60HZ (16.66ms) in UIViewController It should be 120HZ (0.008s) on IPhone 13 pro Here is test code: import SwiftUI class HomeViewController: UIViewController,UIScrollViewDelegate {       override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {     let time = CFAbsoluteTimeGetCurrent()     print(time - oldTime) // 0.016s. But it should be 0.008s on iPhone13Pro     oldTime = time   }       var oldTime:CFAbsoluteTime = 0       override func viewDidLoad() {     super.viewDidLoad()     self.view.backgroundColor = .red   } } So How can I increase this frequency to 120HZ ?.
2
1
1.7k
Jul ’22
How to pass custom object in MTAudioProcessingTapCallbacks?
I wrote the code below to pass my custom object Engine in MTAudioProcessingTapCallbacks Here is code: func getTap() -> MTAudioProcessingTap? {     var tap: Unmanaged<MTAudioProcessingTap>?           func onInit(tap:MTAudioProcessingTap,clientInfo:UnsafeMutableRawPointer?,tagStroageOut:UnsafeMutablePointer<UnsafeMutableRawPointer?>) {               let engine = Engine()       tagStroageOut.pointee = Unmanaged<Engine>.passUnretained(engine).toOpaque()     }                 var callback = MTAudioProcessingTapCallbacks(version: kMTAudioProcessingTapCallbacksVersion_0, clientInfo:nil, init: onInit, finalize: nil, prepare: nil, unprepare: nil) { tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut in               guard MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut) == noErr else {         preconditionFailure()       }               let storage = MTAudioProcessingTapGetStorage(tap)       let engine = Unmanaged<Engine>.fromOpaque(storage).takeUnretainedValue()               // This line crashed :       // ClientProcessingTapManager (14): EXC_BAD_ACCESS (code=1, address=0x544453e46ea0)       engine.dealWith(bufferPtr: bufferListInOut)     }     guard MTAudioProcessingTapCreate(kCFAllocatorDefault, &callback, kMTAudioProcessingTapCreationFlag_PostEffects, &tap) == noErr else{       fatalError()     }     return tap?.takeRetainedValue()   } } How can I do it?
1
0
1.4k
Nov ’21
How to get the height of padding in UINavigationBar
The top padding of UINavigationBar 's value is UIApplication.shared.statusBarFrame.height or UIApplication.shared.keyWindow?.safeAreaInsets.top ? Here is image See the blue box of the image https://github.com/luckysmg/daily_images/blob/main/20220928153515.jpg?raw=true Because two value is the same on devices older than iPhone14 Pro but is not the same on iPhone14 Pro and iPhone 14 Pro max. Which is the right value to get?
1
0
1.1k
Oct ’22
Crash when set CAMetalLayer.presentsWithTransaction=true in background thread
When set metalLayer.presentsWithTransaction = true, and we call drawable.present() in bg thread (our rendering thread is not main thread), the app sometimes will crash with: *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Modifications to the layout engine must not be performed from a background thread after it has been accessed from the main thread.' Any solution?
1
0
1.3k
Jan ’23
How to process the audio bufferList with AVAudioEngine?
I have an AVMutableAudioMix and use MTAudioProcessingTap to process the audio data.But After I pass the buffer to AVAudioEngine and to render it with renderOffline,the audio has no any effects...How can I do it? Any idea? Here is the code for MTAudioProcessingTapProcessCallback var callback = MTAudioProcessingTapCallbacks(version: kMTAudioProcessingTapCallbacksVersion_0, clientInfo:UnsafeMutableRawPointer(Unmanaged.passUnretained(self.engine).toOpaque()), init: tapInit, finalize: tapFinalize, prepare: tapPrepare, unprepare: tapUnprepare) { tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut in                       guard MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut) == noErr else {         preconditionFailure()       }       let storage = MTAudioProcessingTapGetStorage(tap)       let engine = Unmanaged<Engine>.fromOpaque(storage).takeUnretainedValue()       // render the audio with effect       engine.render(bufferPtr: bufferListInOut,numberOfFrames: numberFrames)     } And here is the Engine code class Engine {   let engine = AVAudioEngine()       let player = AVAudioPlayerNode()   let pitchEffect = AVAudioUnitTimePitch()   let reverbEffect = AVAudioUnitReverb()   let rateEffect = AVAudioUnitVarispeed()   let volumeEffect = AVAudioUnitEQ()   let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 2, interleaved: false)!   init() {     engine.attach(player)     engine.attach(pitchEffect)     engine.attach(reverbEffect)     engine.attach(rateEffect)     engine.attach(volumeEffect)           engine.connect(player, to: pitchEffect, format: format)     engine.connect(pitchEffect, to: reverbEffect, format: format)     engine.connect(reverbEffect, to: rateEffect, format: format)     engine.connect(rateEffect, to: volumeEffect, format: format)     engine.connect(volumeEffect, to: engine.mainMixerNode, format: format)           try! engine.enableManualRenderingMode(.offline, format: format, maximumFrameCount: 4096)           reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.largeRoom2)     reverbEffect.wetDryMix = 100     pitchEffect.pitch = 2100           try! engine.start()     player.play()   }       func render(bufferPtr:UnsafeMutablePointer<AudioBufferList>,numberOfFrames:CMItemCount) {     let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: 4096)!     buffer.frameLength = AVAudioFrameCount(numberOfFrames)     buffer.mutableAudioBufferList.pointee = bufferPtr.pointee     self.player.scheduleBuffer(buffer) {       try! self.engine.renderOffline(AVAudioFrameCount(numberOfFrames), to: buffer)     }   } }
0
0
1.5k
Nov ’21
How to know a device is available to use captureTextFromCamera API?
The new captureTextFromCamera API allow device to use OCR in UIResponder,but how can I know this device support OCR or not ? eg: iPhoneX:NO, iPhone13:YES
Replies
6
Boosts
0
Views
1.3k
Activity
Oct ’21
The view.layer.presentation() 's value is not accurate.
I add two view on screen. One is the blue view, other is orange view. And I use CADisplayLink to change orange view 's frame according to the blueView.layer.presentation.frame As expected, the orange view should cover the blue view completely, but we can see a gap and we can see a little part of blue view during keyboard animation. Here is demo code: class SecondViewController: UIViewController {       let textField = UITextField()       override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {     self.textField.resignFirstResponder()   }       lazy var displayLink = CADisplayLink(target: self, selector: #selector(self.onDisplayLink))       @objc private func onDisplayLink () {     self.realBox.frame = CGRect(x: self.realBox.frame.origin.x, y: self.blueBox.layer.presentation()!.frame.origin.y, width: self.view.bounds.width, height: 100)   }       let blueBox = UIView()   let realBox = UIView()       override func viewDidLoad() {     super.viewDidLoad()     self.view.backgroundColor = .white           textField.frame = .init(x: 100, y: 100, width: 100, height: 100)     textField.backgroundColor = .red     self.view.addSubview(textField)           realBox.backgroundColor = .orange           blueBox.backgroundColor = .blue     blueBox.frame = .init(x: 0, y: self.view.bounds.height - 100, width: self.view.bounds.width, height: 100)     self.view.addSubview(blueBox)     self.view.addSubview(realBox)     realBox.frame = .init(x: 0, y:self.view.bounds.height - 100 , width: self.view.bounds.width, height: 100)           NotificationCenter.default.addObserver(forName: UIResponder.keyboardWillChangeFrameNotification, object: nil, queue: .main) { noti in       let userInfo = noti.userInfo!       let endFrame = userInfo[UIResponder.keyboardFrameEndUserInfoKey] as! CGRect       let isOpen = endFrame.intersects(self.view.bounds)       self.blueBox.frame = .init(x: 0, y: isOpen ? self.view.bounds.height - 100 - endFrame.height : self.view.bounds.height - 100, width: self.view.bounds.width, height: 100)     }           if #available(iOS 15.0, *) {       self.displayLink.preferredFrameRateRange = .init(minimum: 60, maximum: 120, preferred: 120)     } else {       self.displayLink.preferredFramesPerSecond = 120     }     self.displayLink.add(to: .main, forMode: .common)   } } So how to get an accurate value of the layer currently displayed on screen? Here is the video: https://github.com/luckysmg/daily_images/blob/main/RPReplay_Final1661168764.mov?raw=true
Replies
3
Boosts
2
Views
1.7k
Activity
Aug ’22
CAMetalLayer.nextDrawable() is very time-consuming, take 5ms-12ms
I think CAMetalLayer.nextDrawable() shouldn't be very time-consuming, it should be less than 1ms. But occasionally, it will take more than 5ms, just 7ms~13ms, which is a very long time. How can we optimize this? Since this will cause rendering junk.....
Replies
3
Boosts
0
Views
2.3k
Activity
Jan ’23
How to get animation interpolation during animation?
I want to get animation interpolation during animation running. just like code below: Animator(begin:100,end:200).addListener({value in //here will be called when animation updates. print(value) }) But I can't find any API like the code above in iOS SDK.And I don't want to use any UIView or CALayer to get presentationLayer to get this value. So How can I do that?🤔🤔🤔
Replies
2
Boosts
0
Views
1.4k
Activity
Oct ’21
Does captureTextFromCamera API availability will change during App running ?
I use this method to judge this device whether supports the captureTextFromCamera API : let tf = UITextField() tf.canPerformAction(#selector(UIResponder.captureTextFromCamera(_:)),withSender: nil) This will return false on iPhoneX and return true on iPhone13 Pro So I want to know: On iPhone13Pro tf.canPerformAction(#selector(UIResponder.captureTextFromCamera(_:)),withSender: nil) Can this method return the false ? Or this value won't change for every device? Can I assume that this method will always return true on iPhone13 Pro and will always false on iPhoneX?
Topic: UI Frameworks SubTopic: UIKit Tags:
Replies
2
Boosts
0
Views
1.1k
Activity
Jun ’22
Is there a way to increase the frequency of touchesMoved method?
I have set CADisableMinimumFrameDurationOnPhone to YES And My device is iPhone13 Pro. The refresh rate is be 120HZ max. But the touchesMoved callback call with a frequency of 60HZ (16.66ms) in UIViewController It should be 120HZ (0.008s) on IPhone 13 pro Here is test code: import SwiftUI class HomeViewController: UIViewController,UIScrollViewDelegate {       override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {     let time = CFAbsoluteTimeGetCurrent()     print(time - oldTime) // 0.016s. But it should be 0.008s on iPhone13Pro     oldTime = time   }       var oldTime:CFAbsoluteTime = 0       override func viewDidLoad() {     super.viewDidLoad()     self.view.backgroundColor = .red   } } So How can I increase this frequency to 120HZ ?.
Replies
2
Boosts
1
Views
1.7k
Activity
Jul ’22
How to pass custom object in MTAudioProcessingTapCallbacks?
I wrote the code below to pass my custom object Engine in MTAudioProcessingTapCallbacks Here is code: func getTap() -> MTAudioProcessingTap? {     var tap: Unmanaged<MTAudioProcessingTap>?           func onInit(tap:MTAudioProcessingTap,clientInfo:UnsafeMutableRawPointer?,tagStroageOut:UnsafeMutablePointer<UnsafeMutableRawPointer?>) {               let engine = Engine()       tagStroageOut.pointee = Unmanaged<Engine>.passUnretained(engine).toOpaque()     }                 var callback = MTAudioProcessingTapCallbacks(version: kMTAudioProcessingTapCallbacksVersion_0, clientInfo:nil, init: onInit, finalize: nil, prepare: nil, unprepare: nil) { tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut in               guard MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut) == noErr else {         preconditionFailure()       }               let storage = MTAudioProcessingTapGetStorage(tap)       let engine = Unmanaged<Engine>.fromOpaque(storage).takeUnretainedValue()               // This line crashed :       // ClientProcessingTapManager (14): EXC_BAD_ACCESS (code=1, address=0x544453e46ea0)       engine.dealWith(bufferPtr: bufferListInOut)     }     guard MTAudioProcessingTapCreate(kCFAllocatorDefault, &callback, kMTAudioProcessingTapCreationFlag_PostEffects, &tap) == noErr else{       fatalError()     }     return tap?.takeRetainedValue()   } } How can I do it?
Replies
1
Boosts
0
Views
1.4k
Activity
Nov ’21
AVAudioUnitTimePitch has a render latency
AVAudioUnitTimePitch.latency is 0.09s on my debug devices. It will have a little time delay during render audio using `AVAudioEngine. I just want to change the pitch during playing audio. So how can I avoid this this latency??
Replies
1
Boosts
0
Views
1.9k
Activity
Mar ’23
iOS device compatibility reference is outdated, where is the new doc for now?
Here is old doc https://developer.apple.com/library/archive/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/Displays/Displays.html#//apple_ref/doc/uid/TP40013599-CH108-SW1 Where is new for it ?
Topic: UI Frameworks SubTopic: UIKit Tags:
Replies
1
Boosts
1
Views
967
Activity
Aug ’22
How to get the height of padding in UINavigationBar
The top padding of UINavigationBar 's value is UIApplication.shared.statusBarFrame.height or UIApplication.shared.keyWindow?.safeAreaInsets.top ? Here is image See the blue box of the image https://github.com/luckysmg/daily_images/blob/main/20220928153515.jpg?raw=true Because two value is the same on devices older than iPhone14 Pro but is not the same on iPhone14 Pro and iPhone 14 Pro max. Which is the right value to get?
Replies
1
Boosts
0
Views
1.1k
Activity
Oct ’22
How to get keyboard animation curve stops and interpolation?
The keyboard animation is private. How can we get the interpolation of the keyboard animation using a API?
Topic: UI Frameworks SubTopic: UIKit Tags:
Replies
1
Boosts
0
Views
986
Activity
Nov ’22
Crash when set CAMetalLayer.presentsWithTransaction=true in background thread
When set metalLayer.presentsWithTransaction = true, and we call drawable.present() in bg thread (our rendering thread is not main thread), the app sometimes will crash with: *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Modifications to the layout engine must not be performed from a background thread after it has been accessed from the main thread.' Any solution?
Replies
1
Boosts
0
Views
1.3k
Activity
Jan ’23
iOS 16.6 CAMetalLayer nextDrawable takes much time
As title desc. Before iOS 16.6 is smooth. But iOS 16.6 is lagging. Seems iOS 17 beta4 is ok.. Maybe Apple should post a hotfix for iOS 16.6
Replies
1
Boosts
1
Views
977
Activity
Aug ’23
How to export audio file from `AVAudioEngine`
I have an AVAudioEngine,but I don't know how to export the audio in AVAudioEngine to a file. Anyone can help?
Replies
0
Boosts
0
Views
1.2k
Activity
Nov ’21
How to process the audio bufferList with AVAudioEngine?
I have an AVMutableAudioMix and use MTAudioProcessingTap to process the audio data.But After I pass the buffer to AVAudioEngine and to render it with renderOffline,the audio has no any effects...How can I do it? Any idea? Here is the code for MTAudioProcessingTapProcessCallback var callback = MTAudioProcessingTapCallbacks(version: kMTAudioProcessingTapCallbacksVersion_0, clientInfo:UnsafeMutableRawPointer(Unmanaged.passUnretained(self.engine).toOpaque()), init: tapInit, finalize: tapFinalize, prepare: tapPrepare, unprepare: tapUnprepare) { tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut in                       guard MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut) == noErr else {         preconditionFailure()       }       let storage = MTAudioProcessingTapGetStorage(tap)       let engine = Unmanaged<Engine>.fromOpaque(storage).takeUnretainedValue()       // render the audio with effect       engine.render(bufferPtr: bufferListInOut,numberOfFrames: numberFrames)     } And here is the Engine code class Engine {   let engine = AVAudioEngine()       let player = AVAudioPlayerNode()   let pitchEffect = AVAudioUnitTimePitch()   let reverbEffect = AVAudioUnitReverb()   let rateEffect = AVAudioUnitVarispeed()   let volumeEffect = AVAudioUnitEQ()   let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 2, interleaved: false)!   init() {     engine.attach(player)     engine.attach(pitchEffect)     engine.attach(reverbEffect)     engine.attach(rateEffect)     engine.attach(volumeEffect)           engine.connect(player, to: pitchEffect, format: format)     engine.connect(pitchEffect, to: reverbEffect, format: format)     engine.connect(reverbEffect, to: rateEffect, format: format)     engine.connect(rateEffect, to: volumeEffect, format: format)     engine.connect(volumeEffect, to: engine.mainMixerNode, format: format)           try! engine.enableManualRenderingMode(.offline, format: format, maximumFrameCount: 4096)           reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.largeRoom2)     reverbEffect.wetDryMix = 100     pitchEffect.pitch = 2100           try! engine.start()     player.play()   }       func render(bufferPtr:UnsafeMutablePointer<AudioBufferList>,numberOfFrames:CMItemCount) {     let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: 4096)!     buffer.frameLength = AVAudioFrameCount(numberOfFrames)     buffer.mutableAudioBufferList.pointee = bufferPtr.pointee     self.player.scheduleBuffer(buffer) {       try! self.engine.renderOffline(AVAudioFrameCount(numberOfFrames), to: buffer)     }   } }
Replies
0
Boosts
0
Views
1.5k
Activity
Nov ’21