What I want is to make video 's rotation be from 0 to 90 degree in 0 ~ 2s.
Here is the code:
let layerIns = AVMutableVideoCompositionLayerInstruction(assetTrack: track)
let endTransform = CGAffineTransform(rotationAngle: CGFloat(Double.pi / 2))
let timeRange = CMTimeRange(start: .zero, duration: CMTime(seconds: 2, preferredTimescale: 600))
layerIns.setTransformRamp(fromStart: .identity, toEnd: endTransform, timeRange: timeRange)
But the center of rotation is top-left corner.. Which makes rotation effect look strange.
I wonder how to change the video layer's anchor point to center point instead of top-left corner...
related question on StackOverFlow :Question Link - https://stackoverflow.com/questions/65575655/how-to-change-the-settransformramp-s-transform-anchor-point
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
The new captureTextFromCamera API allow device to use OCR in UIResponder,but how can I know this device support OCR or not ?
eg:
iPhoneX:NO,
iPhone13:YES
I want to get animation interpolation during animation running.
just like code below:
Animator(begin:100,end:200).addListener({value in
//here will be called when animation updates.
print(value)
})
But I can't find any API like the code above in iOS SDK.And I don't want to use any UIView or CALayer to get presentationLayer to get this value.
So How can I do that?🤔🤔🤔
I have an AVAudioEngine,but I don't know how to export the audio in AVAudioEngine to a file. Anyone can help?
I wrote the code below to pass my custom object Engine in MTAudioProcessingTapCallbacks
Here is code:
func getTap() -> MTAudioProcessingTap? {
var tap: Unmanaged<MTAudioProcessingTap>?
func onInit(tap:MTAudioProcessingTap,clientInfo:UnsafeMutableRawPointer?,tagStroageOut:UnsafeMutablePointer<UnsafeMutableRawPointer?>) {
let engine = Engine()
tagStroageOut.pointee = Unmanaged<Engine>.passUnretained(engine).toOpaque()
}
var callback = MTAudioProcessingTapCallbacks(version: kMTAudioProcessingTapCallbacksVersion_0, clientInfo:nil, init: onInit, finalize: nil, prepare: nil, unprepare: nil) { tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut in
guard MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut) == noErr else {
preconditionFailure()
}
let storage = MTAudioProcessingTapGetStorage(tap)
let engine = Unmanaged<Engine>.fromOpaque(storage).takeUnretainedValue()
// This line crashed :
// ClientProcessingTapManager (14): EXC_BAD_ACCESS (code=1, address=0x544453e46ea0)
engine.dealWith(bufferPtr: bufferListInOut)
}
guard MTAudioProcessingTapCreate(kCFAllocatorDefault, &callback, kMTAudioProcessingTapCreationFlag_PostEffects, &tap) == noErr else{
fatalError()
}
return tap?.takeRetainedValue()
}
}
How can I do it?
I have an AVMutableAudioMix and use MTAudioProcessingTap to process the audio data.But After I pass the buffer to AVAudioEngine and to render it with renderOffline,the audio has no any effects...How can I do it? Any idea?
Here is the code for MTAudioProcessingTapProcessCallback
var callback = MTAudioProcessingTapCallbacks(version: kMTAudioProcessingTapCallbacksVersion_0, clientInfo:UnsafeMutableRawPointer(Unmanaged.passUnretained(self.engine).toOpaque()), init: tapInit, finalize: tapFinalize, prepare: tapPrepare, unprepare: tapUnprepare) { tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut in
guard MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut) == noErr else {
preconditionFailure()
}
let storage = MTAudioProcessingTapGetStorage(tap)
let engine = Unmanaged<Engine>.fromOpaque(storage).takeUnretainedValue()
// render the audio with effect
engine.render(bufferPtr: bufferListInOut,numberOfFrames: numberFrames)
}
And here is the Engine code
class Engine {
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
let pitchEffect = AVAudioUnitTimePitch()
let reverbEffect = AVAudioUnitReverb()
let rateEffect = AVAudioUnitVarispeed()
let volumeEffect = AVAudioUnitEQ()
let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 2, interleaved: false)!
init() {
engine.attach(player)
engine.attach(pitchEffect)
engine.attach(reverbEffect)
engine.attach(rateEffect)
engine.attach(volumeEffect)
engine.connect(player, to: pitchEffect, format: format)
engine.connect(pitchEffect, to: reverbEffect, format: format)
engine.connect(reverbEffect, to: rateEffect, format: format)
engine.connect(rateEffect, to: volumeEffect, format: format)
engine.connect(volumeEffect, to: engine.mainMixerNode, format: format)
try! engine.enableManualRenderingMode(.offline, format: format, maximumFrameCount: 4096)
reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.largeRoom2)
reverbEffect.wetDryMix = 100
pitchEffect.pitch = 2100
try! engine.start()
player.play()
}
func render(bufferPtr:UnsafeMutablePointer<AudioBufferList>,numberOfFrames:CMItemCount) {
let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: 4096)!
buffer.frameLength = AVAudioFrameCount(numberOfFrames)
buffer.mutableAudioBufferList.pointee = bufferPtr.pointee
self.player.scheduleBuffer(buffer) {
try! self.engine.renderOffline(AVAudioFrameCount(numberOfFrames), to: buffer)
}
}
}
The GarageBand app can import both midi and recorded audio file into a single player to play.
Just like this:
My App have the same feature but I don't know how to implement it.
I have tried the AVAudioSequencer,but it only can load and play MIDI file.
I have tried the AVPlayer and AVPlayerItem,but it seems that it can't load the MIDI file.
So How to combine MIDI file and audio file into a single AVPlayerItem or anything else to play?
This is the crash log from Firebase.
Fatal Exception: NSInvalidArgumentException
*** -[AVAssetWriter addInput:] Format ID 'lpcm' is not compatible with file type com.apple.m4a-audio
But I can't reproduce the crash ...
This is the demo code
Does anyone know where the problem is ?
let normalOutputSettings:[String:Any] = [
AVFormatIDKey : kAudioFormatLinearPCM,
AVSampleRateKey : 44100,
AVNumberOfChannelsKey : 2,
AVLinearPCMBitDepthKey : 16,
AVLinearPCMIsNonInterleaved : false,
AVLinearPCMIsFloatKey : false,
AVLinearPCMIsBigEndianKey : false
]
let writerInput = AVAssetWriterInput(mediaType: .audio, outputSettings: outputSettings)
let outputURL = URL(fileURLWithPath: NSTemporaryDirectory() + UUID().uuidString + ".m4a")
self.writer = try! AVAssetWriter(outputURL: outputURL, fileType: fileType)
writer?.add(writerInput)
I use this method to judge this device whether supports the captureTextFromCamera API :
let tf = UITextField()
tf.canPerformAction(#selector(UIResponder.captureTextFromCamera(_:)),withSender: nil)
This will return false on iPhoneX and return true on iPhone13 Pro
So I want to know: On iPhone13Pro
tf.canPerformAction(#selector(UIResponder.captureTextFromCamera(_:)),withSender: nil)
Can this method return the false ?
Or this value won't change for every device?
Can I assume that this method will always return true on iPhone13 Pro and will always false on iPhoneX?
AVAudioUnitTimePitch.latency is 0.09s on my debug devices.
It will have a little time delay during render audio using `AVAudioEngine.
I just want to change the pitch during playing audio.
So how can I avoid this this latency??
Topic:
Media Technologies
SubTopic:
Audio
Tags:
AudioToolbox
AVAudioEngine
Core Audio Kit
AVFoundation
I have set CADisableMinimumFrameDurationOnPhone to YES
And My device is iPhone13 Pro. The refresh rate is be 120HZ max.
But the touchesMoved callback call with a frequency of 60HZ (16.66ms) in UIViewController It should be 120HZ (0.008s) on IPhone 13 pro
Here is test code:
import SwiftUI
class HomeViewController: UIViewController,UIScrollViewDelegate {
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
let time = CFAbsoluteTimeGetCurrent()
print(time - oldTime) // 0.016s. But it should be 0.008s on iPhone13Pro
oldTime = time
}
var oldTime:CFAbsoluteTime = 0
override func viewDidLoad() {
super.viewDidLoad()
self.view.backgroundColor = .red
}
}
So How can I increase this frequency to 120HZ ?.
I add two view on screen.
One is the blue view, other is orange view.
And I use CADisplayLink to change orange view 's frame according to the blueView.layer.presentation.frame
As expected, the orange view should cover the blue view completely, but we can see a gap and we can see a little part of blue view during keyboard animation.
Here is demo code:
class SecondViewController: UIViewController {
let textField = UITextField()
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
self.textField.resignFirstResponder()
}
lazy var displayLink = CADisplayLink(target: self, selector: #selector(self.onDisplayLink))
@objc private func onDisplayLink () {
self.realBox.frame = CGRect(x: self.realBox.frame.origin.x, y: self.blueBox.layer.presentation()!.frame.origin.y, width: self.view.bounds.width, height: 100)
}
let blueBox = UIView()
let realBox = UIView()
override func viewDidLoad() {
super.viewDidLoad()
self.view.backgroundColor = .white
textField.frame = .init(x: 100, y: 100, width: 100, height: 100)
textField.backgroundColor = .red
self.view.addSubview(textField)
realBox.backgroundColor = .orange
blueBox.backgroundColor = .blue
blueBox.frame = .init(x: 0, y: self.view.bounds.height - 100, width: self.view.bounds.width, height: 100)
self.view.addSubview(blueBox)
self.view.addSubview(realBox)
realBox.frame = .init(x: 0, y:self.view.bounds.height - 100 , width: self.view.bounds.width, height: 100)
NotificationCenter.default.addObserver(forName: UIResponder.keyboardWillChangeFrameNotification, object: nil, queue: .main) { noti in
let userInfo = noti.userInfo!
let endFrame = userInfo[UIResponder.keyboardFrameEndUserInfoKey] as! CGRect
let isOpen = endFrame.intersects(self.view.bounds)
self.blueBox.frame = .init(x: 0, y: isOpen ? self.view.bounds.height - 100 - endFrame.height : self.view.bounds.height - 100, width: self.view.bounds.width, height: 100)
}
if #available(iOS 15.0, *) {
self.displayLink.preferredFrameRateRange = .init(minimum: 60, maximum: 120, preferred: 120)
} else {
self.displayLink.preferredFramesPerSecond = 120
}
self.displayLink.add(to: .main, forMode: .common)
}
}
So how to get an accurate value of the layer currently displayed on screen?
Here is the video:
https://github.com/luckysmg/daily_images/blob/main/RPReplay_Final1661168764.mov?raw=true
Here is old doc https://developer.apple.com/library/archive/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/Displays/Displays.html#//apple_ref/doc/uid/TP40013599-CH108-SW1
Where is new for it ?
The top padding of UINavigationBar 's value is
UIApplication.shared.statusBarFrame.height
or
UIApplication.shared.keyWindow?.safeAreaInsets.top ?
Here is image
See the blue box of the image
https://github.com/luckysmg/daily_images/blob/main/20220928153515.jpg?raw=true
Because two value is the same on devices older than iPhone14 Pro but is not the same on iPhone14 Pro and iPhone 14 Pro max.
Which is the right value to get?
The keyboard animation is private. How can we get the interpolation of the keyboard animation using a API?