The screenshot says all, it's a very irritating issue in XCode where for many builtin symbols, on selection it shows "Cut, Copy,..." instead of "Jump to Definition". Happens most often when jumping from Swift to Objective C symbols. Wondering if there is any fix?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have a RemoteIO unit that successfully playbacks the microphone samples in realtime via attached headphones. I need to get the same functionality ported using AVAudioEngine, but I can't seem to make a head start. Here is my code, all I do is connect inputNode to playerNode which crashes.
var engine: AVAudioEngine!
var playerNode: AVAudioPlayerNode!
var mixer: AVAudioMixerNode!
var engineRunning = false
private func setupAudioSession() {
var options:AVAudioSession.CategoryOptions = [.allowBluetooth, .allowBluetoothA2DP]
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord, mode: AVAudioSession.Mode.default, options: options)
try AVAudioSession.sharedInstance().setAllowHapticsAndSystemSoundsDuringRecording(true)
} catch {
MPLog("Could not set audio session category")
}
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setActive(false)
try audioSession.setPreferredSampleRate(Double(44100))
} catch {
print("Unable to deactivate Audio session")
}
do {
try audioSession.setActive(true)
} catch {
print("Unable to activate AudioSession")
}
}
private func setupAudioEngine() {
self.engine = AVAudioEngine()
self.playerNode = AVAudioPlayerNode()
self.engine.attach(self.playerNode)
engine.connect(self.engine.inputNode, to: self.playerNode, format: nil)
do {
try self.engine.start()
}
catch {
print("error couldn't start engine")
}
engineRunning = true
}
But starting AVAudioEngine causes a crash:
libc++abi: terminating with uncaught exception of type NSException
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason:
'required condition is false: inDestImpl->NumberInputs() > 0 || graphNodeDest->CanResizeNumberOfInputs()'
terminating with uncaught exception of type NSException
How do I get realtime record and playback of mic samples via headphones working?
I see there is a deprecation warning when using detailTextLabel of UITableViewCell.
@available(iOS, introduced: 3.0, deprecated: 100000, message: "Use UIListContentConfiguration instead, this property will be deprecated in a future release.")
open var detailTextLabel: UILabel? { get } // default is nil. label will be created if necessary (and the current style supports a detail label).
But it is not clear how to use UIListContentConfiguration to support detailTextLabel that is on the right side of the cell. I only see secondaryText in UIListContentConfiguration that is always displayed as a subtitle. How does one use UIListContentConfiguration as a replacement?
I have a UITableViewController with a grouped table view. No matter what I try, I can't match dark mode colors of native Settings app of iOS 14. I tried the following:
self.tableView.backgroundColor = UIColor.systemGroupedBackground
And in cellForItemAtIndexPath, I set
cell.backgroundColor = UIColor.secondarySystemGroupedBackground
This matches colors for light mode but not for dark mode.
I display my view to external display using UIScene as follows by selecting an appropriate UISceneConfiguration:
// MARK: UISceneSession Lifecycle
@available(iOS 13.0, *)
func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptions) -> UISceneConfiguration {
// Called when a new scene session is being created.
// Use this method to select a configuration to create the new scene with.
// return UISceneConfiguration(name: "Default Configuration", sessionRole: connectingSceneSession.role)
// This is not necessary; however, I found it useful for debugging
switch connectingSceneSession.role {
case .windowApplication:
return UISceneConfiguration(name: "Default Configuration", sessionRole: connectingSceneSession.role)
case .windowExternalDisplay:
return UISceneConfiguration(name: "External Screen", sessionRole: connectingSceneSession.role)
default:
fatalError("Unknown Configuration \(connectingSceneSession.role.rawValue)")
}
}
The above API is called automatically when external screen is connected/disconnected. My question is whether there is anyway or API that disables/enables external screen display at runtime (without user disconnecting the HDMI cable)?
I see AVPlayerViewController automatically hides video preview on iOS device the moment it detects external display. I want to present video preview on external display as well as on iOS device at the same time. The audio should be routed to external screen via HDMI and not playback on iOS device. Is it possible to do this using standard APIs? I could think of a couple of ways mentioned below and would like to hear from AVFoundation team and others what is recommended or the way out:
Create two AVPlayerViewController and two AVPlayers. One of it is attached to externalScreen window, and both display same local video file. Audio would be muted for the playerController displaying video on iOS device. Some play/pause/seek syncing code is required to sync both the players.
Use only one AVPlayerViewController that airplays to external screen. Attach AVPlayerItemVideoOutput to the playerItem and intercept video frames and display them using Metal/CoreImage on the iOS device.
Use two AVPlayerLayer and add them to external and device screen. Video frames will be replicated to both external screen and iOS device , but not sure if it is possible to route audio only to external screen via HDMI but muted locally.
I am using AVAudioSession with playAndRecord category as follows:
private func setupAudioSessionForRecording() {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setActive(false)
try audioSession.setPreferredSampleRate(Double(48000))
} catch {
NSLog("Unable to deactivate Audio session")
}
let options:AVAudioSession.CategoryOptions = [.allowAirPlay, .mixWithOthers]
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord, mode: AVAudioSession.Mode.default, options: options)
} catch {
NSLog("Could not set audio session category \(error)")
}
do {
try audioSession.setActive(true)
} catch {
NSLog("Unable to activate AudioSession")
}
}
Next I use AVAudioEngine to repeat what I say in the microphone to external speakers (on the TV connected with iPhone using HDMI Cable).
//MARK:- AudioEngine
var engine: AVAudioEngine!
var playerNode: AVAudioPlayerNode!
var mixer: AVAudioMixerNode!
var audioEngineRunning = false
public func setupAudioEngine() {
self.engine = AVAudioEngine()
engine.connect(self.engine.inputNode, to: self.engine.outputNode, format: nil)
do {
engine.prepare()
try self.engine.start()
}
catch {
print("error couldn't start engine")
}
audioEngineRunning = true
}
public func stopAudioEngine() {
engine.stop()
audioEngineRunning = false
}
The issue is I hear some kind of reverb/humming noise after I speak for a few seconds that keeps getting amplified and repeated. If I use a RemoteIO unit instead, no such noise comes out of speakers. I am not sure if my setup of AVAudioEngine is correct. I have tried all kinds of AVAudioSession configuration but nothing changes.
The link to sample audio with background speaker noise is posted [here] in the Stackoverflow forum (https://stackoverflow.com/questions/72170548/echo-when-using-avaudioengine-over-hdmi#comment127514327_72170548)
Topic:
Media Technologies
SubTopic:
Audio
Tags:
AVAudioSession
AVAudioNode
AVAudioEngine
AVFoundation
I am trying to use AVAudioEngine for listening to mic samples and playing them simultaneously via external speaker or headphones (assuming they are attached to iOS device). I tried the following using AVAudioPlayerNode and it works, but there is too much delay in the audio playback. Is there a way to hear sound realtime without delay? Why the scheduleBuffer API has so much delay I wonder.
var engine: AVAudioEngine!
var playerNode: AVAudioPlayerNode!
var mixer: AVAudioMixerNode!
var audioEngineRunning = false
public func setupAudioEngine() {
self.engine = AVAudioEngine()
let input = engine.inputNode
let format = input.inputFormat(forBus: 0)
playerNode = AVAudioPlayerNode()
engine.attach(playerNode)
self.mixer = engine.mainMixerNode
engine.connect(self.playerNode, to: self.mixer, format: playerNode.outputFormat(forBus: 0))
engine.inputNode.installTap(onBus: 0, bufferSize: 4096, format: format, block: { buffer, time in
self.playerNode.scheduleBuffer(buffer, completionHandler: nil)
})
do {
engine.prepare()
try self.engine.start()
audioEngineRunning = true
self.playerNode.play()
}
catch {
print("error couldn't start engine")
audioEngineRunning = false
}
}
I have a UISceneConfiguration for external screen which is triggered when external display is connected to iOS device.
// MARK: UISceneSession Lifecycle
@available(iOS 13.0, *)
func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptions) -> UISceneConfiguration {
// Called when a new scene session is being created.
// Use this method to select a configuration to create the new scene with.
switch connectingSceneSession.role {
case .windowApplication:
return UISceneConfiguration(name: "Default Configuration", sessionRole: connectingSceneSession.role)
case .windowExternalDisplay:
return UISceneConfiguration(name: "External Screen", sessionRole: connectingSceneSession.role)
default:
fatalError("Unknown Configuration \(connectingSceneSession.role.rawValue)")
}
}
I display a custom view in the external screen this way in a new UIScene linked to external display. But the problem now is if I also have an AVPlayerViewController in the flow of application, it no longer displays to external screen. I suppose AVPlayerViewController does it's own configuration for external display playback perhaps, but now I have a custom view embedded on external screen it is unable to override it. What do I need to do so that AVPlayerViewController can display content to external screen the way it does normally?
I have an AVPlayerLayer and AVPlayer setup for playback on external screen as follows:
var player = AVPlayer()
playerView.player = player
player.usesExternalPlaybackWhileExternalScreenIsActive = true
player.allowsExternalPlayback = true
playerView is just a UIView that has AVPlayerLayer as it's main layer. This code works and automatically starts displaying and playing video on external screen. The thing is I want an option to invert the AVPlayerLayer on the external screen. I tried setting transform on playerView but that is ignored on the external screen. How do I gain more control on the external screen window?
I also tried to manually add playerView to external screen window and set
player.usesExternalPlaybackWhileExternalScreenIsActive = true
I can also display AVPlayerLayer manually this way. But again, setting a transform on this screen has no effect on external display. So it may also be a UIKit issue.
I need to implement a text editor using UITextView that supports:
Bold/Italic/Underline
Color,Font,font size changes
Paragraph alignment
List format (bullets, numbers, etc.)
Custom selection of text anywhere in the text view and change the properties
So far I have managed to do it without NSTextStorage but it seems I am hitting limits. For instance, to change font, I use UIFontPickerViewController and change the font as follows:
func fontPickerViewControllerDidPickFont(_ viewController: UIFontPickerViewController) {
if let selectedFontDesc = viewController.selectedFontDescriptor {
let font = UIFont(descriptor: selectedFontDesc, size: selectedFontDesc.pointSize)
self.selectedFont = font
self.textView.typingAttributes = [NSAttributedString.Key.foregroundColor: self.selectedColor ?? UIColor.white, NSAttributedString.Key.font: self.selectedFont ?? UIFont.preferredFont(forTextStyle: .body, compatibleWith: nil)]
if let range = self.textView.selectedTextRange, let selectedFont = selectedFont {
let attributedText = NSMutableAttributedString(attributedString: self.textView.attributedText)
let location = textView.offset(from: textView.beginningOfDocument, to: range.start)
let length = textView.offset(from: range.start, to: range.end)
let nsRange = NSRange(location: location, length: length)
attributedText.setAttributes([NSAttributedString.Key.font : selectedFont], range: nsRange)
self.textView.attributedText = attributedText
}
}
}
This works but the problem is it resets the color of the selected text and other properties. I need to understand a way in which the existing attributed of the text under selection are not disturbed. I suspect the way to do is with using NSTextStorage but I can't find anything good on internet that explains the right use of NSTextStorage to achieve this.
I am trying to develop tone curves filter using Metal or Core Image as I find CIToneCurve filter is having limitations (number of points are atmost 5, spline curve it is using is not documented, and sometimes output is a black image even with 4 points). Moreover it's not straightforward to have separate R,G,B curves independently. I decided to explore other libraries that implement tone curve and the only one that I know is GPUImage (few others borrow code from the same library). But the source code is too cryptic to understand and I have [doubts] about the manner in which it is generating look up texture (https://stackoverflow.com/questions/70516363/gpuimage-tone-curve-rgbcomposite-filter).
Can someone explain how to correctly implement R,G,B, and RGB composite curves filter like in Mac Photos App?
I have a UICollectionViewCell defined in storyboard which has a UILabel added to it's contentView. Collection view uses a flow layout and I return a fixed size of cell in flowlayout delegate as follows:
let sizeOfItem = CGFloat(210.0)
func collectionView(_ collectionView: UICollectionView, layout collectionViewLayout: UICollectionViewLayout, sizeForItemAt indexPath: IndexPath) -> CGSize {
return CGSize(width: sizeOfItem, height: sizeOfItem)
}
I added the following constraint for UILabel in storyboard and the cell automatically starts resizing itself to match the text size in UILabel. This is not what I want. I want the cell to be fixed size and label to autoshrink instead.
I even tried setting contentHuggingPriority of label to lowest value (i.e. 1). This stops the cell from auto-shrinking if the text in label is small. But the cell grows when the text is bigger. I don't want this to happen either. I want the cell to remain fixed size and label to adapt it's font and be constrained within the cell.
I have a UICollectionViewFlowLayout assigned to UICollectionView. It's delegate method returns fixed CGSize in sizeForItemAtIndexPath. The problem is if UICollectionViewCell contains UIImageView or any other dynamically sized view with autolayout constraints that match leading, trailing, top, bottom to contentView, the cell resizes depending on the size of image in the imageView. Is it possible to get fixed size cells with UICollectionViewFlowLayout? I tried setting intrinsicContentSize for UIImageView in storyboard but it doesn't work.
I have this code to adjust UITextView when keyboard is about to be displayed.
@objc func keyboardWillShow(notification: NSNotification) {
if let rectValue = notification.userInfo?[UIResponder.keyboardFrameEndUserInfoKey] as? NSValue {
let keyboardSize = rectValue.cgRectValue.size
let contentInsets: UIEdgeInsets = UIEdgeInsets(top: 0, left: 0, bottom: keyboardSize.height - view.safeAreaInsets.bottom, right: 0)
textView.contentInset = contentInsets
textView.scrollIndicatorInsets = contentInsets
}
}
I see setting contentInset automatically creates a scrolling animation (after the keyboard appears fully) that makes the text at the cursor visible. The animation happens automatically and there is no way to disable it. I want to scroll to a given location in the textView sometimes when the keyboard is presented (preferably, animating it along with keyboard appearance). The implicit contentInset animation makes it impossible however. First the keyboard animates, then contentInset animates, and now only can I animate to my chosen location in textView. This is awkward. I am wondering what is the right way to achieve it?