I have a UISceneConfiguration for external screen which is triggered when external display is connected to iOS device.
// MARK: UISceneSession Lifecycle
@available(iOS 13.0, *)
func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptions) -> UISceneConfiguration {
// Called when a new scene session is being created.
// Use this method to select a configuration to create the new scene with.
switch connectingSceneSession.role {
case .windowApplication:
return UISceneConfiguration(name: "Default Configuration", sessionRole: connectingSceneSession.role)
case .windowExternalDisplay:
return UISceneConfiguration(name: "External Screen", sessionRole: connectingSceneSession.role)
default:
fatalError("Unknown Configuration \(connectingSceneSession.role.rawValue)")
}
}
I display a custom view in the external screen this way in a new UIScene linked to external display. But the problem now is if I also have an AVPlayerViewController in the flow of application, it no longer displays to external screen. I suppose AVPlayerViewController does it's own configuration for external display playback perhaps, but now I have a custom view embedded on external screen it is unable to override it. What do I need to do so that AVPlayerViewController can display content to external screen the way it does normally?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am using AVAudioSession with playAndRecord category as follows:
private func setupAudioSessionForRecording() {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setActive(false)
try audioSession.setPreferredSampleRate(Double(48000))
} catch {
NSLog("Unable to deactivate Audio session")
}
let options:AVAudioSession.CategoryOptions = [.allowAirPlay, .mixWithOthers]
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playAndRecord, mode: AVAudioSession.Mode.default, options: options)
} catch {
NSLog("Could not set audio session category \(error)")
}
do {
try audioSession.setActive(true)
} catch {
NSLog("Unable to activate AudioSession")
}
}
Next I use AVAudioEngine to repeat what I say in the microphone to external speakers (on the TV connected with iPhone using HDMI Cable).
//MARK:- AudioEngine
var engine: AVAudioEngine!
var playerNode: AVAudioPlayerNode!
var mixer: AVAudioMixerNode!
var audioEngineRunning = false
public func setupAudioEngine() {
self.engine = AVAudioEngine()
engine.connect(self.engine.inputNode, to: self.engine.outputNode, format: nil)
do {
engine.prepare()
try self.engine.start()
}
catch {
print("error couldn't start engine")
}
audioEngineRunning = true
}
public func stopAudioEngine() {
engine.stop()
audioEngineRunning = false
}
The issue is I hear some kind of reverb/humming noise after I speak for a few seconds that keeps getting amplified and repeated. If I use a RemoteIO unit instead, no such noise comes out of speakers. I am not sure if my setup of AVAudioEngine is correct. I have tried all kinds of AVAudioSession configuration but nothing changes.
The link to sample audio with background speaker noise is posted [here] in the Stackoverflow forum (https://stackoverflow.com/questions/72170548/echo-when-using-avaudioengine-over-hdmi#comment127514327_72170548)
Topic:
Media Technologies
SubTopic:
Audio
Tags:
AVAudioSession
AVAudioNode
AVAudioEngine
AVFoundation
I am trying to use AVAudioEngine for listening to mic samples and playing them simultaneously via external speaker or headphones (assuming they are attached to iOS device). I tried the following using AVAudioPlayerNode and it works, but there is too much delay in the audio playback. Is there a way to hear sound realtime without delay? Why the scheduleBuffer API has so much delay I wonder.
var engine: AVAudioEngine!
var playerNode: AVAudioPlayerNode!
var mixer: AVAudioMixerNode!
var audioEngineRunning = false
public func setupAudioEngine() {
self.engine = AVAudioEngine()
let input = engine.inputNode
let format = input.inputFormat(forBus: 0)
playerNode = AVAudioPlayerNode()
engine.attach(playerNode)
self.mixer = engine.mainMixerNode
engine.connect(self.playerNode, to: self.mixer, format: playerNode.outputFormat(forBus: 0))
engine.inputNode.installTap(onBus: 0, bufferSize: 4096, format: format, block: { buffer, time in
self.playerNode.scheduleBuffer(buffer, completionHandler: nil)
})
do {
engine.prepare()
try self.engine.start()
audioEngineRunning = true
self.playerNode.play()
}
catch {
print("error couldn't start engine")
audioEngineRunning = false
}
}
I have an AVPlayerLayer and AVPlayer setup for playback on external screen as follows:
var player = AVPlayer()
playerView.player = player
player.usesExternalPlaybackWhileExternalScreenIsActive = true
player.allowsExternalPlayback = true
playerView is just a UIView that has AVPlayerLayer as it's main layer. This code works and automatically starts displaying and playing video on external screen. The thing is I want an option to invert the AVPlayerLayer on the external screen. I tried setting transform on playerView but that is ignored on the external screen. How do I gain more control on the external screen window?
I also tried to manually add playerView to external screen window and set
player.usesExternalPlaybackWhileExternalScreenIsActive = true
I can also display AVPlayerLayer manually this way. But again, setting a transform on this screen has no effect on external display. So it may also be a UIKit issue.
I need to implement a text editor using UITextView that supports:
Bold/Italic/Underline
Color,Font,font size changes
Paragraph alignment
List format (bullets, numbers, etc.)
Custom selection of text anywhere in the text view and change the properties
So far I have managed to do it without NSTextStorage but it seems I am hitting limits. For instance, to change font, I use UIFontPickerViewController and change the font as follows:
func fontPickerViewControllerDidPickFont(_ viewController: UIFontPickerViewController) {
if let selectedFontDesc = viewController.selectedFontDescriptor {
let font = UIFont(descriptor: selectedFontDesc, size: selectedFontDesc.pointSize)
self.selectedFont = font
self.textView.typingAttributes = [NSAttributedString.Key.foregroundColor: self.selectedColor ?? UIColor.white, NSAttributedString.Key.font: self.selectedFont ?? UIFont.preferredFont(forTextStyle: .body, compatibleWith: nil)]
if let range = self.textView.selectedTextRange, let selectedFont = selectedFont {
let attributedText = NSMutableAttributedString(attributedString: self.textView.attributedText)
let location = textView.offset(from: textView.beginningOfDocument, to: range.start)
let length = textView.offset(from: range.start, to: range.end)
let nsRange = NSRange(location: location, length: length)
attributedText.setAttributes([NSAttributedString.Key.font : selectedFont], range: nsRange)
self.textView.attributedText = attributedText
}
}
}
This works but the problem is it resets the color of the selected text and other properties. I need to understand a way in which the existing attributed of the text under selection are not disturbed. I suspect the way to do is with using NSTextStorage but I can't find anything good on internet that explains the right use of NSTextStorage to achieve this.
I have a UICollectionViewCell defined in storyboard which has a UILabel added to it's contentView. Collection view uses a flow layout and I return a fixed size of cell in flowlayout delegate as follows:
let sizeOfItem = CGFloat(210.0)
func collectionView(_ collectionView: UICollectionView, layout collectionViewLayout: UICollectionViewLayout, sizeForItemAt indexPath: IndexPath) -> CGSize {
return CGSize(width: sizeOfItem, height: sizeOfItem)
}
I added the following constraint for UILabel in storyboard and the cell automatically starts resizing itself to match the text size in UILabel. This is not what I want. I want the cell to be fixed size and label to autoshrink instead.
I even tried setting contentHuggingPriority of label to lowest value (i.e. 1). This stops the cell from auto-shrinking if the text in label is small. But the cell grows when the text is bigger. I don't want this to happen either. I want the cell to remain fixed size and label to adapt it's font and be constrained within the cell.
I have a UICollectionViewFlowLayout assigned to UICollectionView. It's delegate method returns fixed CGSize in sizeForItemAtIndexPath. The problem is if UICollectionViewCell contains UIImageView or any other dynamically sized view with autolayout constraints that match leading, trailing, top, bottom to contentView, the cell resizes depending on the size of image in the imageView. Is it possible to get fixed size cells with UICollectionViewFlowLayout? I tried setting intrinsicContentSize for UIImageView in storyboard but it doesn't work.
This issue is driving me crazy. I load an NSAttributedString in UITextView and within moments after loading the foregroundColor attribute of text is erased(i.e becomes white) without me doing anything. Here is the code and NSLog dump. How do I debug this I wonder?
class ScriptEditingView: UITextView, UITextViewDelegate {
var defaultFont = UIFont.preferredFont(forTextStyle: .body)
var defaultTextColor = UIColor.white
private func commonInit() {
self.font = UIFont.preferredFont(forTextStyle: .body)
self.allowsEditingTextAttributes = true
self.textColor = defaultTextColor
self.backgroundColor = UIColor.black
self.isOpaque = true
self.isEditable = true
self.isSelectable = true
self.dataDetectorTypes = []
self.showsHorizontalScrollIndicator = false
}
}
And then in my ViewController that contains the UITextView, I have this code:
textView = ScriptEditingView(frame: newTextViewRect, textContainer: nil)
textView.delegate = self
view.addSubview(textView)
textView.allowsEditingTextAttributes = true
let guide = view.safeAreaLayoutGuide
// 5
textView.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
textView.leadingAnchor.constraint(equalTo: guide.leadingAnchor),
textView.trailingAnchor.constraint(equalTo: guide.trailingAnchor),
textView.topAnchor.constraint(equalTo: view.topAnchor),
textView.bottomAnchor.constraint(equalTo: view.bottomAnchor)
])
textView.attributedText = attributedString
NSLog("Attributed now")
dumpAttributesOfText()
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
NSLog("Attributes after 1 sec")
self.dumpAttributesOfText()
}
And here is code to dump attributes of text:
private func dumpAttributesOfText() {
textView.attributedText?.enumerateAttributes(in: NSRange(location: 0, length: textView.attributedText!.length), options: .longestEffectiveRangeNotRequired, using: { dictionary, range, stop in
NSLog(" range \(range)")
if let font = dictionary[.font] as? UIFont {
NSLog("Font at range \(range) - \(font.fontName), \(font.pointSize)")
}
if let foregroundColor = dictionary[.foregroundColor] as? UIColor {
NSLog("Foregroundcolor \(foregroundColor) at range \(range)")
}
if let underline = dictionary[.underlineStyle] as? Int {
NSLog("Underline \(underline) at range \(range)")
}
})
}
The logs show this:
2022-07-02 13:16:02.841199+0400 MyApp[12054:922491] Attributed now
2022-07-02 13:16:02.841370+0400 MyApp[12054:922491] range {0, 14}
2022-07-02 13:16:02.841486+0400 MyApp[12054:922491] Font at range {0, 14} - HelveticaNeue, 30.0
2022-07-02 13:16:02.841586+0400 MyApp[12054:922491] Foregroundcolor UIExtendedGrayColorSpace 1 1 at range {0, 14}
2022-07-02 13:16:02.841681+0400 MyApp[12054:922491] range {14, 6}
2022-07-02 13:16:02.841770+0400 MyApp[12054:922491] Font at range {14, 6} - HelveticaNeue, 30.0
2022-07-02 13:16:02.841855+0400 MyApp[12054:922491] Foregroundcolor kCGColorSpaceModelRGB 0.96863 0.80784 0.27451 1 at range {14, 6}
2022-07-02 13:16:03.934816+0400 MyApp[12054:922491] Attributes after 1 sec
2022-07-02 13:16:03.935087+0400 MyApp[12054:922491] range {0, 20}
2022-07-02 13:16:03.935183+0400 MyApp[12054:922491] Font at range {0, 20} - HelveticaNeue, 30.0
2022-07-02 13:16:03.935255+0400 MyApp[12054:922491] Foregroundcolor UIExtendedGrayColorSpace 1 1 at range {0, 20}
I have this code to adjust UITextView when keyboard is about to be displayed.
@objc func keyboardWillShow(notification: NSNotification) {
if let rectValue = notification.userInfo?[UIResponder.keyboardFrameEndUserInfoKey] as? NSValue {
let keyboardSize = rectValue.cgRectValue.size
let contentInsets: UIEdgeInsets = UIEdgeInsets(top: 0, left: 0, bottom: keyboardSize.height - view.safeAreaInsets.bottom, right: 0)
textView.contentInset = contentInsets
textView.scrollIndicatorInsets = contentInsets
}
}
I see setting contentInset automatically creates a scrolling animation (after the keyboard appears fully) that makes the text at the cursor visible. The animation happens automatically and there is no way to disable it. I want to scroll to a given location in the textView sometimes when the keyboard is presented (preferably, animating it along with keyboard appearance). The implicit contentInset animation makes it impossible however. First the keyboard animates, then contentInset animates, and now only can I animate to my chosen location in textView. This is awkward. I am wondering what is the right way to achieve it?
Can someone help me in identifying the source of this crash that I see in crash logs?
@AVFoundation engineers, my users am seeing the following crash which I can not reproduce at my end.
*** -[__NSArrayI objectAtIndexedSubscript:]: index 2 beyond bounds [0 .. 1]
Fatal Exception: NSRangeException
0 CoreFoundation 0x99288 __exceptionPreprocess
1 libobjc.A.dylib 0x16744 objc_exception_throw
2 CoreFoundation 0x1a431c -[__NSCFString characterAtIndex:].cold.1
3 CoreFoundation 0x4c96c CFDateFormatterCreateStringWithAbsoluteTime
4 AVFCapture 0x6cad4 -[AVCaptureConnection getAvgAudioLevelForChannel:]
All I am doing is this:
func updateMeters() {
var channelCount = 0
var decibels:[Float] = []
let audioConnection = self.audioConnection
if let audioConnection = audioConnection {
for audioChannel in audioConnection.audioChannels {
decibels.append(audioChannel.averagePowerLevel)
channelCount = channelCount + 1
}
}
What am I doing wrong?
I have the following code to connect inputNode to mainMixerNode of AVAudioEngine:
public func setupAudioEngine() {
self.engine = AVAudioEngine()
let format = engine.inputNode.inputFormat(forBus: 0)
//main mixer node is connected to output node by default
engine.connect(self.engine.inputNode, to: self.engine.mainMixerNode, format: format)
do {
engine.prepare()
try self.engine.start()
}
catch {
print("error couldn't start engine")
}
engineRunning = true
}
But I am seeing a crash in Crashlytics dashboard (which I can't reproduce).
Fatal Exception: com.apple.coreaudio.avfaudio
required condition is false: IsFormatSampleRateAndChannelCountValid(format)
Before calling the function setupAudioEngine I make sure the AVAudioSession category is not playback where mic is not available. The function is called where audio route change notification is handled and I check this condition specifically. Can someone tell me what I am doing wrong?
Fatal Exception: com.apple.coreaudio.avfaudio
0 CoreFoundation 0x99288 __exceptionPreprocess
1 libobjc.A.dylib 0x16744 objc_exception_throw
2 CoreFoundation 0x17048c -[NSException initWithCoder:]
3 AVFAudio 0x9f64 AVAE_RaiseException(NSString*, ...)
4 AVFAudio 0x55738 AVAudioEngineGraph::_Connect(AVAudioNodeImplBase*, AVAudioNodeImplBase*, unsigned int, unsigned int, AVAudioFormat*)
5 AVFAudio 0x5cce0 AVAudioEngineGraph::Connect(AVAudioNode*, AVAudioNode*, unsigned long, unsigned long, AVAudioFormat*)
6 AVFAudio 0xdf1a8 AVAudioEngineImpl::Connect(AVAudioNode*, AVAudioNode*, unsigned long, unsigned long, AVAudioFormat*)
7 AVFAudio 0xe0fc8 -[AVAudioEngine connect:to:format:]
8 MyApp 0xa6af8 setupAudioEngine + 701 (MicrophoneOutput.swift:701)
9 MyApp 0xa46f0 handleRouteChange + 378 (MicrophoneOutput.swift:378)
10 MyApp 0xa4f50 @objc MicrophoneOutput.handleRouteChange(note:)
11 CoreFoundation 0x2a834 __CFNOTIFICATIONCENTER_IS_CALLING_OUT_TO_AN_OBSERVER__
12 CoreFoundation 0xc6fd4 ___CFXRegistrationPost_block_invoke
13 CoreFoundation 0x9a1d0 _CFXRegistrationPost
14 CoreFoundation 0x408ac _CFXNotificationPost
15 Foundation 0x1b754 -[NSNotificationCenter postNotificationName:object:userInfo:]
16 AudioSession 0x56f0 (anonymous namespace)::HandleRouteChange(AVAudioSession*, NSDictionary*)
17 AudioSession 0x5cbc invocation function for block in avfaudio::AVAudioSessionPropertyListener(void*, unsigned int, unsigned int, void const*)
18 libdispatch.dylib 0x1e6c _dispatch_call_block_and_release
19 libdispatch.dylib 0x3a30 _dispatch_client_callout
20 libdispatch.dylib 0x11f48 _dispatch_main_queue_drain
21 libdispatch.dylib 0x11b98 _dispatch_main_queue_callback_4CF
22 CoreFoundation 0x51800 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__
23 CoreFoundation 0xb704 __CFRunLoopRun
24 CoreFoundation 0x1ebc8 CFRunLoopRunSpecific
25 GraphicsServices 0x1374 GSEventRunModal
26 UIKitCore 0x514648 -[UIApplication _run]
27 UIKitCore 0x295d90 UIApplicationMain
28 libswiftUIKit.dylib 0x30ecc UIApplicationMain(_:_:_:_:)
29 MyApp 0xc358 main (WhiteBalanceUI.swift)
30 ??? 0x104b1dce4 (Missing)
iOS 16 has serious bugs in UIKit without any known workarounds, so much that I had to remove my app from sale. Obviously I am affected and am desperate to know if anyone has found any workaround or if UIKit Engineers can tell me any workarounds.
To summarise, the issue is calling setNeedsUpdateOfSupportedInterfaceOrientations shortly (within 0.5 seconds) after app launch doesn't trigger autorotation. It does trigger autorotation when the device is launched from the debugger but not when the app is launched directly. Maybe the debugger incurs some delay that is perhaps sufficient to trigger autorotation? I have filed FB11516363 but need a workaround desperately so as to get my app back on AppStore.
I get this error when I run my code on an iOS 14 device using XCode 14.
dyld: Symbol not found: _$s7SwiftUI15GraphicsContextV4fill_4with5styleyAA4PathV_AC7ShadingVAA9FillStyleVtF
Referenced from: ******
Expected in: /System/Library/Frameworks/SwiftUI.framework/SwiftUI
I do not get any errors when I run the same code on an iOS 15 or iOS 16 device.
How do I resolve this issue?
This is another strange issue on iOS 16, and one among the many woes that cause autoRotation problems on the platform. At the app startup time, there is a mismatch between -[UIViewController interfaceOrientation] (now deprecated), -[UIViewController supportedInterfaceOrientations] (which returns .landscapeRight), and -[windowScene interfaceOrientation].
public var windowOrientation: UIInterfaceOrientation {
return view.window?.windowScene?.interfaceOrientation ?? .unknown
}
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
layoutInterfaceForOrientation(windowOrientation)
}
override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
return .landscapeRight
Putting a breakpoint in viewDidLayoutSubviews reveals that windowOrientation is portrait while interfaceOrientation is .landscapeRight.
Is there anyway this can be fixed or any workaround possible?