Post

Replies

Boosts

Views

Activity

How do I output different sounds to headphones and speakers while simultaneously recording, all without using AVAudioSession.Category.multiroute?
I need to find a way to allow recording from the mic while outputting two different sound streams to two different devices (speaker and headphones). I've done a fair bit of reading around using AVAudioSession.Category.multiroute but haven't found any modern examples. @theanalogkid posted a nice example using obj-C nine years ago, but others have noted that the code isn't readily translatable to Swift. To make matters worse, this is one of the very few examples on how to properly use multirouting. The official documentation is lacking, to say the least, and the WWDC 2012 session is, well, old enough to attend middle school and be a Taylor Swift fan, but definitely not in Swift. The few relevant forum posts here are spread over this middle schooler's life span and likely outdated, with most having no responses other than the poster's own plightful echo. They don't paint a pretty picture of .multiroute's health, with a recent poster noting that volume buttons don't work in this mode, contacting DTS and finding that there's no fix; another finding that it just doesn't work for certain devices, etc. Audio is giving me enough of a headache so I'd like to avoid slogging through this if possible. .multiroute feels like the developer mode of AVAudioSession, but without documentation. tl;dr - Without using .multiroute, is there a way to allow an app to output two different devices while simultaneously recording audio? If .multiroute is the only way to achieve this, can someone give me a quick rundown of how this category works?
1
0
803
Aug ’24
How do I stop Tasks from choking up animations?
I'm making a loading screen, but I can't figure out how to make the loading indicator animate smoothly while work is being performed. I've tried a variety of tactics, including creating confining the animation to a .userInitiated Task, or downgrading the loading Task to .background, and using TaskGroups. All of these resulted in hangs, freezes, or incredibly long load times. I've noticed that standard ProgressViews work fine when under load, but the documentation doesn't indicate why this is the case. Customized ProgressViews don't share this trait (via .progressViewStyle()) also choke up. Finding out why might solve half the problem. Note: I want to avoid async complications that come with using nonisolated functions. I've used them elsewhere, but this isn't the place for them.
9
0
1.3k
Aug ’24
[WatchOS 11, Xcode 16] Can't get watch to appear in Xcode
This morning I bought my first-ever Apple Watch for the sole purpose of development and proceeded to spend six hours failing at the first step of development: getting the device to enter developer mode and connect to Xcode. Since I'm not seeing any WatchOS 11 posts on this issue, it might just be me. This is why I'm making a new thread that's specific to WatchOS 11, Xcode 16, and maybe Series 10. Some particulars for my case: Overall __Followed Xcode 16.0 documentation On a watchOS device that you use for development, go to Settings > Privacy > Developer Mode. To toggle Developer mode, use the Developer Mode switch. To pair an Apple Watch to a Mac, connect its companion iPhone to the Mac with a cable, and ensure that the iPhone is paired for development. After this step, follow any instructions on the Apple Watch to trust the Mac. When paired through an iPhone running iOS 17 or later, Xcode connects to the Apple Watch over Wi-Fi __Tried all the folk remedies listed in the (many) previous posts on enabling development mode and connecting to Xcode iOS 18.0 __In developer mode __Connected to macOS via USB, trusts computer WatchOS 11.0 __Prompt to trust computer appears and trust is established __‘Developer Mode’ list item never appears at end of the ‘Privacy’ menu under ‘Settings’ __‘Developer’ item sometimes appears at the end of ‘Settings’ Despite never having seen or toggled ‘Developer Mode’ under ‘Privacy’ Persists across reboots Possible that WatchOS 11 eliminated the item under Settings > Privacy? If so, documentation not up to date Xcode 16.0 __Watch never appears under ‘Manage Run Destinations’ After installing sample app to phone, then attempting to install WatchOS app via iOS Watch app, “Cannot install at this time” alert appears App icon appears on watch, and tapping on it leads to an alert with, “This app cannot be installed because its integrity could not be verified”, despite wi-fi working Watch apps for other apps (e.g. Apple Store) can be successfully installed via iOS Watch app Above suggests the watch isn't truly in developer mode despite Settings > Developer appearing and persisting across reboots __The network path from Xcode to WatchOS should be clear Reconfigured router such that devices on the same network can talk to each other iPad and iPhone appear with network icon when not connected via cable and Xcode can run code on them Watch on same network as iPad and iPhone macOS 15.0 __Due to security policy, cannot use Wi-Fi (disabled both physically and via sudo /usr/sbin/networksetup -setnetworkserviceenabled 'Wi-Fi' off) Possible that Xcode can only establish a connection to WatchOS via Wi-Fi and not via ethernet bridged to wifi. If so, a confirmation would be hugely helpful. This is currently my prime suspect. Wi-fi cannot be re-enabled, so I'm trying workarounds like connecting watch to phone's hotspot (doesn't work) and somehow using phone to provide network to the Mac. __Due to security policy, firewall configured to block all incoming connections Shouldn't be an issue since Xcode doesn't need incoming connections to see non-watch devices __Due to security policy, mDNSResponder and mDNSResponderHelper disabled Also shouldn't be an issue, but including just in case
2
2
2.3k
Sep ’24
How would you make a View that magnifies the View(s) beneath it?
I need a magnifying glass function for one of my SwiftUI Views, but can't find a way to implement it as needed. I found a Youtube video where the author renders the view twice, overlaying the second over the first, then scaling and masking it to create the illusion of magnification, but this is expensive and doesn't work in many cases where more complex views are presented (e.g. a LazyVGrid). I've also explored continually capturing partial screenshots and scaling them up to create the illusion of magnification, but there's no straightforward way to achieve this with SwiftUI without getting into the messiness of UIViewRepresentables. Any help would be greatly appreciated
2
0
514
Oct ’24
Does the SpriteView of an SKScene have layers? Unable to get magnifying glass view to work with scene.
I'm trying to make a magnifying glass that shows up when the user presses a button and follows the user's finger as it's dragged across the screen. I came across a UIKit-based solution (https://github.com/niczyja/MagnifyingGlass-Swift), but when implemented in my SKScene, only the crosshairs are shown. Through experimentation I've found that magnifiedView?.layer.render(in: context) in: public override func draw(_ rect: CGRect) { guard let context = UIGraphicsGetCurrentContext() else { return } context.translateBy(x: radius, y: radius) context.scaleBy(x: scale, y: scale) context.translateBy(x: -magnifiedPoint.x, y: -magnifiedPoint.y) removeFromSuperview() magnifiedView?.layer.render(in: context) magnifiedView?.addSubview(self) } can be removed without altering the situation, suggesting that line is not working as it should. But this is where I hit a brick wall. The view below is shown but not offset or magnified, and any attempt to add something to context results in a black magnifying glass. Does anyone know why this is? I don't think it's an issue with the code, so I'm suspecting its something specific to SpriteKit or SKScene, likely related to how CALayers work. Any pointers would be greatly appreciated. . . . Full code below: import UIKit public class MagnifyingGlassView: UIView { public weak var magnifiedView: UIView? = nil { didSet { removeFromSuperview() magnifiedView?.addSubview(self) } } public var magnifiedPoint: CGPoint = .zero { didSet { center = .init(x: magnifiedPoint.x + offset.x, y: magnifiedPoint.y + offset.y) } } public var offset: CGPoint = .zero public var radius: CGFloat = 50 { didSet { frame = .init(origin: frame.origin, size: .init(width: radius * 2, height: radius * 2)) layer.cornerRadius = radius crosshair.path = crosshairPath(for: radius) } } public var scale: CGFloat = 2 public var borderColor: UIColor = .lightGray { didSet { layer.borderColor = borderColor.cgColor } } public var borderWidth: CGFloat = 3 { didSet { layer.borderWidth = borderWidth } } public var showsCrosshair = true { didSet { crosshair.isHidden = !showsCrosshair } } public var crosshairColor: UIColor = .lightGray { didSet { crosshair.strokeColor = crosshairColor.cgColor } } public var crosshairWidth: CGFloat = 5 { didSet { crosshair.lineWidth = crosshairWidth } } private let crosshair: CAShapeLayer = CAShapeLayer() public convenience init(offset: CGPoint = .zero, radius: CGFloat = 50, scale: CGFloat = 2, borderColor: UIColor = .lightGray, borderWidth: CGFloat = 3, showsCrosshair: Bool = true, crosshairColor: UIColor = .lightGray, crosshairWidth: CGFloat = 0.5) { self.init(frame: .zero) layer.masksToBounds = true layer.addSublayer(crosshair) defer { self.offset = offset self.radius = radius self.scale = scale self.borderColor = borderColor self.borderWidth = borderWidth self.showsCrosshair = showsCrosshair self.crosshairColor = crosshairColor self.crosshairWidth = crosshairWidth } } public func magnify(at point: CGPoint) { guard magnifiedView != nil else { return } magnifiedPoint = point layer.setNeedsDisplay() } private func crosshairPath(for radius: CGFloat) -> CGPath { let path = CGMutablePath() path.move(to: .init(x: radius, y: 0)) path.addLine(to: .init(x: radius, y: bounds.height)) path.move(to: .init(x: 0, y: radius)) path.addLine(to: .init(x: bounds.width, y: radius)) return path } public override func draw(_ rect: CGRect) { guard let context = UIGraphicsGetCurrentContext() else { return } context.translateBy(x: radius, y: radius) context.scaleBy(x: scale, y: scale) context.translateBy(x: -magnifiedPoint.x, y: -magnifiedPoint.y) removeFromSuperview() magnifiedView?.layer.render(in: context) //If above disabled, no change //Possible that nothing's being rendered into context //Could it be that SKScene view has no layer? magnifiedView?.addSubview(self) } }
0
0
660
Nov ’24
How do I make a UIViewRepresentable beneath SwiftUI elements ignore touches to these elements?
Hello, and an early "Merry Christmas" to all, I'm building a SwiftUI app, and one of my Views is a fullscreen UIViewRepresentable (SpriteView) beneath a SwiftUI interface. Whenever the user interacts with any SwiftUI element, the UIView registers a hit in touchesBegan(). For example, my UIView has logic for pinching (not implemented via UIGestureRecognizer), so whenever the user holds down a SwiftUI element while touching the UIView, that counts as two touches to the UIView which invokes the pinching logic. Things I've tried to block SwiftUI from passing the gesture down to the UIView: Adding opaque elements beneath control elements Adding gestures to the elements above Adding gesture masks to the gestures above Converting eligible elements to Buttons (since those seem immune) Adding SpriteViews beneath those elements to absorb gestures So far nothing has worked. As long as the UIView is beneath SwiftUI elements, any interactions with those elements will be registered as a hit. The obvious solution is to track each SwiftUI element's size and coordinates with respect to the UIView's coordinate space, then use exclusion areas, but this is both a pain and expensive, and I find it hard to believe this is the best fix for such a seemingly basic problem. I'm probably overlooking something basic, so any suggestions will be greatly appreciated
0
0
434
Dec ’24
Are default fonts under UIFont.familyNames licensed for use by iOS developers?
This is a follow-up to my previous question: How to attribute/credit Apple Fonts added to app? In that previous post, I misremembered what I did and said I found fonts via macOS' FontBooks, when instead I came acrossUIFont.familyNames. Since these are included via UIKit, the legal implications should be different. I looked at various license agreements that govern iOS app development but haven't found anything mentioning fonts. Since these are included as part of UIKit, its reasonable to assume that developers are allowed to include these fonts--but in what ways? Am I allowed to let users create, say, documents with these fonts? Am I only allowed to display these fonts? There are 84 fonts, and judging by their FontBook entries, there is a wide range of licenses and restrictions. It seems unnecessarily harsh to have every iOS developer verify each one and figure out which they can legally keep if they want to offer their users access to all (for, say, a text-editing app). There must be some overarching rule that supersedes/encapsulates them, but this rule isn't clear to me after hours of research. I'm not a lawyer, and I don't think Apple expects every app developer to consult their lawyers on whether they can use system fonts. I'm about to send an email to Apple's legal team (I will post their response here if allowed), but in the meantime I want to hear what other devs think about this. In Xcode, entering UIFont.familyNames returns the following: ["Academy Engraved LET", "Al Nile", "American Typewriter", "Apple Color Emoji", "Apple SD Gothic Neo", "Apple Symbols", "Arial", "Arial Hebrew", "Arial Rounded MT Bold", "Avenir", "Avenir Next", "Avenir Next Condensed", "Baskerville", "Bodoni 72", "Bodoni 72 Oldstyle", "Bodoni 72 Smallcaps", "Bodoni Ornaments", "Bradley Hand", "Chalkboard SE", "Chalkduster", "Charter", "Cochin", "Copperplate", "Courier New", "Damascus", "Devanagari Sangam MN", "Didot", "DIN Alternate", "DIN Condensed", "Euphemia UCAS", "Farah", "Futura", "Galvji", "Geeza Pro", "Georgia", "Gill Sans", "Grantha Sangam MN", "Helvetica", "Helvetica Neue", "Hiragino Maru Gothic ProN", "Hiragino Mincho ProN", "Hiragino Sans", "Hoefler Text", "Impact", "Kailasa", "Kefa", "Khmer Sangam MN", "Kohinoor Bangla", "Kohinoor Devanagari", "Kohinoor Gujarati", "Kohinoor Telugu", "Lao Sangam MN", "Malayalam Sangam MN", "Marker Felt", "Menlo", "Mishafi", "Mukta Mahee", "Myanmar Sangam MN", "Noteworthy", "Noto Nastaliq Urdu", "Noto Sans Kannada", "Noto Sans Myanmar", "Noto Sans Oriya", "Optima", "Palatino", "Papyrus", "Party LET", "PingFang HK", "PingFang SC", "PingFang TC", "Rockwell", "Savoye LET", "Sinhala Sangam MN", "Snell Roundhand", "STIX Two Math", "STIX Two Text", "Symbol", "Tamil Sangam MN", "Thonburi", "Times New Roman", "Trebuchet MS", "Verdana", "Zapf Dingbats", "Zapfino"]
Topic: Design SubTopic: General Tags:
1
0
854
Jan ’25
How to attribute/credit Apple Fonts added to app?
My app lets users create things with text, and I've included Apple fonts so users can format their text with them. These were fonts I found in the Font Book app that comes with macOS. My assumption is that these, like the San Francisco font, can be distributed with apps. Do I need to attribute these fonts to their creators and publish a license in my "About" page? If so, where do I find the license(s) and what is the proper way of publishing them? Is there anything else I should know? Please let me know if this should've been posted under a different topic and tag
Topic: Design SubTopic: General Tags:
3
0
1.2k
Jan ’25
New in SF Symbols 7: ipod.and.vision.pro.
Why? Why stop there? (Why not ipod.and.imacg3? applenewton.and.vision.pro?) I get why the older ipod symbols exist but these new pairings are odd. If anyone ever sees these restricted symbols in the wild, or even just someone using a Vision Pro and an iPod (Touch) together in a way that's not contrived, please do let me know!
Topic: Design SubTopic: General Tags:
1
0
777
Oct ’25
[iOS 26] Can no longer detect whether iPhone has notch
I'm currently using the extension below to determine whether an iPhone has a notch so I can adjust my UI accordingly. extension UIDevice { var hasNotch: Bool { if userInterfaceIdiom == .phone, let window = (UIApplication.shared.connectedScenes .compactMap { $0 as? UIWindowScene } .flatMap { $0.windows } .first { $0.isKeyWindow }) { return window.safeAreaInsets.bottom > 0 } return false } } (Adapted from https://stackoverflow.com/questions/73946911/how-to-detect-users-device-has-dynamic-island-in-uikit) This no longer works in iOS 26, and I have yet to find a similar method that works. Does anyone have any fixes?
4
0
196
Oct ’25