I've defined a URL scheme for my application, and that's being honored by iOS. But the function that's supposed to handle the URL in my appliation (as documented here) is never called.
The documentation doesn't say exactly where this is supposed to go. I've tried it in my App struct:
@main
struct MyGreatApp: App
{
var body: some Scene
{
WindowGroup
{
MainView()
}
}
// Handle custom URLs, specifically the ones sent in invitation E-mails or texts.
func application(_ application: UIApplication,
open theURL: URL,
options: [UIApplication.OpenURLOptionsKey : Any] = [:] ) -> Bool
{
// Determine who sent the URL.
let sendingAppID = options[.sourceApplication]
print("source application = \(sendingAppID ?? "Unknown")")
...
And I also tried putting this at the file level. No dice either way. Anybody have an idea why?
To head off things I've seen in other posts: I'm not using scenes, and there's no SceneDelegate.
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello, community and Apple engineers. I need your help.
Our app has the following issue: NavigationStack pushes a view twice if the NavigationStack is inside TabView and NavigationStack uses a navigation path of custom Hashable elements.
Our app works with issues in Xcode 18 Beta 13 + iOS 18.0. The same issue happened on previous beta versions of Xcode 18.
The issue isn’t represented in iOS 17.x and everything worked well before iOS 18.0 beta releases.
I was able to represent the same issue in a clear project with two simple views. I will paste the code below.
Several notes:
We use a centralised routing system in our app where all possible routes for navigation path are implemented in a View extension called withAppRouter().
We have a enum RouterDestination that contains all possible routes and is resolved in withAppRouter() extension.
We use Router class that contains @Published var path: [RouterDestination] = [] and this @Published property is bound to NavigationStack. In the real app, we need to have an access to this path property for programmatic navigation purposes.
Our app uses @ObservableObject / @StateObject approach.
import SwiftUI
struct ContentView: View {
@StateObject private var router = Router()
var body: some View {
TabView {
NavigationStack(path: $router.path) {
NavigationLink(value: RouterDestination.next, label: {
Label("Next", systemImage: "plus.circle.fill")
})
.withAppRouter()
}
}
}
}
enum RouterDestination: Hashable {
case next
}
struct SecondView: View {
var body: some View {
Text("Screen 2")
}
}
class Router: ObservableObject {
@Published var path: [RouterDestination] = []
}
extension View {
func withAppRouter() -> some View {
navigationDestination(for: RouterDestination.self) { destination in
switch destination {
case .next:
return SecondView()
}
}
}
}
Below you can see the GIF with the issue:
What I tried to do:
Use iOS 17+ @Observable approach. It didn’t help.
Using @State var path: [RouterDestination] = [] directly inside View seems to help. But it is not what we want as we need this property to be @Published and located inside Router class where we can get an access to it, and use for programmatic navigation if needed.
I ask Apple engineers to help with that, please, and if it is a bug of iOS 18 beta, then please fix it in the next versions of iOS 18.0
There is a serious usability issue with PHPickerViewController in a UIKit app running on macOS 26 via Mac Catalyst when the Mac Catalyst interface is set to “Scaled to Match iPad”. Mouse click and other pointer interactions do not take place in the correct position. This means you have to click in the wrong position to select a photo and to close the picker. This basically makes it unusable.
To demonstrate, use Xcode 26 on macOS 26 to create a new iOS app project based on Swift/Storyboard. Then update ViewController.swift with the following code:
import UIKit
import PhotosUI
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
var cfg = UIButton.Configuration.plain()
cfg.title = "Photo Picker"
let button = UIButton(configuration: cfg, primaryAction: UIAction(handler: { _ in
self.showPicker()
}))
button.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(button)
NSLayoutConstraint.activate([
button.centerXAnchor.constraint(equalTo: view.safeAreaLayoutGuide.centerXAnchor),
button.centerYAnchor.constraint(equalTo: view.safeAreaLayoutGuide.centerYAnchor),
])
}
private func showPicker() {
var config = PHPickerConfiguration()
config.selectionLimit = 10
config.selection = .ordered
let vc = PHPickerViewController(configuration: config)
vc.delegate = self
self.present(vc, animated: true)
}
}
extension ViewController: PHPickerViewControllerDelegate {
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
print("Picked \(results.count) photos")
dismiss(animated: true)
}
}
Then go to the "Supported Destinations" section of the project target. Add a "Mac (Mac Catalyst)" destination. Then under the "Deployment Information" section, make sure the "Mac Catalyst Interface" setting is "Scaled to Match iPad".
Then build and run the app on a Mac (using the Mac Catalyst destination) with macOS 26.0.1. Make sure the Mac has a dozen or so pictures in the Photo Library to fully demonstrate the issue. When the app is run, a simple screen appears with one button in the middle. Click the button to bring up the PHPickerViewController. Now try to interact with the picker interface. Note that all pointer interactions are in the wrong place on the screen. This makes it nearly impossible to choose the correct photos and close the picker.
Quit the app. Select the project and go to the General tab. In the "Deployment Info" change the “Mac Catalyst Interface” setting to “Optimize for Mac” and run the app again. Now the photo picker works just fine.
If you run the app on a Mac running macOS 15 then the photo picker works just fine with either “Mac Catalyst Interface” setting.
The problem only happens under macOS 26.0 (I do not have macOS 26.1 beta to test) when the “Mac Catalyst Interface” setting is set to “Scaled to Match iPad”. This is critical for my app. I cannot use “Optimize for Mac”. There are far too many issues with that setting (I use UIStepper and UIPickerView to start). So it is critical to the usability of my app under macOS 26 that this issue be resolved.
It is expected that PHPickerViewController responds correctly to pointer events on macOS 26 when running a Mac Catalyst app set to “Scaled to Match iPad”.
A version of this has been filed as FB20503207
I integrated an Advanced App Clip Experience to my app. In trying to test the App Clip Card, the card does not appear when I tap the link on my device that is associated to the Advanced App Clip Experience.
Listed are some contextual information:
Testing on device running on iOS 18.3.1
Associated Domains:
Main target app: applinks:
App clips target app: applinks: and appclips:
Archived and uploaded build to App Store Connect.
Green "Testing" status via Testflight.
On Distribution tab, green "Valid" status for build domain.
Advanced App Clip Experience green "Received" status.
Developer App Clip Testing Diagnostics:
Green "Register Advanced Experience" status
Green "App Clip Code" status
Warning "App Clip Published on App Store"
Orange Circle "Associated Domains"
After looking at countless threads, I still cannot for the life of me find a solution to test the App Clip. Any guidance would be extremely appreciated. I had also submitted a support ticket with case ID #102552504973.
I'm using GoogleMaps in my project.
Legacy preview works well but new preview (Xcode 16.3.1 beta) produces error. It doesn't seem to find Googlemaps.a.
== PREVIEW UPDATE ERROR:
FailedToLaunchAppError: Failed to launch ***
==================================
| [Remote] JITError
|
| ==================================
|
| | [Remote] CouldNotLoadInputStaticArchiveFile: Could not load static archive during preview: /Users/xxx/Library/Developer/Xcode/DerivedData/BOA-eiluspltxasszsfkpqrnnsxsjhth/Build/Products/Debug_BOA_Inhouse-iphonesimulator/GoogleMaps.a
| |
| | path: /Users/xxx/Library/Developer/Xcode/DerivedData/BOA-eiluspltxasszsfkpqrnnsxsjhth/Build/Products/Debug_BOA_Inhouse-iphonesimulator/GoogleMaps.a
| |
| | ==================================
| |
| | | [Remote] XOJITError
| | |
| | | XOJITError: arm64 slice of /Users/xxx/Library/Developer/Xcode/DerivedData/BOA-eiluspltxasszsfkpqrnnsxsjhth/Build/Products/Debug_BOA_Inhouse-iphonesimulator/GoogleMaps.a does not contain an archive
I am developing an app in swiftUI using Xcode 12.3, deployment target iOS 14.0. The launch screen is setup through info.plist by specifying 'background color' and 'image name'. The file used in 'image name' is from Assets catalog. (PNG format, size300 x 300 and corresponding @2x and @3x resolutions) What I have observed, when the app is installed for the first time the launch image is centered and have original resolutions but all subsequent launches show launch images stretched to cover full screen. Any ideas why this is happening and how to have more consistent behavior either way?
I have tried 'respect safe area' option but it does not make a difference.
Thank you.
In tvOS 18 the onMoveCommand is missing the first press after a view is loaded and every time the direction is changed. It also misses the first press on a button after a focus change. This appears to only impact the newer silver remote and not the older black remote or IR remotes.
With the code bellow press any direction 3 times and it will only log twice.
struct ButtonTest: View {
var body: some View {
VStack {
Button {
debugPrint("button 1")
} label: {
Text("Button 1")
}
Button {
debugPrint("button 2")
} label: {
Text("Button 2")
}
Button {
debugPrint("button 3")
} label: {
Text("Button 3")
}
}
.onMoveCommand(perform: { direction in
debugPrint("move \(direction)")
})
.padding()
}
}
I have set up a collection view cell programmatically which I am registering in the cell registration of a diffable data source. I have set up a delegate for the cell, and the view controller which contains the collection view as the delegate for the cell. However, calling the cell delegate method from the cell is not calling the method in the view controller.
This is my code:
`// in my view controller: cell.delegate = self
// in my collection view cell: delegate?.repromptLLMForIconName(iconName, index: index, emotion: emotion, red: red, green: green, blue: blue)
`
But although the delegate method in the collection view gets called, my method implementation in the view controller does not. What could possibly be going wrong?
Topic:
UI Frameworks
SubTopic:
UIKit
Is there a way to access an Icon Composer .icon file in Swift or Objective-C? Any way to get this in an NSImage object that I can display in an image view? Thanks.
Basic Information
Please provide a descriptive title for your feedback:
Sheet presentationDetents breaks after rapid open/dismiss cycles
Which platform is most relevant for your report?
iOS
Description
Steps to Reproduce:
Create a sheet with presentationDetents([.medium])
Rapidly perform these actions multiple times (usually 3-4 times):
a. Open the sheet
b. Immediately scroll down to dismiss
Open the sheet again
Observe that the sheet now appears at .large size, ignoring the .medium detent
Expected Result:
Sheet should consistently maintain .medium size regardless of how quickly
it is opened and dismissed.
Actual Result:
After rapid open/dismiss cycles, the sheet ignores .medium detent and
appears at .large size.
Reproduction Rate:
Occurs consistently after 3-4 rapid open/dismiss cycles
More likely to occur with faster open/dismiss actions
Configuration:
iOS 18
Xcode 16.0 (16A242d)
SwiftUI
Device: iPhone 14
Xcode downloaded a crash report for my app which I don't quite understand. It seems the following line caused the crash:
myEntity.image = newImage
where myEntity is of type MyEntity:
class MyEntity: NSObject, Identifiable {
@objc dynamic var image: NSImage!
...
}
The code is called on the main thread. According to the crash report, thread 0 makes that assignment, and at the same time thread 16 is calling [NSImageView asynchronousPreparation:prepareResultUsingParameters:].
What could cause such a crash? Could I be doing something wrong or is this a bug in macOS?
crash.crash
I'm trying to piece together how I should reason about multiple windows, activating menus and items.
For instance, I have an email command. If a document is open, it emails just that document. If the collection view holding that document is open, it generates an attachment representing the collection. (this happens with buttons right now but this really feels like a menu item) How do I tell which window is active (document/collection) in order to have the right menu item available.
If the user is adding a document I don't want the "new" command open, I only want one editing view. I saw sample code to include or remove commands but not disable them. I feel like there's a whole conceptual layer I want to understand with the interplay with scenes but don't know where to look for documentation. I searched here but there's hardly any threads on this.
TIA
Topic:
UI Frameworks
SubTopic:
UIKit
I'm working on a NavigationStack based app. Somewhere I'm using:
@Environment(\.dismiss) private var dismiss
and when trying to navigate to that view it gets stuck.
I used Self._printChanges() and discovered the environment variable dismiss is changing repeatedly. Obviously I am not changing that variable explicitly. I wasn't able to reproduce this in a small project so far, but does anybody have any idea what kind of thing I could be doing that might be causing this issue?
iOS 17.0.3
I was hoping for an update of SwiftData which adopted the use of shared and public CloudKit containers, in the same way it does for the private CloudKit container.
So firstly, a big request to any Apple devs reading, for this to be a thing!
Secondly, what would be a sensible way of adding a shared container in CloudKit to an existing app that is already using SwiftData?
Would it be possible to use the new DataStore method to manage CloudKit syncing with a public or shared container?
I am trying to implement "Live activity" to my app. I am following the Apple docs.
Link: https://developer.apple.com/documentation/activitykit/displaying-live-data-with-live-activities
Example code:
struct LockScreenLiveActivityView: View {
let context: ActivityViewContext<PizzaDeliveryAttributes>
var body: some View {
VStack {
Spacer()
Text("\(context.state.driverName) is on their way with your pizza!")
Spacer()
HStack {
Spacer()
Label {
Text("\(context.attributes.numberOfPizzas) Pizzas")
} icon: {
Image(systemName: "bag")
.foregroundColor(.indigo)
}
.font(.title2)
Spacer()
Label {
Text(timerInterval: context.state.deliveryTimer, countsDown: true)
.multilineTextAlignment(.center)
.frame(width: 50)
.monospacedDigit()
} icon: {
Image(systemName: "timer")
.foregroundColor(.indigo)
}
.font(.title2)
Spacer()
}
Spacer()
}
.activitySystemActionForegroundColor(.indigo)
.activityBackgroundTint(.cyan)
}
}
Actually, the code is pretty straightforward. We can use the timerInterval for count-down animation. But when the timer ends, I want to update the Live Activity view. If the user re-opens the app, I can update it, but what happens if the user doesn't open the app? Is there a way to update the live activity without using push notifications?
The following code won't work:
- (void)windowDidLoad {
[super windowDidLoad];
self.window.isVisible = NO;
}
The only main window still shows on application startup (in a minimal newly created app).
One of my published apps in App Store relies on this behavior which had been working for many years since I started Xcode development.
Topic:
UI Frameworks
SubTopic:
AppKit
Fatal Exception: NSInternalInconsistencyException
Cannot remove an observer <WKWebView 0x135137800> for the key path "configuration.enforcesChildRestrictions" from <STScreenTimeConfigurationObserver 0x13c6d7460>, most likely because the value for the key "configuration" has changed without an appropriate KVO notification being sent. Check the KVO-compliance of the STScreenTimeConfigurationObserver [class.]
I noticed that on iOS 26, WKWebView registers STScreenTimeConfigurationObserver, Is this an iOS 26 system issue? What should I do?
When I build my app for iPad OS, either 26, or 18.5, as well as iOS on 16.5 from Xcode 26 with UIDesignRequiresCompatibility enabled my app is crashing as it loads the main UIViewController, a subclassed UITabBarController which is being loaded programatically from a Storyboard from another SplashScreen ViewController.
On i(Pad)OS 18.5 I get this error:
Thread 1: "Could not instantiate class named _TtGC5UIKit17UICoreHostingViewVCS_21ToolbarVisualProvider8RootView_ because no class named _TtGC5UIKit17UICoreHostingViewVCS_21ToolbarVisualProvider8RootView_ was found; the class needs to be defined in source code or linked in from a library (ensure the class is part of the correct target)"
On iPadOS 26 I get this error:
UIKitCore/UICoreHostingView.swift:54: Fatal error: init(coder:) has not been implemented
There is no issue building from Xcode 16.4, regardless of targeted i(Pad)OS.
How can we performantly scroll to a target location using TextKit 2?
Hi everyone,
I'm building a custom text editor using TextKit 2 and would like to scroll to a target location efficiently. For instance, I would like to move to the end of a document seamlessly, similar to how users can do in standard text editors by using CMD + Down.
Background:
NSTextView and TextEdit on macOS can navigate to the end of large documents in milliseconds. However, after reading the documentation and experimenting with various ideas using TextKit 2's APIs, it's not clear how third-party developers are supposed to achieve this.
My Code:
Here's the code I use to move the selection to the end of the document and scroll the viewport to reveal the selection.
override func moveToEndOfDocument(_ sender: Any?) {
textLayoutManager.ensureLayout(for: textLayoutManager.documentRange)
let targetLocation = textLayoutManager.documentRange.endLocation
let beforeTargetLocation = textLayoutManager.location(targetLocation, offsetBy: -1)!
textLayoutManager.textViewportLayoutController.layoutViewport()
guard let textLayoutFragment = textLayoutManager.textLayoutFragment(for: beforeTargetLocation) else {
return
}
guard let textLineFragment = textLayoutFragment.textLineFragment(for: targetLocation, isUpstreamAffinity: true) else {
return
}
let lineFrame = textLayoutFragment.layoutFragmentFrame
let lineFragmentFrame = textLineFragment.typographicBounds.offsetBy(dx: 0, dy: lineFrame.minY)
scrollToVisible(lineFragmentFrame)
}
While this code works as intended, it is very inefficient because ensureLayout(_:) is incredibly expensive and can take seconds for large documents.
Issues Encountered:
In my attempts, I have come across the following two issues.
Estimated Frames: The frames of NSTextLayoutFragment and NSTextLineFragment are approximate and not precise enough for scrolling unless the text layout fragment has been fully laid out.
Laying out all text is expensive: The frames become accurate once NSTextLayoutManager's ensureLayout(for:) method has been called with a range covering the entire document. However, ensureLayout(for:) is resource-intensive and can take seconds for large documents. NSTextView, on the other hand, accomplishes the same scrolling to the end of a document in milliseconds.
I've tried using NSTextViewportLayoutController's relocateViewport(to:) without success. It's unclear to me whether this function is intended for a use case like mine. If it is, I would appreciate some guidance on its proper usage.
Configuration:
I'm testing on macOS Sonoma 14.5 (23F79), Swift (AppKit), Xcode 15.4 (15F31d).
I'm working on a multi-platform project written in AppKit and UIKit, so I'm looking for either a single solution that works in both AppKit and UIKit or two solutions, one for each UI framework.
Question:
How can third-party developers scroll to a target location, specifically the end of a document, performantly using TextKit 2?
Steps to Reproduce:
The issue can be reproduced using the example project (download from link below) by following these steps:
Open the example project.
Run the example app on a Mac. The example app shows an uneditable text view in a scroll view. The text view displays a long text.
Press the "Move to End of Document" toolbar item.
Notice that the text view has scrolled to the bottom, but this took several seconds (~3 seconds on my MacBook Pro 16-inch, 2021). The duration will be shown in Xcode's log.
You can open the ExampleTextView.swift file and find the implementation of moveToEndOfDocument(_:). Comment out line 84 where the ensureLayout(_:) is called, rerun the app, and then select "Move to End of Document" again. This time, you will notice that the text view moves fast but does not end up at the bottom of the document.
You may also open the large-file.json in the project, the same file that the example app displays, in TextEdit, and press CMD+Down to move to the end of the document. Notice that TextEdit does this in mere milliseconds.
Example Project:
The example project is located on GitHub:
https://github.com/simonbs/apple-developer-forums/tree/main/how-can-we-performantly-scroll-to-a-target-location-using-textkit-2
Any advice or guidance on how to achieve this with TextKit 2 would be greatly appreciated.
Thanks in advance!
Best regards,
Simon
We're building a UGC AR app and are leveraging App Clips to distribute AR experiences without app download.
Since earlier this week, many of our users are reporting sharing experiences as App Clip doesn't work anymore. They are getting the message "AppClip unavailable" on a little card. We attached a QR code to try it yourself and a link to a different experience. We tried with multiple experiences and on multiple devices already.
https://scenery.app/experience/1C925FDE-E49A-489B-BA14-58A4E532E645
Interestingly, we can't pinpoint the issue to an exact device or OS.
We tested on many devices and on most, the AppClip is being displayed as unavailable, stating "App Clip unavailable", whereas it works on a few. It all worked fine last week (before September 12th).
iPhone 13 Pro Max, iOS26: works
iPhone SE, iOS 17: works
iPhone 16 Pro, iOS 26: doesn't work
iPhone 12 Pro Max, iOS 26: doesn't work
iPhone 12 mini, iOS 18: does not work
iPad 9th gen, iOS 26: doesn't work
Please help. Our users are very dissatisfied as they expect this to work and it's a crucial feature.
We already filed a radar via Feedback assistant:
FB20303890