I'm running macOS 15.5. I now have Xcode 26 and I'm testing my iOS app against iOS 26. I've encountered several UIKit / iOS 26 bugs I'd like to report. In Feedback Assistant I choose "Developer Technologies & SDKs". Eventually I get asked "What build does the issue occur on?". The list of choices is:
iOS 18.2 Seed 4
iOS 18.1.1
iOS 17.7.2
An earlier iOS build
I'm not sure
So how to I report this as an iOS 26 beta 1 issue?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Under macOS 26 and iPadOS, the Help menu in many cases has a menu item for "App Help". This item has the following icon:
I need to use this in my own app. I am unable to find this icon in SF Symbols 7 beta. I've scanned all of the icons under "What's New". I've searched for "help", "light", and "bulb" and this icon does not appear.
Does anyone know if it's even a new SF Symbol? Or does anyone know of a way to use this icon?
I know there are several existing threads on this topic but things keep changing.
The release notes for Xcode 26 beta 3 have the following statements for a couple of resolved issues:
Asset Catalog
Fixed: Unable to set Icon Composer icon as alternate iOS icon (153305178) (FB18025356)
Icon Composer
Fixed: Icon Composer icons back deploy to older versions of iOS, macOS, and watchOS with inconsistent rendering. (152258860)
I had a working solution under beta 1 and beta 2 for both of these. But under beta 3, I am now seeing the new glass icons for my app when running on a simulated iOS 18 device. This is happening for both the main app icon and any alternates. This contradicts the statement that beta 3 fixes this issue.
There is no documentation (that I can find) describing how you are supposed to support old icons for iOS 18 and new glass icons for iOS 26. There is no documentation for how to support alternate glass icons for iOS 26.
What I'm doing at the moment (that worked before beta 3) was to have the normal iOS 18 app icons in the Asset catalog and to have the new glass icons added to the project. The filenames for the glass .icon files have the same name as the app icons in the Assets catalog. This worked under beta 1 and beta 2. And despite the Xcode 26 beta 3 release notes stating that Icon Composer icons no longer back deploy to iOS 18, I'm seeing the opposite. Beta 3 now does the opposite of that statement.
Does anyone have a working solution that supports old iOS 18 app icons and new iOS 26 glass icons using Xcode 26 beta 3?
Note, all of my testing is with simulated iOS devices and I'm running Xcode 26 beta 3 under macOS 15.5. Maybe that's an issue?
Before I file a bug report I wanted to verify that I'm not missing something.
If I setup a view controller in a navigation controller and I add a view with a constraint that lines it up with the view controller's view's layoutMarginsGuide (leadingAnchor or trailingAnchor), in several cases the view will not line up with buttons added in the navigation bar. Under iOS 18 everything lines up as expected.
To demonstrate, create a new iOS project based on Swift/Storyboard. Setup the storyboard to show a UINavigationController with one UIViewController. Then in ViewController.swift (the one embedded in the navigation controller), use the following code:
import UIKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .yellow
title = "Layout Margins"
let leftCancel = UIBarButtonItem(systemItem: .cancel)
navigationItem.leftBarButtonItem = leftCancel
let rightCancel = UIBarButtonItem(systemItem: .cancel)
navigationItem.rightBarButtonItem = rightCancel
let leftView = UIView()
leftView.backgroundColor = .blue
leftView.translatesAutoresizingMaskIntoConstraints = false
self.view.addSubview(leftView)
let rightView = UIView()
rightView.backgroundColor = .red
rightView.translatesAutoresizingMaskIntoConstraints = false
self.view.addSubview(rightView)
NSLayoutConstraint.activate([
leftView.widthAnchor.constraint(equalToConstant: 80),
leftView.heightAnchor.constraint(equalToConstant: 80),
leftView.leadingAnchor.constraint(equalTo: self.view.layoutMarginsGuide.leadingAnchor),
leftView.topAnchor.constraint(equalTo: self.view.layoutMarginsGuide.topAnchor),
rightView.widthAnchor.constraint(equalToConstant: 80),
rightView.heightAnchor.constraint(equalToConstant: 80),
rightView.trailingAnchor.constraint(equalTo: self.view.layoutMarginsGuide.trailingAnchor),
rightView.topAnchor.constraint(equalTo: self.view.layoutMarginsGuide.topAnchor),
])
}
}
This adds a "Cancel" button to both ends of the navigation bar and it adds two little square views lined up with the leading and trailing layout margins.
Here's the results:
iPad running iPadOS 26 beta 3 (note the misalignment). This is really jarring when trying to align another glass button below the cancel button:
iPad running iPadOS 18.5 (aligned just fine):
iPhone in portrait running iOS 26 beta (aligned just fine):
iPhone in landscape running iOS 26 beta (no alignment at all):
iPhone in portrait running iOS 18.5 (aligned just fine):
iPhone in landscape running iOS 18.5 (aligned just fine):
Under iOS 26 on an iPhone (simulator at least) in portrait, the cancel buttons line up with the colored squares. That's good. In landscape, the colored squares have much larger margins as expected (due to the larger safe areas caused by the notch), but the cancel buttons in the navigation bar are not using the same margins. This one is debatable. Under iOS 18 the cancel buttons use larger margins to match the larger safe area. But I can see why under iOS 26 they changed this since the navigation bar doesn't interfere with the notch. But it's inconsistent.
Under iOS 26 on an iPad (simulator at least), it's wrong in any orientation. Despite the lack of any notch or need for a larger safe area, the colored squares are indented just a bit more than the buttons in the navigation bar. I see no reason for this. Under iOS 18 everything lines up as expected.
My real question at this point: Is the mismatched margins on an iPad under iOS 26 between the buttons in the navigation bar and other views added to the view controller a likely bug or am I missing something?
I’m creating a UITabBarController with a simple array of UITab instances. I’m setting the mode to .tabSideBar. How do I prevent editing? I don’t want the Edit button to appear at all. I’ve tried setting the tab controller’s customizableViewControllers property to both nil and an empty array but neither is preventing the Edit button from appearing. I scanned the various delegates and I don‘t see any relevant methods.
So far I’ve tested this on a simulated iPad running iPadOS 26 using Xcode 26 RC.
The UIResponderStandardEditActions protocol includes pasteAndMatchStyle:. UITextView conforms to UIResponderStandardEditActions. But I can't find a way to get that menu to appear. I get the standard "Paste" menu. I've tried overriding pasteAndMatchStyle: in a subclass of UITextView. I've overridden canPerformAction:withSender: but it never gets called with the pasteAndMatchStyle: selector. I've implemented the textView:editMenuForTextInRange:suggestedActions: delegate method. It's called but the suggested actions do not include the "Paste and Match Style" action (key command).
I came up with an ugly hack that involved overriding buildMenuWithBuilder: and adding my own key command after the paste command. But this shouldn't be necessary considering it's supposed to be a standard edit action.
So what's the trick to make the "Paste and Match Style" edit menu appear properly in a UITextView? I'm testing with iOS 17, 18, and 26.
My UIKit/Mac Catalyst app supports a user opening multiple windows (multiple scenes). One of these is a special scene that shows content that I want to appear in front of all other app windows/scenes, even while the user is interacting with one of the app's other scenes. I do not need this special scene to stay in front of the windows of other apps, just in front of the windows of my own app.
While I'm not 100% sure, it seems that AppKit supports this through the NSWindow level property. I can't find any equivalent feature in UIKit/Mac Catalyst. UIWindow windowLevel is not the same thing since that only affects the order of windows within a given scene. I need an entire scene (and its windows) to stay in front of my app's other scenes (and their windows).
I don't see anything relevant in UIWindow, UIScene, UIWindowScene, UISceneSession, UIScene.ActivationRequestOptions, or UIWindowScene.ActivationRequestOptions.
Under iPadsOS 26.0 and 26.1, if a view controller is presented with a presentation style of fullScreen or pageSheet, and the view controller is setup with a UISearchController that has obscuresBackgroundDuringPresentation set to true, then when cancelling the search the view controller is being dismissed when it should not be.
To replicate, create a new iOS project based on Swift/Storyboard using Xcode 26.0 or Xcode 26.1. Update ViewController.swift with the following code:
import UIKit
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .systemBackground
title = "Root"
navigationItem.rightBarButtonItems = [
UIBarButtonItem(title: "Full", primaryAction: .init(handler: { _ in
self.showModal(with: .fullScreen)
})),
UIBarButtonItem(title: "Page", primaryAction: .init(handler: { _ in
self.showModal(with: .pageSheet)
})),
UIBarButtonItem(title: "Form", primaryAction: .init(handler: { _ in
self.showModal(with: .formSheet)
})),
]
}
private func showModal(with style: UIModalPresentationStyle) {
let vc = ModalViewController()
let nc = UINavigationController(rootViewController: vc)
// This triggers the double dismiss bug when set to either pageSheet or fullScreen.
// If set to formSheet then it works fine.
// Bug is only on iPad with iPadOS 26.0 or 26.1 beta 2.
// Works fine on iPhone (any iOS) and iPadOS 18 as well as macOS 26.0 (not tested with other versions of macOS).
nc.modalPresentationStyle = style
self.present(nc, animated: true)
}
}
Then add a new file named ModalViewController.swift with the following code:
import UIKit
class ModalViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
title = "Modal"
view.backgroundColor = .systemBackground
setupSearch()
}
private func setupSearch() {
let sc = UISearchController(searchResultsController: UIViewController())
sc.delegate = self // Just for debugging - being set or not does not affect the bug
sc.obscuresBackgroundDuringPresentation = true // Critical to reproducing the bug
navigationItem.searchController = sc
navigationItem.preferredSearchBarPlacement = .stacked
}
// When the search is cancelled by tapping on the grayed out area below the search bar,
// this is called twice when it should only be called once. This happens only if the
// view controller is presented with a fullScreen or pageSheet presentation style.
// The end result is that the first call properly dismisses the search controller.
// The second call results in this view controller being dismissed when it should not be.
override func dismiss(animated flag: Bool, completion: (() -> Void)? = nil) {
print("dismiss ViewController")
// Set breakpoint on the following line
super.dismiss(animated: flag, completion: completion)
}
}
extension ModalViewController: UISearchControllerDelegate {
func willDismissSearchController(_ searchController: UISearchController) {
print("willDissmissSearchController")
}
func didDismissSearchController(_ searchController: UISearchController) {
print("didDismissSearchController")
}
}
Build and run the app on a simulated or real iPad running iPadOS 26.0 or 26.1 (beta 2). A root window appears with 3 buttons in the navbar. Each button displays the same view controller but with a different modalPresentationStyle.
Tap the Form button. This displays a modal view controller with formSheet style. Tap on the search field. Then tap on the grayed out area of the view controller to cancel the search. This all works just fine. Dismiss the modal (drag it down).
Now tap either the Page or Full button. These display the same modal view controller with pageSheet or fullScreen style respectively. Tap on the search field. Then tap on the grayed out area of the view controller to cancel the search. This time, not only is the search cancelled, but the view controller is also dismissed. This is because the view controller’s dismiss(animated:completion:) method is being called twice.
See ViewController.swift for the code that presents the modal. See ModalViewController.swift for the code that sets up the search controller. Both contain lots of comments.
Besides the use of fullScreen or pageSheet presentation style to reproduce the bug, the search controller must also have its obscuresBackgroundDuringPresentation property set to true. It’s the tap on that obscured background to cancel the search that results in the double call to dismiss. With the breakpoint set in the overloaded dismiss(animated:completion:) function, you can see the two stack traces that lead to the call to dismiss. When presented as a formSheet, the 2nd call to dismiss is not being made.
This issue does not affect iPadOS 18 nor any version of iOS on iPhones. Nor does it affect the app using Mac Catalyst on macOS 26.0 (untested with macOS 15 or 26.1).
In short, it is expected that cancelling the search in a presented view controller should not also result in the view controller being dismissed.
Tested with Xcode 26.1 beta 2 and Xcode 26.0. Tested with iPadOS 26.1 beta 2 (real and simulated) and iPadOS 26.0 (simulated).
A version of this post was submitted as FB20569327
Just watched the Targeting Content with Multiple Windows video. My app uses UIApplicationShortcutItem and UIApplication.shortcutItems.So I thought it would be useful to set the targetContentIdentifier property of each shortcut item so when the user selects one of the shortcuts, they can be brought to the mose appropriate scene in the app.But UIApplicationShortcurItem targetContentIdentifer is a read-only property. Is this is mistake?The documentation for UISceneActiviationConditions states (emphesis mine):Many different objects contain a targetContentIdentifier property, including NSUserActivity, UNNotificationContent, and UIApplicationShortcutItem. When creating those objects, fill that property with a value that uniquely describes the event and matches your scenes' predicates.Kind of hard to do when the property is read-only.Am I missing something or is this an API bug?
As best as I can tell, the only messages sent to a MEMessageActionHandler for processing are those going to the Inbox. I'm trying to write a Mail extension that processes incoming messages, most importantly those that go straight to the Junk folder, but none of those messages go to the extension.
Is there some setting I may be missing or can a MEMessageActionHandler only process emails going to the Inbox?
I have an Intel Mac running macOS Monterey 12.6. I'm running Xcode 14.2.
I created a new iOS project. I deleted the "Mac (Designed for iPad)" destination and added the "Mac (Mac Catalyst)" destination.
Unlike the iOS minimum deployment target, oddly there is no place to set the macOS minimum deployment target. I had to go to the target's Build Settings and change it there.
I then chose "My Mac (Mac Catalyst)" as the build target and built the project. So far, so good.
I then tried to run the app. I get a dialog with the following message:
Could not launch “SampleApp”
The app is incompatible with the current version of macOS. Please check the app's deployment target.
Clicking on the Details button gives lots of info. Some things I see are:
...
"device_osBuild" = "12.6 (21G115)";
...
"sdk_canonicalName" = "macosx13.1";
"sdk_osVersion" = "13.1";
But I'm not running macOS 13.1. I have macOS 12.6. Oddly there's no mention of the deployment target being set to 12.4 (which I set in the target's Build Settings).
If I go to Xcode -> Preferences -> Platforms, I see "macOS 13.1" listed as built-in. Tapping on the + button there is no option for macOS.
So my question is how do I support running a Mac Catalyst app on an Intel Mac running macOS 12.6 while using Xcode 14.2?
Can it be done? What steps do I need to take?
I've reloaded the project. I've restarted Xcode. I've done a clean build. Nothing changes.
I have a brand new macOS app (built with Mac Catalyst and based on a long existing iOS app) I've submitted to App Store Connect for review. It was rejected due to my Contacts purpose string not being deemed sufficient (despite being the same one the iOS version of the app has been using for years).
Anyway, I made a change to the privacy string and submitted a new build. The new build was rejected and a screenshot showed the About screen with the new build number as well as the privacy string from the original build, not the new build. So I verified that my archive did in fact have an Info.plist with the updated privacy string. So I resubmitted that build again for review and it was rejected again for the same reason. Despite the reviewer claiming, at my request, that a fresh install of the latest build was used.
So I changed the privacy string again and submitted a 3rd new build, again verifying the archive that I was sending through the Xcode Organizer did have the updated (now 3rd) privacy string.
And again the app has been rejected. Despite 2 new builds, the reviewers are still seeing the original privacy string.
Does anyone have any ideas on how to get this resolved?
Is there any way for a 3rd party macOS app to receive some sort of notification for a change to the Text Size accessibility setting in the Settings app? I have not been able to find any API for this. Several Apple apps (Mail, Notes, and others) update text size based on the setting. I'd like to do the same in my own macOS app.
When trying to use a UISearchController setup with a UISearchBar that has scope buttons, the search controller's scopeBarActivation property is set to .onSearchActivation, the navigation item's preferredSearchBarPlacement property is set to .integrated. or .integratedButton, and the search bar/button appears in the navigation bar, then the scope buttons never appear. But space is made for where they should appear.
Some relevant code in a UIViewController shown as the root view controller of a UINavigationController:
private func setupSearch() {
let sc = UISearchController(searchResultsController: UIViewController())
sc.delegate = self
sc.obscuresBackgroundDuringPresentation = true
// Setup search bar with scope buttons
let bar = sc.searchBar
bar.scopeButtonTitles = [ "One", "Two", "Three", "Four" ]
bar.selectedScopeButtonIndex = 0
bar.delegate = self
// Apply the search controller to the nav bar
navigationItem.searchController = sc
// BUG - Under iOS/iPadOS 26 RC, using .onSearchActivation results in the scope buttons never appearing at all
// when using integrated placement in the nav bar.
// Ensure the scope buttons appear immediately upon activating the search controller
sc.scopeBarActivation = .onSearchActivation
// This works but doesn't show the scope buttons until the user starts typing - that's too late for my needs
//sc.scopeBarActivation = .automatic
if #available(iOS 26.0, *) {
// Under iOS 26 put the search icon in the nav bar - same issue for .integrated and .integratedButton
navigationItem.preferredSearchBarPlacement = .integrated // .integratedButton
// My toolbar is full so I need the search in the navigation bar
navigationItem.searchBarPlacementAllowsToolbarIntegration = false // Ensure it's in the nav bar
} else {
// Under iOS 18 put the search bar in the nav bar below the title
navigationItem.preferredSearchBarPlacement = .stacked
}
}
I need the search bar in the navigation bar since the toolbar is full. And I need the scope buttons to appear immediately upon search activation.
This problem happens on any real or simulated iPhone or iPad running iOS/iPadOS 26 RC.
For the stock Camera app there is a Grid setting. Is there any way to get a UIImagePickerController with a source type of Camera to honor that Grid setting? Or would this need to be an enhancement request for Apple?