I've been getting intermittent failures on Xcode code compiling my app on multiple platforms because it fails to compile a metal shader.
The Metal Toolchain was not installed and could not compile the Metal source files. Download the Metal Toolchain from Xcode > Settings > Components and try again.
Sometimes if I re-run it, it works fine. Then I'll run it again, and it will fail.
If you tell me to file a feedback, please tell me what information would be useful and actionable, because this is all I have.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
For my first build, my package.resolved was not committed to the respository. I've fixed that and if I check my main branch on GitHub I can see the package.resolved file in the xcshareddata directory.
Even so, Xcode cloud is telling me that the file is missing and is failing to start my builds.
Could there be a caching issue going on?
My .gitignore file is empty.
Anything new this year to support reordering outline group items or items across sections in a multi-section list?
I really want to code my sidebar in swiftUI but user driven ordering is a must for me.
Attempting to launch a widget in Debug mode on Sonoma from Xcode 15 is failing with the following message:
attach failed (Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.)
Looking in console I see this message:
macOSTaskPolicy: (com.apple.debugserver) may not get the task control port of (MacGalleryWidget) (pid: 1851): (MacGalleryWidget) is hardened, (MacGalleryWidget) doesn't have get-task-allow, (com.apple.debugserver) is a declared debugger(com.apple.debugserver) is not a declared read-only debugger
What Xcode settings should I be looking at to rectify this? I suspect I may have something that's out of whack.
I'd like an Image subview of a lock screen widget to render as itself, and not with the multiply-like effect it gets today.
I've tried .widgetAccentable(true) and .widgetAccentable(false), but none have the appearance I'm looking for.
Is there maybe a new modifier that lets me "force" the rendering mode? Hoping there is and it's just not jumping out at me.
Thanks for your help.
My Xcode is only showing me the "Designed for iPad" Vision Pro archive destination despite having the native VisionPro destination chosen in my target settings.
Any thoughts on how to fix this?
Hello, I'm trying to accept drags from outside my app to create a new row in a list.
I've observed .onInsert not getting called in this scenario and I'm curious if it's 100% not possible, or if there's an obscure view modifier that I am missing.
Thank you.
struct ContentView: View {
@State var data = ["One", "Two", "Three"]
var body: some View {
HStack {
List {
ForEach(data, id: \.self) { item in
Text(item)
}
.onMove(perform: { indices, newOffset in
data.move(fromOffsets: indices, toOffset: newOffset)
})
.onInsert(of: [UTType.plainText], perform: { index, items in
// WORKS
data.insert("new", at: index)
})
.onInsert(of: [UTType.data], perform: { index, items in
// Never called
data.insert("OUTSIDE", at: index)
})
}
Text("DragMe")
.onDrag {
return NSItemProvider(item: "DragMe" as NSString, typeIdentifier: UTType.plainText.identifier)
}
}
}
}
Is there selection capabilities built into the new container APIs?
I would like to ensure that I can spawn a context menu for multiple selected items in my custom-layout container.
Topic:
UI Frameworks
SubTopic:
SwiftUI
Have the requirements to support swipe to dismiss from a quick-look view controller changed in iOS18? I am noticing that my app no longer supports gestural dismissal in an iOS18 build.
Not this is a QLPreviewController presented from a UIViewController presented in a SwiftUI view hierarchy as part of a ViewControllerRepresentable.
So, I've declared an AppIntent that indicates my app can "Open files" that conform to UTType.Image.
I've got a @AssistantEntity(schema: .files.file) and a
@AssistantIntent(schema: .files.openFile) declared.
So I navigate to the files app, quicklook an image, and open type-to-siri.
I tell siri "open this in " and all it does is act like "open ". No breakpoint is hit in my intent's perform method.
Am I doing something wrong? How can I test these cross-app behaviors?
Are they... not actually possible? Does an "OpenIntent" only work on my app's own URLs and not on file URLs from other apps?
Trying to upload an iOS26 app archive with Xcode 26 beta 7 and getting ITMS-90717: Invalid large app icon, meaning my app is not eligible for TestFlight testing.
My App contains an IconComposer .icon asset, so I'm not sure what it's complaining about. I'm not seeing anything in the release notes about this, and I'm not sure if I'm doing something wrong or not.
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
App Store Connect
Organizer Window
Hey All,Been digging around the internet looking for this one, and while stackoverflow has some relevant solutions, none are working for me.My View Hierarchy is the followingView--->UISplitViewController.view ( set as a child viewController )--------> rootViewController.view (set as the mainViewController of the splitView)--------> detailViewController.view (set as the detailViewController of the splitview)Via the iPhone 6 simulator(split view is always collapsed) I present a modal viewcontroller with the following code: UINavigationController *navigationController = [[UINavigationController alloc] initWithRootViewController:viewController];
[navigationController.navigationBar setBarStyle:UIBarStyleBlack];
[navigationController setModalPresentationStyle:UIModalPresentationPopover];
navigationController.popoverPresentationController.sourceView = view;
navigationController.popoverPresentationController.barButtonItem = barButtonItem;
navigationController.popoverPresentationController.delegate = self;
[self presentViewController:nav animated:YES completion:nil];I dissmiss the presented controller from that viewController by calling:[self dismissViewControllerAnimated:true completion:nil];If I set animated to "false" I dont have any problems, but it looks bad and doesnt make sense.I see some posts regarding this and custom presenatation methods, but Im not using anything custom here.Any Help is appreciated!EDIT:On iPhone the ModalPresentationStyle should default to UIModalPresentationOverFullScreen, so I tried setting the presentationStyle directly to that, and it worked!If I set the presentationStyle to "FullScreen" I get the same behavior, a black screen after dismissing.
I want to create a feature where a user can stick images down my app onto their walls. I want to persist their placements between launches and use pinching a panning gestures to manipulate the images.
I see lots of articles going back a few years that show how to do this in ARKit, but going through WWDC videos I’m seeing a trend toward RealityKit, and am starting to think that’s the “right” thing to learn.
Is RealityKit to most up to date secret sauce? Is there a sample project like this one but using RealityKit?
https://developer.apple.com/documentation/arkit/environmental_analysis/placing_objects_and_handling_3d_interaction
Hello, I’ve noticed that when I set the image of a picture frame asset in Reality Composer it will change its size and aspect ratio to match the image. That’s pretty nice!
I would like to let a user dynamically modify that picture while running the app. Is this possible? Or are the models properties you set in the composer locked in when you export?
Background
So, I've got an anchor that I add to my Session after performing a raycast from a user's tap.
This anchor is named "PictureAnchor".
This anchor is not getting saved in my scene's world map, and I'm not sure why.
Information Gathering
I keep an eye on my session by outputting some information in
func session(_ session: ARSession, didUpdate frame: ARFrame)
As the ARFrame's are processed I look at the scene's anchors via sceneView.scene.anchors.filter({$0.name == "PictureAnchor"
and I see that my anchor is present in the sceneAnchors.
However, when I do
frame.anchors.filter to check the anchors of the ARFrame itself, my PictureAnchor is never present.
Furthermore, if I "save" the worldMap, an Anchor named PictureAnchor is not present.
Note: I could be totally wrong on how to read the data inside a saved world map, but I'm taking the anchors array at face value.
Other Information
I've noticed that the AR Persistence sample project actually checks for the anchor to be present in the ARFrame's anchors before permitting a save, but this condition is never happening for me.
I also noticed that my scene can have over 100 anchors, and the frame can have over 40, but only around 8 or 16 anchors are saved to the world map.
Main Question Restated
So, my main question is, why is my user-added "PictureAnchor" not present in the ARWorldMap, when I save my scene's map?
I see that it's present in the scene's anchors, but not present in the ARFrame's anchors.
A model entity is visible in the scene after being attached to this anchor as well.