Post

Replies

Boosts

Views

Activity

[SwiftUI][DragDrop][iPadOS] Drop into TabView Sidebar Tab not triggering. How to debug?
Are there tools to inspect why a drag-and-drop drop is not triggering in a SwiftUI app? I've declared .draggable on the dragging view, and .dropDestination on the receiving TabContent Tab view. This combination of modifiers is working on a smaller demo app that I have, but not on my more complex one. Is there a means to debug this in SwiftUI? I'd like to see if the drag-and-drop pasteboard actually has what I think it should have on it. Notably: "TabContent" has a far more restricted list of modifiers that can be used on it.
0
0
84
3w
Xcode Cloud 26b7 Metal Compilation Failure
I've been getting intermittent failures on Xcode code compiling my app on multiple platforms because it fails to compile a metal shader. The Metal Toolchain was not installed and could not compile the Metal source files. Download the Metal Toolchain from Xcode > Settings > Components and try again. Sometimes if I re-run it, it works fine. Then I'll run it again, and it will fail. If you tell me to file a feedback, please tell me what information would be useful and actionable, because this is all I have.
11
7
872
Sep ’25
iOS26 - ITMS-90717: Invalid large app icon
Trying to upload an iOS26 app archive with Xcode 26 beta 7 and getting ITMS-90717: Invalid large app icon, meaning my app is not eligible for TestFlight testing. My App contains an IconComposer .icon asset, so I'm not sure what it's complaining about. I'm not seeing anything in the release notes about this, and I'm not sure if I'm doing something wrong or not.
6
1
284
Sep ’25
Unable to launch + Attach to Mac Widget (Sonoma + Xcode 15)
Attempting to launch a widget in Debug mode on Sonoma from Xcode 15 is failing with the following message: attach failed (Not allowed to attach to process. Look in the console messages (Console.app), near the debugserver entries, when the attach failed. The subsystem that denied the attach permission will likely have logged an informative message about why it was denied.) Looking in console I see this message: macOSTaskPolicy: (com.apple.debugserver) may not get the task control port of (MacGalleryWidget) (pid: 1851): (MacGalleryWidget) is hardened, (MacGalleryWidget) doesn't have get-task-allow, (com.apple.debugserver) is a declared debugger(com.apple.debugserver) is not a declared read-only debugger What Xcode settings should I be looking at to rectify this? I suspect I may have something that's out of whack.
1
4
1.2k
May ’25
[18.2b2] How do I test an OpenIntent?
So, I've declared an AppIntent that indicates my app can "Open files" that conform to UTType.Image. I've got a @AssistantEntity(schema: .files.file) and a @AssistantIntent(schema: .files.openFile) declared. So I navigate to the files app, quicklook an image, and open type-to-siri. I tell siri "open this in " and all it does is act like "open ". No breakpoint is hit in my intent's perform method. Am I doing something wrong? How can I test these cross-app behaviors? Are they... not actually possible? Does an "OpenIntent" only work on my app's own URLs and not on file URLs from other apps?
2
1
518
Nov ’24
[iOS18] QLPreviewController - No more swipe to dismiss?
Have the requirements to support swipe to dismiss from a quick-look view controller changed in iOS18? I am noticing that my app no longer supports gestural dismissal in an iOS18 build. Not this is a QLPreviewController presented from a UIViewController presented in a SwiftUI view hierarchy as part of a ViewControllerRepresentable.
4
1
1.8k
Oct ’24
Page-Curl Shader -- Pixel transparency check is wrong?
Given I do not understand much at all about how to write shaders I do not understand the math associated with page-curl effects I am trying to: implement a page-curl shader for use on SwiftUI views. I've lifted a shader from HIROKI IKEUCHI that I believe they lifted from a non-metal shader resource online, and I'm trying to digest it. One thing I want to do is to paint the "underside" of the view with a given color and maintain the transparency of rounded corners when they are flipped over. So, if an underside pixel is "clear" then I want to sample the pixel at that position on the original layer instead of the "curl effect" pixel. There are two comments in the shader below where I check the alpha, and underside flags, and paint the color red as a debug test. The shader gives this result: The outside of those rounded corners is appropriately red and the white border pixels are detected as "not-clear". But the "inner" portion of the border is... mistakingly red? I don't get it. Any help would be appreciated. I feel tapped out and I don't have any IRL resources I can ask. // // PageCurl.metal // ShaderDemo3 // // Created by HIROKI IKEUCHI on 2023/10/17. // #include <metal_stdlib> #include <SwiftUI/SwiftUI_Metal.h> using namespace metal; #define pi float(3.14159265359) #define blue half4(0.0, 0.0, 1.0, 1.0) #define red half4(1.0, 0.0, 0.0, 1.0) #define radius float(0.4) // そのピクセルの色を返す [[ stitchable ]] half4 pageCurl ( float2 _position, SwiftUI::Layer layer, float4 bounds, float2 _clickedPoint, float2 _mouseCursor ) { half4 undersideColor = half4(0.5, 0.5, 1.0, 1.0); float2 originalPosition = _position; // y座標の補正 float2 position = float2(_position.x, bounds.w - _position.y); float2 clickedPoint = float2(_clickedPoint.x, bounds.w - _clickedPoint.y); float2 mouseCursor = float2(_mouseCursor.x, bounds.w - _mouseCursor.y); float aspect = bounds.z / bounds.w; float2 uv = position * float2(aspect, 1.) / bounds.zw; float2 mouse = mouseCursor.xy * float2(aspect, 1.) / bounds.zw; float2 mouseDir = normalize(abs(clickedPoint.xy) - mouseCursor.xy); float2 origin = clamp(mouse - mouseDir * mouse.x / mouseDir.x, 0., 1.); float mouseDist = clamp(length(mouse - origin) + (aspect - (abs(clickedPoint.x) / bounds.z) * aspect) / mouseDir.x, 0., aspect / mouseDir.x); if (mouseDir.x < 0.) { mouseDist = distance(mouse, origin); } float proj = dot(uv - origin, mouseDir); float dist = proj - mouseDist; float2 linePoint = uv - dist * mouseDir; half4 pixel = layer.sample(position); if (dist > radius) { pixel = half4(0.0, 0.0, 0.0, 0.0); // background behind curling layer (note: 0.0 opacity) pixel.rgb *= pow(clamp(dist - radius, 0., 1.) * 1.5, .2); } else if (dist >= 0.0) { // THIS PORTION HANDLES THE CURL SHADED PORTION OF THE RESULT // map to cylinder point float theta = asin(dist / radius); float2 p2 = linePoint + mouseDir * (pi - theta) * radius; float2 p1 = linePoint + mouseDir * theta * radius; bool underside = (p2.x <= aspect && p2.y <= 1. && p2.x > 0. && p2.y > 0.); uv = underside ? p2 : p1; uv = float2(uv.x, 1.0 - uv.y); // invert y pixel = layer.sample(uv * float2(1. / aspect, 1.) * float2(bounds[2], bounds[3])); // ME<---- if (underside && pixel.a == 0.0) { //<---- PIXEL.A IS 0.0 WHYYYYY pixel = red; } // Commented out while debugging alpha issues // if (underside && pixel.a == 0.0) { // pixel = layer.sample(originalPosition); // } else if (underside) { // pixel = undersideColor; // underside // } // Shadow the pixel being returned pixel.rgb *= pow(clamp((radius - dist) / radius, 0., 1.), .2); } else { // THIS PORTION HANDLES THE NON-CURL-SHADED PORTION OF THE SAMPLING. float2 p = linePoint + mouseDir * (abs(dist) + pi * radius); bool underside = (p.x <= aspect && p.y <= 1. && p.x > 0. && p.y > 0.); uv = underside ? p : uv; uv = float2(uv.x, 1.0 - uv.y); // invert y pixel = layer.sample(uv * float2(1. / aspect, 1.) * float2(bounds[2], bounds[3])); // ME if (underside && pixel.a == 0.0) { //<---- PIXEL.A IS 0.0 WHYYYYY pixel = red; } // Commented out while debugging alpha issues // if (underside && pixel.a == 0.0) { // // If the new underside pixel is clear, we should sample the original image's pixel. // pixel = layer.sample(originalPosition); // } else if (underside) { // pixel = undersideColor; // } } return pixel; }
1
0
764
Oct ’24
[NewbQs] Is this possible with AppIntentDomains?
As a user, when viewing a photo or image, I want to be able to tell Siri, “add this to ”, similar to example from the WWDC presentation where a photo is added to a note in the notes app. Is this... possible with app domains as they are documented? I see domains like open-file and open-photo, but I don't know if those are appropriate for this kind of functionality?
1
0
600
Sep ’24