Post

Replies

Boosts

Views

Activity

Page-Curl Shader -- Pixel transparency check is wrong?
Given I do not understand much at all about how to write shaders I do not understand the math associated with page-curl effects I am trying to: implement a page-curl shader for use on SwiftUI views. I've lifted a shader from HIROKI IKEUCHI that I believe they lifted from a non-metal shader resource online, and I'm trying to digest it. One thing I want to do is to paint the "underside" of the view with a given color and maintain the transparency of rounded corners when they are flipped over. So, if an underside pixel is "clear" then I want to sample the pixel at that position on the original layer instead of the "curl effect" pixel. There are two comments in the shader below where I check the alpha, and underside flags, and paint the color red as a debug test. The shader gives this result: The outside of those rounded corners is appropriately red and the white border pixels are detected as "not-clear". But the "inner" portion of the border is... mistakingly red? I don't get it. Any help would be appreciated. I feel tapped out and I don't have any IRL resources I can ask. // // PageCurl.metal // ShaderDemo3 // // Created by HIROKI IKEUCHI on 2023/10/17. // #include <metal_stdlib> #include <SwiftUI/SwiftUI_Metal.h> using namespace metal; #define pi float(3.14159265359) #define blue half4(0.0, 0.0, 1.0, 1.0) #define red half4(1.0, 0.0, 0.0, 1.0) #define radius float(0.4) // そのピクセルの色を返す [[ stitchable ]] half4 pageCurl ( float2 _position, SwiftUI::Layer layer, float4 bounds, float2 _clickedPoint, float2 _mouseCursor ) { half4 undersideColor = half4(0.5, 0.5, 1.0, 1.0); float2 originalPosition = _position; // y座標の補正 float2 position = float2(_position.x, bounds.w - _position.y); float2 clickedPoint = float2(_clickedPoint.x, bounds.w - _clickedPoint.y); float2 mouseCursor = float2(_mouseCursor.x, bounds.w - _mouseCursor.y); float aspect = bounds.z / bounds.w; float2 uv = position * float2(aspect, 1.) / bounds.zw; float2 mouse = mouseCursor.xy * float2(aspect, 1.) / bounds.zw; float2 mouseDir = normalize(abs(clickedPoint.xy) - mouseCursor.xy); float2 origin = clamp(mouse - mouseDir * mouse.x / mouseDir.x, 0., 1.); float mouseDist = clamp(length(mouse - origin) + (aspect - (abs(clickedPoint.x) / bounds.z) * aspect) / mouseDir.x, 0., aspect / mouseDir.x); if (mouseDir.x < 0.) { mouseDist = distance(mouse, origin); } float proj = dot(uv - origin, mouseDir); float dist = proj - mouseDist; float2 linePoint = uv - dist * mouseDir; half4 pixel = layer.sample(position); if (dist > radius) { pixel = half4(0.0, 0.0, 0.0, 0.0); // background behind curling layer (note: 0.0 opacity) pixel.rgb *= pow(clamp(dist - radius, 0., 1.) * 1.5, .2); } else if (dist >= 0.0) { // THIS PORTION HANDLES THE CURL SHADED PORTION OF THE RESULT // map to cylinder point float theta = asin(dist / radius); float2 p2 = linePoint + mouseDir * (pi - theta) * radius; float2 p1 = linePoint + mouseDir * theta * radius; bool underside = (p2.x <= aspect && p2.y <= 1. && p2.x > 0. && p2.y > 0.); uv = underside ? p2 : p1; uv = float2(uv.x, 1.0 - uv.y); // invert y pixel = layer.sample(uv * float2(1. / aspect, 1.) * float2(bounds[2], bounds[3])); // ME<---- if (underside && pixel.a == 0.0) { //<---- PIXEL.A IS 0.0 WHYYYYY pixel = red; } // Commented out while debugging alpha issues // if (underside && pixel.a == 0.0) { // pixel = layer.sample(originalPosition); // } else if (underside) { // pixel = undersideColor; // underside // } // Shadow the pixel being returned pixel.rgb *= pow(clamp((radius - dist) / radius, 0., 1.), .2); } else { // THIS PORTION HANDLES THE NON-CURL-SHADED PORTION OF THE SAMPLING. float2 p = linePoint + mouseDir * (abs(dist) + pi * radius); bool underside = (p.x <= aspect && p.y <= 1. && p.x > 0. && p.y > 0.); uv = underside ? p : uv; uv = float2(uv.x, 1.0 - uv.y); // invert y pixel = layer.sample(uv * float2(1. / aspect, 1.) * float2(bounds[2], bounds[3])); // ME if (underside && pixel.a == 0.0) { //<---- PIXEL.A IS 0.0 WHYYYYY pixel = red; } // Commented out while debugging alpha issues // if (underside && pixel.a == 0.0) { // // If the new underside pixel is clear, we should sample the original image's pixel. // pixel = layer.sample(originalPosition); // } else if (underside) { // pixel = undersideColor; // } } return pixel; }
1
0
734
Oct ’24
[Getting Started] Selecting default navLink on startup.
So, I've got a SwiftUI app with a triple column splitView. When the app starts on an 11 inch iPad, the "primary" column is offscreen. The primaryColumn has a List full of navigationLinks. Like so: List { ForEach(items, id: \.itemID) { item in NavigationLink(tag: item, selection: $selectedItem) {               ItemDetailsView(item: item) ... Now, the selection of the first Column in the split view cascades through the rest of the app, so populating it is pretty important. I've tried having the selectedItem be set from an EnvironmentObject. I've also tried having it set in onAppear. Everything I try only causes a selection the "pop into place" whenever I expose the primary column of the sidebar. Am I going about this the wrong way? Is it because the sidebar is hidden by default?
1
0
824
Jul ’21
[iOS15] Fruta Doesn't Launch With Populated State
Asking with the WWDC-10220 tag because Fruta is the sample code for this presentation. When launching Fruta on a landscape iPad Pro 11", the "State" of the application is completely empty until the user taps the back button. After tapping back everything appears to pop into place. Is this expected behavior in SwiftUI when using split screens? NavigationLinks are finicky and I'm expecting the programmatic setting of the primary column "selection" to be resolved on launch, not when the user taps back.
0
0
893
Jul ’21
File type limitations of sendResource?
Hello, Im noticing a behavior when I try to send a package file as a resource to peer. A file like this is basically a folder with an extension, and, and despite receiving a progress object from the send call, I’m not seeing it rise past 0.0%. Attaching it to a ProgressView also does not show any progress. The completion handler of the send is never called with an error and the receiver will only get the “finished receiving” callback if the host cancels or disconnects. I didn’t see anything in the sendResource documentation about not supporting bundle files, but it would not surprise me. Any thoughts?
1
0
899
Feb ’22
Create a World Anchor from a Spatial Tap Gesture?
World Anchor from SpatialTapGesture ?? At 19:56 in the video, it's mentioned that we can use a SpatialTapGesture to "identify a position in the world" to make a world anchor. Which API calls are utilized to make this happen? World anchors are created with 4x4 matrices, and a SpatialTapGestures doesn't seem to generate one of those. Any ideas?
0
0
828
Aug ’23
RayCasting to Surface Returns Inconsistent Results?
On Xcode 15.1.0b2 when rayacsting to a collision surface, there appears to be a tendency for the collisions to be inconsistent. Here are my results. Green cylinders are hits, and red cylinders are raycasts that returned no collision results. NOTE: This raycast is triggered by a tap gesture recognizer registering on the cube... so it's weird to me that the tap would work, but the raycast not collide with anything. Is this something that just performs poorly in the simulator? My RayCasting command is: guard let pose = self.arSessionController.worldTracking.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) else { print("FAILED TO GET POSITION") return } let transform = Transform(matrix: pose.originFromAnchorTransform) let locationOfDevice = transform.translation let raycastResult = scene.raycast(from: locationOfDevice, to: destination, relativeTo: nil) where destination is retrieved in a tap gesture handler via: let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene) Any findings would be appreciated.
2
0
733
Nov ’23
How does an indirect drag gesture work?
Hello, I’ve got a few questions about drag gestures on VisionOS in Immersive scenes. Once a user initiates a drag gesture are their eyes involved anymore in the gesture? If not and the user is dragging something farther away, how far can they move it using indirect gestures? I assume the user’s range of motion is limited because their hands are in their lap, so could they move something multiple meters along a distant wall? How can the user cancel the gesture If they don’t like the anticipated / telegraphed result? I’m trying to craft a good experience and it’s difficult without some of these details. I have still not heard back on my devkit application. Thank you for any help.
2
0
729
Dec ’23