Post

Replies

Boosts

Views

Activity

Reply to Metal and VisionOS
Currently it’s only in single app, VR / no-passthrough mode only. Look for compositorservices. If you want Metal with passthrough and not just through RealityKit, that’s unavailable (unfortunately in my opinion). Please send feedback requests with specific use cases if you want that.
Topic: App & System Services SubTopic: Core OS Tags:
Jul ’23
Reply to On vision pro, what is the max walkable distance in unbounded passthrough mode?
To rephrase, in AR mode, you can walk around, is that correct? In VR mode, you are limited to the 1.5 for now. So for AR mode with the passthrough video on, in which that limit is not imposed, how far from the origin are you allowed to walk around? Is there a limit, or is it essentially boundless and the tracking will hold for arbitrary spaces, or is it limited to some maximum boundary based on tracking limits?
Topic: Graphics & Games SubTopic: General Tags:
Jul ’23
Reply to Generating vertex data in compute shader
Firstly, did you profile why the vertices are expensive to compute before going for this solution? Also, it's unclear how you're computing the vertices since you haven't provided code or an algorithm for that part, so it's hard to tell if you're doing the compute step optimally. Successfully using compute relies heavily on taking advantage of parallelism, so make sure it makes sense to use a compute kernel. Roughly, I can imagine you can allocate one gigantic buffer -- no need for multiple. Conceptually split the buffer into some fixed-size sections (X vertices) that are handled by some specified number of threads. You can tune this size. Beyond that, it's tricky to help, but maybe with more specific info, it'll be easier.
Topic: Graphics & Games SubTopic: General Tags:
Jul ’23
Reply to Can visionOS app be used to scan QRCode?
For a suggestion to the developers, maybe add a way to request things from the system like “give me QR codes in the scene and the AR Anchors” that could be handled without giving camera data to the client application? Essentially, the system could have a bunch of system-level algorithms the developer could query. —similar to the dataproviders
Topic: App & System Services SubTopic: Core OS Tags:
Jul ’23
Reply to Using digital crown on Vision Pro Simulator
The release notes for Xcode 15 beta 2 (https://developer.apple.com/documentation/xcode-release-notes/xcode-15-release-notes) say that a GUI-based emulation of the crown is a missing feature, and that the workaround is to emulate it yourself using in-software function calls. There is no UI for simulating Apple Vision Pro’s immersion crown. (109429267) Workaround: Use XCTest’s XCUIDevice. rotateDigitalCrown(delta:) method. /*! * Rotate the digital crown by a specified amount. * * @param rotationalDelta * The amount by which to rotate the digital crown. A value of 1.0 represents one full rotation. * The value’s sign indicates the rotation’s direction, but the sign is adjusted based on the crown’s orientation. * Positive values always indicate an upward scrolling gesture, while negative numbers indicate a downward scrolling gesture. * */ - (void)rotateDigitalCrownByDelta:(CGFloat)rotationalDelta; PROBLEM: I can't figure out how to use the function. Xcode reports that the library does not exist when I run the application within the simulator. Maybe it's not currently possible.
Jun ’23
Reply to `MTKView` on visionOS
Not an engineer, but my guess is that using Metal at the moment is really different to set-up and doesn’t involve using a straight UI/NSView-style application. Rather, you call into compositorservices as in this talk: https://developer.apple.com/videos/play/wwdc2023/10089/
Topic: Graphics & Games SubTopic: General Tags:
Jun ’23
Reply to Immersion space becoming inactive on stepping outside of the "system-defined boundary"
They limit mobility to 1.5m from start. However, as I write in this topic, there are tons of use cases for larger tracking areas in VR—actually most of the ones I’m interested in, and several being researched: https://developer.apple.com/forums/thread/731449 I hope that eventually it’ll be possible for the user to define a larger safety boundary for controlled environments in which it’s fine to walk around. I guess if we file feedback reports en-masse with our use cases, that is the best way to push things forward.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Jun ’23
Reply to Custom renderpipeline & shader confusion
Currently, Metal custom rendering and custom rendering in general are not allowed on visionOS, except in full immersive VR mode :(. For full immersive mode, you have to use compositorservices. There’s an example doc and wwdc talk ( https://developer.apple.com/videos/play/wwdc2023/10089/ ) but no sample project (yet)? For passthrough mode, there is reason to believe that standalone app full AR passthrough mode could be updated in the future to support custom rendering too, but it’s not a given. Checkout the discussion I had here starting at this post to understand the current limitations: https://developer.apple.com/forums/thread/731506?answerId=755464022#755464022 I’d suggest filing feature requests for support for custom rendering because I also think it‘s super important not to be limited to the default RealityKit renderer for passthrough mode. They want to see use cases. Personally, I think custom rendering is a given.
Topic: Graphics & Games SubTopic: General Tags:
Jun ’23