Post

Replies

Boosts

Views

Activity

Sample Project for WWDC24 10092 Metal with Passthrough?
It’s great that we’ll be able to use Metal custom renderers in passthrough mode on visionOS. https://developer.apple.com/wwdc24/10092 This is a lot of complicated set-up, however. It’s also unclear how occlusion and custom algorithms / raytracing will work in tandem with scene understanding. May we have a project template and/or sample? Preferably with the C api and not just swift. This would be much-appreciated and helpful to everyone who wants this set-up. I’d like to see the whole process. Thank you for introducing this feature!
3
1
1.1k
Nov ’24
WKWebView 120hz Support
I'm developing an application that needs smooth framerates within a wkwebview that interacts with native code. However, requestAnimationFrame by default is still throttled to 60hz even if all my target devices (the iPad Pro for example) have supported 120hz for a long time already. I noticed that the latest Safari in 18.3 beta supports unlocked framerates, but that's only under Safari feature flags. To my knowledge, these flags do not apply to WKWebView. Is there a way to enable unlocked framerate in WKWebView via requestAnimationFrame? (Calling JS at a faster rate from the native code side will not work, almost definitely, since WKWebView will still render at its own rate.) This is an experimental application for internal use and I'm okay if there are temporary beta solutions available.
2
1
699
Jan ’25
Unexpected Behavior: PointerEvents do not permit simultaneous pencil and multitouch at the same time. Discussing Workarounds
For many years, I've noticed that although in native code I can handle continuous and simultaneous Apple pencil and touch inputs using UIKit, Safari and WKWebView's PointerEvents only seem to allow you to use one input type at a time. i.e. Apple Pencil down blocks touch input until lifted and touch input blocks Apple Pencil input. It's as though requiresexclusivetouchtype has been set in the underlying webkit implementation. There's decades of research (e.g. https://dl.acm.org/doi/10.1145/1866029.1866036 ) and several existing native applications in production showing that multimodal inputs open-up many unique and useful applications and interactions. Even a simple "hold object with finger" + "draw with stylus" controls are the norm. I recently built a native application using multimodal simultaneous inputs, but this is impossible to port to web due to the unexpected behavior of PointerEvents (and touch events, and mouse events; any variant exhibits the same behavior). I've researched and attempted to apply every possible flag, change, and css code to get this working, but I think the behind-the-scenes implementation is what's blocking the simultaneous touch types. This is unexpected and undesired behavior because it's inconsistent with the native behavior. If it's unintended, it's a big priority to fix for creating better user experiences on the iPad. If it's intended, I do not believe that's reasonable (even if it might be more complex and used for more advanced applications). Please expose a way to support simultaneous touch types in iPadOS/iOS in both Safari and WKWebView. At minimum, may we have a discussion on how to support the desired behavior? The simplest solution I can think of is to provide a webkit-platform-specific boolean in Safari and WKWebView called requiresExclusiveTouchType, which is set to False by default to keep the current behavior, and settable to True to get the more flexible behavior I'm expecting.
2
0
633
Jan ’25
GroupActivities on iPadOS?
I forgot to ask this during my lab session, but I noticed iPadOS is not listed under supported OSes under the GroupActivities documentation page. iPadOS supports FaceTime, but is it that GroupActivies doesn't work on iPadOS? This would be a crying shame since one of the examples specifically involved drawing collaboratively. The iPad is the perfect device for that use case. EDIT: Quick edit. Coordinate media experiences with Group Activities mentions iPadOS support, in which case the first page I linked might have a missing OS entry.
1
0
1.1k
Jun ’21
What is the purpose of cameraoutput in visionOS’s RealityRenderer?
Related to “what you can do in visionOS,” what are all of these camera-related functionalities for? (As of yet, not described in the documentation) https://developer.apple.com/documentation/realitykit/realityrenderer/cameraoutput/colortextures https://developer.apple.com/documentation/realitykit/realityrenderer/cameraoutput/relativeviewport What are the intended use cases? Is this the equivalent to render-to-texture? I also see some interop with raw Metal happening here.
1
0
964
Jun ’23
runtime texture input for ShaderGraphMaterial?
For the MaterialX shadergraph, the given example hard-codes two textures for blending at runtime ( https://developer.apple.com/documentation/visionos/designing-realitykit-content-with-reality-composer-pro#Build-materials-in-Shader-Graph ) Can I instead generate textures at runtime and set what those textures are as dynamic inputs for the material, or must all used textures be known when the material is created? If the procedural texture-setting is possible, how is it done, since the example shows a material with those hard-coded textures? EDIT: It looks like the answer is ”yes” since setParameter accepts textureResources https://developer.apple.com/documentation/realitykit/materialparameters/value/textureresource(_:)?changes=l_7 However, how do you turn a MTLTexture into a TextureResource?
1
2
1.2k
Jul ’23
When is full hand tracking available on Vision Pro?
I’m still a little unsure about the various spaces and capabilities. I’d like to make full use of hand tracking, joints and all. In the mode with passthrough and a single application present (not a shared space), is that available? (I am pretty sure that the answer is “yes,” but I’d like to confirm.) What is this mode called in the system? Mixed full-space?
1
0
1.2k
Jul ’23
External Peripheral Support on Vision Pro?
Does the Vision Pro allow usb peripherals like cameras, microphones, or video feeds from an iPhone or iPad? Can I use AVFoundation to access external camera feeds or microphones? Note that I am not asking about the internal cameras, which I am aware are off-limits. One use case is to support multiple viewing angles comparable to what we do with slide projectors. For example, draw using an iPad flat on your desk while wearing the Vision Pro in full passthrough mode. Simultaneously mirror the iPad’s screen on multiple walls in real-time at minimum latency (by thunderbolt connection), similar to how I can use Quicktime in macOS to mirror my iPad’s screen.
1
0
646
Feb ’24