Post

Replies

Boosts

Views

Activity

Placement of model inside volumetric window?
I am having troubles placing a model inside a volumetric window. I have a model - just a simple cube created in Reality Composer Pro that is 0.2m on a side and centered at the origin - and I want to display it in a volumetric window that is 1.0m on a side while preserving the cube's origin 0.2m size. The small cube seems to be flush against the back and top of the larger volumetric window. Is it possible to initially position the model inside the volume? For example, can the model be placed flush against the bottom and front of the volumetric window? (note: the actual use case is wanting to place 3D terrain (which tends to be mostly flat like a pizza box) flush against the bottom of the volumetric window)
2
2
738
Dec ’24
Triangle count and texture size budget for RealityKit on visionOS
In the past, Apple recommended restricting USDZ models to a maximum of 100,000 triangles and a texture sizes of 2048x2048 for Apple QuickLook (and I think for RealityKit on iOS in general). Does Apple have any recommended max polygon counts for visionOS? Is it the same for models running in a Volumetric window in the shared space and in ImmersiveSpace? What is the recommended texture size for visionOS? (I seem to recall 8192x8192, but I can't find it now)
2
0
1.6k
Jun ’24
AnchorEntity to ARAnchor (or vice versa)
Given an AnchorEntity from say RealityKit's Scene anchors collection, is it possible to retrieve the ARAnchor that was used when creating the AnchorEntity? Looking through the AnchorEntity documentation, - https://developer.apple.com/documentation/realitykit/anchorentity it seems that while you can create an AnchorEntity using an ARAnchor, there is no way to retrieve that ARAnchor afterwards. Alternatively, the ARSession delegate functions receive a list of ARAnchors or an ARFrame that has ARAnchors, but I could not find an approach to retrieve AnchorEntities that might be associated with any of these ARAnchors. Given an ARAnchor, is there a way to get an AnchorEntity associated with it? Thanks,
1
0
1.4k
Mar ’21
RealityKit playAnimation with transitionDuration causes a blink/glitch frame
I am experiencing a single video frame glitch when transitioning from one RealityKit Entity animation to another when transitionDuration is non-zero. This is with the current RealityKit and iOS 14.6 (i.e., not the betas). Is this a known issue? Have people succeeded in transitioning from one animation to another with a non-zero transition time and no strange blink? Background: I loaded two USDZ models, each with a different animation. One model will be shown, but the AnimationResource from the second model will (at some point) be applied to the first model. I originally created the models with Adobe's mixamo site (they are characters moving), downloaded the .fbx files, and then converted them to USDZ with Apple's "Reality Converter". I start the first model (robot) with its animation, then at some point I apply the animation from the second model (nextAnimationToPlay) to the original model (robot). If the transitionDuration is set to something other than 0, there appears a single video frame glitch (or blink) before the animation transition occurs (that single frame may be the model's original T-pose, but I'm not certain). robot.playAnimation(nextAnimationToPlay, transitionDuration: 1.0, startsPaused: false) If transitionDuration is set to 0, there is no glitch, but then I lose the smooth transition. I have tried variations. For example, setting startPaused to "true", and then calling resume() on the playback controller; also, waiting until the current animation completes before calling the playAnimation() with the next animation. Still, I get the quick blink. Any suggestions or pointers would be appreciated. Thanks,
1
0
1.3k
Aug ’21
StoreKit 2 demo buy buttons flips state
I've been playing with Apple's StoreKit 2 demo code (buying the cars, subscriptions, ...), and sometimes when I purchase a car, one or more of the other buttons visually flip state (e.g., purchased checkmark changes back to the price). Leaving the StoreView and returning to it shows the correct state for each of the buttons. I am using the StoreKit Configuration Products.storekit (for the scheme), so testing in Xcode. I get this in both the simulator and on my actual phone. The issue is random. The vast majority of the time everything works perfectly. Is anyone else seeing this issue? Does anyone know how to address it? Dev environment: Xcode 13.0 beta 5 (13A5212g) macOS 12.0 Beta (21A5534d) Mac mini (M1, 2020)
1
0
797
Oct ’21
Guidance on USDZ model sizes
I'm looking for documentation/guidance on USDZ and scene model sizes. My focus is on RealityKit-based apps. I found the 2018 WWDC presentation Integrating Apps and Content with AR Quick Look which mentions a rule of thumb for a USDZ model of: 100K polygons One set of 2048x2048 textures 10 seconds of animations Are these number still recommended in 2021? Are these numbers just for Quicklook, or do they apply to RealityKit-based apps too? If a RealityKit scene loads several USDZ models, should the cumulative number of polygons across all models be 100K, or is the 100K number on a per-model basis? The talk mentioned AR Quicklook will dynamically downsample textures for devices with less memory. Does RealityKit do this as well? If so, can I error on providing a larger texture (e.g., 4096 x 4096) and trust RealityKit to downsample as appropriate for me? (I am hoping there is some documentation covering questions like this)
1
0
2k
Nov ’21
Are RealityKit lights expensive?
I am finding some unexpected behavior with lights I've been adding to a RealityKit scene. For example, I created 14 PointLights, but only 8 appeared to be used to illuminate the scene. In another example, I created 7 PointLights and 7 SpotLights, and the frame rate dropped quite a bit. Are lights computationally expensive, causing some adaptive behavior by RealityKit? Should I be judicious in my use of lights for a scene? (Note: I set arView.environment.lighting.resource to a Skybox with a black image; my goal was to completely control the lighting. I don't know if that added to the computational load)
1
1
1.2k
Dec ’21
Setting lensPosition to focus at infinity
I am creating a fixed-focus camera app with the focus distance at infinity (or at least 30+ feet away). When I set lensPosition to 1.0, the images were blurry. Some tests letting autofocus do the job showed a lensPosition of about 0.808 for my wide and telephoto lenses and 0.84 for the ultra wide lens did the trick. (iPhone 13 Max) Will the lensPosition to focus at infinity vary between devices and lenses on that device? Is there a way to determine the appropriate lensPosition at run time?
1
0
875
May ’22
AnchorEntity from ARAnchor bug?
I am running into a strange bug where the exact same code compiles fine in one project but generates a compiler error in another project. In particular, I am trying to create an AnchorEnity from an ARAnchor. func addModelTo(anchor: ARAnchor) { let entityAnchor = AnchorEntity(anchor: anchor) ... } The compiler error message is not even consistent. Sometimes I get a single error message: Cannot convert value of type 'ARAnchor' to expected argument type 'AnchoringComponent.Target' Other times I get an error with two possible issues: No exact matches in call to initializer Candidate '() -> AnchorEntity' requires 0 arguments, but 1 was provided (RealityFoundation.AnchorEntity) Candidate expects value of type 'AnchoringComponent.Target' for parameter #1 (got '(anchor: ARAnchor)') I'm trying to track down why this sometimes causes an error and sometimes it does not. Any pointers?
1
0
1.6k
Nov ’22
State of the Union video on iOS or Apple TV
When I try to watch the WWDC22 "Platforms State of the Union" video on my Apple TV, iPhone, or iPad, using the Developer app, the video only shows a single frame every few seconds and there is no audio. The video plays fine on the Developer app on my Mac. Has anyone else had this problem? Is there a work around? (I've tried deleting the Developer app on my Apple TV and reinstalling it, but no joy).
1
0
830
Jan ’23
Multiple BodyTrackedEntities?
Can ARKit/RealityKit track multiple bodies for animation simultaneously? Reviewing Apple's CapturingBodyMotionIn3D sample code (for WWDC2019 session), there is no explicit linkage between the ARBodyAnchor and the loaded BodyTrackedEntity (e.g., the AnchorEntity used for the BodyTrackedEntity is not associated with the ARBodyAnchor). There seems to be some hidden linkage between the ARKit ARBodyAnchor and the RealityKit BodyTrackedEntity. Likewise, the ARFrame has a property for only a single ARBody2D (detectedBody). My interpretation is that only a single person can be tracked at a time. Is this correct?
1
0
1.2k
Jan ’23
AnchorEntity a child of an Entity?
Reviewing Apple's AnchorEntity documentation, I see that an AnchorEntity can be a child of an Entity in the RealityKit hierarchy. Has it always been this way? In my memory, an AnchorEntity was always just the base element in a Scene. If this was a change by Apple at some point, has Apple given examples where making an AnchorEntity a child of an Entity lets you do cool things you couldn't do before?
1
0
876
Feb ’23
Logging not redacting strings in Xcode
This is probably a minor point because it wouldn't affect distributed binaries, but I thought I'd mention it in case the behavior is unexpected. After watching WWDC 20202 Explore logging in Swift, I tried some simple examples in a Mac command-line app. I was surprised to see the strings were all printed just fine. There was no redaction. At least when running the program from Xcode. (Even using the old os_log() approach showed the strings without needing to add %{public}@.) However, if I run the program from a Terminal shell, the string arguments are properly redacted. I actually like this behavior (showing more while running in Xcode), but I thought I'd just raise the issue. Sample code and screenshot from Console are shown below. import Foundation import os let logger = Logger(subsystem: "com.example.logging_test", category: "hello") let greeting = "Hello" let personName = "World" logger.log("\(greeting), \(personName)") logger.log("\(greeting, privacy: .private), \(personName)") logger.log("\(greeting, privacy: .public), \(personName)") os_log("%@, %@", greeting, personName)
1
0
1.4k
Mar ’23
PerspectiveCamera in portrait and landscape modes
I have an ARView in nonAR cameraMode and a PerspectiveCamera. When I rotate my iPhone from portrait to landscape mode, the size of the content shrinks. For example, the attached image shows the same scenes with the phone in portrait and landscape modes. The blue cube is noticeable smaller in landscape. The size of the cube relative to the vertical space (i.e., the height of the view) in each situation is consistent. Is there a way to keep the scene (e.g., the cube) the same size whether I am in portrait or landscape mode?
1
0
804
Aug ’23