Apple's own USDZ files from their website does not display properly both in OPReview and in Keynote when imported. It displays properly, however, in Reality Converter bit with this error:
"Invalid USD shader node in USD file
Shader nodes must have “id” as the implementationSource, with id values that begin with “Usd”. Also, shader inputs with connections must each have a single, valid connection source."
I tried importing other models from external sources and they work without any issue at all.
Is there any potential fix or workaround this?
Thanks in advance.
RealityKit
RSS for tagSimulate and render 3D content for use in your augmented reality apps using RealityKit.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello all,
I am building for visionOS with another engineer and using Reality Composer Pro to validate usd files.
The starting position of my animated usdz, its position when it's first loaded, is not the same as the first frame of the animation on the usdz file
For testing, I am using the AR Quick Look asset 'toy_biplane_idle.usdz' which demonstrates the same 'error' we're currently getting with our own usdz files.
When the usdz is loaded, it is on the ground plane -
But when the aniamtion is played, the plane 'snaps' to the position of the first frame of the animation -
This 'snapping' behavior is giving us problems. We want the user ot see this plane in its static 'load' position with the option to play the animation. But we dont want it to snap when the user presses play
Is it possible to load the .usdz in the position specified by the first frame of the animation? What is the best way to fix this issue.
Thanks!
Hi,
I am trying to use PhotogrammetrySample input sequence according to the WWDC video.
I modified only the following code in the sample:
var images = try FileManager.default.contentsOfDirectory(at: captureFolderManager.imagesFolder,
includingPropertiesForKeys: nil,
options: .skipsHiddenFiles)
let inputSequence = images.lazy.compactMap { file in
return self.loadSampleAndMask(file: file)
}
// Next line causes the exception
photogrammetrySession = try PhotogrammetrySession(
input: inputSequence,
configuration: configuration)
private func loadSampleAndMask(file: URL) -> PhotogrammetrySample? {
do {
var sample = try PhotogrammetrySample(contentsOf: file)
return sample
} catch {
return nil
}
}
I am getting following runtime error:
Thread 1: EXC_BREAKPOINT (code=1, subcode=0x240d88904)
Topic:
Graphics & Games
SubTopic:
RealityKit
I'm building an iOS/iPadOS app for iOS 18+ using the new RealityView in SwiftUI. (I may add visionOS, but I'm not focusing on it right now.) The 3D scene I'm rendering is fairly simple (just a few dozen vertices and a couple of textures), and I'd like to render it at 120fps on ProMotion devices if possible. I tried setting CADisableMinimumFrameDurationOnPhone to true in the info plist, but it had no effect. The frame rate in the GPU Report in Xcode stays capped at 60fps, and the gauge even tops out at 60.
My question is kind of the opposite of this post, which asks how to limit the frame rate of a RealityView.
I'm on Xcode 16 beta 5 on macOS Sonoma and iOS 18.0 beta 6 on my iPhone 15 Pro.
Hi, I'm trying to understand how I can get 3- or 4-channel per-vertex data into the Graph Editor.
From my tests, it seems that:
the "Geometric Property" node does not give access to 4-channel data,
"Geometric Property (vector3)" does not give me access to custom properties besides the ones defined in MaterialX core
the "Texture Coordinates" node has a vector4f mode (yay!), but according to MaterialX spec, texcoords must have 2 or 3 channels, and I can't get 4-channel data to show up there either.
My assumption so far is that I must be missing some "magic" – for example, do the primvars in a file have to be in a specific order, independent of their names? Or do their names matter? (E.g. convention would be primars:st and primvars:st1 and so on)
Unfortunately the forum doesn't allow me to attach any USDZ or ZIP files or GDrive links; if there's a way to share a test file I'm happy to do so!
I'm using RealityKit for a scene with many static and dynamic ModelEntitys simulating physics. When all the entities have simple collision generated from .generateCollisionShapes I don't see any issues, but for some entities I need much more complex and accurate collision. For this I've been using ShapeResource.generateStaticMesh with the mesh's data (2769 positions, 16272 face indices in this case), which works exactly as desired with a low entity count. However once there are 600+ dynamic entities introducing even one static entity with complex collision will reliably trigger a crash when colliding with one of the dynamic entities (not necessarily on first contact, but inevitably after multiple collisions).
If I arbitrarily limit the number of entities to a max of around 500 it seems to prevent the issue from happening, though the likelihood seems to increase with the number of entities so there may be a low probability of it triggering even at 500 entities that I haven't hit while testing.
If physx imposes some kind of entity or collision face/shape limit or something like that I'd at least like to know exactly what it is, but ideally there's a way to work around this. Right now my "fix" is just arbitrarily restricting the entity count in a way that limits what my app can do.
The crash triggers inside
0x00000001a6790dfc in physx::PxcDiscreteNarrowPhasePCM(physx::PxcNpThreadContext&, physx::PxcNpWorkUnit const&, physx::Gu::Cache&, physx::PxsContactManagerOutput&) ()
which looks like this (crash line has an -> arrow at the bottom)
CoreRE`physx::PxcDiscreteNarrowPhasePCM:
...
0x1a6790df0 <+668>: mov x1, x24
0x1a6790df4 <+672>: bl 0x1a67913d8 ; physx::PxcNpCacheStreamPair::reserve(unsigned int)
0x1a6790df8 <+676>: ldrb w8, [x23]
-> 0x1a6790dfc <+680>: str w8, [x0, #0x20]