Post

Replies

Boosts

Views

Activity

Reply to VisionOS slow image tracking and enterprise API questions
Thanks for answering all my questions. I did submit a feedback through assistant (FB15332348), but for now it means I'll have to "do something" :). The CoreML approach could be a way, but it'll take time to get to the quality of tracking Apple provides on iOS. Also it feels that VP's cameras have problems with tracking of smaller images - I see that they fail to track at around 30cm, while iPhone works a few times farther (with perfect light conditions). So it's possible that even @60Hz it would still not help me to do what I need. Now I'm trying to play with object tracking, although what I need to track is quite small with rather boring and uniform textures. But if that would work, then the "higher object tracking frequency" from Enterprise API might be helpful. I really love what VP can do, but I hate as much some limitations it currently has ;).
Topic: Spatial Computing SubTopic: ARKit Tags:
Oct ’24
Reply to VisionOS slow image tracking and enterprise API questions
@Vision Pro Engineer - a small update. There is some sort of company size requirement, as my individual developer account, when I click on the development only request link says You must be the Account Holder of an Apple Developer Program for Organizations or an Apple Developer Enterprise Program to view this page. Is there someone I can contact about this?
Topic: Spatial Computing SubTopic: ARKit Tags:
Oct ’24
Reply to VisionOS slow image tracking and enterprise API questions
For anyone that would find this thread while looking for same answers: after some conversations with Apple support - size of company doesn't matter as long as it qualifies for company, or enterprise account. So in case of single person, or a small company (that is not a legal entity) it doesn't work, because in both cases you can get only individual account and this account can't apply for enterprise API. So the answer to my question Is it possible to request Enterprise API access as a single person with basic Apple Developer subscription? is: nope.
Topic: Spatial Computing SubTopic: ARKit Tags:
Oct ’24
Reply to VisionPro camera frame rate
Thanks @Vision Pro Engineer for the answer. I am familiar with feedback process, as all my questions on this forum end up with using the Feedback Assistant ;). Submitted under FB17273408. Regarding my other question - is there any limit on external cameras connected through developer strap, or would I get let's say 60, or 90fps?
Topic: Spatial Computing SubTopic: General Tags:
Apr ’25
Reply to RealityKit System update and timing
Thanks @Vision Pro Engineer, had no idea about that and that explains the double system calls (I tried your solution and it works). In my case I have some things I want to calculate once and then apply to all found entities, so I pulled it outside the loop and was wondering what happens, because those calculations executed more than once give wrong results :). Regarding the time flowing at different speeds I start to think that context.deltaTime is not really "time from when the update was last called", but more "at which time delta from previous frame results of this execution will materialize" that's why they're paced at frame rate and they "float around" the real time depending on system load, or something like that.
Topic: Spatial Computing SubTopic: ARKit Tags:
May ’25
Reply to Perspective problem
Thanks @Vision Pro Engineer for the answer. I was hoping for something closer to "you need to call this function and it will be ok" ;). My environment is simple and static - I have a physical object (standing on my desk) with fiducial tag glued to it and I want to overlay a 3D model (which I call here entity) over this physical object. I'm sitting at my desk, cameras are not obstructed, light is good (and during the day there's also a lot of sunlight), I'm not using travel mode and the object is 50-60cm from me (and I do have Apple's enterprise blessings to access the cameras :). I did another test today - I calculated the pose only once and just displayed the 3D model at that point. No continuous recalculations of its pose (the tracked object is not moving right now anyway). I tried to push and pull the entity along the Z axis a bit, so it's nearer/farther away from me, but it does not seem to affect this effect (so it's not a parallax). The entity always gets more displaced from its correct coordinates, the closer it gets to the edge of my field of view. If I place it perfectly in the center, the position is ok, then if I rotate my head to the right (so the entity gets close to left edge of what I see in passthrough), the entity gets more and more displaced to the right. Same with left and up/down - the displacement follows my head movement. I did record it through the standard "record my view" and this effect seems to be even stronger on the recording, so I feel that it has something to do with magic you do between raw camera input and what displays are showing - magic is not applied to recording, so effect is stronger and in passthrough it corrects, but not enough to be perfect. So maybe it's not the entity that's drifting, but the image of physical object in passthrough gets displaced? Or I'm delusional, which is also possible :D. I'll file a bug report with a video and post the number here, just need to prepare it on something I can share.
Topic: Spatial Computing SubTopic: ARKit Tags:
3w
Reply to ManipulationComponent + Warning messages in RealityView
Oh, so that's causing it. Since yesterday I'm trying to figure out what I messed up for this error to show up and can't find anything. Removing the call to ManipulationComponent.configureEntity made it go away :). In general it seems to be working for me - if you do ManipulationComponent.configureEntity(someEntity) it makes all the basic things you need (otherwise my understanding is that you need to add all the required components, like collision). Then you can do, for example, someEntity.components[ManipulationComponent.self].releaseBehavior = .stay (configureEntity apparently automatically adds also that component, which I discovered 5 minutes ago, because earlier I was creating and adding my own). But yes - the error is always there when it's called...
Topic: UI Frameworks SubTopic: SwiftUI Tags:
1w
Reply to ManipulationComponent + Warning messages in RealityView
Yes, once I found out that this component is causing that error (thanks to @Draiis post here) I realized that it was some other place where I messed up some relative offsets that were pushing the entities far away from me. But when I was debugging it, the first thing I noticed was that error, which sent me on a wild goose chase, because I assumed I'm doing something wrong that makes things not appear, because they "already are parented" and it can cause "unexpected behavior" :). Submitted FB20176878.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
21h