Post

Replies

Boosts

Views

Activity

Reply to Reality Converter fails to convert FBX on M1 Mac
Success! In another thread someone mentioned installing older versions of Autodesk software, and that did the trick for me too. (Note: This is still with Monterey 12.2 not the 12.3 beta where I initially had problems) Here is the previous thread where I found the suggestion to try older versions of Autodesk software. https://developer.apple.com/forums/thread/661927
Topic: Graphics & Games SubTopic: General Tags:
Feb ’22
Reply to If a USDZ file can play multiple animations?
In a previous project I needed to create separate USDZ files for each animation - one animation per USDZ file. I loaded the first USDZ model and kept both the model and animation. For additional animations I loaded the USDZ file, kept the animation, and dumped the model part. I could then apply the animations to the full model I loaded first.
Topic: Spatial Computing SubTopic: ARKit Tags:
Feb ’23
Reply to Y and Z axis of RealityView coordinate space appear to be flipped in visionOS Simulator
I believe this is the expected behavior and consistent with RealityKit on the iPhone when attaching an anchor to a vertical surface. Attached is an old RealityKit test I did, where I anchored the object to the wall. In this case, the green cube is along the +X axis, the red cube is the +Y axis, and the blue cube is the +Z axis. If I recall correctly, when I attached things to the ceiling, the +Y axis pointed down. In general, I believe the +Y axis is the normal from the surface.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’23
Reply to How to import a .stl 3D model into VisionOS
Can Apple's Reality Converter convert the .stl file to .usdz? (I don't think the old version could) On occasion, I've imported models into Blender, exported them as GLTF files, and then used Reality Converter to convert them to USDZ. One issue I've run into in this process is the unit size. For example, in Reality Converter, I will change the unit size from meter to centimeter and then back to meter and then export from Reality Converter. For some reason, this solves the problem of a model appearing at 1/100 the size you expect.
Topic: App & System Services SubTopic: Core OS Tags:
Jul ’23
Reply to Apple AR
My personal recommendation (as a regular developer; not someone from Apple) would be to start with Apple's WWDC sessions on Spatial Computing. The iPhone/iPad AR experiences will probably need a different user experience from Apple Vision Pro AR experiences. For example, on iPhone and iPad, because the user's hands are holding the device, they can't do much in the way of interacting with content. Also, Apple recommends AR experiences for iPhone & iPad only last for 1-2 minutes at a time for various reasons. See Apple's 2022 WWDC session Qualities of great AR experiences. After that, I recommend starting from Apple's oldest sessions (2019, at the bottom of that web page) and working forward in time. Finally, while Apple Vision Pro is the coolest platform for AR (I desperately want a dev kit), don't ignore the approximately 1 billion people with iPhones and iPads who could run your AR/VR applications if you target those platforms.
Topic: Spatial Computing SubTopic: ARKit Tags:
Jul ’23
Reply to Does RealityKit support Blend Shapes
I have yet to do this (I am working towards it), but there are a number of videos on YouTube showing how to rig Blender models to work with Apple's ARKit (on YouTube's site, search for "iPhone blendshapes" or "ARKit blendshapes"). I've also seen people willing to rig models for a nominal fee to support Apple's blendshapes. I think Apple's documentation on ARFaceAnchor is good starting point. I've also played a bit with Apple's Tracking and Visualizing Faces sample app. The code is a little out of date, but is still a good place to start.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’23
Reply to iOS 16.2: Cannot load underlying module for 'ARKit'
Just bumping this up in part because I ran into this problem again, but this time when I was trying to find the underlying problem with <> issue in a SwiftUI coordinator. I started a new project today beginning with "Augmented Reality App" template, and now I get this ARKit error with iOS 16.4 but not iOS 16.2 (or any other iOS minimum deployment). Sometimes I feel like Apple is gaslighting me. :-) This issue also prevents Xcode from knowing a variable's type, showing <> (see screenshot below), which causes additional problems. (Again, all problems go away when I target a different minimum deployment target) Values for "Minimum Deployments" value where I get the error and where I do not. iOS 16.4 - error iOS 16.3 - NO error iOS 16.2 - NO error I can replicate the error by creating a new iOS project with the template "Augmented Reality App", and then simply add "import ARKit" to the group of imports. In this example with minimum deployment set to iOS 16.4, Xcode doesn't know what type foo is. It shows <> But when I change the minimum deployment type to iOS 16.3, Xcode knows foo is of type PerspectiveCamera.
Aug ’23
Reply to 'MultipeerConnectivityService' is unavailable in visionOS?
SharePlay with visionOS appears to hide location data of other people from the app. For example, data about the other personas (outside of their immersion status) is not exposed via APIs. I am guessing this is for privacy reasons (?). I am not sure how Apple handles (or will handle) people in the same physical room. So far, I haven't seen any examples of (or WWDC videos) covering this. I look forward to some clarification and examples. One possible workaround for people in the same physical room is to anchor the virtual content to an image. Print that image on a piece of paper and place it on the floor or a table. The two participants should see the same virtual content in the same location and same orientation because it is tied to something physical (the printed paper).
Aug ’23
Reply to PerspectiveCamera in portrait and landscape modes
I have found a workaround, but I don't know if this a good design or not. In "Deployment Info" in Xcode, I set iPhone/iPad Orientation to Portrait only. Then when I rotated the device to the side, the cube doesn't change sizes. (Note: I am getting the device's eulerAngles and converting them to a quaternion and applying that to the PerspectiveCamera)
Topic: Spatial Computing SubTopic: ARKit Tags:
Aug ’23
Reply to Getting to MeshAnchor.MeshClassification from MeshAnchor?
I made some progress. When creating the SceneReconstructionProvider, specify the classifications mode. let sceneReconstruction = SceneReconstructionProvider(modes: [.classification]) Then the MeshAnchor.Geometry's classification property is set, and here are some example values count: 11342 description: GeometrySource(count: 11342, format: MTLVertexFormat(rawValue: 45)) format: 45 componentsPerVector: 1 offset: 0 stride: 1 So I am guessing the buffer contains a sequence values that map to the MeshAnchor.MeshClassification raw values. (Now I just need to figure out which MTLVertexFormat case has a raw value of 45 :-) Edit: uchar is type 45. So, the buffer contains a sequence of unsigned bytes.
Topic: Spatial Computing SubTopic: ARKit Tags:
Mar ’24