Post

Replies

Boosts

Views

Activity

Reality Converter fails to convert FBX on M1 Mac
Does Apple have any documentation on using Reality Converter to convert FBX to USDZ on an M1 Max? I'm trying to convert an .fbx file to USDZ with Apple's Reality Converter on an M1 Mac (macOS 12.3 Beta), but everything I've tried so far has failed. When I try to convert .fbx files on my Intel-based iMac Pro, it succeeds. Following some advice on these forums, I tried to install all packages from Autodesk https://www.autodesk.com/developer-network/platform-technologies/fbx-sdk-2020-0 FBX SDK 2020.0.1 Clang FBX Python SDK Mac FBX SDK 2020.0.1 Python Mac FBX Extensions SDK 2020.0.1 Mac Still no joy. I have a work around - I still have my Intel-based iMac. But I'd like to switch over to my M1 Mac for all my development. Any pointers? Note: I couldn't get the usdzconvert command line tool to work on my M1 Mac either. /usr/bin/python isn't there.
7
0
4.2k
May ’22
flickering, double vision with raycast on iPadOS 15.0
In ARKit+RealityKit I do a raycast from the ARView's center, then create an AnchorEntity at the result and add a target ModelEntity (a flattened cube) to the AnchorEntity. guard let result = session.raycast(query).first else { return } let newAnchor = AnchorEntity(raycastResult: result) newAnchor.addChild(placementTargetEntity) arView.scene.addAnchor(newAnchor) I repeat this for each frame update via the ARSessionDelegate session(_:didUpdate:), removing the previous AnchorEntity first. I use this as a target to let the user know where the full model will be placed when they tap the screen. This works find under iOS 14, but I get strange results with iPadOS 15 - two different placements are created on different screen updates, offset from each other and slightly rotated from each other. Has anyone else had issues with raycast() or creating an AnchorEntity from the result? Is the use of session(_:didUpdate:) via ARSessionDelegate to update virtual content considered bad style now? (I noticed in the WWDC21 they used a different mechanism to update their virtual content.) (If any Apple engineers read this, I filed a feedback with sample code and video of the issue at FB9535616)
5
0
1.2k
Aug ’21
visionOS 3D tap location offset by ~0.35m?
I have a simple visionOS app that uses a RealityView to map floors and ceilings using PlaneDetectionProvider and PlaneAnchors. I can look at a location on the floor or ceiling, tap, and place an object at that location (I am currently placing a small cube with X-Y-Z axes sticking out at the location). The tap locations are consistently about 0.35m off along the horizontal plane (it is never off vertically) from where I was looking. Has anyone else run into the issue of a spatial tap gesture resulting in a location offset from where they are looking? And if I move to different locations, the offset is the same in real space, so the offset doesn't appear to be associated with the orientation of the Apple Vision Pro (e.g. it isn't off a little to the left of the headset of where I was looking). Attached is an image showing this. I focused on the corner of the carpet (yellow circle), tapped my fingers to trigger a tap gesture in RealityView, extracted the location, and placed a purple cube at that location. I stood in 4 different locations (where the orange squares are), looked at the corner of the rug (yellow circle) and tapped. All 4 purple cubes are place at about the same location ~0.35m away from the look location. Here is how I captured the tap gesture and extracted the 3D location: var myTapGesture: some Gesture { SpatialTapGesture() .targetedToAnyEntity() .onEnded { event in let location3D = event.convert(event.location3D, from: .global, to: .scene) let entity = event.entity model.handleTap(location: location3D, entity: entity) } } Here is how I set the position of the purple cube: func handleTap(location: SIMD3<Float>, entity: Entity) { let positionEntity = Entity() positionEntity.setPosition(location, relativeTo: nil) ... }
5
0
1.6k
Apr ’24
Reality Converter scale issue
I don't know if this is an issue with Apple's Reality Converter app or Blender (I'm using 3.0 on the Mac), but when I export a model as .obj and import it to Reality Converter, the scale is off by a factor of 100. That is, the following workflow creates tiny (1/100 scale) entities: Blender > [.obj] > Reality Converter > [USDZ] But this workflow is OK: Blender > [.glb] > Reality Converter > [USDZ] Two workarounds are: export as .glb/.gltf, when exporting .obj set the scale factor to 100 in Blender Is this a known issue, or am I doing something wrong? If it is an issue, should I file a bug report?
3
0
1.9k
Dec ’21
Getting to MeshAnchor.MeshClassification from MeshAnchor?
I am working with MeshAnchors, and I am having troubles getting to the classification of the triangles/faces. This post references the MeshAnchor.Geometry, and that struct does have a property named "classifications", but it is of type GeometrySource. I cannot find any classification information in GeometrySource. Am I missing something there? I think I am looking for something of type MeshAnchor.MeshClassification, but I cannot find any structs with this as a property.
3
0
1.3k
Feb ’25
Zombie System Extensions
I had a weird case today when an endpoint system extension remained running even after I deleted the .app bundle.If I tried killing the process with "sudo kill -9 &lt;pid&gt;", the extension respawned.If I tried "sudo launchctl remove &lt;name&gt;", I was told I didn't have privilege.Searching my hard drive I found a copy of the system extension in /Macintosh HD/Library/System Extensions/...I rebooted into recovery mode, deleted the extension bundle, and restarted. Everything initially looked fine. The process did not come back.But then when I tried to re-build, re-package, re-install, and re-launch the application, the operating system complained that it could not find the system extension even though it was there in the .app bundle.The operating system seems to (A) create a cache/copy of the system extension bundle, and (my guess) (B) maintains a link to that cache location somewhere and tries to launch that cached system extension bundle.[my hacked solution was to rename the extension, including creating a new bundle ID and associated provisioning profile]Has anyone encountered a system extension that woud not die? Did you figure out how to kill it and clear out any caches of it?Thanks,
10
2
7.9k
Feb ’22
PHAsset of older photos is missing GPS info in Swift
I've been working in Swift on iOS to access images via UIImagePickerController, pulling the PHAsset from the picker delegate's "info" dictionary, and then pulling GPS information from the PHAsset. For newer photos, the asset.location is populated with GPS information. Also, with newer photos, CIImage's property dictionary has {GPS} information. So all is good with newer photos. But when I go back to images taken in 2017, asset.location is nil and there is no "{GPS} information in the CIImage. However, if I export the photo from Photos app on my Mac and then view it in Preview, there *is* GPS information. So am I missing some settings to find the GPS information in older photos using PHAsset on iOS? Thanks,
2
0
1.3k
Sep ’21
RealityKit Transform rotation: choosing clockwise vs. anti-clockwise
I'm using Transform's move(to:relativeTo:duration:timingFunction:) to rotate an Entity around the Y axis in an animated fashion (e.g., duration 2 seconds) Unfortunately, when I rotate from 6 radians (343.7*) to 6.6 radians (378.2*), the rotation does not continue anti-clockwise past 2 pi (360*) but backwards to 0.317 radians (18.2*). Is there a way to force a rotation about an axis to go in a clockwise or anti-clockwise direction when animating?
2
0
1.7k
Apr ’21
ARKit FPS drop when device gets hot?
During testing of my app the frames per second -- shown either in the Xcode debug navigator or ARView .showStatistics -- sometimes drops by half and stays down there. This low FPS will continue even when I kill the app completely and restart. However, after giving my phone a break, the fps returns to 60 fps. Does ARKit automatically throttle down FPS when the device gets too hot? If so, is there a signal my program can catch from ARKit or the OS that can tell me this is happening?
2
0
1.7k
Sep ’21
polygon count vs. triangle count?
In a previous post I asked if 100,000 polygons is still the recommended size for USDZ Quick Look models on the web. (The answer is yes) But I realize my polygons are 4-sided but are not planar, so they have to be broken down into 2 triangles when rendered. Given that, should I shoot for 50,000 polygons (i.e., 100,000 triangles)? Or does the 100,000 polygon statistic already assume polygons will be subdivided into triangles? (The models are generated from digital terrain (GeoTIFF) data, not a 3D modeling tool)
2
0
2.4k
Nov ’21
iOS 16.2: Cannot load underlying module for 'ARKit'
I have a strange warning in Xcode associated with ARKit. It isn't a big issue because there is a work around, but I am curious why this is happening and how I can avoid it in the future. I opened up an old AR project in Xcode, and the editor gave a strange error message on the "import ARKit" line saying Cannot load underlying module for 'ARKit' Despite the error message, the code continues to build and run. I've quit & restarted Xcode, rebooted the Mac, and even deleted and and redownloaded Xcode, but the error/warning was still there. Upon some additional testing, I discovered that I only get this message when targeting "iOS 16.2" but not when targeting 16.0, 16.1, 16.3, or 16.4. (I did not try pre-iOS 16) Any idea why my Xcode no longer likes ARKit on iOS 16.2 on my Mac? Development platform: Xcode: Version 14.3.1 (14E300c) macOS: 13.4 (22F66) Mac: Mac Studio 2022 iOS on iPhone 14 Pro Max: 16.5 (20F66) Screenshot:
2
1
1.4k
Aug ’23
Xcode Beta RC didn't have an option for vision simulator
I just downloaded the latest Xcode beta, Version 15.0 (15A240d) and ran into some issues: On start up, I was not given an option to download the Vision simulator. I cannot create a project targeted at visionOS I cannot build/run a hello world app for Vision. In my previous Xcode-beta (Version 15.0 beta 8 (15A5229m)), there was an option to download the vision simulator, and I can create projects for the visionOS and run the code in the vision simulator. The Xcode file downloaded was named "Xcode" instead of "Xcode-beta". I didn't want to get rid of the exiting Xcode, so I selected Keep Both. Now I have 3 Xcodes in the Applications folder Xcode Xcode copy Xcode-beta That is the only thing I see that might have been different about my install. Hardware: Mac Studio 2022 with M1 Max macOS Ventura 13.5.2 Any idea what I did wrong?
2
0
1.8k
Sep ’23