I am creating a fixed-focus camera app with the focus distance at infinity (or at least 30+ feet away).
When I set lensPosition to 1.0, the images were blurry.
Some tests letting autofocus do the job showed a lensPosition of about 0.808 for my wide and telephoto lenses and 0.84 for the ultra wide lens did the trick. (iPhone 13 Max)
Will the lensPosition to focus at infinity vary between devices and lenses on that device?
Is there a way to determine the appropriate lensPosition at run time?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Does Apple have any documentation on using Reality Converter to convert FBX to USDZ on an M1 Max?
I'm trying to convert an .fbx file to USDZ with Apple's Reality Converter on an M1 Mac (macOS 12.3 Beta), but everything I've tried so far has failed.
When I try to convert .fbx files on my Intel-based iMac Pro, it succeeds.
Following some advice on these forums, I tried to install all packages from Autodesk
https://www.autodesk.com/developer-network/platform-technologies/fbx-sdk-2020-0
FBX SDK 2020.0.1 Clang
FBX Python SDK Mac
FBX SDK 2020.0.1 Python Mac
FBX Extensions SDK 2020.0.1 Mac
Still no joy.
I have a work around - I still have my Intel-based iMac. But I'd like to switch over to my M1 Mac for all my development.
Any pointers?
Note: I couldn't get the usdzconvert command line tool to work on my M1 Mac either. /usr/bin/python isn't there.
I had a weird case today when an endpoint system extension remained running even after I deleted the .app bundle.If I tried killing the process with "sudo kill -9 <pid>", the extension respawned.If I tried "sudo launchctl remove <name>", I was told I didn't have privilege.Searching my hard drive I found a copy of the system extension in /Macintosh HD/Library/System Extensions/...I rebooted into recovery mode, deleted the extension bundle, and restarted. Everything initially looked fine. The process did not come back.But then when I tried to re-build, re-package, re-install, and re-launch the application, the operating system complained that it could not find the system extension even though it was there in the .app bundle.The operating system seems to (A) create a cache/copy of the system extension bundle, and (my guess) (B) maintains a link to that cache location somewhere and tries to launch that cached system extension bundle.[my hacked solution was to rename the extension, including creating a new bundle ID and associated provisioning profile]Has anyone encountered a system extension that woud not die? Did you figure out how to kill it and clear out any caches of it?Thanks,
Is there an equivalent to MultipeerConnectivityService that implements SynchronizationService over TCP/IP connections?
I'd like to have two users in separate locations, each with a local ARAnchor but then have a synchronized RealityKit scene graph attached to their separate ARAnchors.
Is this possible?
Thanks,
I don't know if this is an issue with Apple's Reality Converter app or Blender (I'm using 3.0 on the Mac), but when I export a model as .obj and import it to Reality Converter, the scale is off by a factor of 100.
That is, the following workflow creates tiny (1/100 scale) entities:
Blender > [.obj] > Reality Converter > [USDZ]
But this workflow is OK:
Blender > [.glb] > Reality Converter > [USDZ]
Two workarounds are:
export as .glb/.gltf,
when exporting .obj set the scale factor to 100 in Blender
Is this a known issue, or am I doing something wrong?
If it is an issue, should I file a bug report?
I am finding some unexpected behavior with lights I've been adding to a RealityKit scene.
For example, I created 14 PointLights, but only 8 appeared to be used to illuminate the scene.
In another example, I created 7 PointLights and 7 SpotLights, and the frame rate dropped quite a bit.
Are lights computationally expensive, causing some adaptive behavior by RealityKit?
Should I be judicious in my use of lights for a scene?
(Note: I set arView.environment.lighting.resource to a Skybox with a black image; my goal was to completely control the lighting. I don't know if that added to the computational load)
In a previous post I asked if 100,000 polygons is still the recommended size for USDZ Quick Look models on the web. (The answer is yes)
But I realize my polygons are 4-sided but are not planar, so they have to be broken down into 2 triangles when rendered.
Given that, should I shoot for 50,000 polygons (i.e., 100,000 triangles)?
Or does the 100,000 polygon statistic already assume polygons will be subdivided into triangles?
(The models are generated from digital terrain (GeoTIFF) data, not a 3D modeling tool)
I've recently added some USDZ files to a web page, and I can download and display them fine via AR Quick Look on an iPhone or iPad.
I've noticed full occlusion is active in the AR view.
Over time, the device appears to heat up and the frame rate drops.
Are there any properties I can set in the <a rel="ar" ...> HTML tag to control things like occlusion or autofocus (i.e., turn them off)?
RealityKit has a CollisionFilter to determine which entities can collide with which other ones.
Perchance, is there something similar for OcclusionMaterial?
In effect, I'd like to have the ability to have a model with an OcclusionMaterial "occlude this entity but not that entity".
I've been creating USDA files manually and converting them to USDZ via Apple's usdzconvert tool (version 0.64).
In the file I set unit size to be 1 meter
metersPerUnit = 1.0
but the USDZ keeps the unit size at 1 cm.
Apple's Reality Converter does process the metersPerUnit metadata, so that is a viable work-around for me. But sometimes I'd prefer the command-line tool.
Is there an update to the usdzconvert tool? I couldn't find one.
During my first external test using TestFlight for an In-App Purchase (iPadOS), the user was
(1) Prompted for their Apple ID & password
(2) Prompted for their password a second time
(3) (User believes) prompted for their password a third time
Are these multiple prompts for their password expected behavior, or have I done something wrong?
I'm looking for documentation/guidance on USDZ and scene model sizes. My focus is on RealityKit-based apps.
I found the 2018 WWDC presentation
Integrating Apps and Content with AR Quick Look
which mentions a rule of thumb for a USDZ model of:
100K polygons
One set of 2048x2048 textures
10 seconds of animations
Are these number still recommended in 2021?
Are these numbers just for Quicklook, or do they apply to RealityKit-based apps too?
If a RealityKit scene loads several USDZ models, should the cumulative number of polygons across all models be 100K, or is the 100K number on a per-model basis?
The talk mentioned AR Quicklook will dynamically downsample textures for devices with less memory. Does RealityKit do this as well?
If so, can I error on providing a larger texture (e.g., 4096 x 4096) and trust RealityKit to downsample as appropriate for me?
(I am hoping there is some documentation covering questions like this)
When I create an AnchorEntity like this:
let entityAnchor = AnchorEntity(plane: [.horizontal], classification: [.floor], minimumBounds: [0.2,0.2])
and add a USDZ model to it, I get a nice ground shadow.
But if I create an AnchorEntity using an ARAnchor like this:
let entityAnchor = AnchorEntity(anchor: anchor)
I do not get that nice ground shadow.
Is there a way to get that ground shadow I get from a plane anchor but with an EntityAnchor where I can specify where it goes or attach it to an ARAnchor?
[Note: for LiDAR devices, I can get a nice shadow using
config.sceneReconstruction = .mesh
arView.environment.sceneUnderstanding.options.insert(.occlusion)
arView.environment.sceneUnderstanding.options.insert(.receivesLighting)
but creating the environment mesh is computationally expensive. I'd like to avoid that if possible.]
I've been playing with Apple's StoreKit 2 demo code (buying the cars, subscriptions, ...), and sometimes when I purchase a car, one or more of the other buttons visually flip state (e.g., purchased checkmark changes back to the price).
Leaving the StoreView and returning to it shows the correct state for each of the buttons.
I am using the StoreKit Configuration Products.storekit (for the scheme), so testing in Xcode.
I get this in both the simulator and on my actual phone.
The issue is random. The vast majority of the time everything works perfectly.
Is anyone else seeing this issue?
Does anyone know how to address it?
Dev environment:
Xcode 13.0 beta 5 (13A5212g)
macOS 12.0 Beta (21A5534d)
Mac mini (M1, 2020)
I've been working in Swift on iOS to access images via UIImagePickerController, pulling the PHAsset from the picker delegate's "info" dictionary, and then pulling GPS information from the PHAsset.
For newer photos, the asset.location is populated with GPS information.
Also, with newer photos, CIImage's property dictionary has {GPS} information.
So all is good with newer photos.
But when I go back to images taken in 2017, asset.location is nil and there is no "{GPS} information in the CIImage.
However, if I export the photo from Photos app on my Mac and then view it in Preview, there *is* GPS information.
So am I missing some settings to find the GPS information in older photos using PHAsset on iOS?
Thanks,