Post

Replies

Boosts

Views

Activity

SceneKit - error Thread 1: "*** -[NSPathStore2 stringByAppendingPathExtension:]: nil argument
We are trying to save scene into usdz by using scene?.write method , which seems to work as expected until iOS 17. in iOS 17 we are getting error Thread 1: "*** -[NSPathStore2 stringByAppendingPathExtension:]: nil argument which seems to be because of scenekit issue attaching StackTrace screenshot for reference we have used updated method for url in scene?.write(to : url, delegate:nil) where url has been generated using .appending(path: String) method
2
3
1.2k
Jul ’23
usdzconvert realworld scale is off
Hi,Been trying to use the command line usdzconvert script to do conversion straight from OBJ + textures to usdz, however the scaling is completely off. Even at 1000% scale up in AR view the model is still much smaller than the actual size.This also happens when we do the conversion from the gltf file instead of OBJ.This doesn't happen however when using the Reality Converter app.Has anyone else run into this? Can anyone from Apple reproduce this?cheersMarkus
2
0
1.7k
May ’21
Photogrammetry Session - new model?
Hi Apple Team, We noticed the following exciting changelog in the latest macOS 26 beta: A new algorithm significantly improves PhotogrammetrySession reconstruction quality of low-texture objects not captured with the ObjectCaptureSession front end. It will be downloaded and cached once in the background when the PhotogrammetrySession is used at runtime. If network isn’t available at that time, the old low quality model will be used until the new one can be downloaded. There is no code change needed to get this improved model. (145220451) However after trying this on the latest beta and running some tests we do not see any differences on objects with low textures such as single coloured surfaces. Is there anything we are missing? the machine is definitely connected to the internet but we have no way of knowing from the logs if the new model is being used? thanks
5
1
669
Jul ’25
Does anyone actually notice any improvements using the new ObjectCaptureSession with PhotogrammetrySession?
We have implemented all the recent additions Apple made for this on the iOS side for guided capture using Lidar and image data via ObjectCaptureSession. After the capture finishes we are sending our images to PhotogrammetrySession on macOS to reconstruct models in higher quality (Medium) than the Preview quality that is currently supported on iOS. We have now done a few side by side captures of using the new ObjectCapureSession vs using the traditional capture via the AvFoundation framework but have not seen any improvements that were claimed during the session that Apple hosted at WWDC. As a matter of fact we feel that the results are actually worse because the images obtained through the new ObjectCaptureSession aren't as high quality as the images we get from AvFoundation. Are we missing something here? Is PhotogrammetrySession on macOS not using this new additional Lidar data or have the improvements been overstated? From the documentation it is not clear at all how the new Lidar data gets stored and how that data transfers. We are using iOS 17 beta 4 and macOS Sonoma Beta 4 in our testing. Both codebases have been compiled using Xcode 15 Beta 5.
1
0
953
Jul ’23
Will you make the Object Capture GUI sample app available?
Will you make the GUI sample app that was used during the session available as well? thanks!
Replies
6
Boosts
0
Views
1.9k
Activity
Jun ’23
SceneKit - error Thread 1: "*** -[NSPathStore2 stringByAppendingPathExtension:]: nil argument
We are trying to save scene into usdz by using scene?.write method , which seems to work as expected until iOS 17. in iOS 17 we are getting error Thread 1: "*** -[NSPathStore2 stringByAppendingPathExtension:]: nil argument which seems to be because of scenekit issue attaching StackTrace screenshot for reference we have used updated method for url in scene?.write(to : url, delegate:nil) where url has been generated using .appending(path: String) method
Replies
2
Boosts
3
Views
1.2k
Activity
Jul ’23
usdzconvert realworld scale is off
Hi,Been trying to use the command line usdzconvert script to do conversion straight from OBJ + textures to usdz, however the scaling is completely off. Even at 1000% scale up in AR view the model is still much smaller than the actual size.This also happens when we do the conversion from the gltf file instead of OBJ.This doesn't happen however when using the Reality Converter app.Has anyone else run into this? Can anyone from Apple reproduce this?cheersMarkus
Replies
2
Boosts
0
Views
1.7k
Activity
May ’21
USD draco support ?
USD added draco compression support in 19.11. is this something that apple is also considering to adopt?
Replies
2
Boosts
0
Views
1.7k
Activity
Jun ’23
Object Capture Pivot point changes in macOS14
Anybody has noticed pivot issue in constructed model through object capture. Ideally pivot of object should be centre of bounding box but with new macOS changes now pivot is at 0,0,0 (below the bounding box) Here is a quick comparison Old v/s new
Replies
0
Boosts
1
Views
720
Activity
Sep ’23
Photogrammetry Session - new model?
Hi Apple Team, We noticed the following exciting changelog in the latest macOS 26 beta: A new algorithm significantly improves PhotogrammetrySession reconstruction quality of low-texture objects not captured with the ObjectCaptureSession front end. It will be downloaded and cached once in the background when the PhotogrammetrySession is used at runtime. If network isn’t available at that time, the old low quality model will be used until the new one can be downloaded. There is no code change needed to get this improved model. (145220451) However after trying this on the latest beta and running some tests we do not see any differences on objects with low textures such as single coloured surfaces. Is there anything we are missing? the machine is definitely connected to the internet but we have no way of knowing from the logs if the new model is being used? thanks
Replies
5
Boosts
1
Views
669
Activity
Jul ’25
How to keep users from resizing models in AR Quick Look?
Is there a way to lock down the scale of a model in AR quicklook?
Replies
2
Boosts
0
Views
2.9k
Activity
Jul ’21
Supported image formats for Object Capture API
Are both TIFFs and DNG (Apple ProRAW format) currently not supported?
Replies
1
Boosts
0
Views
1.1k
Activity
Dec ’21
2x Telephoto (enabled by quad-pixel sensor) - Developer Documentation?
Is there any developer documentation on how we can make use of this new 2x mode on the Iphone 14 Pro?
Replies
2
Boosts
0
Views
1.3k
Activity
Oct ’22
Does anyone actually notice any improvements using the new ObjectCaptureSession with PhotogrammetrySession?
We have implemented all the recent additions Apple made for this on the iOS side for guided capture using Lidar and image data via ObjectCaptureSession. After the capture finishes we are sending our images to PhotogrammetrySession on macOS to reconstruct models in higher quality (Medium) than the Preview quality that is currently supported on iOS. We have now done a few side by side captures of using the new ObjectCapureSession vs using the traditional capture via the AvFoundation framework but have not seen any improvements that were claimed during the session that Apple hosted at WWDC. As a matter of fact we feel that the results are actually worse because the images obtained through the new ObjectCaptureSession aren't as high quality as the images we get from AvFoundation. Are we missing something here? Is PhotogrammetrySession on macOS not using this new additional Lidar data or have the improvements been overstated? From the documentation it is not clear at all how the new Lidar data gets stored and how that data transfers. We are using iOS 17 beta 4 and macOS Sonoma Beta 4 in our testing. Both codebases have been compiled using Xcode 15 Beta 5.
Replies
1
Boosts
0
Views
953
Activity
Jul ’23