Post

Replies

Boosts

Views

Activity

Reply to Using digital crown on Vision Pro Simulator
Am I correct that XCTest is only usable in code testing? If so, how exactly would we simulate this during development? While models from Reality Composer Pro show up, their display is near identical to Mixed. How does one build a skybox or other objects which replace the user's room? Apologies for so many questions. The documentation seems to lack nuance on how to build much beyond making a model show up on a table. ImmersiveSpace(id: "ImmersiveSpace") { ImmersiveView() }.immersionStyle(selection: .constant(.full), in: .full)
Jul ’23
Reply to How To Rotate A 3D Model - Vision OS
I was able to accomplish this with a DragGesture() instead. I was having difficulties with the RotateGesture3D Model3D(named: "Purse", bundle: realityKitContentBundle){ model in model .resizable() .aspectRatio(contentMode: .fit) } placeholder: { ProgressView() } .rotation3DEffect(rotation, axis: rotationAxis) .gesture( DragGesture() .onChanged { value in // Calculate rotation angle let angle = sqrt(pow(value.translation.width, 2) + pow(value.translation.height, 2)) rotation = Angle(degrees: Double(angle)) // Calculate rotation axis let axisX = -value.translation.height / CGFloat(angle) let axisY = value.translation.width / CGFloat(angle) rotationAxis = (x: axisX, y: axisY, z: 0) } )
Topic: Spatial Computing SubTopic: ARKit Tags:
Jul ’23
Reply to How To Load a ModelEntity from a URL?
This involves a few short steps: Storing the assets on a remote server: For this purpose, you'd need to host your assets on a remote server. Halocline (https://www.haloclinetech.com/) offers a unique, dedicated platform for this specific need, and the best part is that it's free! Given my experience with it, I'd recommend checking it out. Fetching the assets in your app: Once you have a URL for your asset, you can use the provided script from Halocline (https://www.haloclinetech.com/learn/) which helps store files from remote URLs in the user’s temporary directory. Using the assets in your app: After fetching the assets, you can incorporate them into your application as if they were locally stored. let halocline = HaloclineHelper() let url = URL(string: "your-halocline-url")! let modelEntity = try await halocline.loadEntity(from: url)
Topic: Programming Languages SubTopic: Swift Tags:
Aug ’23
Reply to Using digital crown on Vision Pro Simulator
Am I correct that XCTest is only usable in code testing? If so, how exactly would we simulate this during development? While models from Reality Composer Pro show up, their display is near identical to Mixed. How does one build a skybox or other objects which replace the user's room? Apologies for so many questions. The documentation seems to lack nuance on how to build much beyond making a model show up on a table. ImmersiveSpace(id: "ImmersiveSpace") { ImmersiveView() }.immersionStyle(selection: .constant(.full), in: .full)
Replies
Boosts
Views
Activity
Jul ’23
Reply to Unintuitive VR/AR development experience
This is a large issue with the current documentation and developer tools. I have found basic Swift UI tutorials more informative. Starting with SwiftUI tutorials would be the best bet.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Jul ’23
Reply to How To Rotate A 3D Model - Vision OS
I was able to accomplish this with a DragGesture() instead. I was having difficulties with the RotateGesture3D Model3D(named: "Purse", bundle: realityKitContentBundle){ model in model .resizable() .aspectRatio(contentMode: .fit) } placeholder: { ProgressView() } .rotation3DEffect(rotation, axis: rotationAxis) .gesture( DragGesture() .onChanged { value in // Calculate rotation angle let angle = sqrt(pow(value.translation.width, 2) + pow(value.translation.height, 2)) rotation = Angle(degrees: Double(angle)) // Calculate rotation axis let axisX = -value.translation.height / CGFloat(angle) let axisY = value.translation.width / CGFloat(angle) rotationAxis = (x: axisX, y: axisY, z: 0) } )
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jul ’23
Reply to Restore window positions on visionOS
This would be helpful. When calling the Swift openURL() method, Safari is opening at the origin (below the camera) instead of in front of the user. If I open Safari via the home page, it opens in front of the user. Seems like a bug with Vision OS.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to How to load IBL resource on the fly?
How are you currently displaying the skybox? Are you using it in a RealityView?
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to How To Load a ModelEntity from a URL?
This involves a few short steps: Storing the assets on a remote server: For this purpose, you'd need to host your assets on a remote server. Halocline (https://www.haloclinetech.com/) offers a unique, dedicated platform for this specific need, and the best part is that it's free! Given my experience with it, I'd recommend checking it out. Fetching the assets in your app: Once you have a URL for your asset, you can use the provided script from Halocline (https://www.haloclinetech.com/learn/) which helps store files from remote URLs in the user’s temporary directory. Using the assets in your app: After fetching the assets, you can incorporate them into your application as if they were locally stored. let halocline = HaloclineHelper() let url = URL(string: "your-halocline-url")! let modelEntity = try await halocline.loadEntity(from: url)
Topic: Programming Languages SubTopic: Swift Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to How To Load a ModelEntity from a URL?
Disclosure: I'm affiliated with Halocline.
Topic: Programming Languages SubTopic: Swift Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to mediastreamvalidator throws `Error injecting segment data`
Have you found a solution?
Topic: Media Technologies SubTopic: Streaming Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to Spatial/3D Video export and code
Did you figure this out?
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to How/where to encode MV-HEVC stereo video?
You can encode a left/right stereoscopic video into MV-HEVC on Halocline, https://haloclinetech.com. I am directly involved in the app's production, so I would be happy to help out with any of your needs.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to Spatial/3D Video export and code
You can encode a left/right stereoscopic video into MV-HEVC on Halocline, https://haloclinetech.com. I am directly involved in its production, so I would be happy to help out with any of your needs.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to Stereoscopic Video Playback in Vision OS
Stereoscopic videos which are encoded into MV-HEVC format can be opened up in the native player as well. You can see an example of a left/right stereoscopic video encoded into MV-HEVC here: https://x.com/zacharyhandshoe/status/1713745034233749926?s=20
Topic: App & System Services SubTopic: Hardware Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to Suggested guidance for creating MV-HEVC video files?
You can film regular stereoscopic left/right content and encode it into MV-HEVC using Halocline. I am affiliated with the project, so please reach out if you have any questions or concerns.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Oct ’23