Discuss Spatial Computing on Apple Platforms.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Look to Scroll
Hello! I’m excited to see that Look to Scroll has been included in visionOS 26 Beta. I’m aiming to achieve a feature where the user’s gaze at a specific edge automatically scrolls to that position. However, I’ve experimented with ScrollView and haven’t been able to trigger this functionality. Could you advise if additional API modifiers are necessary? Thank you!
1
0
562
Jul ’25
Entities moved with Manipulation Component in visionOS Beta 4 are clipped by volume bounds
In Beta 1,2, and 3, we could pick up and inspect entities, bringing them closer while moving them outside of the bounds of a volume. As of Beta 4, these entities are now clipped by the bounds of the volume. I'm not sure if this is a bug or an intended change, but I files a Feedback report (FB19005083). The release notes don't mention a change in behavior–at least not that I can find. Is this an intentional change or a bug? Here is a video that shows the issue. https://youtu.be/ajBAaSxLL2Y In the previous versions of visionOS 26, I could move these entities out of the volume and inspect them close up. Releasing would return them to the volume. Now they are clipped as soon as they reach the end of the volume. I haven't had a chance to test with windows or with the SwiftUI modifier version of manipulation.
1
4
416
Jul ’25
Setting immerstionStyle while in immersive space breaks all entities.
I have my immersive space set up like: ImmersiveSpace(id: "Theater") { ImmersiveTeleopView() .environment(appModel) .onAppear() { appModel.immersiveSpaceState = .open } .onDisappear { appModel.immersiveSpaceState = .closed } } .immersionStyle(selection: .constant(appModel.immersionStyle.style), in: .mixed, .full) Which allows me to set the immersive style while in the space (from a Picker on a SwiftUI window). The scene responds correctly but a lot of the functionality of my immersive space is gone after the change in style; in that I am no longer able to enable/disable entities (which I also have a toggles for in the SwiftUI window). I have to exit and reenter the immersive space to regain the ability to change the enabled state of my entities. My appModel.immersionStyle is inspired by the Compositor-Services demo (although I am using a RealityView) listed in https://developer.apple.com/documentation/CompositorServices/interacting-with-virtual-content-blended-with-passthrough and looks like this: public enum IStyle: String, CaseIterable, Identifiable { case mixedStyle, fullStyle public var id: Self { self } var style: ImmersionStyle { switch self { case .mixedStyle: return .mixed case .fullStyle: return .full } } } /// Maintains app-wide state @MainActor @Observable class AppModel { // Immersion Style public var immersionStyle: IStyle = .mixedStyle
1
0
230
Oct ’25
Overlaying SwiftUI content with transparency in front of RealityView
Following up on my previous question here: https://developer.apple.com/forums/thread/774262 Having solved the clipping problem, I am now trying to overlay some content in front of the RealityView. However, it looks like any content with transparency does not render in front of the RealityView, while opaque views seem to work; placing content with transparency like glassBackgroundEffect() behind the RealityView in a ZStack causes the entire window to flicker. Additionally, my SwiftUI attachment placed in front of the stereoscopic image plane are invisible if the user look at it straight at 90 degrees. However, if the user look at it from increasing angles from the sides, the attachment gradually turns visible again. Are these behaviors expected? What is a recommended approach to overlay content in front of a RealityView? Thanks!
1
0
396
Feb ’25
Summon gesture
Can you help to write a code able to pick an element a bit far from me, then bring it near to me, flick it a bit and then send it back to its original position when I release it? Thanks a lot, Christophe
1
0
68
Apr ’25
Safari-like toolbar in visionOS
I like the toolbar visionOS's Safari uses for back & forward page, share, etc. It floats above the window. My attempt to do this with ornaments isn't as satisfying as they partially cover the window. My attempts with toolbar haven't produced visible results. Is this Safari-style toolbar for visionOS exposed by Apple in the API's? If so, could someone point me to documentation or sample code? Thanks!
1
0
229
Oct ’25
Eye tracking data access for researchers in the medical field
Hello, esteemed tech developer. I am using the Apple Vision Pro to create an AR assist system about the da Vinci Surgical Robot in a medical surgical suite, and would like to capture eye movement data with tester uniformity. Although the Apple Vision Pro has a superb infrared sensor to monitor eye movement status, Apple does not seem to have open access officially. (I'm aware of many existing discussions about this, but I was still wondering if there might be an option, particularly for research labs.)Here's my FB number.FB16603687
1
0
635
Feb ’25
Cursor display issue on attachment view in immersive space
While using Screen Mirroring in developer mode within my immersive space, I noticed an alignment issue with the computer cursor (transparent circle). When I move it toward an attachment view, the cursor remains horizontal instead of aligning with the surface of the attachment view. It shows correctly on a 2D window only wrong on attachment view. Is this behavior a bug, or could it be caused by a missing or incorrect configuration on the attachment view? Want help, thanks.
1
0
94
Apr ’25
UICollectionViewDataSourcePrefetching does not work on SwiftUI wrapped VisionOS
prefetching logic for UICollectionView on VisionOS does not work. I have set up a Standalone test repo to demonstrate this issue. This repo is basically a visionOS version of Apple's guide project on implementation of prefetching logic. in repo you will see a simple ViewController that has UICollectionView, wrapped inside UIViewControllerRepresentable. on scroll, it should print 🕊️ prefetch start on console to demonstrate func collectionView(_ collectionView: UICollectionView, prefetchItemsAt indexPaths: [IndexPath]) is called. However it never happens on VisionOS devices. With the same code it behaves correctly on iOS devices
1
0
202
Jul ’25
AVPlayer stutters when using AVPlayerItemVideoOutput
We’re trying to build a custom player for Unity. For this, we’re using AVPlayer with AVPlayerItemVideoOutput to get textures. However, we noticed that playback is not smooth and the stream often freezes. For testing, we used this 8K video: https://deovr.com/nwfnq1 The video was played using the following code: @objc public func playVideo(urlString: String) { guard let url = URL(string: urlString) else { return } let pItem = AVPlayerItem(url: url) playerItem = pItem pItem.preferredForwardBufferDuration = 10.0 let pixelBufferAttributes: [String: Any] = [ kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, kCVPixelBufferMetalCompatibilityKey as String: true, ] let output = AVPlayerItemVideoOutput( pixelBufferAttributes: pixelBufferAttributes ) pItem.add(output) playerItemObserver = pItem.observe(\.status) { [weak self] pItem, _ in guard pItem.status == .readyToPlay else { return } self?.playerItemObserver = nil self?.player.play() } player = AVPlayer(playerItem: pItem) player.currentItem?.preferredPeakBitRate = 35_000_000 } When AVPlayerItemVideoOutput is attached, the video stutters and the log looks like this: 🟢 Playback likely to keep up 🟡 Buffer ahead: 4.08s | buffer: 4.08s 🟡 Buffer ahead: 4.08s | buffer: 4.08s 🟡 Buffer ahead: -0.07s | buffer: 0.00s 🟡 Buffer ahead: 2.94s | buffer: 3.49s 🟡 Buffer ahead: 2.50s | buffer: 4.06s 🟡 Buffer ahead: 1.74s | buffer: 4.30s 🟡 Buffer ahead: 0.74s | buffer: 4.30s 🟠 Playback may stall 🛑 Buffer empty 🟡 Buffer ahead: 0.09s | buffer: 4.30s 🟠 Playback may stall 🟠 Playback may stall 🛑 Buffer empty 🟠 Playback may stall 🟣 Buffer full 🟡 Buffer ahead: 1.41s | buffer: 1.43s 🟡 Buffer ahead: 1.41s | buffer: 1.43s 🟡 Buffer ahead: 1.07s | buffer: 1.43s 🟣 Buffer full 🟡 Buffer ahead: 0.47s | buffer: 1.65s 🟠 Playback may stall 🛑 Buffer empty 🟡 Buffer ahead: 0.10s | buffer: 1.65s 🟠 Playback may stall 🟡 Buffer ahead: 1.99s | buffer: 2.03s 🟡 Buffer ahead: 1.99s | buffer: 2.03s 🟣 Buffer full 🟣 Buffer full 🟡 Buffer ahead: 1.41s | buffer: 2.00s 🟡 Buffer ahead: 0.68s | buffer: 2.27s 🟡 Buffer ahead: 0.09s | buffer: 2.27s 🟠 Playback may stall 🛑 Buffer empty 🟠 Playback may stall When we remove AVPlayerItemVideoOutput from the player, the video plays smoothly, and the output looks like this: 🟢 Playback likely to keep up 🟡 Buffer ahead: 1.94s | buffer: 1.94s 🟡 Buffer ahead: 1.94s | buffer: 1.94s 🟡 Buffer ahead: 1.22s | buffer: 2.22s 🟡 Buffer ahead: 1.05s | buffer: 3.05s 🟡 Buffer ahead: 1.12s | buffer: 4.12s 🟡 Buffer ahead: 1.18s | buffer: 5.18s 🟡 Buffer ahead: 0.72s | buffer: 5.72s 🟡 Buffer ahead: 1.27s | buffer: 7.28s 🟡 Buffer ahead: 2.09s | buffer: 3.03s 🟡 Buffer ahead: 4.16s | buffer: 6.10s 🟡 Buffer ahead: 6.66s | buffer: 7.09s 🟡 Buffer ahead: 5.66s | buffer: 7.09s 🟡 Buffer ahead: 4.66s | buffer: 7.09s 🟡 Buffer ahead: 4.02s | buffer: 7.45s 🟡 Buffer ahead: 3.62s | buffer: 8.05s 🟡 Buffer ahead: 2.62s | buffer: 8.05s 🟡 Buffer ahead: 2.49s | buffer: 3.53s 🟡 Buffer ahead: 2.43s | buffer: 3.38s 🟡 Buffer ahead: 1.90s | buffer: 3.85s We’ve tried different attribute settings for AVPlayerItemVideoOutput. We also removed all logic related to reading frame data, but the choppy playback still remained. Can you advise whether this is a player issue or if we’re doing something wrong?
1
0
403
Oct ’25
Displaying multiple immersive movies in spheres in an immersive environment
In visionOS, I'm trying to create an immersive environment which would feature several spheres in which immersive movies are visible. I'm starting from a sample code which creates a sphere, sets an immersive movie as its material, and opens it as an immersive environment. This works fine. But if I create a sphere in an open immersive environment using Reality Composer Pro and sets its material to an immersive movie, I can see the movie on the sphere while I move outside of it but if I try to get inside the sphere, it disappears. What would be the right way of doing this ?
1
1
707
Oct ’25
Independent gestures for multiple entities in RealityView
Hi, I'm developing an app for the Apple Vision Pro. Inside the app the user should be able to load objects from the web into the scene and then be able to move them around (dragging and rotating) via gestures. My question: I'm working with RealityKit and use RealityView.. I have no issues loading in one object and making it interactive by adding gestures to the entire RealityView via the .gestures() function. Also I succeeded in loading multiple objects into the scene. My problem is that I can't figure out how to add my gestures to multiple objects independently. I can't use Reality composer since I'm loading the objects dynamically into the scene. Using .gestures() doesn't work for multiple objects since every gesture needs to be targeted to a specific entity but I have multiple entities. I also tried defining a GestureComponent and adding it to every newly loaded entity but that doesn't seem to work as nothing happens, even though my gesture is targeted to every entity having my GestureComponent. I only found solutions that are not usable on Visionos / RealityView like installGestures. Also I tried following this guide: https://developer.apple.com/documentation/realitykit/transforming-realitykit-entities-with-gestures But I feel like there are things missing and contradictory, like a extension of RealityView is mentioned but not shown and the guide states that we don't need to store the translation values for every entity and therefore creates a EntityGestureState.swift file to store these values but then it's not used and a GestureStateComponent is used instead that was never mentioned and contradicts what was just said about not storing values per entity because a new instance of it is created for every entity. Nevermind, following the guide didn't lead me to a working solution.
1
0
341
Feb ’25
Is `ParticleEmitterComponent` implemented for `RealityKit` on iOS?
Hi there, I was looking to add a particle emitter to my augmented reality app I'm developing using RealityKit. I'm targeting iOS. I noticed in the documentation for the ParticleEmitterComponent that it looks like iOS 18.0+ is supported, but when I try to use the ParticleEmitterComponent in my code in XCode, I get an error that it isn't found. Furthermore, this StackOverflow post seems to indicate that particle systems are not available for iOS. Would it be possible to get clarification on this?
1
0
145
May ’25
ManipulationComponent in both parent and child entities
Hello, In my project, I have attached a ManipulationComponent to Entity A and as expected, I'm able interact with it using the built-in gestures. I have another Entity B which is a child of A that I would like to interact with as well, so I attempted to add a ManipulationComponent to B. However, no gestures seem to be registered on B; I can still interact with A but B cannot be interacted with despite having ManipulationComponents on both entities. So I'm wondering if I'm just doing something wrong, if this is an issue with the ManipulationComponent, or if this is a limitation of the API. Attached is the code used to add the ManipulationComponent to an Entity and it was done on both A and B: let mc = ManipulationComponent() model.components.set(mc) var boxShape = ShapeResource.generateBox(width: 0.25, height: 0.05, depth: 0.25) boxShape = boxShape.offsetBy(translation: simd_float3(0, -0.05, -0.25)) ManipulationComponent.configureEntity(model, collisionShapes: [boxShape]) if var mc = model.components[ManipulationComponent.self] { mc.releaseBehavior = .stay mc.dynamics.inertia = .low model.components.set(mc) } I am using visionOS 26.0; let me know if there's any additional information needed.
1
0
373
Oct ’25
The folding and unfolding effect of the NBA sand table
Seeing this magical sand table, the unfolding and folding effects are similar to spreading out cards, which is very interesting. But I don't know how to achieve it. I want to see if there are any ways to achieve this effect and give some ideas. May I ask if this effect can be achieved under the existing API
1
0
66
May ’25