Post

Replies

Boosts

Views

Activity

Map view in SwiftUI on visionOS: Bad performance with many markers
I am trying to build a visionOS app that uses a map as a central user interface. This works fine on high zoom levels when there are only a couple of markers present. But as soon as I zoom out and the markers number gets to hundreds or even thousands, the performance gets super, super bad. It takes seconds for the map to render, and pans are also laggy. What makes things worse is that the SwiftUI map does not support clustering yet. Has anyone found a solution to this? I found this example by Apple about how to implement clustering: https://developer.apple.com/documentation/mapkit/mkannotationview/decluttering_a_map_with_mapkit_annotation_clustering It works, but it's using UIKit and storyboards and I could not get it transformed into SwiftUI compatible code. I also found this blog post that created a neat SwiftUI integration for a clusterable map: https://www.linkedin.com/pulse/map-clustering-swiftui-dmitry-%D0%B2%D0%B5l%D0%BEv-j3x7f/ However, I wasn't able to adapt it so the map would update itself in a reactive way. I want to retrieve new data from our server if the user changes the visible region of the map and zooms in or out. I have no clue how to transfer my .onChange(of:) and .onMapCameraChange() modifiers to the UIKit world.
2
1
916
Mar ’24
visionOS: Moving window with a Map causes app to freeze
I'm developing a map-based app for visionOS. The loads map data from a server, using JSON. It works just fine, but I noticed the following effect: If I move the app's window around, it freezes; either on the first movement, or on one of the subsequent ones. The map cannot be panned anymore, and all other UI elements lose their interactivity as well. I noticed this issue before, when I was opening the map on app startup (and here it even happened without moving the window). Since I added a short delay, this was resolved. There was no log message in this case. However, when I noticed that it also happens if I move the window around, I saw that Xcode logs an error: +[UIView setAnimationsEnabled:] being called from a background thread. Performing any operation from a background thread on UIView or a subclass is not supported and may result in unexpected and insidious behavior. trace=( 0 UIKitCore 0x0000000185824a24 __42+[UIView(Animation) setAnimationsEnabled:]_block_invoke + 112 1 libdispatch.dylib 0x0000000102a327e4 _dispatch_client_callout + 16 2 libdispatch.dylib 0x0000000102a34284 _dispatch_once_callout + 84 3 UIKitCore 0x0000000185824ad8 +[UIView(Animation) performWithoutAnimation:] + 56 4 SwiftUI 0x00000001c68cf1e0 OUTLINED_FUNCTION_136 + 10376 5 SwiftUI 0x00000001c782bebc OUTLINED_FUNCTION_12 + 22864 6 SwiftUI 0x00000001c78285e8 OUTLINED_FUNCTION_12 + 8316 7 SwiftUI 0x00000001c787c288 OUTLINED_FUNCTION_20 + 39264 8 SwiftUI 0x00000001c787c2cc OUTLINED_FUNCTION_20 + 39332 9 UIKitCore 0x000000018582fc24 -[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 1496 10 QuartzCore 0x000000018a05cf00 _ZN2CA5Layer16layout_if_neededEPNS_11TransactionE + 440 11 QuartzCore 0x000000018a068ad0 _ZN2CA5Layer28layout_and_display_if_neededEPNS_11TransactionE + 124 12 QuartzCore 0x0000000189f80498 _ZN2CA7Context18commit_transactionEPNS_11TransactionEdPd + 460 13 QuartzCore 0x0000000189fb00b0 _ZN2CA11Transaction6commitEv + 652 14 VectorKit 0x00000001938ee620 _ZN2md12HoverSupport18updateHoverProxiesERKNSt3__16vectorINS1_10shared_ptrINS_5LabelEEEN3geo12StdAllocatorIS5_N3mdm9AllocatorEEEEE + 2468 15 VectorKit 0x0000000193afd1cc _ZN2md15StandardLabeler16layoutForDisplayERKNS_13LayoutContextE + 156 16 VectorKit 0x0000000193cf133c _ZN2md16CompositeLabeler16layoutForDisplayERKNS_13LayoutContextE + 52 17 VectorKit 0x0000000193abf318 _ZN2md12LabelManager6layoutERKNS_13LayoutContextEPKNS_20CartographicRendererERKNSt3__113unordered_setINS7_10shared_ptrINS_12LabelMapTileEEENS7_4hashISB_EENS7_8equal_toISB_EEN3geo12StdAllocatorISB_N3mdm9AllocatorEEEEERNS_8PassListE + 2904 18 VectorKit 0x0000000193cad464 _ZN2md9realistic16LabelRenderLayer6layoutERKNS_13LayoutContextE + 464 19 VectorKit 0x0000000193658b54 _ZNSt3__110__function6__funcIZN2md9realistic20RealisticRenderLayer5frameERNS2_13LayoutContextEE3$_0NS_9allocatorIS7_EEFvvEEclEv + 180 20 VectorKit 0x00000001936584cc ___ZN3geo9TaskQueue14queueAsyncTaskENSt3__110shared_ptrINS_4TaskEEEPU28objcproto17OS_dispatch_group8NSObject_block_invoke + 80 21 libdispatch.dylib 0x0000000102a30f98 _dispatch_call_block_and_release + 24 22 libdispatch.dylib 0x0000000102a327e4 _dispatch_client_callout + 16 23 libdispatch.dylib 0x0000000102a3aa80 _dispatch_lane_serial_drain + 916 24 libdispatch.dylib 0x0000000102a3b7c4 _dispatch_lane_invoke + 420 25 libdispatch.dylib 0x0000000102a3c794 _dispatch_workloop_invoke + 864 26 libdispatch.dylib 0x0000000102a481a0 _dispatch_root_queue_drain_deferred_wlh + 324 27 libdispatch.dylib 0x0000000102a475fc _dispatch_workloop_worker_thread + 488 28 libsystem_pthread.dylib 0x0000000103b0f924 _pthread_wqthread + 284 29 libsystem_pthread.dylib 0x0000000103b0e6e4 start_wqthread + 8 I disabled all my withAnimation() statements, and the problem persists. I also thought it might be related to my own network fetches, but I think all apply their changes on the main thread. And when I turn on network logging for my own fetching logic, I do not see any data coming in. I also do not think there should be a reason for it. How can I debug such a situation, so I know, which call actually threw this message? I'd like to know if it is my code or a bug in the SwiftUI map itself.
3
0
1k
Feb ’24
Vision Pro prescription lenses: How to get them from abroad?
We're a US company but have a founder who's on a longer trip abroad (digital nomading), not expected to come back the States soon. So we wanted to order the Vision Pro and ship it to him. However, Zeiss does not accept prescriptions from abroad. How can this be resolved? I've seen quite a number of folks from Germany already using the Vision Pro, so there must be a way to get around this limitation somehow.
1
0
578
Feb ’24
HelloWorld example: How to make the globe in immersive space interactable?
In the HelloWorld sample, there is an immersive view with a globe in it. It spins, but the user cannot spin it themselves. I have looked at the volumetric window, where the globe can be interacted with, but if I understand it correctly, this works because the whole RealityView is being rotated if the user performs a drag gesture. How could the same be accomplished for an entity inside a RealityView, in this case the globe inside the immersive view? If I just apply the dragRotation modifier, it will rotate the entire RealityView, which yields a strange result, as the globe is not centered on the world origin here, so it spins around the users head. Is there a way to either translate the entire RealityView and then spin it, or just spin an entity inside it (the globe) on user interaction? In Unity, I would just use another gameobject as a parent to the globe, translate it, and let the user spin it.
1
0
609
Feb ’24
How to use drag gestures on objects with inverted normals?
I want to build a panorama sphere around the user. The idea is that the users can interact with this panorama, i.e. pan it around and select markers placed on it, like on a map. So I set up a sphere that works like a skybox, and inverted its normal, which makes the material is inward facing, using this code I found online: import Combine import Foundation import RealityKit import SwiftUI extension Entity { func addSkybox(for skybox: Skybox) { let subscription = TextureResource .loadAsync(named: skybox.imageName) .sink(receiveCompletion: { completion in switch completion { case .finished: break case let .failure(error): assertionFailure("\(error)") } }, receiveValue: { [weak self] texture in guard let self = self else { return } var material = UnlitMaterial() material.color = .init(texture: .init(texture)) let sphere = ModelComponent(mesh: .generateSphere(radius: 5), materials: [material]) self.components.set(sphere) /// flip sphere inside out so the texture is inside self.scale *= .init(x: -1, y: 1, z: 1) self.transform.translation += SIMD3(0.0, 1.0, 0.0) }) components.set(Entity.SubscriptionComponent(subscription: subscription)) } struct SubscriptionComponent: Component { var subscription: AnyCancellable } } This works fine and is looking awesome. However, I can't get a gesture work on this. If the sphere is "normally" oriented, i.e. the user drags it "from the outside", I can do it like this: import RealityKit import SwiftUI struct ImmersiveMap: View { @State private var rotationAngle: Float = 0.0 var body: some View { RealityView { content in let rootEntity = Entity() rootEntity.addSkybox(for: .worldmap) rootEntity.components.set(CollisionComponent(shapes: [.generateSphere(radius: 5)])) rootEntity.generateCollisionShapes(recursive: true) rootEntity.components.set(InputTargetComponent()) content.add(rootEntity) } .gesture(DragGesture().targetedToAnyEntity().onChanged({ _ in log("drag gesture") })) But if the user drags it from the inside (i.e. the negative x scale is in place), I get no drag events. Is there a way to achieve this?
1
0
601
Feb ’24
Weird behaviour of the keyboard in the visionOS simulator
I noticed that the keyboard behaves pretty strangely in the visionOS simulator. We tried to add a search bar to the top of our app (ornament), including a search field. As soon as the user starts typing, the keyboard disapppears. This is not happening in Safari, so I wondering what goes wrong in our app? On our login screen, if the user presses Tab on the keyboard to get to the next field, the keyboard opens and closes again and again, so I have to restart the simulator to be able to login again. Only if I click into the fields directly, it works fine. I am wondering if we're doing something wrong here, or if this is just a bug in the simulator and will be gone on a real device?
0
0
423
Mar ’24
Converting a Unity model / prefab to UDZ
We are porting a iOS Unity AR app to native visionOS. Ideally, we want to re-use our AR models in both applications. These AR models are rather simple. But still, converting them manually would be time-consuming, especially when it gets to the shaders. Is anyone aware of any attempts to write conversion tools for this? Maybe in other ecosystems like Godot or Unreal, where folks also want to convert the proprietary Unity format to something else? I've seen there's an FBX converter, but this would not care for shaders or particles. I am basically looking for something like the Polyspatial-internal conversion tools, but without the heavy weight of all the rest of Unity. Alternatively, is there a way to export a Unity project to visionOS and then just take the models out of the Xcode project?
0
0
854
Mar ’24
Is it possible to tell Vision Pro to use just one eye for eye tracking?
I have an eye condition where my left eye is not really looking straight forward. I guess this is what makes my Vision Pro think that I am looking in a different direction (if I try typing on the keyboards, I often miss a key). So I am wondering if there is a way to set it up to use only one eye as a reference? I am using only one eye anyway, because I do not have stereo vision either.
0
0
373
Mar ’24
Example project for Image Tracking on visionOS?
I am trying to get image tracking working on visionOS, but the documentation is pretty poor. It does not show how the SwiftUI setup should look like, and also how the reference images can be provided. For the latter question: I tried to just add a folder to my Assets and use this as the reference image group, but ImageTracker did not find it.
0
0
371
Mar ’24
How to project a SwiftUI onto a 3D object in Immersive Space?
I'd like to let the user immersive in one of my views, by projecting its content on the inner side of a sphere surrounding the user. Think of a video player app that surrounds the user with video previews they can select, like a 3D version of the Netflix homescreen. The view should be fully interactable, not just a read-only view. Is this possible?
Replies
0
Boosts
2
Views
423
Activity
Feb ’24
Map view in SwiftUI on visionOS: Bad performance with many markers
I am trying to build a visionOS app that uses a map as a central user interface. This works fine on high zoom levels when there are only a couple of markers present. But as soon as I zoom out and the markers number gets to hundreds or even thousands, the performance gets super, super bad. It takes seconds for the map to render, and pans are also laggy. What makes things worse is that the SwiftUI map does not support clustering yet. Has anyone found a solution to this? I found this example by Apple about how to implement clustering: https://developer.apple.com/documentation/mapkit/mkannotationview/decluttering_a_map_with_mapkit_annotation_clustering It works, but it's using UIKit and storyboards and I could not get it transformed into SwiftUI compatible code. I also found this blog post that created a neat SwiftUI integration for a clusterable map: https://www.linkedin.com/pulse/map-clustering-swiftui-dmitry-%D0%B2%D0%B5l%D0%BEv-j3x7f/ However, I wasn't able to adapt it so the map would update itself in a reactive way. I want to retrieve new data from our server if the user changes the visible region of the map and zooms in or out. I have no clue how to transfer my .onChange(of:) and .onMapCameraChange() modifiers to the UIKit world.
Replies
2
Boosts
1
Views
916
Activity
Mar ’24
visionOS: Moving window with a Map causes app to freeze
I'm developing a map-based app for visionOS. The loads map data from a server, using JSON. It works just fine, but I noticed the following effect: If I move the app's window around, it freezes; either on the first movement, or on one of the subsequent ones. The map cannot be panned anymore, and all other UI elements lose their interactivity as well. I noticed this issue before, when I was opening the map on app startup (and here it even happened without moving the window). Since I added a short delay, this was resolved. There was no log message in this case. However, when I noticed that it also happens if I move the window around, I saw that Xcode logs an error: +[UIView setAnimationsEnabled:] being called from a background thread. Performing any operation from a background thread on UIView or a subclass is not supported and may result in unexpected and insidious behavior. trace=( 0 UIKitCore 0x0000000185824a24 __42+[UIView(Animation) setAnimationsEnabled:]_block_invoke + 112 1 libdispatch.dylib 0x0000000102a327e4 _dispatch_client_callout + 16 2 libdispatch.dylib 0x0000000102a34284 _dispatch_once_callout + 84 3 UIKitCore 0x0000000185824ad8 +[UIView(Animation) performWithoutAnimation:] + 56 4 SwiftUI 0x00000001c68cf1e0 OUTLINED_FUNCTION_136 + 10376 5 SwiftUI 0x00000001c782bebc OUTLINED_FUNCTION_12 + 22864 6 SwiftUI 0x00000001c78285e8 OUTLINED_FUNCTION_12 + 8316 7 SwiftUI 0x00000001c787c288 OUTLINED_FUNCTION_20 + 39264 8 SwiftUI 0x00000001c787c2cc OUTLINED_FUNCTION_20 + 39332 9 UIKitCore 0x000000018582fc24 -[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 1496 10 QuartzCore 0x000000018a05cf00 _ZN2CA5Layer16layout_if_neededEPNS_11TransactionE + 440 11 QuartzCore 0x000000018a068ad0 _ZN2CA5Layer28layout_and_display_if_neededEPNS_11TransactionE + 124 12 QuartzCore 0x0000000189f80498 _ZN2CA7Context18commit_transactionEPNS_11TransactionEdPd + 460 13 QuartzCore 0x0000000189fb00b0 _ZN2CA11Transaction6commitEv + 652 14 VectorKit 0x00000001938ee620 _ZN2md12HoverSupport18updateHoverProxiesERKNSt3__16vectorINS1_10shared_ptrINS_5LabelEEEN3geo12StdAllocatorIS5_N3mdm9AllocatorEEEEE + 2468 15 VectorKit 0x0000000193afd1cc _ZN2md15StandardLabeler16layoutForDisplayERKNS_13LayoutContextE + 156 16 VectorKit 0x0000000193cf133c _ZN2md16CompositeLabeler16layoutForDisplayERKNS_13LayoutContextE + 52 17 VectorKit 0x0000000193abf318 _ZN2md12LabelManager6layoutERKNS_13LayoutContextEPKNS_20CartographicRendererERKNSt3__113unordered_setINS7_10shared_ptrINS_12LabelMapTileEEENS7_4hashISB_EENS7_8equal_toISB_EEN3geo12StdAllocatorISB_N3mdm9AllocatorEEEEERNS_8PassListE + 2904 18 VectorKit 0x0000000193cad464 _ZN2md9realistic16LabelRenderLayer6layoutERKNS_13LayoutContextE + 464 19 VectorKit 0x0000000193658b54 _ZNSt3__110__function6__funcIZN2md9realistic20RealisticRenderLayer5frameERNS2_13LayoutContextEE3$_0NS_9allocatorIS7_EEFvvEEclEv + 180 20 VectorKit 0x00000001936584cc ___ZN3geo9TaskQueue14queueAsyncTaskENSt3__110shared_ptrINS_4TaskEEEPU28objcproto17OS_dispatch_group8NSObject_block_invoke + 80 21 libdispatch.dylib 0x0000000102a30f98 _dispatch_call_block_and_release + 24 22 libdispatch.dylib 0x0000000102a327e4 _dispatch_client_callout + 16 23 libdispatch.dylib 0x0000000102a3aa80 _dispatch_lane_serial_drain + 916 24 libdispatch.dylib 0x0000000102a3b7c4 _dispatch_lane_invoke + 420 25 libdispatch.dylib 0x0000000102a3c794 _dispatch_workloop_invoke + 864 26 libdispatch.dylib 0x0000000102a481a0 _dispatch_root_queue_drain_deferred_wlh + 324 27 libdispatch.dylib 0x0000000102a475fc _dispatch_workloop_worker_thread + 488 28 libsystem_pthread.dylib 0x0000000103b0f924 _pthread_wqthread + 284 29 libsystem_pthread.dylib 0x0000000103b0e6e4 start_wqthread + 8 I disabled all my withAnimation() statements, and the problem persists. I also thought it might be related to my own network fetches, but I think all apply their changes on the main thread. And when I turn on network logging for my own fetching logic, I do not see any data coming in. I also do not think there should be a reason for it. How can I debug such a situation, so I know, which call actually threw this message? I'd like to know if it is my code or a bug in the SwiftUI map itself.
Replies
3
Boosts
0
Views
1k
Activity
Feb ’24
visionOS: How to debug Safari / WkWebView?
We need to debug a website running inside a WkWebView on visionOS. To debug it, I want to connect my desktop Safari to it. However, at least in the simulator there is no option in visionOS' Safari settings to enable Web Debugging. Is this missing, or can it be found elsewhere?
Replies
1
Boosts
0
Views
891
Activity
Feb ’24
Vision Pro prescription lenses: How to get them from abroad?
We're a US company but have a founder who's on a longer trip abroad (digital nomading), not expected to come back the States soon. So we wanted to order the Vision Pro and ship it to him. However, Zeiss does not accept prescriptions from abroad. How can this be resolved? I've seen quite a number of folks from Germany already using the Vision Pro, so there must be a way to get around this limitation somehow.
Replies
1
Boosts
0
Views
578
Activity
Feb ’24
visionOS: Project SwiftUI view onto a 3D curved plane?
I'd like to map a SwiftUI view (in my case: a map) onto a 3D curved plane in immersive view, so user can literally immersive themselves into the map. The user should also be able to interact with the map, by panning it around and selecting markers. Is this possible?
Replies
0
Boosts
1
Views
783
Activity
Feb ’24
HelloWorld example: How to make the globe in immersive space interactable?
In the HelloWorld sample, there is an immersive view with a globe in it. It spins, but the user cannot spin it themselves. I have looked at the volumetric window, where the globe can be interacted with, but if I understand it correctly, this works because the whole RealityView is being rotated if the user performs a drag gesture. How could the same be accomplished for an entity inside a RealityView, in this case the globe inside the immersive view? If I just apply the dragRotation modifier, it will rotate the entire RealityView, which yields a strange result, as the globe is not centered on the world origin here, so it spins around the users head. Is there a way to either translate the entire RealityView and then spin it, or just spin an entity inside it (the globe) on user interaction? In Unity, I would just use another gameobject as a parent to the globe, translate it, and let the user spin it.
Replies
1
Boosts
0
Views
609
Activity
Feb ’24
How to use drag gestures on objects with inverted normals?
I want to build a panorama sphere around the user. The idea is that the users can interact with this panorama, i.e. pan it around and select markers placed on it, like on a map. So I set up a sphere that works like a skybox, and inverted its normal, which makes the material is inward facing, using this code I found online: import Combine import Foundation import RealityKit import SwiftUI extension Entity { func addSkybox(for skybox: Skybox) { let subscription = TextureResource .loadAsync(named: skybox.imageName) .sink(receiveCompletion: { completion in switch completion { case .finished: break case let .failure(error): assertionFailure("\(error)") } }, receiveValue: { [weak self] texture in guard let self = self else { return } var material = UnlitMaterial() material.color = .init(texture: .init(texture)) let sphere = ModelComponent(mesh: .generateSphere(radius: 5), materials: [material]) self.components.set(sphere) /// flip sphere inside out so the texture is inside self.scale *= .init(x: -1, y: 1, z: 1) self.transform.translation += SIMD3(0.0, 1.0, 0.0) }) components.set(Entity.SubscriptionComponent(subscription: subscription)) } struct SubscriptionComponent: Component { var subscription: AnyCancellable } } This works fine and is looking awesome. However, I can't get a gesture work on this. If the sphere is "normally" oriented, i.e. the user drags it "from the outside", I can do it like this: import RealityKit import SwiftUI struct ImmersiveMap: View { @State private var rotationAngle: Float = 0.0 var body: some View { RealityView { content in let rootEntity = Entity() rootEntity.addSkybox(for: .worldmap) rootEntity.components.set(CollisionComponent(shapes: [.generateSphere(radius: 5)])) rootEntity.generateCollisionShapes(recursive: true) rootEntity.components.set(InputTargetComponent()) content.add(rootEntity) } .gesture(DragGesture().targetedToAnyEntity().onChanged({ _ in log("drag gesture") })) But if the user drags it from the inside (i.e. the negative x scale is in place), I get no drag events. Is there a way to achieve this?
Replies
1
Boosts
0
Views
601
Activity
Feb ’24
visionOS: Searchbar above window, like on Safari?
I'd like to place a search bar of top of the main window of my visionOS app. It should look similar to Safari's search bar, and also show search results as the user types. How can this be accomplished?
Replies
1
Boosts
0
Views
543
Activity
Mar ’24
Weird behaviour of the keyboard in the visionOS simulator
I noticed that the keyboard behaves pretty strangely in the visionOS simulator. We tried to add a search bar to the top of our app (ornament), including a search field. As soon as the user starts typing, the keyboard disapppears. This is not happening in Safari, so I wondering what goes wrong in our app? On our login screen, if the user presses Tab on the keyboard to get to the next field, the keyboard opens and closes again and again, so I have to restart the simulator to be able to login again. Only if I click into the fields directly, it works fine. I am wondering if we're doing something wrong here, or if this is just a bug in the simulator and will be gone on a real device?
Replies
0
Boosts
0
Views
423
Activity
Mar ’24
Converting a Unity model / prefab to UDZ
We are porting a iOS Unity AR app to native visionOS. Ideally, we want to re-use our AR models in both applications. These AR models are rather simple. But still, converting them manually would be time-consuming, especially when it gets to the shaders. Is anyone aware of any attempts to write conversion tools for this? Maybe in other ecosystems like Godot or Unreal, where folks also want to convert the proprietary Unity format to something else? I've seen there's an FBX converter, but this would not care for shaders or particles. I am basically looking for something like the Polyspatial-internal conversion tools, but without the heavy weight of all the rest of Unity. Alternatively, is there a way to export a Unity project to visionOS and then just take the models out of the Xcode project?
Replies
0
Boosts
0
Views
854
Activity
Mar ’24
Is it possible to tell Vision Pro to use just one eye for eye tracking?
I have an eye condition where my left eye is not really looking straight forward. I guess this is what makes my Vision Pro think that I am looking in a different direction (if I try typing on the keyboards, I often miss a key). So I am wondering if there is a way to set it up to use only one eye as a reference? I am using only one eye anyway, because I do not have stereo vision either.
Replies
0
Boosts
0
Views
373
Activity
Mar ’24
Example project for Image Tracking on visionOS?
I am trying to get image tracking working on visionOS, but the documentation is pretty poor. It does not show how the SwiftUI setup should look like, and also how the reference images can be provided. For the latter question: I tried to just add a folder to my Assets and use this as the reference image group, but ImageTracker did not find it.
Replies
0
Boosts
0
Views
371
Activity
Mar ’24
Reality Composer Pro: How to add text to a scene (and manipulate it with code)
I would like to add text to a Reality Composer Pro scene and set the actual text via code. How can I achieve this? I haven't seen any "Text" element in the editor.
Replies
1
Boosts
0
Views
1.7k
Activity
Mar ’24
ImageTrackingProvider on visionOS: How to add tracked images over time?
I've seen that the ImageTrackingProvider allows to set the tracked images in init. But how can I add images afterwards? We have an application that loads the images dynamically at runtime.
Replies
0
Boosts
0
Views
426
Activity
Mar ’24