Post

Replies

Boosts

Views

Activity

Remote control
Hi everyone, I’m working on a concept for an iOS app that would allow a user to remotely control an Enterprise iOS device, similar to how AnyDesk or TeamViewer work on desktop. I understand that apps like TeamViewer for iOS offer screen sharing, and some level control but not a full level control. Before I invest further in development, I’d like to clarify a few points: Is there any official Apple-supported way (public or private API) to allow remote control of an iOS device? Has Apple ever approved apps that allow true remote control of iOS (not just screen sharing)? If full control is not allowed, what are the permitted alternatives (e.g. screen broadcast via ReplayKit, remote assistance mode, etc.)? Would such an app be considered for enterprise distribution only (via MDM), or is there a potential App Store path? Any insight or experience from developers who’ve tried this would be very appreciated. Thanks!
0
0
119
Jul ’25
How to trigger scene custom behaviour in Xcode 15
Hi, I'm working on an AR App. With Reality Composer and Xcode 14 I was triggering custom behaviours from with just: myScene.notifications.myBox.post() called from let myScene = try! Experience.loadBox() Now in Xcode 15 I don't have an Experience instead with the .reality file I have to use Entities so : let objectAR = try! Entity.load(named: "myProject.reality") How can I trigger my previously Reality Composer exported custom behaviour from that ?
0
1
598
Oct ’23
AR Kit Anchor
Hi, I’ve implemented an ARKit app that display an usdz object in the real world. In this scenario, the placement is via image recognition (Reality Composer Scene) Obviously when I don’t see the image (QR marker), the app could not detect the anchor and it will not place the object in the real world. Is it possibile to recognize an image (QR marker) and after placing the object on it, leave the object there ? So basically detect the marker place the object leave the object there, not depending on the image (marker) recognition Thanks
0
1
731
Sep ’23
AR Scanner
I'm trying to scan a real world object with Apple ARKit Scanner . Sometimes the scan is not perfect, so I'm wondering if I can obtain an .arobject in other ways, for example with other scanning apps, and then merge all the scans into one single more accurate scan. I know that merging is possible, in fact, during the ARKit Scanner session the app prompts me if I want to merge multiple scans, and in that case I can select previous scan from my files app, in this context I would like to add from other sources. Is it possible ? And if yes, are out there any other options to obtain an .arobject, or is that a practical way to improve the quality of object detection? Thanks
0
0
664
Jul ’23
Cannot pause and deallocate ARSession using SwiftUI and ARKit
Hi, I'm developing an AR App using Apple ARKit. At the moment in my AugmentedView I'm using the boilerplate directly provided by Apple for AR Apps template when choosing initial type of project. When I run my app in debug mode I see that I'm receiving this warning/advice in the console: ARSession is being deallocated without being paused. Please pause running sessions explicitly. Is possible to pause ARSession in SwiftUI? Because as far as I know all this stuff is managed by default by the OS. Is that correct? To notice, I have two Views, a parent View and by using a NavigationStack / NavigationLink the child "subView" Here is my code snippet for the parent View: import SwiftUI struct ArIntroView: View { var body: some View { NavigationStack{ NavigationLink(destination: AugmentedView(), label: { HStack { Text("Go to ARView") } .padding() } ) } } } struct ArIntroView_Previews: PreviewProvider { static var previews: some View { ArIntroView() } } Here is my code for the child View: import SwiftUI import RealityKit struct AugmentedView : View { @State private var showingSheet = false // add state to try explicitly end AR // @State private var isExitTriggered: Bool = false var body: some View { ZStack { ARViewContainer() // to hide toolbar in View .toolbar(.hidden, for: .tabBar) .ignoresSafeArea(.all) VStack { Spacer() HStack { Button { //toggle bottom sheet showingSheet.toggle() } label: { Text("Menu") .frame(maxWidth: .infinity) .padding() } .background() .foregroundColor(.black) .cornerRadius(10) //present the bottom sheet .sheet(isPresented: $showingSheet) { HStack{ Button { // dismiss bottom sheet showingSheet.toggle() } label: { Label("Exit", systemImage: "cross") } } .presentationDetents([.medium]) .presentationDragIndicator(.visible) } } .padding() } Spacer() } } } struct ARViewContainer: UIViewRepresentable { // @Binding var isExitTriggered: Bool let arView = ARView(frame: .zero) func makeUIView(context: Context) -> ARView { // Load the "Box" scene from the "Experience" Reality File let boxAnchor = try! Experience.loadBox() // Add the box anchor to the scene arView.scene.anchors.append(boxAnchor) return arView } // not work it will remove the View and after it will re-create it from makeUIView func updateUIView(_ uiView: ARView, context: Context) { // if isExitTriggered { // uiView.session.pause() // uiView.removeFromSuperview() // } } } #if DEBUG struct Augmented_Previews : PreviewProvider { static var previews: some View { AugmentedView() } } #endif In addition, when running the app and check the performance with Apple Instruments, during the AR session I have memory leaks, apparently with the CoreRE library here the snap: Any suggestion or critique will be welcome.
2
0
1.7k
Jun ’23