Post

Replies

Boosts

Views

Activity

Reply to Getting ShinyTV Example to Work
I have solved this problem. It seems that adding the Associated Domain credential inadvertently created a second entitlement file in a different location and the system was not properly resolving the entitlement. Once I deleted both entitlement files and started fresh, sign in worked properly.
Topic: App & System Services SubTopic: General Tags:
4d
Reply to Getting ShinyTV Example to Work
I have an Apple TV 4K (3rd generation) and was able to generate a sysdiagnose, so thanks for that clarification. This document could probably use updating because it makes no such distinction between 3rd gen devices. Nevertheless, going through the swcutil_show.txt, ShinyTV is not listed among the SharedWebCredential apps. I have also tried with another app of mine with a different domain, both with validated AASA files using swcutil. If the AASA is set up correctly and in the right place, and the app has Associated Domain entitlements, what else could be preventing it from appearing in the swcutil_show.txt file?
Topic: App & System Services SubTopic: General Tags:
6d
Reply to Getting ShinyTV Example to Work
Thanks for the suggestions. I have tried a few times streaming console data from my Apple TV, and the only items relevant to my password request are from the CompanionServices subsystem and AuthenticationServices. com.apple.swc does not show up for me. Testing with swcutil all came back positive that the file can be downloaded and that it can identify the webcredentials section, the app id, and the domain. I saw in the docs a suggestion to take a sysdiagnose but my 3rd gen doesn't support that, and tvOS shared login does not appear to work on the simulator.
Topic: App & System Services SubTopic: General Tags:
1w
Reply to Custom 3D Window Using RealityView
Hi thanks for the reply; I think i wasn’t clear about what’s going on. I have a window with a RealityView in it. Currently that RealityView presents a Reality Composer scene. When I look at that window in the compiled app, the contents sit physically in front of the actual window, and moving them back in the scene has no effect at all. Since posting this, I have experimented with doing a findEntity in the scene and pulling out a Transform which parents a ModelEntity. Doing that allows me to manipulate the depth of the ModelEntity relative to that Transform. But it is surprising that I can’t do the same thing with the scene itself; I have to extract scene elements to adjust their depth.
Nov ’24
Reply to SwiftUI and @FetchRequest - modify predicate or sort dynamically?
Extending what has already been suggested, you can modify the predicate of the wrappedValue to dynamically trigger an update to a SwiftUI list. In the example below, toggling the isFiltered @State causes an update of the List and it reloads the FetchedResults with the filtered or unfiltered predicate. @FetchRequest private var items: FetchedResults<Item> @State var isFiltered = false let filteredPredicate: NSPredicate let unfilteredPredicate: NSPredicate init(element: Element) { self.element = element      filteredPredicate = NSPredicate(format: "element == %@ && score > 0.85", element) unfilteredPredicate = NSPredicate(format: "element == %@", element)      self._items = FetchRequest<Item>(entity: Item.entity(), sortDescriptors: [NSSortDescriptor(keyPath: \Item.name, ascending: true),           predicate: unfilteredPredicate,           animation: .default) } var listItems: FetchedResults<Moment> {      get {         _moments.wrappedValue.nsPredicate = isFiltered ? filteredPredicate : unfilteredPredicate           return moments      } } var body: some View {     List {        ForEach(Array(listItems.enumerated()), id: \.element) { index, item in Text(item.name) .toolbar {                ToolbarItem {                    Button {                        withAnimation {                           isFiltered.toggle()                        }                   } label: {                        Label("Filter Items", systemImage: isFiltered ? "star.circle.fill" : "star.circle")                   }                }           } } } }
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Jun ’22
Reply to ARKit 5 Motion Capture Enabled?
I cannot speak to your difficulties. I have a pre-existing app that successfully uses the motion capture with iOS 13 and 14. In my test I ran the same app simultaneously on a device running iOS 15 beta and another running iOS 14.6. I recorded a video of each and compared them. Motion capture works on 13.5, 14, and 15, but the new precision of ARKit 5 is supposed to be limited to A14 chips
Topic: Spatial Computing SubTopic: ARKit Tags:
Jul ’21
Reply to iOS 14 change in performance cost of SCNNode creation
I ended up removing my SCNNodes instantiations finding another way to handle my situation, but I have also found that I have problems modifying a node's transform while a UIKit animation is running. It seems to cause a one-second delay in my fps every second or so, even when the node in question is not visible. I tried out the SceneKit profiling as you suggested, but at least with my current setup I am not seeing any compile events, and no clear other culprits in the event durations.
Oct ’20
Reply to ARKit Behaviors Notification
When working with RealityKit, your scene generates code with properties and methods specific to your scene. If you have a RealityComposer file called MyTestApp and a scene called MyScene, and then create a notification trigger with an identifier of HideObject, then the generated code will create a notification accessible from your scene object in your app. So for example: MyTestApp.loadMySceneAsync { result in switch result { case .success(let scene): scene.notifications.hideObject.post() default: break } }
Topic: Spatial Computing SubTopic: ARKit Tags:
Oct ’20