What's the most effective equivalent to the accessibilityElement API when you're creating visualizations in SwiftUI using the new Canvas drawing mechanism?
When I'm drawing with Canvas, I'd know the same roughly positioning information as drawing in UIKit or AppKit and making charts visualizations, but I didn't spot an equivalent API for marking sections of the visualization to be associated with specific values. (akin to setting the frame coordinates in a UIView accessibility element)
I've only just started playing with Canvas, so I may be wrong - but it seems like a monolithic style view element that I can't otherwise easily break down into sub-components like I can by applying separate CALayers or UIViews on which to hang accessibility indicators.
For making the equivalent bar chart in straight up SwiftUI, is there a similar mechanism, or is this a place where I'd need to pop the escape hatch to UIKit or AppKit and do the relevant work there, and then host it inside a SwiftUI view?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'd love to be able to run the HTTP traces on a few specific integration-style tests that I have written using XCUITesting. Is there a way, even with external scripting, to invoke the test and have the profiler capture a trace like this?
Is any of the sample code from Explore advanced rendering with RealityKit2 used to generate the dynamic mesh resource, or the geometry modifier, available as a standalone project?
If the library I want to document is primarily tracked with a Package.swift cross-platform definition, what's the optimal way to build and render documentation?
Can I still use xcodebuild docbuild with the Package.swift format for the project, or does the documentation system require a fully-defined Xcode project wrapped around it?
Or do I need to open the Package.swift file in Xcode and then generate the docc archive from Xcode's Product menu?
Hello,
I recently downloaded the same from Creating a Game with SceneUnderstanding - https://developer.apple.com/documentation/realitykit/creating_a_game_with_sceneunderstanding, and the out-of-the-box compilation failed, with what appears to be some structural misses in RealityKit.
This is with Xcode 12.4 (12D4e) - and a number of compilation complaints appear to be Objective-C elements not being correctly bridged, or otherwise unavailable to the compiler.
Missing: ARView.RenderOptions
.session within an ARView instance
SceneUnderstanding from within ARView.Environment
.showSceneUnderstanding from ARView.DebugOptions
and a few others. I tried changing the deployment target from iOS 14.0 to iOS 14.4, clean build and rebuild, and wiping out DerivedData just to take a stab at things, but to no avail.
Is there something I can/should update to get the scene understanding elements back available from the ARView class?