I thought the ARCoachingOverlayView was a nice touch, so each apps ARKit coaching was recognizable and I used it in my ARView/ARSCNView based apps.
Now with RealityView, is there any replacement planned?
Or should we just use UIViewRepresentable and wrap ARCoachingOverlayView?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Is there any way to render a RealityView to an Image/UIImage like we used to be able to do using SCNView.snapshot() ?
ImageRenderer doesn't work because it renders a SwiftUI view hierarchy, and I need the currently presented RealityView with camera background and 3D scene content the way the user sees it
I tried UIHostingController and UIGraphicsImageRenderer like
extension View {
func snapshot() -> UIImage {
let controller = UIHostingController(rootView: self)
let view = controller.view
let targetSize = controller.view.intrinsicContentSize
view?.bounds = CGRect(origin: .zero, size: targetSize)
view?.backgroundColor = .clear
let renderer = UIGraphicsImageRenderer(size: targetSize)
return renderer.image { _ in
view?.drawHierarchy(in: view!.bounds, afterScreenUpdates: true)
}
}
}
but that leads to the app freezing and sending an infinite loop of
[CAMetalLayer nextDrawable] returning nil because allocation failed.
Same thing happens when I try
return renderer.image { ctx in
view.layer.render(in: ctx.cgContext)
}
Now that SceneKit is deprecated, I didn't want to start a new app using deprecated APIs.
In courses like Compose interactive 3D content in Reality Composer Pro Realitykit Engineers recommended working with Reality Composer Pro to create RealityKit packages to embed in our Realitykit Xcode projects.
And, comparing the workflow to Unity/Unreal, I can see the reasoning since it is nice to prepare scenes/materials/assets visually.
Now when we also want to run a Xcode Cloud CI/CD pipeline this seems to come into conflict:
When adding a basic *.usdz to the RealityKitContent.rkassets folder, every build we run on Xcode cloud fails with:
Compile Reality Asset RealityKitContent.rkassets
❌realitytool requires Metal for this operation and it is not available in this build environment
I have also found this related forum post here but it was specifically about compiling a *.skybox.
Is there any standard way of efficiently showing a MTLTexture on a RealityKit Entity?
I can't find anything proper on how to , for example, generate a LowLevelTexture out of a MTLTexture. Closest match was this two year old thread.
In the old SceneKit app, we would just do
guard let material = someNode.geometry?.materials.first else { return }
material.diffuse.contents = mtlTexture
Our flow is as follows (for visualizing the currently detected object):
Camera-Stream -> CoreML Segmentation -> Send the relevant part of the MLShapedArray-Tensor to a MTLComputeShader that returns a MTLTexture -> Show the resulting texture on a 3D object to the user
I added a basic Hello World SwiftUI view to an existing UIKit project, yet I can not get the preview to work.
Usual error is:
MessageSendFailure: Message send failure for send render message to agent
==================================
| RemoteHumanReadableError: Could not connect to agent
|
| Bootstrap timeout after 8.0s waiting for connection from 'Identity(pid: 30286, sceneIdentifier: Optional("XcodePreviews-30286-133-static"))' on service com.apple.dt.uv.agent-preview-service
Neither this nor the generated report is very helpful.
I also created a new Xcode project to see if the same View works in a new test project, which it does.
If my project compiles without warnings and errors, but SwiftUI preview fails, what are the options I have left? (My deployment target is IOS14, my Xcode is the fresh Xcode 13.0)
I tried animating the scrollTo() like so, as described in the docs. - https://developer.apple.com/documentation/swiftui/scrollviewreader
swift
withAnimation {
scrollProxy.scrollTo(index, anchor: .center)
}
the result is the same as if I do
swift
withAnimation(Animation.easeIn(duration: 20)) {
scrollProxy.scrollTo(progress.currentIndex, anchor: .center)
}
I tried this using the example from the ScrollViewReader docs.
With the result that up and down scrolling has exactly the same animation.
struct ScrollingView: View {
@Namespace var topID
@Namespace var bottomID
var body: some View {
ScrollViewReader { proxy in
ScrollView {
Button("Scroll to Bottom") {
withAnimation {
proxy.scrollTo(bottomID)
}
}
.id(topID)
VStack(spacing: 0) {
ForEach(0..100) { i in
color(fraction: Double(i) / 100)
.frame(height: 32)
}
}
Button("Top") {
withAnimation(Animation.linear(duration: 20)) {
proxy.scrollTo(topID)
}
}
.id(bottomID)
}
}
}
func color(fraction: Double) - Color {
Color(red: fraction, green: 1 - fraction, blue: 0.5)
}
}
struct ScrollingView_Previews: PreviewProvider {
static var previews: some View {
ScrollingView()
}
}
Most models are only available as glb or fbx, so I usually reexport them into usdz using Blender.
When I import them into Reality Composer Pro, Mesh, Textures etc look great, but in the Animation Library subsection all I can see is one default subtree animation.
In Blender I can see all available animations and play them individually. The default subtree animation just plays the default idle animation.
In fact when I open the nonlinear animation view in Blender and select a different animation as the default animation, the exported usdz shows the newly selected animation as default subtree animation.
I can see in the Apple sample apps models can have multiple animations in their Animation Library.
I'm using the latest Blender 4.5 and the usdz exporter should be working properly?
Hey there,
When I run the following 50 lines of code in release mode, or turn Optimization on in Build-Settings Swift Compiler - Code Generation I will get the following crash.
Anyone any idea why that happens? (Xcode 13.4.1, happens on Device as well as simulator on iOS 15.5 and 15.6)
Example Project: https://github.com/Bersaelor/ResourceCrashMinimalDemo
#0 0x000000010265dd58 in assignWithCopy for Resource ()
#1 0x000000010265d73c in outlined init with copy of Resource<VoidPayload, String> ()
#2 0x000000010265d5dc in specialized Resource<>.init(url:method:query:authToken:headers:) [inlined] at /Users/konradfeiler/Source/ResourceCrashMinimalDemo/ResourceCrashMinimalDemo/ContentView.swift:51
#3 0x000000010265d584 in specialized ContentView.crash() at /Users/konradfeiler/Source/ResourceCrashMinimalDemo/ResourceCrashMinimalDemo/ContentView.swift:18
Code needed:
import SwiftUI
struct ContentView: View {
var body: some View {
Button(action: { crash() }, label: { Text("Create Resouce") })
}
/// crashes in `outlined init with copy of Resource<VoidPayload, String>`
func crash() {
let testURL = URL(string: "https://www.google.com")!
let r = Resource<VoidPayload, String>(url: testURL, method: .get, authToken: nil)
print("r: \(r)")
}
}
struct VoidPayload {}
enum HTTPMethod<Payload> {
case get
case post(Payload)
case patch(Payload)
}
struct Resource<Payload, Response> {
let url: URL
let method: HTTPMethod<Payload>
let query: [(String, String)]
let authToken: String?
let parse: (Data) throws -> Response
}
extension Resource where Response: Decodable {
init(
url: URL,
method: HTTPMethod<Payload>,
query: [(String, String)] = [],
authToken: String?,
headers: [String: String] = [:]
) {
self.url = url
self.method = method
self.query = query
self.authToken = authToken
self.parse = {
return try JSONDecoder().decode(Response.self, from: $0)
}
}
}