AR Quick Look has two modes:
Object Mode: one can view a model in an empty space with a ground plane and a shadow
AR Mode: one can view the model in an SR context, within a real environment
Does the developer have access to this functionality (moving between camera and non-camera modes)? I'm really asking if the camera can be disabled and reenabled in the same session.
Thanks
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'd like to know the location of the pivot point of a ModelEntity, is there any way to get this? Alternatively can I get it from a USDZ file?
I want to place models in a specific location and some models have the pivot in the center while others have it at the bottom. If I can detect this I can adjust accordingly and place them correctly. I don't have control over the models, alas.
Thanks,
Spiff
Hi, When I import a USDZ from the RoomPlan demo code into Blender it results in no geometry. Xcode has no problem with the model, on the other hand, nor does Preview. Has anyone else had this issue? Apparently the Forum won't let me upload a model here.
Hi, is it possible to disable scrolling behavior in the SwiftUI List view? I'd like to take advantage of the new grouping features
List(content, children: \.children)
in List and want the list to be part of a larger scrolling view. As it stands I get an embedded scroll view for the list which is not my intent.
Thanks!
I'm recreating the ARQuickLook controller in code. One of its behaviors is to move the model to the visible center when entering Obj mode. I've hacked the ARViewContainer of the default Xcode Augmented Reality App to demonstrate what I'm trying to do.
I think that moving the entity to 0,0,0 will generally not do the right thing because the world origin will be elsewhere. What I'm not clear on is how to specify the translation for entity.move() in the code. I'm assuming I'll need to raycast using a CGPoint describing view center to obtain the appropriate translation but I'm not sure about the details. Thanks for any help with this.
struct ARViewContainer: UIViewRepresentable {
let arView = ARView(frame: .zero)
let boxAnchor = try! Experience.loadBox()
func makeUIView(context: Context) -> ARView {
arView.scene.anchors.append(boxAnchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
DispatchQueue.main.asyncAfter(deadline: .now() + 4) {
arView.environment.background = .color(.white)
arView.cameraMode = .nonAR
if let entity = boxAnchor.children.first {
let translation = SIMD3<Float>(x: 0, y: 0, z: 0 )
let transform = Transform(scale: .one, rotation: simd_quatf(), translation: translation)
entity.move(to: transform, relativeTo: nil, duration: 2, timingFunction: .easeInOut)
}
}
}
}
I'm trying to render a MKPolyline on tvOS and get a runtime exception with
_validateTextureView:531: failed assertion `cannot create View from Memoryless texture.'
This same code works fine on the iPad and I'm starting to think it just doesn't work on tvOS...unless this is a beta issue.
I don’t understand what the times are that are returned in forecasts.
guard let (current, hourly, daily) = try? await weatherService.weather(for: location, including: .current, .hourly, .daily(startDate: startDate, endDate: endDate))
I’m expecting this to be the time to which the forecast applies.
let utc = hourly.forecast[hour].date
When I convert it from UTC to local time I get something unusual for localDate
let dateFormatter = DateFormatter()
dateFormatter.timeZone = TimeZone.current
dateFormatter.dateFormat = "ha"
let localDateString = dateFormatter.string(from: utc)
What are these times? Shouldn’t the hourly be times increasing in one hour increments from startDate? Am I doing something incorrectly?
Thanks