I'm trying to render a MKPolyline on tvOS and get a runtime exception with
_validateTextureView:531: failed assertion `cannot create View from Memoryless texture.'
This same code works fine on the iPad and I'm starting to think it just doesn't work on tvOS...unless this is a beta issue.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi, is it possible to disable scrolling behavior in the SwiftUI List view? I'd like to take advantage of the new grouping features
List(content, children: \.children)
in List and want the list to be part of a larger scrolling view. As it stands I get an embedded scroll view for the list which is not my intent.
Thanks!
Hi, why doesn't this view scroll? The image is taller than the screen.
struct ImageScroll: View {
@State var uiImage: UIImage
var body: some View {
ScrollView {
Image(uiImage: uiImage)
.focusable()
}
.focusable()
}
}
Thanks,
Spiff
Hi, after rotating the iPad ARCamera.trackingState transitions to .initilizing and the video freezes. The application as a whole still functions. I'm unable to extract a relevant code example as this is embedded in a larger application. I am not seeing this behavior on iPhone. Is there any possibility that this is related to layout constraints? I see some warnings when the rotation occurs. Is there anything else I might examine?
Thank you
I am presenting a list of Items which are sections by the categoryIdentifier property. My app uses CloudKit. When I add change the categoryIdentifier for an item it is display in the correct section and other running instances of the app do the same. When restarting the app some items are grouped under an incorrect section, but the cstegoryIdentifier within the item is still as it was set. My question is what I'm doing wrong that upon restart the organization is incorrect. In case it matters, I'm setting this in the container:
container.viewContext.automaticallyMergesChangesFromParent = true
As an aside: It seems necessary to make the sectioning type optional (as is the case in the underlying entity) like this
SectionedFetchResults<String?, Item>
Though the examples don't seem to need this.
struct ContentView: View {
@Environment(\.managedObjectContext) private var viewContext
@State private var isShowingItem = false
@State private var isAddingItem = false
@SectionedFetchRequest(
sectionIdentifier: \.categoryIdentifier,
sortDescriptors: [NSSortDescriptor(keyPath: \Item.name, ascending: true)],
animation: .default) private var sectionedItems: SectionedFetchResults<String?, Item>
var body: some View {
NavigationView {
List {
ForEach(sectionedItems) { section in
Section(header: Text(Category.nameForId(section.id, context: viewContext)).font(.headline)) {
ForEach(section) { item in
NavigationLink(destination: ItemInputView(item: item, category: Category.get(identifier: item.category, context: viewContext))) {
Text(item.getName())
}
}
}
}
}
.navigationTitle("Foo")
.sheet(isPresented: $isAddingItem) {
ItemInputView(item: nil, category: Category.getDefault(context: viewContext))
}
}
.navigationViewStyle(.stack)
}
}
Using the code below to fetch data for multiple urls I encounter a number of errors such as:
2021-12-01 20:20:32.090690-0500 foo[63170:6750061] [assertion] Error acquiring assertion: <Error Domain=RBSAssertionErrorDomain Code=2 "Specified target process does not exist" UserInfo={NSLocalizedFailureReason=Specified target process does not exist}>
021-12-01 20:20:32.115662-0500 yerl[63170:6749861] [ProcessSuspension] 0x10baf8cc0 - ProcessAssertion: Failed to acquire RBS assertion 'ConnectionTerminationWatchdog' for process with PID=63200, error: Error Domain=RBSServiceErrorDomain Code=1 "target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit" UserInfo={NSLocalizedFailureReason=target is not running or doesn't have entitlement com.apple.runningboard.assertions.webkit}
I've added the following to my entitlements file:
<key>com.apple.security.network.client</key>
<true/>
with no change in the result. I gather that these are errors from a WKWebView but don't know how to resolve them.
@State private var metadataProvider: LPMetadataProvider?
...
metadataProvider?.startFetchingMetadata(for: url) { (linkMetadata, error) in
guard let linkMetadata = linkMetadata, let imageProvider = linkMetadata.iconProvider else { return }
imageProvider.loadObject(ofClass: UIImage.self) { (fetchedImage, error) in
if let error = error {
print(error.localizedDescription)
return
}
if let uiimage = fetchedImage as? UIImage {
DispatchQueue.main.async {
let image = Image(uiImage: uiimage)
self.image = image
print("cache: miss \(url.absoluteString)")
model.set(uiimage, for: url)
}
} else {
print("no image available for \(url.absoluteString)")
}
}
}
Thanks in advance.
My RealityKit app uses an ARView with camera mode .nonAR. Later it puts another ARView with camera mode .ar on top of this.
When I apply layout constraints to the second view the program aborts with the follow messages. If both views are of type .ar this doesn't occur, it is only when the first view is .nonAR and then has the second presented over it.
I have been unable so far to reproduce this behavior in a demo program to provide to you and the original code is complex and proprietary.
Does anyone know what is happening? I've seen other questions concerning situation but not under the same circumstances.
2021-12-01 17:59:11.974698-0500 MyApp[10615:6672868] -[MTLTextureDescriptorInternal validateWithDevice:], line 1325: error ‘Texture Descriptor Validation
MTLTextureDescriptor has width (4294967295) greater than the maximum allowed size of 16384.
MTLTextureDescriptor has height (4294967295) greater than the maximum allowed size of 16384.
MTLTextureDescriptor has invalid pixelFormat (0).
’
-[MTLTextureDescriptorInternal validateWithDevice:]:1325: failed assertion `Texture Descriptor Validation
MTLTextureDescriptor has width (4294967295) greater than the maximum allowed size of 16384.
MTLTextureDescriptor has height (4294967295) greater than the maximum allowed size of 16384.
MTLTextureDescriptor has invalid pixelFormat (0).
I am subclassing ARView. I'm seeing the following crash on Firebase during initialization. Unfortunately I can't reproduce it locally so I don't really know what is going on. All the devices are running iOS 15. Most of the devices have < 60 MB available RAM but not all, a few have more. It doesn't smell like a memory issue, but I mention it here... has anyone seen this?
It crashes in this thread:
Crashed: com.apple.root.default-qos
EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000000arrow_right
0
CoreRE
re::CompoundShape::~CompoundShape() + 168
1
CoreRE
re::CompoundShape::~CompoundShape() + 148
2
CoreRE
re::CollisionShapeAssetLoader::unloadAsset(void*) + 176
3
CoreRE
re::internal::AssetBackgroundLoader::unloadAsset(re::internal::AssetLoadItem&) + 168
4
CoreRE
re::internal::AssetBackgroundLoader::runIfNeeded(re::internal::AssetLoadItem&) + 88
after starting on the main three:
com.apple.main-thread0
ARKitCore
+[ARSession setRenderType:] + 1022
2RealityKit
ARView.init(frame:cameraMode:automaticallyConfigureSession:) + 740
Hi, I'm embedding the QLPreviewController in a UIViewControllerRepresentable. When I view .usdz models I don't see the AR/Object selector at the top, nor the sharing button. I have tried presenting modally with a .sheet modifier and had the same result. What do I need to do to get the controls? Thanks, code attached.
Code
Spiff
AR Quick Look has two modes:
Object Mode: one can view a model in an empty space with a ground plane and a shadow
AR Mode: one can view the model in an SR context, within a real environment
Does the developer have access to this functionality (moving between camera and non-camera modes)? I'm really asking if the camera can be disabled and reenabled in the same session.
Thanks
I'm recreating the ARQuickLook controller in code. One of its behaviors is to move the model to the visible center when entering Obj mode. I've hacked the ARViewContainer of the default Xcode Augmented Reality App to demonstrate what I'm trying to do.
I think that moving the entity to 0,0,0 will generally not do the right thing because the world origin will be elsewhere. What I'm not clear on is how to specify the translation for entity.move() in the code. I'm assuming I'll need to raycast using a CGPoint describing view center to obtain the appropriate translation but I'm not sure about the details. Thanks for any help with this.
struct ARViewContainer: UIViewRepresentable {
let arView = ARView(frame: .zero)
let boxAnchor = try! Experience.loadBox()
func makeUIView(context: Context) -> ARView {
arView.scene.anchors.append(boxAnchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
DispatchQueue.main.asyncAfter(deadline: .now() + 4) {
arView.environment.background = .color(.white)
arView.cameraMode = .nonAR
if let entity = boxAnchor.children.first {
let translation = SIMD3<Float>(x: 0, y: 0, z: 0 )
let transform = Transform(scale: .one, rotation: simd_quatf(), translation: translation)
entity.move(to: transform, relativeTo: nil, duration: 2, timingFunction: .easeInOut)
}
}
}
}
I'd like to know the location of the pivot point of a ModelEntity, is there any way to get this? Alternatively can I get it from a USDZ file?
I want to place models in a specific location and some models have the pivot in the center while others have it at the bottom. If I can detect this I can adjust accordingly and place them correctly. I don't have control over the models, alas.
Thanks,
Spiff
I’m loading a USDZ model using Entity.loadAsync(contentsOf:)
I’d like to get the dimensions of the model and I find that visualBounds(relativeTo: nil).extents returns dimensions larger than the actual dimensions while I see the correct dimensions when viewing the USDZ in Blender or when instantiating it as a MDLAsset(url:). What is the method to get the actual dimensions from an Entity?
Thanks
Thanks
I’m implementing my first Component Entity System and am having an issue. I have a requirement that some component properties be dynamic. I do not want to create a subclass that conforms to HasExampleComponent, so this was my approach. My issue is that even though the entity contains the property I can’t cast it to HasExampleComponent.
When I create the entity I set the component like this:
entity.components[ExampleComponent.self] = .init()
I'd appreciate a template for a ECS with component properties that can be updated from the app.
Thanks
public struct ExampleComponent: Component {
public var value = 0
}
public protocol HasExampleComponent: Entity {
var value: Int
}
public class ExampleSystem: System {
private static let query = EntityQuery(where: .has(ExampleComponent.self))
public required init(scene: Scene) {}
public func update(context: SceneUpdateContext) {
context.scene.performQuery(Self.query).forEach { entity in
// this won’t work because entity doesn’t conform to HasExampleComponent
entity.value += 1
}
}
}
extension Entity {
@available (iOS 15.0, *)
public var value: Int? {
get { components[RotatingComponent.self].value ?? 0}
set { components[RotatingComponent.self].value = newValue }
}
}
When using the RoomPlan UI (RoomCaptureView), one obtains the final result using
public func captureView(didPresent processedResult: CapturedRoom, error: Error?)
which then gets exported via
finalResults.export(to: url)
What is the best way to do this if only using RoomCaptureSession?
Should I just keep track if each CapturedRoom coming back in the delegate methods and use the final one?