This is one of the files being looked for during initialization of the RoomPlan WWDC Demo package but it cannot be found since moving to IOS 18.0. it is not anyrhere since the upgrade.
Reference is 2024-06-18 16:03:36.871062-0500 RoomPlanExampleApp[860:159744] [loading] Unable to create bundle at URL (file:///System/Library/CoreServices/SystemVersion.bundle): does not exist or not a directory (0)
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Why do I get this error almost immediately on starting my rendering pass?
Multiline
BlockQuote. 2024-05-29 20:02:22.744035-0500 RoomPlanExampleApp[491:10341] [] <<<< AVPointCloudData >>>> Fig assert: "_dataBuffer" at bail (AVPointCloudData.m:217) - (err=0)
2024-05-29 20:02:22.744455-0500 RoomPlanExampleApp[491:10341] [] <<<< AVPointCloudData >>>> Fig assert: "_dataBuffer" at bail (AVPointCloudData.m:217) - (err=0)
2024-05-29 20:05:54.079981-0500 RoomPlanExampleApp[491:10025] [CAMetalLayer nextDrawable] returning nil because allocation failed.
2024-05-29 20:05:54.080144-0500 RoomPlanExampleApp[491:10341] [] <<<< AVPointCloudData >>>> Fig assert: "_dataBuffer" at bail (AVPointCloudData.m:217) - (err=0)
I am running the RoomPlan Demo app and keep getting the above error and when I try to find someplace to get the archive in the Metal Libraries my searches come up blank. There are no files that show up in a search that contain such identifiers. A number of messages are displayed about "deprecated" interfaces also. Is it normal to send out demo apps that are hobbled in this way?
When running a modified version of the RoomPlan Demo I get frequent Session Interrupted conditions. In looking at the traces I find a status of SensorDidPause in the interruption Side of the error but am mystified as to how to determine which sensor it was that paused and how to diagnose it. It appears there is a bitmap of available and active sensor devices in the sensor info passed with the session data on the error. In looking at the error status I can see that one or two of the motion sensors have had a problem. How do I do further diagnostic checks on what the cause of the error is? I am also curious why the error occurred as soon as the AR Session for my test started via the “session.run” call. The documentation in this area seems difficult to find. Attached are traces from running the test and stack dumps for the calls. Please send me guidance on how to proceed. The device in question is an iPad iPhone(3) that is attached to the Mac mini named “Hawkeye”. There is no known direct involvement for the Hawkeye system
It appears that when a class like the following:
" class RoomCaptureViewController: UIViewController,
RoomCaptureViewDelegate,ARSCNViewDelegate,
MTKViewDelegate, ARSessionDelegate, RoomCaptureSessionDelegate. "
has multiple delegates, the ordering of the priority of each message is delivered to a delegate by a priority sensitive order based algorithm and that one message can be processed by only one delegate and not passed off to other delegates if they don't have the proper entry points. Specifically I noted that changing the order seems to result in a delegate not getting a message that it should be seeing. Is there a "handoff" call that can be made after a delegate has seen a message but needs to pass it off to another delegate for processing? This is a protocol typically utilized in Interrupt handlers for PCIe and other messaging protocols and I have not been able to find a similar capability In the voluminous documentation available for IOS and Mac systems. I would also like to know how a message is dispatched by a class to the particular delegate for which the message was intended. Is there a detailed document that explains how the messaging protocol works that is not so fragmented as to require having multiple monitors open in order to form a coherent picture of the messaging interface for Delegates belonging to a class?
Application fails almost immediately after initial entry to rendering code.
Could not locate file '.' in bundle.
Class for component already registered
Registering library () that already exists in shader manager. Library will be overwritten.
Resolving material name 'engine:BuiltinRenderGraphResources/AR/suFeatheringCreateMergedOcclusionMask.rematerial' as an asset path -- this usage is deprecated; instead provide a valid bundle
What additional packages need updates to run successfully? I have updated the Xcode and other packages available in Beta set
The definition for PolygonCorners given in the document at the index https://developer.apple.com/documentation/roomplan/capturedroom/surface/polygoncorners.
Is woefully lacking. To validly link the sides of a polygon from the definition of a corner there needs to be an incoming and outgoing line descriptor that points to the next subsequent corner of the polygon. Just like a line has two endpoints, the edges of a polygon have corners to which any corner may be linked and it is an ordered set forwards and backwards, in the case of a 2D polygon. In the case of a 3D polygon, such a corner would likely have at the very least third directional connection giving depth of that point. In a 3D geometry you may have one point that has multiple points going in another direction much like the facet cuts of a Brilliant Cut Diamond. This sibling relationship of the corner points would complicate the definition but not impossibly so.
The Dynamic library build for the Metal Library fails when built from the downloaded copy. It uses the name of the download as the directory name which has spaces and does not give a syntactically correct file name. Renaming the folder in which the build files reside seems to resolve that problem. Then the header files for the Dynamic library get an access denied error when trying to compile. Why are these demos released with such trivial problems? Surely someone has tried to run it previously and the demo should have been fixed or the short Readme should have been updated with instructions on how to set it up.
I am running a modified RoomPllan app in my test environment I get two ARSessions active, sometimes more. It appears that the first one is created by Scene Kit because it is related go ARSCNView. Who controls that and what gets processed through it? I noticed that I get a lot of Session Interruptions from Sensor Failure when I am doing World Tracking and the first one happens almost immediately.
When I get the room capture delegates fired up I start getting images to the delegate via a second session that is collecting images. How do I tell which session is the scene kit session and which one is the RoomCapture session on thee fly when it comes through the delegate? Is there a difference in the object desciptor that I can use as a differentiator? Relying on the Address of the ARSession buffer being different is okay if you get your timing right. It wasn't clear from any of the documentation that there would be TWO or more AR Sessions delivering data through the delegates. The books on the use of ARKIT are not much help in determining the partition of responsibilities between the origins. The buffer arrivals at the functions supported by the delegates do not have a clear delineation of what function is delivered through which delegate discernible from the highly fragmented documentation provided by the Developer document library. Can someone give me some guidance here? Are there sources for CLEAR documentation of what is delivered via which delegate for the various interfaces?
with the latest Xcode that runs with Mac OS 14.5 Developer Beta has messages with a time and date in them There are also some other fields of an indeterminate origin/type.
"2024-05-06 15:37:32.383996-0500 RoomPlanExampleApp[24190:1708576] [CAMetalLayerDrawable texture] should not be called after already presenting this drawable. Get a nextDrawable instead."
specifically I need to know how the string [24190:1708576] relates to a location in my application so I can act on the message. I certainly can't find the text in the "[CAMetalLayerDrawable texture]". field anywhere in the user documentation OR the Development documentation. In order for a diagnostic message to be Actionable and remedied by a user it must identify the module and source line of the initiating code and there must be accessible documentation for users to access to get an explanation of potential remedies.. This interface fails to supply enough information to diagnose the problem. The label in [CAMetalLayerDrawable texture] cannot even be found in a search of the package information attached to the Xcode Release paired with the IOS and Mac OS system releases.
A sample of some error messages that are presented in the Xcode log for executon of a program. There is nothing in the messages that will help identify a component as the origin of the message, nor is there a locatable derinition for the various labels and fields of the text. What component or even framework does this set of messages originate? Your search engine is useless because it returns gibberish. It doesn’t even follow the common behavior of SEARCH ENGINES because it takes label strings compounded from common words and searches for the common word instead of using the catenated string that is the internal variable name that is in the text.
2024-06-22 19:45:58.089943-0500 RoomPlanExampleApp[733:30145] [ClientDonation] (+[PPSClientDonation isRegisteredSubsystem:category:]) Permission denied: GenerativeFunctionMetrics / ANEInferenceOperationPrepareForEncode. I am looking for a definition of the error with a way to locate the context in which the error occurs.
2024-06-22 19:45:58.089967-0500 RoomPlanExampleApp[733:30145] [ClientDonation] (+[PPSClientDonation sendEventWithIdentifier:payload:]) Invalid inputs: payload={
aneModelPath = "/System/Library/PrivateFrameworks/RoomScanCore.framework/PrecompiledModels/lcnn_floorplan_model.bundle/H14G.bundle/main/segment_0__ane/net.hwx";
bundleIdentfier = "com.example.apple-samplecode.RoomPlanExampleApp9QSS565686";
}
2024-06-22 19:45:58.094770-0500 RoomPlanExampleApp[733:30145] [ClientDonation] (+[PPSClientDonation sendEventWithIdentifier:payload:]) Invalid inputs: payload={
bundleIdentfier = "com.example.apple-samplecode.RoomPlanExampleApp9QSS565686";
e5FunctionName = main;
numSegments = 1;
}
Precisely what is the difference between measurements from the LiDAR camera depth data and that provided by the TrueDepth camera? Where is the camera selection done so I can examine the code that sets up the LiDAR camera?
I was happy to see the improvements in the Capture Side of RoomPlan, particularly the Polygon handling for the walls. Unfortunately, the Walls with 5 Vertices (Great Room Ceilings, Garage exteriors, etc) are projected in the viewers as having 4 Vertices and the top edge of the wall is projected upwards to the highest point between the two vertices leading to the peak. I am presuming that from this manifestation there must be some presumed 4-ness of all walls that leads to use of the Extent of the uppermost point rather than a projection of the lines to the point where the lines meet at the top. What viewer can be used that properly handles this misrepresentation of the imagery? I have imagery but your download only accepts pure files and will not take a USDZ file even retyped as TXT files
it appears that the Metal Debugging interface does not support this method, at least the function hashing algorithm does not have a pattern for it in the symbol dictionary as presented. Where do we get updated C- libraries and functions that sync with the things that are presented in the Demo Kits and Samples that Apple puts in the user domain? Why does this stuff get out into the wild insufficiently tested? It seems thet the demo kits made available to users should be included in the test domain used to verify new code releases. I came from a development environment where the 6 month release cycle involved automated execution of the test suite before it went beta or anywhere else.
Do I need an entitlement to use the ARWorldTrackingConfiguration and get that data in RoomPlan enhancements being done?