I can't create any breakpoint in my Xcode after I upgraded to macOS 15.4
macOS: Version 15.4 (24E248)
visionOS Simulator: 2.3
Xcode: Version 16.2 (16C5032a)
My app works well without any breakpoints.
But if I create any breakpoint it shows me this:
Couldn't find the Objective-C runtime library in loaded images.
Message from debugger: The LLDB RPC server has crashed. You may need to manually terminate your process. The crash log is located in ~/Library/Logs/DiagnosticReports and has a prefix 'lldb-rpc-server'. Please file a bug and attach the most recent crash log.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi Apple,
In VisionOS, for real-time streaming of large 3D scenes, I plan to create Metal buffers and textures in multiple threads and then use a compute shader on the main thread to copy the Metal resources into RealityKit, minimizing main thread usage. Given that most of RealityKit's default APIs require execution on the main actor (main thread), it is not ideal for streaming data. Is this approach the best way to handle streaming data and real-time rendering?
Thank you very much.
Hi
I have create the texture using CVMetalTextureCacheCreateTextureFromImage, It's looks like return a UnsafeMutablePointer<CVMetalTexture?>, and I pass it to metal shader using CVMetalTextureGetTexture. use this way looks like have memory leak, how to release the texture is correct way? I tried call CVMetalTextureCacheFlush but it not working.
Thanks
Chengliang
Part of code:
var texture: CVMetalTexture? = nil
let status = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, nil, pixelFormat, width, height, planeIndex, &texture)
...
encoder.setTexture(CVMetalTextureGetTexture(self.texture), index: 0)
I am using Frame Capture Debugging Tools works well for vertex shader functions, but I can't debug the compute function, the debug button is gray and show me Unsupported Post vertex transform data.