Post

Replies

Boosts

Views

Activity

Reply to How to Convert a MTLTexture into a TextureResource?
Hi Joe, It's involved and I have not verified i'm using all the best APIs. I made an effort to ensure that Idid not make extra buffer copies. Your implementation may have a different optimal route depending on your texture source But this shows the essence of working with the drawable queue. code-block func drawNextTexture(pixelBuffer: CVPixelBuffer) { guard let textureResource = textureResource else { return } guard let drawableQueue = drawableQueue else { return } guard let scalePipelineState = scalePipelineState else { return } guard let scalePipelineDescriptor = scalePipelineDescriptor else { return } guard let commandQueue = commandQueue else { return } guard let textureCache = textureCache else { return } let srcWidth = CVPixelBufferGetWidth(pixelBuffer) let srcHeight = CVPixelBufferGetHeight(pixelBuffer) autoreleasepool { var drawableTry: TextureResource.Drawable? do { drawableTry = try drawableQueue.nextDrawable() // may stall for up to 1 second. guard drawableTry != nil else { return // no frame needed } } catch { print("Exception obtaining drawable: \(error)") return } guard let drawable = drawableTry else { return } guard let commandBuffer = commandQueue.makeCommandBuffer() else { return } var cvMetalTextureTry: CVMetalTexture? CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, nil, .bgra8Unorm_srgb, // linear color; todo try srgb srcWidth, srcHeight, 0, &cvMetalTextureTry) guard let cvMetalTexture = cvMetalTextureTry, let sourceTexture = CVMetalTextureGetTexture(cvMetalTexture) else { return } // Check if the sizes match if srcWidth == textureResource.width && srcHeight == textureResource.height { // Sizes match, use a blit command encoder to copy the data to the drawable's texture if let blitEncoder = commandBuffer.makeBlitCommandEncoder() { blitEncoder.copy(from: sourceTexture, sourceSlice: 0, sourceLevel: 0, sourceOrigin: MTLOrigin(x: 0, y: 0, z: 0), sourceSize: MTLSize(width: srcWidth, height: srcHeight, depth: 1), to: drawable.texture, destinationSlice: 0, destinationLevel: 0, destinationOrigin: MTLOrigin(x: 0, y: 0, z: 0)) blitEncoder.endEncoding() } } else { // Sizes do not match, need to scale the source texture to fit the destination texture let renderPassDescriptor = MTLRenderPassDescriptor() renderPassDescriptor.colorAttachments[0].texture = drawable.texture renderPassDescriptor.colorAttachments[0].loadAction = .clear renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColorMake(0, 0, 0, 1) // Clear to opaque black renderPassDescriptor.colorAttachments[0].storeAction = .store if let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) { renderEncoder.setRenderPipelineState(scalePipelineState) renderEncoder.setVertexBuffer(scaleVertexBuffer, offset: 0, index: 0) renderEncoder.setVertexBuffer(scaleTexCoordBuffer, offset: 0, index: 1) renderEncoder.setFragmentTexture(sourceTexture, index: 0) renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4) renderEncoder.endEncoding() } } commandBuffer.present(drawable) commandBuffer.commit() } } Good luck.
Topic: Graphics & Games SubTopic: General Tags:
Apr ’24
Reply to How to ship SDK RealityKit entity components that can be using and applied within a customer's application?
Thank you for the prompt response. I made this diagram. I show the Component being defined in my SDK and shipped in source format. The customer would them copy the component source into their app's RCP sources directory. I will test this out soon. Two Questions: First, is there a precedent for some SDK files to be shipped as source? (Yes, I guess this is like sample code.) Any recomended practices here to make this feel natural to developers? Second, can we make a feature request to allow package references, like my framework, to be added to an RCP package and have all the valid public components in the framework added to the RCP components UI? This would reduce the manual steps of app developers keeping framework components up to date. More Profit!
Jun ’24
Reply to Convert SCNGeometryElement -> MTKMesh
How did this progress? Just seeing this many years later and it still seems like scenekit has key limitation. Is finding an offramp to metalkit a typical path?
Topic: Graphics & Games SubTopic: SceneKit Tags:
Replies
Boosts
Views
Activity
Dec ’22
Reply to How to force XCode 14.3 to install on Monterey
Will the steps in the first response cause my current version of Xcode to stop working? If so, how can I get the old version back?
Replies
Boosts
Views
Activity
Aug ’23
Reply to Suggested guidance for creating MV-HEVC video files?
Any updates for how to transcode or encode these files? I have stereo frames/video and want to encode to view on apple spatial video players.
Topic: Media Technologies SubTopic: Audio Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to Reality Composer Pro | UnlitSurface and opacity maps error
@KevinTho any luck here? What did you learn or discover? Thanks in advance for sharing.
Replies
Boosts
Views
Activity
Mar ’24
Reply to Linear Interpolation
on my phone now. Appologies for terse response. 1: 3x remap nodes to make interoplate for each RgB 2: combine3 node To make r g b into rgb. i hope this helps.
Topic: Graphics & Games SubTopic: General Tags:
Replies
Boosts
Views
Activity
Apr ’24
Reply to Animated texture or Sprite Sheet
You may be able to use video material with mov.
Replies
Boosts
Views
Activity
Apr ’24
Reply to How to Convert a MTLTexture into a TextureResource?
Hi Joe, It's involved and I have not verified i'm using all the best APIs. I made an effort to ensure that Idid not make extra buffer copies. Your implementation may have a different optimal route depending on your texture source But this shows the essence of working with the drawable queue. code-block func drawNextTexture(pixelBuffer: CVPixelBuffer) { guard let textureResource = textureResource else { return } guard let drawableQueue = drawableQueue else { return } guard let scalePipelineState = scalePipelineState else { return } guard let scalePipelineDescriptor = scalePipelineDescriptor else { return } guard let commandQueue = commandQueue else { return } guard let textureCache = textureCache else { return } let srcWidth = CVPixelBufferGetWidth(pixelBuffer) let srcHeight = CVPixelBufferGetHeight(pixelBuffer) autoreleasepool { var drawableTry: TextureResource.Drawable? do { drawableTry = try drawableQueue.nextDrawable() // may stall for up to 1 second. guard drawableTry != nil else { return // no frame needed } } catch { print("Exception obtaining drawable: \(error)") return } guard let drawable = drawableTry else { return } guard let commandBuffer = commandQueue.makeCommandBuffer() else { return } var cvMetalTextureTry: CVMetalTexture? CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, nil, .bgra8Unorm_srgb, // linear color; todo try srgb srcWidth, srcHeight, 0, &cvMetalTextureTry) guard let cvMetalTexture = cvMetalTextureTry, let sourceTexture = CVMetalTextureGetTexture(cvMetalTexture) else { return } // Check if the sizes match if srcWidth == textureResource.width && srcHeight == textureResource.height { // Sizes match, use a blit command encoder to copy the data to the drawable's texture if let blitEncoder = commandBuffer.makeBlitCommandEncoder() { blitEncoder.copy(from: sourceTexture, sourceSlice: 0, sourceLevel: 0, sourceOrigin: MTLOrigin(x: 0, y: 0, z: 0), sourceSize: MTLSize(width: srcWidth, height: srcHeight, depth: 1), to: drawable.texture, destinationSlice: 0, destinationLevel: 0, destinationOrigin: MTLOrigin(x: 0, y: 0, z: 0)) blitEncoder.endEncoding() } } else { // Sizes do not match, need to scale the source texture to fit the destination texture let renderPassDescriptor = MTLRenderPassDescriptor() renderPassDescriptor.colorAttachments[0].texture = drawable.texture renderPassDescriptor.colorAttachments[0].loadAction = .clear renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColorMake(0, 0, 0, 1) // Clear to opaque black renderPassDescriptor.colorAttachments[0].storeAction = .store if let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) { renderEncoder.setRenderPipelineState(scalePipelineState) renderEncoder.setVertexBuffer(scaleVertexBuffer, offset: 0, index: 0) renderEncoder.setVertexBuffer(scaleTexCoordBuffer, offset: 0, index: 1) renderEncoder.setFragmentTexture(sourceTexture, index: 0) renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4) renderEncoder.endEncoding() } } commandBuffer.present(drawable) commandBuffer.commit() } } Good luck.
Topic: Graphics & Games SubTopic: General Tags:
Replies
Boosts
Views
Activity
Apr ’24
Reply to how to show spatial photo on my Application
Any solution here?
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Apr ’24
Reply to Metal stereo shader on Vision Pro
Any updates here? Would like a way to affect stereo layers in a metal shader on a swift UI view.
Topic: Graphics & Games SubTopic: General Tags:
Replies
Boosts
Views
Activity
Apr ’24
Reply to How to ship SDK RealityKit entity components that can be using and applied within a customer's application?
Thank you for the prompt response. I made this diagram. I show the Component being defined in my SDK and shipped in source format. The customer would them copy the component source into their app's RCP sources directory. I will test this out soon. Two Questions: First, is there a precedent for some SDK files to be shipped as source? (Yes, I guess this is like sample code.) Any recomended practices here to make this feel natural to developers? Second, can we make a feature request to allow package references, like my framework, to be added to an RCP package and have all the valid public components in the framework added to the RCP components UI? This would reduce the manual steps of app developers keeping framework components up to date. More Profit!
Replies
Boosts
Views
Activity
Jun ’24
Reply to Dev documentation search is not accurate/complete
Submitted: https://feedbackassistant.apple.com/feedback/13902697
Replies
Boosts
Views
Activity
Jun ’24
Reply to How to ship SDK RealityKit entity components that can be using and applied within a customer's application?
Adding updated diagram to clearly show the BubbleComponent Swift file is copied into the RCP package for the app. I hope this info i s helpful to others in the future.
Replies
Boosts
Views
Activity
Jun ’24
Reply to VisionOS RealityKit
I was able to solve this using the model sort order component. I can now render spatial augmented 3D lines in with my stereoscopic texture content.
Topic: Graphics & Games SubTopic: General Tags:
Replies
Boosts
Views
Activity
Jul ’24
Reply to How/where to encode MV-HEVC stereo video?
It is worth noting that the Apple MV-HEVC decoder does not currently support ALPHA. I ended up making my own video player to get spatial video with alpha.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Jul ’24
Reply to How can I access Model3D entity to set shader material effects
I did find a solution for this. I generated my USD code including my shadergraph. I then use that USD text to load my Model3D. It works pretty well and I now have scrollable list of stereo images that looks really good on Vision Pro.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Aug ’24