Hi, so I dug a little bit into DrawableQueue and got it working. Seems really powerful and there is apparently no measure performance hit.
However I've got one little issue: the rendered texture of my drawable looks a little too bright or oversaturated. I assume this is some kind of color mapping issue?
My setup looks like the following:
First I setup my DrawableQueue
let descriptor = TextureResource.DrawableQueue.Descriptor(
pixelFormat: .rgba8Unorm,
width: 1440,
height: 1440,
usage: .unknown,
mipmapsMode: .none
)
… let queue = try TextureResource.DrawableQueue(descriptor)
Next I setup the MTLRenderPipelineDescriptor and MTLRenderPipelineDescriptor:
let pipelineDescriptor = MTLRenderPipelineDescriptor()
pipelineDescriptor.sampleCount = 1
pipelineDescriptor.colorAttachments[0].pixelFormat = .rgba8Unorm
pipelineDescriptor.depthAttachmentPixelFormat = .invalid
…
and then at each frame I convert the currents frame pixelbuffer to a MTLTexture like in the ARKit with Metal Xcode sample.
guard
let drawable = try? drawableQueue.nextDrawable(),
let commandBuffer = commandQueue?.makeCommandBuffer(),
let renderPipelineState = renderPipelineState,
let frame = arView?.session.currentFrame
else {
return
}
// update vertex coordinates with display transform
updateImagePlane(frame: frame)
let pixelBuffer = frame.capturedImage
// convert captured image into metal textures
guard
!(CVPixelBufferGetPlaneCount(pixelBuffer) < 2),
let capturedImageTextureY = createTexture(
fromPixelBuffer: pixelBuffer,
pixelFormat: .r8Unorm,
planeIndex: 0
),
let capturedImageTextureCbCr = createTexture(
fromPixelBuffer: pixelBuffer,
pixelFormat: .rg8Unorm,
planeIndex: 1
)
else {
return
}
let renderPassDescriptor = MTLRenderPassDescriptor()
renderPassDescriptor.colorAttachments[0].texture = drawable.texture
renderPassDescriptor.colorAttachments[0].loadAction = .load
renderPassDescriptor.colorAttachments[0].storeAction = .store
renderPassDescriptor.renderTargetHeight = textureResource.width
renderPassDescriptor.renderTargetWidth = textureResource.height
guard let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else {
return
}
renderEncoder.pushDebugGroup("DrawCapturedImage")
renderEncoder.setCullMode(.none)
renderEncoder.setRenderPipelineState(renderPipelineState)
renderEncoder.setVertexBuffer(imagePlaneVertexBuffer, offset: 0, index: 0)
renderEncoder.setFragmentTexture(capturedImageTextureY, index: 1)
renderEncoder.setFragmentTexture(capturedImageTextureCbCr, index: 2)
renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
renderEncoder.endEncoding()
commandBuffer.present(drawable)
commandBuffer.commit()
in the fragment shader of my quadmapped texture I perform the ycbcrToRGBTransform.
Then finally in my CustomMaterial fragment shader I just sample the texture and display it:
[[visible]]
void cameraMappingSurfaceShader(realitykit::surface_parameters params) {
auto surface = params.surface();
float2 uv = params.geometry().uv0();
// Flip uvs vertically.
uv.y = 1.0 - uv.y;
half4 color = params.textures().custom().sample(samplerBilinear, uv);
surface.set_emissive_color(color.rgb);
}
Almost everything looks fine, it's just a slight difference in brightness.
Do I maybe need to work with a different pixel format?
As a test I also used a simple image, loaded it as a texture resource and then replaced it via DrawableQueue and metal texture with the same image. This gave me similar results (too bright).
The encoding of the display transform matrix will be the next step, but for now I'd like to get this working properly.
Thanks for any help!