The background
I'm finally working to convert my very old Mac kaleidoscope application, ScopeWorks, which was written in OpenGL and Objective-C, to a Multiplatform app in SwiftUI and Metal.
I'm using the MetalKit MTKView class, wrapped for SwiftUI as an NSViewRepresentable or UIViewRepresentable. I then provide an MTKViewDelegate that provides a draw method. The draw method fetches the current render pass descriptor, creates a command buffer, sets up a render pipeline, and does its drawing.
My renderer's makePipeline method looks like this:
func makePipeline() {
let library = device.makeDefaultLibrary()
let pipelineDesc = MTLRenderPipelineDescriptor()
pipelineDesc.vertexFunction = library?.makeFunction(name: "vertex_main")
pipelineDesc.fragmentFunction = library?.makeFunction(name: "fragment_main")
pipelineDesc.colorAttachments[0].pixelFormat = .bgra8Unorm
pipeline = try! device.makeRenderPipelineState(descriptor: pipelineDesc)
}
And my shaders look like this:
struct VertexOut {
float4 position [[position]];
float2 texCoord;
};
vertex VertexOut vertex_main(const device float2* position [[buffer(0)]],
uint vid [[vertex_id]]) {
VertexOut out;
float2 pos = position[vid];
out.position = float4(pos, 0, 1);
out.texCoord = pos * 0.5 + 0.5; // basic mapping
return out;
}
fragment float4 fragment_main(VertexOut in [[stage_in]],
texture2d<float> tex [[texture(0)]],
constant float4& color [[buffer(1)]]) {
constexpr sampler s(address::repeat, filter::linear);
// float4 texColor = tex.sample(s, in.texCoord);
// return texColor * color;
float4 textureColor = {1, 2, 3, 4};
if (all(color == textureColor)) {
return tex.sample(s, in.texCoord);
} else {
return color;
}
// Sample the texture directly — no color tint applied
return tex.sample(s, in.texCoord);
}
The first part of my MTKViewDelegate's draw method looks like this:
func draw(in view: MTKView) {
guard let drawable = view.currentDrawable,
let descriptor = view.currentRenderPassDescriptor,
let pipeline = pipeline,
let texture = texture else { return }
let commandBuffer = commandQueue.makeCommandBuffer()!
let encoder = commandBuffer.makeRenderCommandEncoder(descriptor: descriptor)!
encoder.setRenderPipelineState(pipeline)
encoder.setFragmentTexture(texture, index: 0)
descriptor.colorAttachments[0].clearColor = MTLClearColor(red: 0.0, green: 0, blue: 0, alpha: 1.0)
// Draw six equilateral triangles forming the hexagon
let radius: Float = 0.6
for i in 0..<6 {
let angle = Float(i) * (.pi / 3)
let cosA = cos(angle)
let sinA = sin(angle)
let nextA = Float(i+1) * (.pi / 3)
let cosB = cos(nextA)
let sinB = sin(nextA)
let verts: [simd_float2] = [
simd_float2(0, 0),
simd_float2(radius * cosA, radius * sinA),
simd_float2(radius * cosB, radius * sinB)
]
encoder.setVertexBytes(verts, length: MemoryLayout<simd_float2>.stride * 3, index: 0)
// Tell the fragment shader to use the texture color.
var textureColor: simd_float4 = simd_float4(1, 2, 3, 4)
encoder.setFragmentBytes(&textureColor, length: MemoryLayout<SIMD4<Float>>.stride, index: 1)
encoder.drawPrimitives(type: .triangle, vertexStart: 0, vertexCount: 3)
One of the things the existing app does is load PNG or TIFF images with an alpha channel, and then overlay parts of the image on top of themselves flipped, so you get interesting Moiré patterns in the lines in the resulting kaleidoscope.
For now I'm working on a single sample image, loading it into a texture in Metal, and just rendering it as a hexagon and drawing lines for the triangles that make up the hexagon. (For now I'm using the vertex coordinates as the texture coordinates, so I get a hexagonal part of my texture rather than a single triangular part tessellated into a hexagon. I'll fix that later.)
In both iOS and OS I set the clear color to black at the beginning of the draw function.
The issue:
The source image is mostly transparent, but with a lot of partly transparent pixels. Here's what it looks like in Photoshop, where you can see the transparent parts as a checkerboard pattern:
(I tried to crop the original image to show the approximate part that I'm rendering in a hexagon, but it's not exact. Look for the same shapes in the different images to compare them.)
When I render my hexagon in the Metal view in the iOS version of the app, it looks like it's forcing each pixel to fully opaque or fully transparent:
And in the macOS version of the app, it seems to force ALL the pixels to opaque:
I haven't shown all the setup code, because it's' a lot. Is there some rendering mode setup I'm missing in order to get it to draw the pixels into the output based on their opacity, including partial opacity?