Okay after finding this question and trying what it said I made some progress. However, I am attempting to use arView.session.currentFrame.smoothedSceneDepth and not arView.session.currentFrame.estimatedDepthData.
Here is the updated extension:
extension CVPixelBuffer {
func value(from point: CGPoint) -> Float? {
let width = CVPixelBufferGetWidth(self)
let height = CVPixelBufferGetHeight(self)
let normalizedYPosition = ((point.y / UIScreen.main.bounds.height) * 1.3).clamped(0, 1.0)
let colPosition = Int(normalizedYPosition * CGFloat(height))
let rowPosition = Int(( 1 - (point.x / UIScreen.main.bounds.width)) * CGFloat(width) * 0.8)
return value(column: colPosition, row: rowPosition)
}
func value(column: Int, row: Int) -> Float? {
guard CVPixelBufferGetPixelFormatType(self) == kCVPixelFormatType_DepthFloat32 else { return nil }
CVPixelBufferLockBaseAddress(self, .readOnly)
if let baseAddress = CVPixelBufferGetBaseAddress(self) {
let width = CVPixelBufferGetWidth(self)
let index = column + (row * width)
let offset = index * MemoryLayout<Float>.stride
let value = baseAddress.load(fromByteOffset: offset, as: Float.self)
CVPixelBufferUnlockBaseAddress(self, .readOnly)
return value
}
CVPixelBufferUnlockBaseAddress(self, .readOnly)
return nil
}
}
Note that point.y is associated with column position and point.x is associated with row position, so the buffer appears to be rotated relative to the view.
I suspect there is some conversion between coordinate spaces that I need to be doing that I am unaware of.
To get this close to having it working I had to multiply the normalized Y position by 1.3 and the X position by 0.8, as well as invert the X axis by subtracting from 1.
The app still consistently crashes on this line:
let value = baseAddress.load(fromByteOffset: offset, as: Float.self)
Topic:
Spatial Computing
SubTopic:
ARKit
Tags: