Post

Replies

Boosts

Views

Activity

Lack of AVCameraCalibrationData parity in RAW Photo workflows
I am requesting technical clarification and a formal feature addition regarding the availability of per-frame calibration data for Standard RAW captures on iPhone 17 and 17 Pro. The Technical Gap: Currently, AVCaptureVideoDataOutput provides a direct path to the AVCameraCalibrationData object, allowing real-time access to the Intrinsic Matrix, Radial/Tangential Distortion coefficients, and Lens Shading Maps. However, this same level of geometric transparency is missing from the AVCapturePhoto RAW/ProRAW delegate. While standard RAW files contain some metadata, they lack the physics sidecar required for precision manual alignment. Because the lens assembly in the iPhone 17 Pro is dynamic shifted constantly by Optical Image Stabilization (OIS) and high-speed Voice Coil Motors (VCM) for focus a static or factor calibration is mathematically insufficient for high-precision workflows. The Problem: Without the 1:1 hardware state at the millisecond of exposure, we cannot perform accurate geometric reconstruction from RAW stills. We are forced to choose between the high dynamic range of a RAW sensor dump and the geometric precision of the video pipeline. Final Questions: Is there a documented, supported method to force the inclusion of the AVCameraCalibrationData object or a raw metadata sidecar in the AVCapturePhoto workflow? If not, can Apple provide parity between the Video and Photo APIs so that the "rawest" data (RAW) is accompanied by the "rawest" physics (Calibration Data)? Providing the pixels without the lens geometry limits the utility of the RAW format for any technical workflow requiring sub-pixel geometric integrity.
0
0
76
9h
Compensating for IMU (accelerometer) thermal drift - getting device temperature?
I’m running into a hardware reality. MEMS sensor thermal drift. If a user zeroes out the tilt indoors at 20°C and then takes the phone outside in the cold, the accelerometer baseline shifts just enough as the device cools to throw off the readings. I want to apply a simple thermal compensation curve to the CoreMotion data to keep the "zero" perfectly level regardless of the weather. However, ProcessInfo.thermalState only gives broad buckets (nominal, fair, etc.) which doesn't help me calculate a continuous offset for a phone cooling down degree by degree. Is there any public API, or even a proxy metric, that can give me a rough battery or internal temperature integer? I don’t need high resolution decimals. Just a general device temp to offset the hardware drift. Any undocumented tricks or proxy metrics anyone has used to handle this?
1
0
154
Mar ’26
Lack of AVCameraCalibrationData parity in RAW Photo workflows
I am requesting technical clarification and a formal feature addition regarding the availability of per-frame calibration data for Standard RAW captures on iPhone 17 and 17 Pro. The Technical Gap: Currently, AVCaptureVideoDataOutput provides a direct path to the AVCameraCalibrationData object, allowing real-time access to the Intrinsic Matrix, Radial/Tangential Distortion coefficients, and Lens Shading Maps. However, this same level of geometric transparency is missing from the AVCapturePhoto RAW/ProRAW delegate. While standard RAW files contain some metadata, they lack the physics sidecar required for precision manual alignment. Because the lens assembly in the iPhone 17 Pro is dynamic shifted constantly by Optical Image Stabilization (OIS) and high-speed Voice Coil Motors (VCM) for focus a static or factor calibration is mathematically insufficient for high-precision workflows. The Problem: Without the 1:1 hardware state at the millisecond of exposure, we cannot perform accurate geometric reconstruction from RAW stills. We are forced to choose between the high dynamic range of a RAW sensor dump and the geometric precision of the video pipeline. Final Questions: Is there a documented, supported method to force the inclusion of the AVCameraCalibrationData object or a raw metadata sidecar in the AVCapturePhoto workflow? If not, can Apple provide parity between the Video and Photo APIs so that the "rawest" data (RAW) is accompanied by the "rawest" physics (Calibration Data)? Providing the pixels without the lens geometry limits the utility of the RAW format for any technical workflow requiring sub-pixel geometric integrity.
Replies
0
Boosts
0
Views
76
Activity
9h
Compensating for IMU (accelerometer) thermal drift - getting device temperature?
I’m running into a hardware reality. MEMS sensor thermal drift. If a user zeroes out the tilt indoors at 20°C and then takes the phone outside in the cold, the accelerometer baseline shifts just enough as the device cools to throw off the readings. I want to apply a simple thermal compensation curve to the CoreMotion data to keep the "zero" perfectly level regardless of the weather. However, ProcessInfo.thermalState only gives broad buckets (nominal, fair, etc.) which doesn't help me calculate a continuous offset for a phone cooling down degree by degree. Is there any public API, or even a proxy metric, that can give me a rough battery or internal temperature integer? I don’t need high resolution decimals. Just a general device temp to offset the hardware drift. Any undocumented tricks or proxy metrics anyone has used to handle this?
Replies
1
Boosts
0
Views
154
Activity
Mar ’26