Post

Replies

Boosts

Views

Activity

Reply to Use CoreImage filters on Vision Pro (visionOS) view
So far, there is no API in visionOS that allows developers access to the live video feed. This is by design, and most likely to protect the user's privacy: While on an iPhone you explicitly consent to sharing your surroundings with an app by pointing your camera at things, you can't really avoid that on the Apple Vision Pro.
Topic: Spatial Computing SubTopic: ARKit Tags:
Jan ’24
Reply to CIImage.clampedToExtent() doesn't fill some edges
I guess it has to do with the affine transform you apply to the image before you do the clampedToExtent(): As you can see, the output of the transform has non-integer extent. This probably creates a row of transparent pixels at the top of the image, that is then repeated when applying clampedToExtent(). To avoid this, you can calculate the scale factors for your transform separately for x and y to ensure that the resulting pixel size is integer. Alternatively, you can apply the clamping before you apply the affine transform.
Topic: Programming Languages SubTopic: Swift Tags:
Dec ’23
Reply to New option(.memoryTarget) in CIContextOption
From the comment in the header file: A NSNumber that specifies the maximum memory footprint (in megabytes) that the CIContext allocates for render tasks. Larger values could increase memory footprint while smaller values could reduce performance. It basically sets how much memory Core Image is allowed to use (roughly) during rendering. From what I observed, the default seems to be 256 MB.
Topic: Programming Languages SubTopic: Swift Tags:
Nov ’23