Post

Replies

Boosts

Views

Activity

Reply to AVCaptureVideoDataOutput stops delivering frames in iOS 14
Found it: I'm getting the attachments (metadata) of the incoming sample buffers and accidentally leaked the dictionary due to wrong bridging. So instead of this NSDictionary* attachments = (_bridge NSDictionary* Nullable)CMCopyDictionaryOfAttachments(NULL, sampleBuffer, kCMAttachmentModeShouldPropagate); I should have done this NSDictionary* attachments = (bridgetransfer NSDictionary* Nullable)CMCopyDictionaryOfAttachments(NULL, sampleBuffer, kCMAttachmentModeShouldPropagate); Interestingly, this leak caused the capture session to stop delivering new sample buffers after 126 frames—without any warning, error, or notification.
Topic: App & System Services SubTopic: Core OS Tags:
Sep ’20
Reply to How to extract the individual sub images from an heic image
I'm not sure if you are able to access sub-images using NSImage. However, you should be able to do so with a CGImageSource: let source = CGImageSourceCreateWithURL(newURL, nil) let numSubImages = CGImageSourceGetCount(source) for i in 0..<numSubImages { &#9;&#9;let subImage = CGImageSourceCreateImageAtIndex(source, i, nil) &#9;&#9;// subImage is a CGImage, you can convert it to an NSImage if you prefer: &#9;&#9;let nsImage = NSImage(cgImage: subImage, size: NSZeroSize) &#9;&#9;// handle image... }
Topic: Programming Languages SubTopic: Swift Tags:
Feb ’21
Reply to How to correctly handle HDR10 in custom Metal Core Image Kernel?
In your kernel, colors are usually normalized to [0.0 ... 1.0], based on the underlying color space. So even if values are stored in 10-bit inters in a texture, your shader will get them as normalized floats. I emphasized the color space above because it is used when translating the colors from the source into those normalized values. When you are using the default sRGB color space, the wide gamut from the HDR source doesn't fit into the sRGB [0.0 ... 1.0] spectrum. That's why you may get values outside that range in your kernel. This is actually useful in most cases because most filter operations that are designed for sRGB still work then. The color invert example above, however, is not. You have two options here that I know of: You can change the workingColorSpace of the CIContext you are using to the HDR color space of the input: let ciContext = CIContext(options: [.workingColorSpace: CGColorSpace(name: CGColorSpace.itur_2020)!]) Then all color values should be capped to [0.0 ... 1.0] in your kernel, where 0.0 is the darkest HDR color value and 1.0 is the brightest. You can safely perform the inversion with 1.0 - x then. However, keep in mind that some other filters will then not produce the correct result because they assume the input to be in (linear) sRGB—Core Image's default. The second option is that you convert ("color match") the input into the correct color space before passing it into your kernel and back to working space again before returning: return kernelOutput.matchedToWorkingSpace(from: colorSpace)
Topic: App & System Services SubTopic: Core OS Tags:
Apr ’21
Reply to How can I extract the data from the output image of CIAreaHistogramFilter
I don't exactly know why you are getting all zeros here, but there is a much simpler way to access the data using CIContext.render(_:toBitmap:...). Please check out the implementation - https://github.com/DigitalMasterpieces/CoreImageExtensions/blob/main/Sources/CIContext%2BValueAccess.swift in this small helper package - https://github.com/DigitalMasterpieces/CoreImageExtensions I wrote that contains useful Core Image extensions.
Topic: Media Technologies SubTopic: General Tags:
Apr ’21
Reply to Save bytes-edited image on specific photo album as-is
I'm not totally sure, but I think it's not possible to just add some random data to the end of an image file like this. Photos has its own database for storing the images and I guess they perform some kind of sanitizing/cleaning when adding new entries. You can add custom data to a PHAsset that is stored alongside the image using PHAdjustmentData, but this is meant for storing "a description of the edits made to an asset's photo, video, or Live Photo content, which allows your app to reconstruct or revert the effects of prior editing sessions." So you would be able to read this data back, but only in an app that understands it. It won't be accessible when you just export the image out of Photos as a JPEG, for instance. And the amount of data you can store this way is also limited. However, you might be able to store the data in the image's (EXIF) metadata somewhere. This seems to me the appropriate place.
Topic: Media Technologies SubTopic: General Tags:
Apr ’21