Post

Replies

Boosts

Views

Activity

Reply to AVAssetWriter & AVTimedMetadataGroup in AVMultiCamPiP
Many thanks Greg !! For anyone else looking into this, the CMMetadataFormatDescription is a bit special and perhaps not very Swifty ;) Working example below: // Add an metadata input let metaSpec : NSDictionary = [ kCMMetadataFormatDescriptionMetadataSpecificationKey_Identifier as NSString: AVMetadataIdentifier.quickTimeMetadataLocationNote, kCMMetadataFormatDescriptionMetadataSpecificationKey_DataType as NSString: kCMMetadataBaseDataType_UTF8] var metaFormat : CMFormatDescription? = nil CMMetadataFormatDescriptionCreateWithMetadataSpecifications(allocator: kCFAllocatorDefault, metadataType: kCMMetadataFormatType_Boxed, metadataSpecifications: [metaSpec] as CFArray, formatDescriptionOut: &metaFormat) let assetWriterMetaDataInput = AVAssetWriterInput(mediaType: .metadata, outputSettings: nil, sourceFormatHint: metaFormat) assetWriterMetaDataInput.expectsMediaDataInRealTime = true assetWriter.add(assetWriterMetaDataInput)
Topic: Media Technologies SubTopic: Video Tags:
Dec ’24
Reply to AVAssetWriter & AVTimedMetadataGroup in AVMultiCamPiP
Hi Greg, Thanks for taking the time to look at this. I can't find the proper way to get a continuous writing metadata adaptor going, I am unable to initialise it before startWriting(). If I declare them together with the other AVAssetWriterInputs like this: private var assetWriterMetaDataInput: AVAssetWriterInput? private var metadataAdaptor: AVAssetWriterInputMetadataAdaptor? And then try to set them up like this: // Add a metadata input let assetWriterMetaDataInput = AVAssetWriterInput(mediaType: .metadata, outputSettings: nil, sourceFormatHint: AVTimedMetadataGroup().copyFormatDescription()) assetWriterMetaDataInput.expectsMediaDataInRealTime = true assetWriter.add(assetWriterMetaDataInput) metadataAdaptor = AVAssetWriterInputMetadataAdaptor(assetWriterInput: assetWriterMetaDataInput) It crashes with the message "Cannot create a new metadata adaptor with an asset writer that does not carry a source format hint'" the moment I start appending an AVTimedMetadataGroup. If there are any examples out there I could look at, please let me know! Thanks!
Topic: Media Technologies SubTopic: Video Tags:
Dec ’24
Reply to How to visualize 16bit raw image data
This is an accelerate based conversion for macOS but maybe it will help you along: func normaliseImage(inImage: CGImage?) -> CGImage? { guard let cgImage = inImage ?? originalnsimage.cgImage(forProposedRect: nil, context: nil, hints: nil) else {return nil} guard let format = vImage_CGImageFormat(cgImage: cgImage) else {return nil} guard let sourceBuffer = try? vImage_Buffer(cgImage: cgImage, format: format) else {return nil} defer {sourceBuffer.free()} bitsperPixel = min(cgImage.bitsPerPixel, 32) bitsperComponent = min(cgImage.bitsPerComponent, 8) // Get workable argb image format in return let argbFormat: vImage_CGImageFormat = { return vImage_CGImageFormat(bitsPerComponent: bitsperComponent, bitsPerPixel: bitsperPixel, colorSpace: originalColorSpace, bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.first.rawValue), renderingIntent: .defaultIntent)! }() guard var destinationBuffer = try? vImage_Buffer(width: Int(sourceBuffer.width), height: Int(sourceBuffer.height), bitsPerPixel: argbFormat.bitsPerPixel) else {return nil} defer {destinationBuffer.free()} if let toRgbConverter = try? vImageConverter.make(sourceFormat: format, destinationFormat: argbFormat) { try? toRgbConverter.convert(source: sourceBuffer, destination: &destinationBuffer) } let result = try? destinationBuffer.createCGImage(format: argbFormat) return result }
Topic: Media Technologies SubTopic: General Tags:
Mar ’24