This is a weird XCode 13 beta bug (including beta 5). Metal Core Image kernels fail to load from the library giving error
2021-08-26 12:05:23.806226+0400 MetalFilter[23183:1751438] [api] +[CIKernel kernelWithFunctionName:fromMetalLibraryData:options:error:] Cannot initialize kernel with given library data.
[ERROR] Failed to create CIColorKernel: Error Domain=CIKernel Code=6 "(null)" UserInfo={CINonLocalizedDescriptionKey=Cannot initialize kernel with given library data.}
But there is no such error with XCode 12.5. The kernel loads fine. Only on XCode 13 beta there is an error.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have the following class in Swift:
public class EffectModel {
var type:String
var keyframeGroup:[Keyframe<EffectParam>] = []
}
public enum EffectParam<Value:Codable>:Codable {
case scalar(Value)
case keyframes([Keyframe<Value>])
public enum CodingKeys: String, CodingKey {
case rawValue, associatedValue
}
...
...
}
public class Keyframe<T:Codable> : Codable {
public var time:CMTime
public var property:String
public var value:T
enum CodingKeys: String, CodingKey {
case time
case property
case value
}
...
}
The problem is compiler doesn't accepts the generic EffectParam and gives the error
Generic parameter 'Value' could not be inferred
One way to solve the problem would be to redeclare the class EffectModel as
public class EffectModel <EffectParam:Codable>
But the problem is this class has been included in so many other classes so I will need to incorporate generic in every class that has object of type EffectModel, and then any class that uses objects of those classes and so on. That is not a solution for me. Is there any other way to solve the problem in Swift using other language constructs (such as protocols)?
The AppStore version of my app works perfectly fine. But the moment I build the same code with XCode 13, AVAssetWriter fails with errors at the very beginning itself both on iOS 14 and iOS 15. This happens with Multicam session only.
Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780)
I am wondering what could be wrong. I even passed default settings to AVAssetWriter (the recommended ones).
Update: It turned out to be an issue with audio settings.
Hourly sales data is not updated in appstoreconnect for the past 3 days. Two days ago, the same website running on different devices were showing different hourly sales figures and now they all show zero sales. Is it a problem on client side or a server outage?
@AVFoundation Engineers, I get this bug repeatedly when using AVComposition & AVVideoComposition. Sometimes AVPlayer seek to time completion handler is not called. I check for a flag whether seek is in progress before placing another seek request. But if the completion handler is never invoked, all further seeks stall as flag remains true. What is a reliable way to know seek is not in progress before initiating another seek request.
playerSeeking = true
player.seek(to: time, toleranceBefore: .zero, toleranceAfter: .zero) { [weak self] completed
if !completed {
NSLog("Seek not completed \(time.seconds)")
}
guard let self = self else {
return
}
self.playerSeeking = false
if self.player.rate == 0.0 {
self.updateButtonStates()
}
}
I see there is a deprecation warning when using detailTextLabel of UITableViewCell.
@available(iOS, introduced: 3.0, deprecated: 100000, message: "Use UIListContentConfiguration instead, this property will be deprecated in a future release.")
open var detailTextLabel: UILabel? { get } // default is nil. label will be created if necessary (and the current style supports a detail label).
But it is not clear how to use UIListContentConfiguration to support detailTextLabel that is on the right side of the cell. I only see secondaryText in UIListContentConfiguration that is always displayed as a subtitle. How does one use UIListContentConfiguration as a replacement?
I have a UISceneConfiguration for external screen which is triggered when external display is connected to iOS device.
// MARK: UISceneSession Lifecycle
@available(iOS 13.0, *)
func application(_ application: UIApplication, configurationForConnecting connectingSceneSession: UISceneSession, options: UIScene.ConnectionOptions) -> UISceneConfiguration {
// Called when a new scene session is being created.
// Use this method to select a configuration to create the new scene with.
switch connectingSceneSession.role {
case .windowApplication:
return UISceneConfiguration(name: "Default Configuration", sessionRole: connectingSceneSession.role)
case .windowExternalDisplay:
return UISceneConfiguration(name: "External Screen", sessionRole: connectingSceneSession.role)
default:
fatalError("Unknown Configuration \(connectingSceneSession.role.rawValue)")
}
}
I display a custom view in the external screen this way in a new UIScene linked to external display. But the problem now is if I also have an AVPlayerViewController in the flow of application, it no longer displays to external screen. I suppose AVPlayerViewController does it's own configuration for external display playback perhaps, but now I have a custom view embedded on external screen it is unable to override it. What do I need to do so that AVPlayerViewController can display content to external screen the way it does normally?
@AVFoundation engineers, my users am seeing the following crash which I can not reproduce at my end.
*** -[__NSArrayI objectAtIndexedSubscript:]: index 2 beyond bounds [0 .. 1]
Fatal Exception: NSRangeException
0 CoreFoundation 0x99288 __exceptionPreprocess
1 libobjc.A.dylib 0x16744 objc_exception_throw
2 CoreFoundation 0x1a431c -[__NSCFString characterAtIndex:].cold.1
3 CoreFoundation 0x4c96c CFDateFormatterCreateStringWithAbsoluteTime
4 AVFCapture 0x6cad4 -[AVCaptureConnection getAvgAudioLevelForChannel:]
All I am doing is this:
func updateMeters() {
var channelCount = 0
var decibels:[Float] = []
let audioConnection = self.audioConnection
if let audioConnection = audioConnection {
for audioChannel in audioConnection.audioChannels {
decibels.append(audioChannel.averagePowerLevel)
channelCount = channelCount + 1
}
}
What am I doing wrong?
I see in Crashlytics few users are getting this exception when connecting the inputNode to mainMixerNode in AVAudioEngine:
Fatal Exception: com.apple.coreaudio.avfaudio
required condition is false:
format.sampleRate == hwFormat.sampleRate
Here is my code:
self.engine = AVAudioEngine()
let format = engine.inputNode.inputFormat(forBus: 0)
//main mixer node is connected to output node by default
engine.connect(self.engine.inputNode, to: self.engine.mainMixerNode, format: format)
Just want to understand how can this error occur and what is the right fix?
We are getting this error on iOS 16.1 beta 5 that we never saw before in any of the iOS versions.
[as_client] AVAudioSession_iOS.mm:2374 Failed to set category, error: 'what'
I wonder if there is any known workaround for the same. iOS 16 has been a nightmare and lot of AVFoundation code breaks or becomes unpredictable in behaviour. This is a new issue added in iOS 16.1.
This seems like a new bug in iOS 16.1(b5) where AVMultiCamSession outputs silent audio frames when back & front mics have been added to it. This issue is not seen in iOS 16.0.3 or earlier. I can't reproduce this issue with AVMultiCamPIP sample code so I believe I have some AVAudioSession or AVMultiCamSession configuration in my code that is causing this. Moreover, setting captureSession.usesApplicationAudioSession = true also fixes the issue, but then I do not get the audio samples from both the microphones.
Here is the code:
public func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
{
if let videoDataOutput = output as? AVCaptureVideoDataOutput {
processVideoSampleBuffer(sampleBuffer, fromOutput: videoDataOutput)
} else if let audioDataOutput = output as? AVCaptureAudioDataOutput {
processsAudioSampleBuffer(sampleBuffer, fromOutput: audioDataOutput)
}
}
private var lastDumpTime:TimeInterval?
private func processsAudioSampleBuffer(_ sampleBuffer: CMSampleBuffer, fromOutput audioDataOutput: AVCaptureAudioDataOutput) {
if lastDumpTime == nil {
lastDumpTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer).seconds
}
let time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer).seconds
if time - lastDumpTime! >= 1.0 {
dumpAudioSampleBuffer(sampleBuffer)
lastDumpTime = time
}
}
}
private func dumpAudioSampleBuffer(_ sampleBuffer:CMSampleBuffer) {
NSLog("Dumping audio sample buffer")
var audioBufferList = AudioBufferList(mNumberBuffers: 1,
mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))
var buffer: CMBlockBuffer? = nil
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, bufferListSizeNeededOut: nil, bufferListOut: &audioBufferList, bufferListSize: MemoryLayout.size(ofValue: audioBufferList), blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment), blockBufferOut: &buffer)
// Create UnsafeBufferPointer from the variable length array starting at audioBufferList.mBuffers
withUnsafePointer(to: &audioBufferList.mBuffers) { ptr in
let buffers = UnsafeBufferPointer<AudioBuffer>(start: ptr, count: Int(audioBufferList.mNumberBuffers))
for buf in buffers {
// Create UnsafeBufferPointer<Int16> from the buffer data pointer
let numSamples = Int(buf.mDataByteSize)/MemoryLayout<Int16>.stride
var samples = buf.mData!.bindMemory(to: Int16.self, capacity: numSamples)
for i in 0..<numSamples {
NSLog("Sample \(samples[i])")
}
}
}
}
And here is the output:
Dump Audio Samples
I see 24 hours report completely broken and unreliable in AppStoreConnect Sales & Trends for more than a month. When I submit a complain, all they say is clear cookies and caches and restart the browser. Either the complain is not forwarded to engineering or engineering is not acknowledging the issue.
I noticed on iOS 17 that calling AppTransaction.shared fails with following errors. Here is the code:
let result: VerificationResult<AppTransaction> = try await AppTransaction.shared
NSLog("Result \(result)")
It does not hit the NSLog statement and instead throws the following error logs.
Error getting app transaction: Error Domain=ASDErrorDomain Code=500 "(null)" UserInfo={NSUnderlyingError=0x281815050 {Error Domain=AMSErrorDomain Code=301 "Invalid Status Code" UserInfo={NSLocalizedDescription=Invalid Status Code, AMSURL=https://mzstorekit-sb.itunes.apple.com/inApps/v1/receipts/createAppReceipt?REDACTED, AMSStatusCode=401, AMSServerPayload={
errorCode = 500317;
}, NSLocalizedFailureReason=The response has an invalid status code}}}
Received error that does not have a corresponding StoreKit Error:
Error Domain=AMSErrorDomain Code=301 "Invalid Status Code" UserInfo= {NSLocalizedDescription=Invalid Status Code, AMSURL=https://mzstorekit- sb.itunes.apple.com/inApps/v1/receipts/createAppReceipt?REDACTED, AMSStatusCode=401, AMSServerPayload={
errorCode = 500317;
}, NSLocalizedFailureReason=The response has an invalid status code}
Received error that does not have a corresponding StoreKit Error:
Error Domain=ASDErrorDomain Code=500 "(null)" UserInfo=.{NSUnderlyingError=0x281815050 {Error Domain=AMSErrorDomain Code=301 "Invalid Status Code" UserInfo={NSLocalizedDescription=Invalid Status Code, AMSURL=https://mzstorekit-sb.itunes.apple.com/inApps/v1/receipts/createAppReceipt?REDACTED, AMSStatusCode=401, AMSServerPayload={
errorCode = 500317;
}, NSLocalizedFailureReason=The response has an invalid status code}}}
It seems AVAssetWriter is rejecting CVPixelBuffers with error -12743 when appending NSData for kCVImageBufferAmbientViewingEnvironmentKey for HDR videos.
Here is my code:
var ambientViewingEnvironment:CMFormatDescription.Extensions.Value?
var ambientViewingEnvironmentData:NSData?
ambientViewingEnvironment = sampleBuffer.formatDescription?.extensions[.ambientViewingEnvironment]
let plist = ambientViewingEnvironment?.propertyListRepresentation
ambientViewingEnvironmentData = plist as? NSData
And then attaching this data,
CVBufferSetAttachment(dstPixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, ambientViewingEnvironmentData! as CFData, .shouldPropagate)
No matter what I do, including copying the attachment from sourcePixelBuffer to destinationPixelBuffer as it is, the error remains!
var attachmentMode:CVAttachmentMode = .shouldPropagate
let attachment = CVBufferCopyAttachment(sourcePixelBuffer!, kCVImageBufferAmbientViewingEnvironmentKey, &attachmentMode)
NSLog("Attachment \(attachment!), mode \(attachmentMode)")
CVBufferSetAttachment(dstPixelBuffer, kCVImageBufferAmbientViewingEnvironmentKey, attachment!, attachmentMode)
I need to know if there is anything wrong in the way metadata is copied.
It looks like [[stitchable]] Metal Core Image kernels fail to get added in the default metal library. Here is my code:
class FilterTwo: CIFilter {
var inputImage: CIImage?
var inputParam: Float = 0.0
static var kernel: CIKernel = { () -> CIKernel in
let url = Bundle.main.url(forResource: "default",
withExtension: "metallib")!
let data = try! Data(contentsOf: url)
let kernelNames = CIKernel.kernelNames(fromMetalLibraryData: data)
NSLog("Kernels \(kernelNames)")
return try! CIKernel(functionName: "secondFilter", fromMetalLibraryData: data) //<-- This fails!
}()
override var outputImage : CIImage? {
guard let inputImage = inputImage else {
return nil
}
return FilterTwo.kernel.apply(extent: inputImage.extent, roiCallback: { (index, rect) in
return rect }, arguments: [inputImage])
}
}
Here is the Metal code:
using namespace metal;
[[ stitchable ]] half4 secondFilter (coreimage::sampler inputImage, coreimage::destination dest)
{
float2 srcCoord = inputImage.coord();
half4 color = half4(inputImage.sample(srcCoord));
return color;
}
And here is the usage:
class ViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
let filter = FilterTwo()
filter.inputImage = CIImage(color: CIColor.red)
let outputImage = filter.outputImage!
NSLog("Output \(outputImage)")
}
}
And the output:
StitchableKernelsTesting/FilterTwo.swift:15: Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=CIKernel Code=1 "(null)" UserInfo={CINonLocalizedDescriptionKey=Function does not exist in library data. …•∆}
Kernels []
reflect Function 'secondFilter' does not exist.