The native camera app on iPhone 15 can record videos directly in external hard drive. Is there an API to achieve the same in Photos framework?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
This issue is driving me crazy. I load an NSAttributedString in UITextView and within moments after loading the foregroundColor attribute of text is erased(i.e becomes white) without me doing anything. Here is the code and NSLog dump. How do I debug this I wonder?
class ScriptEditingView: UITextView, UITextViewDelegate {
var defaultFont = UIFont.preferredFont(forTextStyle: .body)
var defaultTextColor = UIColor.white
private func commonInit() {
self.font = UIFont.preferredFont(forTextStyle: .body)
self.allowsEditingTextAttributes = true
self.textColor = defaultTextColor
self.backgroundColor = UIColor.black
self.isOpaque = true
self.isEditable = true
self.isSelectable = true
self.dataDetectorTypes = []
self.showsHorizontalScrollIndicator = false
}
}
And then in my ViewController that contains the UITextView, I have this code:
textView = ScriptEditingView(frame: newTextViewRect, textContainer: nil)
textView.delegate = self
view.addSubview(textView)
textView.allowsEditingTextAttributes = true
let guide = view.safeAreaLayoutGuide
// 5
textView.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
textView.leadingAnchor.constraint(equalTo: guide.leadingAnchor),
textView.trailingAnchor.constraint(equalTo: guide.trailingAnchor),
textView.topAnchor.constraint(equalTo: view.topAnchor),
textView.bottomAnchor.constraint(equalTo: view.bottomAnchor)
])
textView.attributedText = attributedString
NSLog("Attributed now")
dumpAttributesOfText()
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
NSLog("Attributes after 1 sec")
self.dumpAttributesOfText()
}
And here is code to dump attributes of text:
private func dumpAttributesOfText() {
textView.attributedText?.enumerateAttributes(in: NSRange(location: 0, length: textView.attributedText!.length), options: .longestEffectiveRangeNotRequired, using: { dictionary, range, stop in
NSLog(" range \(range)")
if let font = dictionary[.font] as? UIFont {
NSLog("Font at range \(range) - \(font.fontName), \(font.pointSize)")
}
if let foregroundColor = dictionary[.foregroundColor] as? UIColor {
NSLog("Foregroundcolor \(foregroundColor) at range \(range)")
}
if let underline = dictionary[.underlineStyle] as? Int {
NSLog("Underline \(underline) at range \(range)")
}
})
}
The logs show this:
2022-07-02 13:16:02.841199+0400 MyApp[12054:922491] Attributed now
2022-07-02 13:16:02.841370+0400 MyApp[12054:922491] range {0, 14}
2022-07-02 13:16:02.841486+0400 MyApp[12054:922491] Font at range {0, 14} - HelveticaNeue, 30.0
2022-07-02 13:16:02.841586+0400 MyApp[12054:922491] Foregroundcolor UIExtendedGrayColorSpace 1 1 at range {0, 14}
2022-07-02 13:16:02.841681+0400 MyApp[12054:922491] range {14, 6}
2022-07-02 13:16:02.841770+0400 MyApp[12054:922491] Font at range {14, 6} - HelveticaNeue, 30.0
2022-07-02 13:16:02.841855+0400 MyApp[12054:922491] Foregroundcolor kCGColorSpaceModelRGB 0.96863 0.80784 0.27451 1 at range {14, 6}
2022-07-02 13:16:03.934816+0400 MyApp[12054:922491] Attributes after 1 sec
2022-07-02 13:16:03.935087+0400 MyApp[12054:922491] range {0, 20}
2022-07-02 13:16:03.935183+0400 MyApp[12054:922491] Font at range {0, 20} - HelveticaNeue, 30.0
2022-07-02 13:16:03.935255+0400 MyApp[12054:922491] Foregroundcolor UIExtendedGrayColorSpace 1 1 at range {0, 20}
I noticed on iOS 17 that calling AppTransaction.shared fails with following errors. Here is the code:
let result: VerificationResult<AppTransaction> = try await AppTransaction.shared
NSLog("Result \(result)")
It does not hit the NSLog statement and instead throws the following error logs.
Error getting app transaction: Error Domain=ASDErrorDomain Code=500 "(null)" UserInfo={NSUnderlyingError=0x281815050 {Error Domain=AMSErrorDomain Code=301 "Invalid Status Code" UserInfo={NSLocalizedDescription=Invalid Status Code, AMSURL=https://mzstorekit-sb.itunes.apple.com/inApps/v1/receipts/createAppReceipt?REDACTED, AMSStatusCode=401, AMSServerPayload={
errorCode = 500317;
}, NSLocalizedFailureReason=The response has an invalid status code}}}
Received error that does not have a corresponding StoreKit Error:
Error Domain=AMSErrorDomain Code=301 "Invalid Status Code" UserInfo= {NSLocalizedDescription=Invalid Status Code, AMSURL=https://mzstorekit- sb.itunes.apple.com/inApps/v1/receipts/createAppReceipt?REDACTED, AMSStatusCode=401, AMSServerPayload={
errorCode = 500317;
}, NSLocalizedFailureReason=The response has an invalid status code}
Received error that does not have a corresponding StoreKit Error:
Error Domain=ASDErrorDomain Code=500 "(null)" UserInfo=.{NSUnderlyingError=0x281815050 {Error Domain=AMSErrorDomain Code=301 "Invalid Status Code" UserInfo={NSLocalizedDescription=Invalid Status Code, AMSURL=https://mzstorekit-sb.itunes.apple.com/inApps/v1/receipts/createAppReceipt?REDACTED, AMSStatusCode=401, AMSServerPayload={
errorCode = 500317;
}, NSLocalizedFailureReason=The response has an invalid status code}}}
I see these errors when presenting UIActivityViewController with a video file.
[ShareSheet] Failed to request default share mode for fileURL:file:///var/mobile/Containers/Data/Application/B0EB55D3-4BF1-430A-92D8-2231AFFD9499/Documents/IMG-0155.mov error:Error Domain=NSOSStatusErrorDomain Code=-10814 "(null)" UserInfo={_LSLine=1538, _LSFunction=runEvaluator}
I don't understand if I am doing something wrong and what the error means. The share sheet shows anyways.
I am planning to convert a paid app to freemium. I would like existing paid users to remain unaffected in this process. In this question, I am focussing on volume purchase users (both existing and future). The info on Apple developer website advises to use original Store Kit if one needs to support Volume Purchase users:
You may need to use the Original API for in-app purchase for the following features, if your app supports them:
The Volume Purchase Program (VPP). For more information, see Device Management.
Does that mean I can't use StoreKit2 to verify receipts of volume purchases made before the app went freemium (to get original purchase version and date), OR, the API can not be used to make in-app volume purchases and perhaps, users will not be able to make volume purchases from AppStore, OR, both?
Dear AVKit Engineers,
I see a strange bug in AVKit. Even after the viewcontroller hosting AVPlayerViewController is dismissed, I see CPU spiking to over 100% which is caused by some animation code still running after AVPlayerViewController no more exists ([AVMobileChromelessControlsViewController__animateSliderToTintState:duration:completionHandler:]).
How does this code continue to run even after AVPlayerViewController is no more? And what can I do to fix it?
I have tried everything but it looks to be impossible to get MTKView to display full range of colors of HDR CIImage made from CVPixelBuffer (in 10bit YUV format). Only builtin layers such as AVCaptureVideoPreviewLayer, AVPlayerLayer, AVSampleBufferDisplayLayer are able to fully display HDR images on iOS. Is MTKView incapable of displaying full BT2020_HLG color range? Why does MTKView clip colors no matter even if I set pixel Color format to bgra10_xr or bgra10_xr_srgb?
convenience init(frame: CGRect, contentScale:CGFloat) {
self.init(frame: frame)
contentScaleFactor = contentScale
}
convenience init(frame: CGRect) {
let device = MetalCamera.metalDevice
self.init(frame: frame, device: device)
colorPixelFormat = .bgra10_xr
self.preferredFramesPerSecond = 30
}
override init(frame frameRect: CGRect, device: MTLDevice?) {
guard let device = device else {
fatalError("Can't use Metal")
}
guard let cmdQueue = device.makeCommandQueue(maxCommandBufferCount: 5) else {
fatalError("Can't make Command Queue")
}
commandQueue = cmdQueue
context = CIContext(mtlDevice: device, options: [CIContextOption.cacheIntermediates: false])
super.init(frame: frameRect, device: device)
self.framebufferOnly = false
self.clearColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 0)
}
And then rendering code:
override func draw(_ rect: CGRect) {
guard let image = self.image else {
return
}
let dRect = self.bounds
let drawImage: CIImage
let targetSize = dRect.size
let imageSize = image.extent.size
let scalingFactor = min(targetSize.width/imageSize.width, targetSize.height/imageSize.height)
let scalingTransform = CGAffineTransform(scaleX: scalingFactor, y: scalingFactor)
let translation:CGPoint = CGPoint(x: (targetSize.width - imageSize.width * scalingFactor)/2 , y: (targetSize.height - imageSize.height * scalingFactor)/2)
let translationTransform = CGAffineTransform(translationX: translation.x, y: translation.y)
let scalingTranslationTransform = scalingTransform.concatenating(translationTransform)
drawImage = image.transformed(by: scalingTranslationTransform)
let commandBuffer = commandQueue.makeCommandBufferWithUnretainedReferences()
guard let texture = self.currentDrawable?.texture else {
return
}
var colorSpace:CGColorSpace
if #available(iOS 14.0, *) {
colorSpace = CGColorSpace(name: CGColorSpace.itur_2100_HLG)!
} else {
// Fallback on earlier versions
colorSpace = drawImage.colorSpace ?? CGColorSpaceCreateDeviceRGB()
}
NSLog("Image \(colorSpace.name), \(image.colorSpace?.name)")
context.render(drawImage, to: texture, commandBuffer: commandBuffer, bounds: dRect, colorSpace: colorSpace)
commandBuffer?.present(self.currentDrawable!, afterMinimumDuration: 1.0/Double(self.preferredFramesPerSecond))
commandBuffer?.commit()
}
I understand the CVBufferSetAttachment simply appends metadata attachment to the sample buffer in a dictionary. But I see there are no errors in appending metadata that is contradictory in nature. For instance, for the sample buffers received from camera in HDR mode which are in YUV422 10 bit biplanar format, both the following succeed:
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferColorPrimariesKey, kCVImageBufferColorPrimaries_ITU_R_2020, .shouldPropagate)
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferTransferFunctionKey, kCVImageBufferTransferFunction_ITU_R_2100_HLG, .shouldPropagate)
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferYCbCrMatrixKey, kCVImageBufferYCbCrMatrix_ITU_R_2020, .shouldPropagate)
Or
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferColorPrimariesKey, kCVImageBufferColorPrimaries_ITU_R_709_2, .shouldPropagate)
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferTransferFunctionKey, kCVImageBufferTransferFunction_ITU_R_709_2, .shouldPropagate)
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferYCbCrMatrixKey, kCVImageBufferYCbCrMatrix_ITU_R_709_2, .shouldPropagate)
So one could set the color primaries and transfer function to be BT.709 format for sample buffers that are in 10 bit HDR. I see no errors when either sample buffer is appended to AVAssetWriter. I am wondering how attachments actually work and how AVFoundation resolves the contradictions?
I am getting CMSampleBuffers in kCVPixelFormatType_422YpCbCr10BiPlanarVideoRange format from the camera. The pixel buffers are 10 bit HDR. I need to record using ProRes422 codec but in non-HDR format. I am not sure what is a reliable way of doing this so reaching out here. What I did is simply set AVAssetWriter compression dictionary as follows:
compressionSettings[AVVideoTransferFunctionKey] = AVVideoTransferFunction_ITU_R_709_2
compressionSettings[AVVideoColorPrimariesKey] = AVVideoColorPrimaries_ITU_R_709_2
compressionSettings[AVVideoYCbCrMatrixKey] = AVVideoYCbCrMatrix_ITU_R_709_2
It works and the end video recording shows the color space to be HD 1-1-1 with Apple ProRes codec. But I am not sure if AVAssetWriter has actually performed colorspace conversion from HDR 10 to BT.709, or simply clipped the colors out of range.
I need to know a definitive way to achieve this. I see Apple's native camera app is doing this, but not sure how.
I have the following code to connect inputNode to mainMixerNode of AVAudioEngine:
public func setupAudioEngine() {
self.engine = AVAudioEngine()
let format = engine.inputNode.inputFormat(forBus: 0)
//main mixer node is connected to output node by default
engine.connect(self.engine.inputNode, to: self.engine.mainMixerNode, format: format)
do {
engine.prepare()
try self.engine.start()
}
catch {
print("error couldn't start engine")
}
engineRunning = true
}
But I am seeing a crash in Crashlytics dashboard (which I can't reproduce).
Fatal Exception: com.apple.coreaudio.avfaudio
required condition is false: IsFormatSampleRateAndChannelCountValid(format)
Before calling the function setupAudioEngine I make sure the AVAudioSession category is not playback where mic is not available. The function is called where audio route change notification is handled and I check this condition specifically. Can someone tell me what I am doing wrong?
Fatal Exception: com.apple.coreaudio.avfaudio
0 CoreFoundation 0x99288 __exceptionPreprocess
1 libobjc.A.dylib 0x16744 objc_exception_throw
2 CoreFoundation 0x17048c -[NSException initWithCoder:]
3 AVFAudio 0x9f64 AVAE_RaiseException(NSString*, ...)
4 AVFAudio 0x55738 AVAudioEngineGraph::_Connect(AVAudioNodeImplBase*, AVAudioNodeImplBase*, unsigned int, unsigned int, AVAudioFormat*)
5 AVFAudio 0x5cce0 AVAudioEngineGraph::Connect(AVAudioNode*, AVAudioNode*, unsigned long, unsigned long, AVAudioFormat*)
6 AVFAudio 0xdf1a8 AVAudioEngineImpl::Connect(AVAudioNode*, AVAudioNode*, unsigned long, unsigned long, AVAudioFormat*)
7 AVFAudio 0xe0fc8 -[AVAudioEngine connect:to:format:]
8 MyApp 0xa6af8 setupAudioEngine + 701 (MicrophoneOutput.swift:701)
9 MyApp 0xa46f0 handleRouteChange + 378 (MicrophoneOutput.swift:378)
10 MyApp 0xa4f50 @objc MicrophoneOutput.handleRouteChange(note:)
11 CoreFoundation 0x2a834 __CFNOTIFICATIONCENTER_IS_CALLING_OUT_TO_AN_OBSERVER__
12 CoreFoundation 0xc6fd4 ___CFXRegistrationPost_block_invoke
13 CoreFoundation 0x9a1d0 _CFXRegistrationPost
14 CoreFoundation 0x408ac _CFXNotificationPost
15 Foundation 0x1b754 -[NSNotificationCenter postNotificationName:object:userInfo:]
16 AudioSession 0x56f0 (anonymous namespace)::HandleRouteChange(AVAudioSession*, NSDictionary*)
17 AudioSession 0x5cbc invocation function for block in avfaudio::AVAudioSessionPropertyListener(void*, unsigned int, unsigned int, void const*)
18 libdispatch.dylib 0x1e6c _dispatch_call_block_and_release
19 libdispatch.dylib 0x3a30 _dispatch_client_callout
20 libdispatch.dylib 0x11f48 _dispatch_main_queue_drain
21 libdispatch.dylib 0x11b98 _dispatch_main_queue_callback_4CF
22 CoreFoundation 0x51800 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__
23 CoreFoundation 0xb704 __CFRunLoopRun
24 CoreFoundation 0x1ebc8 CFRunLoopRunSpecific
25 GraphicsServices 0x1374 GSEventRunModal
26 UIKitCore 0x514648 -[UIApplication _run]
27 UIKitCore 0x295d90 UIApplicationMain
28 libswiftUIKit.dylib 0x30ecc UIApplicationMain(_:_:_:_:)
29 MyApp 0xc358 main (WhiteBalanceUI.swift)
30 ??? 0x104b1dce4 (Missing)
I have the following code to determine ProRes and HDR support on iOS devices.
extension AVCaptureDevice.Format {
var supports10bitHDR:Bool {
let mediaType = CMFormatDescriptionGetMediaType(formatDescription)
let mediaSubtype = CMFormatDescriptionGetMediaSubType(formatDescription)
return mediaType == kCMMediaType_Video && mediaSubtype == kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange
}
var supportsProRes422:Bool {
let mediaType = CMFormatDescriptionGetMediaType(formatDescription)
let mediaSubtype = CMFormatDescriptionGetMediaSubType(formatDescription)
return (mediaType == kCMMediaType_Video && (mediaSubtype == kCVPixelFormatType_422YpCbCr10BiPlanarVideoRange))
}
}
On iPad Pro M2, supportsProRes422 returns false for all device formats (for wide angle camera). Is it a bug or intentional?
I got approval for User Assigned Device Name but now am stuck how to import this capability in XCode? I successfully updated my AppId in membership portal to reflect this permission, but I can't find anything in XCode to import this. I opened Editor -> Add Capability but there is no option to add User Assigned Device Name in the list!
User Assigned Device Name
I have an XCode project that has multiple targets. In one of the targets, I want to add Apple Watch support (not in other targets). How do I do that?
I have an AVAssetWriter and I do set audio compression settings dictionary using canApply(outputSettings: audioCompressionSettings, forMediaType: .audio) API.
One of the fields in the compression settings is setting an audio sample rate using AVSampleRateKey. My question is if the sample rate I set in this key is different from sample rate of audio sample buffers that are appended, can this cause audio to drift away from video? Is setting arbitrary sample rate in asset writer settings not recommended?
I see 24 hours report completely broken and unreliable in AppStoreConnect Sales & Trends for more than a month. When I submit a complain, all they say is clear cookies and caches and restart the browser. Either the complain is not forwarded to engineering or engineering is not acknowledging the issue.