I need to implement a text editor using UITextView that supports:
Bold/Italic/Underline
Color,Font,font size changes
Paragraph alignment
List format (bullets, numbers, etc.)
Custom selection of text anywhere in the text view and change the properties
So far I have managed to do it without NSTextStorage but it seems I am hitting limits. For instance, to change font, I use UIFontPickerViewController and change the font as follows:
func fontPickerViewControllerDidPickFont(_ viewController: UIFontPickerViewController) {
if let selectedFontDesc = viewController.selectedFontDescriptor {
let font = UIFont(descriptor: selectedFontDesc, size: selectedFontDesc.pointSize)
self.selectedFont = font
self.textView.typingAttributes = [NSAttributedString.Key.foregroundColor: self.selectedColor ?? UIColor.white, NSAttributedString.Key.font: self.selectedFont ?? UIFont.preferredFont(forTextStyle: .body, compatibleWith: nil)]
if let range = self.textView.selectedTextRange, let selectedFont = selectedFont {
let attributedText = NSMutableAttributedString(attributedString: self.textView.attributedText)
let location = textView.offset(from: textView.beginningOfDocument, to: range.start)
let length = textView.offset(from: range.start, to: range.end)
let nsRange = NSRange(location: location, length: length)
attributedText.setAttributes([NSAttributedString.Key.font : selectedFont], range: nsRange)
self.textView.attributedText = attributedText
}
}
}
This works but the problem is it resets the color of the selected text and other properties. I need to understand a way in which the existing attributed of the text under selection are not disturbed. I suspect the way to do is with using NSTextStorage but I can't find anything good on internet that explains the right use of NSTextStorage to achieve this.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
This issue is driving me crazy. I load an NSAttributedString in UITextView and within moments after loading the foregroundColor attribute of text is erased(i.e becomes white) without me doing anything. Here is the code and NSLog dump. How do I debug this I wonder?
class ScriptEditingView: UITextView, UITextViewDelegate {
var defaultFont = UIFont.preferredFont(forTextStyle: .body)
var defaultTextColor = UIColor.white
private func commonInit() {
self.font = UIFont.preferredFont(forTextStyle: .body)
self.allowsEditingTextAttributes = true
self.textColor = defaultTextColor
self.backgroundColor = UIColor.black
self.isOpaque = true
self.isEditable = true
self.isSelectable = true
self.dataDetectorTypes = []
self.showsHorizontalScrollIndicator = false
}
}
And then in my ViewController that contains the UITextView, I have this code:
textView = ScriptEditingView(frame: newTextViewRect, textContainer: nil)
textView.delegate = self
view.addSubview(textView)
textView.allowsEditingTextAttributes = true
let guide = view.safeAreaLayoutGuide
// 5
textView.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
textView.leadingAnchor.constraint(equalTo: guide.leadingAnchor),
textView.trailingAnchor.constraint(equalTo: guide.trailingAnchor),
textView.topAnchor.constraint(equalTo: view.topAnchor),
textView.bottomAnchor.constraint(equalTo: view.bottomAnchor)
])
textView.attributedText = attributedString
NSLog("Attributed now")
dumpAttributesOfText()
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
NSLog("Attributes after 1 sec")
self.dumpAttributesOfText()
}
And here is code to dump attributes of text:
private func dumpAttributesOfText() {
textView.attributedText?.enumerateAttributes(in: NSRange(location: 0, length: textView.attributedText!.length), options: .longestEffectiveRangeNotRequired, using: { dictionary, range, stop in
NSLog(" range \(range)")
if let font = dictionary[.font] as? UIFont {
NSLog("Font at range \(range) - \(font.fontName), \(font.pointSize)")
}
if let foregroundColor = dictionary[.foregroundColor] as? UIColor {
NSLog("Foregroundcolor \(foregroundColor) at range \(range)")
}
if let underline = dictionary[.underlineStyle] as? Int {
NSLog("Underline \(underline) at range \(range)")
}
})
}
The logs show this:
2022-07-02 13:16:02.841199+0400 MyApp[12054:922491] Attributed now
2022-07-02 13:16:02.841370+0400 MyApp[12054:922491] range {0, 14}
2022-07-02 13:16:02.841486+0400 MyApp[12054:922491] Font at range {0, 14} - HelveticaNeue, 30.0
2022-07-02 13:16:02.841586+0400 MyApp[12054:922491] Foregroundcolor UIExtendedGrayColorSpace 1 1 at range {0, 14}
2022-07-02 13:16:02.841681+0400 MyApp[12054:922491] range {14, 6}
2022-07-02 13:16:02.841770+0400 MyApp[12054:922491] Font at range {14, 6} - HelveticaNeue, 30.0
2022-07-02 13:16:02.841855+0400 MyApp[12054:922491] Foregroundcolor kCGColorSpaceModelRGB 0.96863 0.80784 0.27451 1 at range {14, 6}
2022-07-02 13:16:03.934816+0400 MyApp[12054:922491] Attributes after 1 sec
2022-07-02 13:16:03.935087+0400 MyApp[12054:922491] range {0, 20}
2022-07-02 13:16:03.935183+0400 MyApp[12054:922491] Font at range {0, 20} - HelveticaNeue, 30.0
2022-07-02 13:16:03.935255+0400 MyApp[12054:922491] Foregroundcolor UIExtendedGrayColorSpace 1 1 at range {0, 20}
I have the following code to connect inputNode to mainMixerNode of AVAudioEngine:
public func setupAudioEngine() {
self.engine = AVAudioEngine()
let format = engine.inputNode.inputFormat(forBus: 0)
//main mixer node is connected to output node by default
engine.connect(self.engine.inputNode, to: self.engine.mainMixerNode, format: format)
do {
engine.prepare()
try self.engine.start()
}
catch {
print("error couldn't start engine")
}
engineRunning = true
}
But I am seeing a crash in Crashlytics dashboard (which I can't reproduce).
Fatal Exception: com.apple.coreaudio.avfaudio
required condition is false: IsFormatSampleRateAndChannelCountValid(format)
Before calling the function setupAudioEngine I make sure the AVAudioSession category is not playback where mic is not available. The function is called where audio route change notification is handled and I check this condition specifically. Can someone tell me what I am doing wrong?
Fatal Exception: com.apple.coreaudio.avfaudio
0 CoreFoundation 0x99288 __exceptionPreprocess
1 libobjc.A.dylib 0x16744 objc_exception_throw
2 CoreFoundation 0x17048c -[NSException initWithCoder:]
3 AVFAudio 0x9f64 AVAE_RaiseException(NSString*, ...)
4 AVFAudio 0x55738 AVAudioEngineGraph::_Connect(AVAudioNodeImplBase*, AVAudioNodeImplBase*, unsigned int, unsigned int, AVAudioFormat*)
5 AVFAudio 0x5cce0 AVAudioEngineGraph::Connect(AVAudioNode*, AVAudioNode*, unsigned long, unsigned long, AVAudioFormat*)
6 AVFAudio 0xdf1a8 AVAudioEngineImpl::Connect(AVAudioNode*, AVAudioNode*, unsigned long, unsigned long, AVAudioFormat*)
7 AVFAudio 0xe0fc8 -[AVAudioEngine connect:to:format:]
8 MyApp 0xa6af8 setupAudioEngine + 701 (MicrophoneOutput.swift:701)
9 MyApp 0xa46f0 handleRouteChange + 378 (MicrophoneOutput.swift:378)
10 MyApp 0xa4f50 @objc MicrophoneOutput.handleRouteChange(note:)
11 CoreFoundation 0x2a834 __CFNOTIFICATIONCENTER_IS_CALLING_OUT_TO_AN_OBSERVER__
12 CoreFoundation 0xc6fd4 ___CFXRegistrationPost_block_invoke
13 CoreFoundation 0x9a1d0 _CFXRegistrationPost
14 CoreFoundation 0x408ac _CFXNotificationPost
15 Foundation 0x1b754 -[NSNotificationCenter postNotificationName:object:userInfo:]
16 AudioSession 0x56f0 (anonymous namespace)::HandleRouteChange(AVAudioSession*, NSDictionary*)
17 AudioSession 0x5cbc invocation function for block in avfaudio::AVAudioSessionPropertyListener(void*, unsigned int, unsigned int, void const*)
18 libdispatch.dylib 0x1e6c _dispatch_call_block_and_release
19 libdispatch.dylib 0x3a30 _dispatch_client_callout
20 libdispatch.dylib 0x11f48 _dispatch_main_queue_drain
21 libdispatch.dylib 0x11b98 _dispatch_main_queue_callback_4CF
22 CoreFoundation 0x51800 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__
23 CoreFoundation 0xb704 __CFRunLoopRun
24 CoreFoundation 0x1ebc8 CFRunLoopRunSpecific
25 GraphicsServices 0x1374 GSEventRunModal
26 UIKitCore 0x514648 -[UIApplication _run]
27 UIKitCore 0x295d90 UIApplicationMain
28 libswiftUIKit.dylib 0x30ecc UIApplicationMain(_:_:_:_:)
29 MyApp 0xc358 main (WhiteBalanceUI.swift)
30 ??? 0x104b1dce4 (Missing)
I get this error when I run my code on an iOS 14 device using XCode 14.
dyld: Symbol not found: _$s7SwiftUI15GraphicsContextV4fill_4with5styleyAA4PathV_AC7ShadingVAA9FillStyleVtF
Referenced from: ******
Expected in: /System/Library/Frameworks/SwiftUI.framework/SwiftUI
I do not get any errors when I run the same code on an iOS 15 or iOS 16 device.
How do I resolve this issue?
I am unable to understand the sentence "Default is YES for all apps linked on or after iOS 15.4"found in Apple's documentation. What does it actually mean? Does it mean that the default value is YES for app running on iOS 15.4 or later? Or does it mean it is yes if the app is built with iOS 15.4 SDK or later? Or is it something else?
With a 4 channel audio input, I get error while recording a movie using AVAssetWriter and using LPCM codec. Are 4 audio channels not supported by AVAssetWriterInput? Here are my compression settings:
var aclSize:size_t = 0
var currentChannelLayout:UnsafePointer<AudioChannelLayout>? = nil
/*
* outputAudioFormat = CMSampleBufferGetFormatDescription(sampleBuffer)
* for the latest sample buffer received in captureOutput sampleBufferDelegate
*/
if let outputFormat = outputAudioFormat {
currentChannelLayout = CMAudioFormatDescriptionGetChannelLayout(outputFormat, sizeOut: &aclSize)
}
var currentChannelLayoutData:Data = Data()
if let currentChannelLayout = currentChannelLayout, aclSize > 0 {
currentChannelLayoutData = Data.init(bytes: currentChannelLayout, count: aclSize)
}
let numChannels = AVAudioSession.sharedInstance().inputNumberOfChannels
audioSettings[AVSampleRateKey] = 48000.0
audioSettings[AVFormatIDKey] = kAudioFormatLinearPCM
audioSettings[AVLinearPCMIsBigEndianKey] = false
audioSettings[AVLinearPCMBitDepthKey] = 16
audioSettings[AVNumberOfChannelsKey] = numChannels
audioSettings[AVLinearPCMIsFloatKey] = false
audioSettings[AVLinearPCMIsNonInterleaved] = false
audioSettings[AVChannelLayoutKey] = currentChannelLayoutData
It's very hard to know from AVFoundation error codes what is the exact error. For instance, I see the following message and don't know what error code means?
AVFoundation has serious issued in iOS 16.1 beta 5. None of these issues are seen prior to iOS 16.0 or earlier.
The following code fails regularly when switching between AVCaptureMultiCamSession & AVCaptureSession. It turns out that assetWriter.canApply(outputSettings:) condition is false for no apparent reason.
if assetWriter?.canApply(outputSettings: audioSettings!, forMediaType: AVMediaType.audio) ?? false {
}
I dumped audioSettings dictionary and here it is:
Looks like number of channels in AVAudioSession are 3 and that is the issue. But how did that happen? Probably there is a bug and AVCaptureMultiCamSession teardown and deallocation is causing some issue.
Using AVAssetWriter in AVCaptureMultiCamSession, many times no audio is recorded in the video under same audio settings dictionary dumped above. There is audio track in the video but everything is silent it seems. The same code works perfectly in all other iOS versions. I checked that audio sample buffers are indeed vended during recording but it's very likely they are silent buffers.
Is anyone aware of these issues?
I have the following audio compression settings which fail with AVAssetWriter (mov container, HEVC codec, kAudioFormatMPEG4AAC format ID):
["AVSampleRateKey": 48000, "AVFormatIDKey": 1633772320, "AVNumberOfChannelsKey": 1, "AVEncoderBitRatePerChannelKey": 128000, "AVChannelLayoutKey": <02006500 00000000 00000000 00000000 00000000 00000000 00000000 00000000>]
Here is the code line that fails:
if _assetWriter?.canApply(outputSettings: audioSettings!, forMediaType: AVMediaType.audio) ?? false {
} else {
/* Failure */
}
Want to understand what is wrong? I can not reproduce it at my end (only reproducible on user's device with a particular microphone).
I need to know if it is mandatory to provide value for AVChannelLayoutKey in the dictionary with kAudioFormatMPEG4AAC? That could be a possible culprit.
I have the following code to determine ProRes and HDR support on iOS devices.
extension AVCaptureDevice.Format {
var supports10bitHDR:Bool {
let mediaType = CMFormatDescriptionGetMediaType(formatDescription)
let mediaSubtype = CMFormatDescriptionGetMediaSubType(formatDescription)
return mediaType == kCMMediaType_Video && mediaSubtype == kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange
}
var supportsProRes422:Bool {
let mediaType = CMFormatDescriptionGetMediaType(formatDescription)
let mediaSubtype = CMFormatDescriptionGetMediaSubType(formatDescription)
return (mediaType == kCMMediaType_Video && (mediaSubtype == kCVPixelFormatType_422YpCbCr10BiPlanarVideoRange))
}
}
On iPad Pro M2, supportsProRes422 returns false for all device formats (for wide angle camera). Is it a bug or intentional?
I have an AVAssetWriter and I do set audio compression settings dictionary using canApply(outputSettings: audioCompressionSettings, forMediaType: .audio) API.
One of the fields in the compression settings is setting an audio sample rate using AVSampleRateKey. My question is if the sample rate I set in this key is different from sample rate of audio sample buffers that are appended, can this cause audio to drift away from video? Is setting arbitrary sample rate in asset writer settings not recommended?
I understand the CVBufferSetAttachment simply appends metadata attachment to the sample buffer in a dictionary. But I see there are no errors in appending metadata that is contradictory in nature. For instance, for the sample buffers received from camera in HDR mode which are in YUV422 10 bit biplanar format, both the following succeed:
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferColorPrimariesKey, kCVImageBufferColorPrimaries_ITU_R_2020, .shouldPropagate)
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferTransferFunctionKey, kCVImageBufferTransferFunction_ITU_R_2100_HLG, .shouldPropagate)
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferYCbCrMatrixKey, kCVImageBufferYCbCrMatrix_ITU_R_2020, .shouldPropagate)
Or
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferColorPrimariesKey, kCVImageBufferColorPrimaries_ITU_R_709_2, .shouldPropagate)
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferTransferFunctionKey, kCVImageBufferTransferFunction_ITU_R_709_2, .shouldPropagate)
CVBufferSetAttachment(testPixelBuffer!, kCVImageBufferYCbCrMatrixKey, kCVImageBufferYCbCrMatrix_ITU_R_709_2, .shouldPropagate)
So one could set the color primaries and transfer function to be BT.709 format for sample buffers that are in 10 bit HDR. I see no errors when either sample buffer is appended to AVAssetWriter. I am wondering how attachments actually work and how AVFoundation resolves the contradictions?
I am planning to convert a paid app to freemium. I would like existing paid users to remain unaffected in this process. In this question, I am focussing on volume purchase users (both existing and future). The info on Apple developer website advises to use original Store Kit if one needs to support Volume Purchase users:
You may need to use the Original API for in-app purchase for the following features, if your app supports them:
The Volume Purchase Program (VPP). For more information, see Device Management.
Does that mean I can't use StoreKit2 to verify receipts of volume purchases made before the app went freemium (to get original purchase version and date), OR, the API can not be used to make in-app volume purchases and perhaps, users will not be able to make volume purchases from AppStore, OR, both?
I am looking to move from paid app to fremium without upsetting my existing users. I see WWDC2022 session where new fields introduced in iOS 16 are used to extract original application version user used to purchase the app. While my app supports iOS 14 and above, I am willing to sacrifice iOS 14 and go iOS 15 and above as StoreKit2 requires iOS 15 at the minimum. The code below is however only valid for iOS 16. I need to know what is the best way out for iOS 15 devices if I am using StoreKit2? If it is not possible in StoreKit2, then how much is the work involved in original StoreKit API(because in that case I can handle for iOS 14 as well)?
I am trying to migrate my app from paid to freemium and am facing several issues and doubts. Specifically, I am trying to use StoreKit2 AppTransaction API but I am not averse to using StoreKit if my problems are not solved by StoreKit2:
Here are my questions:
AppTransaction/Receipt on launch: I see on launch the AppTransaction.shared call fails on the sandbox initially. That means it's possible that on user's who have purchased the app previously, the AppTransaction (or appStoreReceipt in original StoreKit) may not be available when the user downloads or updates the app? That means I will need to ask every user to authenticate with AppStore to refresh the receipt/AppTransaction?
Volume Purchase Users: I see StoreKit2 is not advised for volume purchases on the Apple website. I am not sure why that is the case, but does that mean AppTransaction will not be available for users who made Volume purchases under VPP? Is the flow to validate VPP users different? If StoreKit 2 can not be used, can the original StoreKit API help here, or nothing can be of help here?